US9666166B2 - Information processing apparatus, method, and recording medium - Google Patents
Information processing apparatus, method, and recording medium Download PDFInfo
- Publication number
- US9666166B2 US9666166B2 US13/308,678 US201113308678A US9666166B2 US 9666166 B2 US9666166 B2 US 9666166B2 US 201113308678 A US201113308678 A US 201113308678A US 9666166 B2 US9666166 B2 US 9666166B2
- Authority
- US
- United States
- Prior art keywords
- update
- section
- association
- sections
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/14—Display of multiple viewports
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2310/00—Command of the display device
- G09G2310/04—Partial updating of the display screen
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/02—Improving the quality of display appearance
- G09G2320/0223—Compensation for problems related to R-C delay and attenuation in electrodes of matrix panels, e.g. in gate electrodes or on-substrate video signal electrodes
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/10—Special adaptations of display systems for operation with variable images
- G09G2320/103—Detection of image changes, e.g. determination of an index representative of the image change
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2330/00—Aspects of power supply; Aspects of display protection and defect management
- G09G2330/02—Details of power systems and of start or stop of display operation
- G09G2330/021—Power management, e.g. power saving
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0407—Resolution change, inclusive of the use of different resolutions for different screen areas
- G09G2340/0435—Change or adaptation of the frame rate of the video stream
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2350/00—Solving problems of bandwidth in display systems
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/20—Details of the management of multiple sources of image data
Definitions
- the embodiments discussed herein are related to an information processing apparatus, a method, and a recording medium.
- the client behaves as if it were playing major roles of executing the processing and/or were holding the data.
- One example of a method for task execution in such a client system is a method in which a server apparatus executes applications for document creation tasks, mail management tasks, and so on and a client apparatus displays results of the processing of the applications.
- a CPU central processing unit
- a large-load task such as a CAD creation task or a moving-image-handling task
- the amount of information transferred from the server apparatus to the client apparatus increases, which may delay responses to operations executed by the client apparatus.
- technologies for improving the response speed is a technology in which a display screen is divided into multiple blocks, a block in which the update frequency is high is detected from the blocks, and an image in the detected block is determined as a moving image and is read and transferred with high priority (e.g., Japanese Laid-open Patent Publication No. 11-275532).
- an information processing apparatus includes, a storage unit that stores an image to be transmitted, an update-frequency setter that sets, for respective sections set in the image to be transmitted, update frequencies of images stored for the sections in a predetermined period of time, an association-degree setter that sets association degrees to indicate degrees of association between the sections based on the update frequencies, a priority setter that identifies the section on which an operation is performed and sets a higher priority for the identified section and the section having a highest degree of association with the identified section than priorities for other sections, and a transmitter that transmits the image, stored by the storage unit, in sequence with the images for the sections whose set priority is higher first.
- FIG. 1 is a diagram of an overview of an information processing apparatus according to a first embodiment
- FIG. 2 is a diagram of an information processing system according to a second embodiment
- FIG. 3 is a block diagram illustrating functions of a server apparatus and a client apparatus in the second embodiment
- FIG. 4 illustrates how the screen to be displayed on a display device is divided
- FIGS. 5A to 5C illustrate how the frequencies of updates on the screen to be displayed on the display device are determined
- FIG. 6 illustrates how a coupled block group is corrected
- FIG. 7 illustrates how frequent-update region candidates are combined
- FIGS. 8A to 8C illustrate region-position identification information
- FIG. 9 illustrates associations between operation tool sections and data rendering sections
- FIG. 10 illustrates a cursor-position management table created by an association-degree setter
- FIG. 11 is a flowchart of processing performed by the server apparatus
- FIG. 12 is a flowchart illustrating operation-section detection processing
- FIG. 13 is a flowchart illustrating channel-band detection processing
- FIG. 14 is a flowchart illustrating window-edge detection processing
- FIG. 15 is a flowchart illustrating update-frequency calculation processing
- FIG. 16 is a flowchart illustrating association-degree setting processing
- FIG. 17 is a flowchart illustrating priority setting processing
- FIG. 18 illustrates a specific example of association-degree setting processing
- FIG. 19 is a flowchart illustrating first window-edge detection processing in a third embodiment
- FIG. 20 is a flowchart illustrating second window-edge detection processing in the third embodiment
- FIG. 21 is a block diagram illustrating a server apparatus in a fourth embodiment
- FIG. 22 illustrates information held by an operation-state detector
- FIG. 23 is a flowchart illustrating priority setting processing in the fourth embodiment
- FIG. 24 is a block diagram illustrating a server apparatus in a fifth embodiment
- FIG. 25 illustrates history information held by an operation-information history accumulator
- FIG. 26 is a flowchart illustrating association-degree setting processing in the fifth embodiment
- FIG. 27 illustrates a specific example of the association-degree setting processing in the fifth embodiment
- FIG. 28 is a block diagram illustrating a server apparatus in a sixth embodiment
- FIG. 29 illustrates an operation of an association-degree setter in the sixth embodiment.
- FIG. 30 is a flowchart illustrating association-degree setting processing in the sixth embodiment.
- applications for CAD and so on may display a set of multiple windows on a single monitor.
- the windows displayed include, for example, a window having a relatively high update frequency (e.g., a window in which data is rendered) and a window having a relatively low update frequency (e.g., a window in which operation tools are displayed).
- a window having a relatively high update frequency e.g., a window in which data is rendered
- a window having a relatively low update frequency e.g., a window in which operation tools are displayed.
- a time lag may occur in movement of the mouse cursor although the moving-image data is quickly updated.
- Delay in transmission of a desired image occurs not only in cases in which still images and moving images are handled, but is also common to cases in which the amount of information transmitted between a client apparatus and a server apparatus in a thin client system increases during screen update.
- FIG. 1 is a diagram of an overview of an information processing apparatus according to a first embodiment.
- An information processing apparatus 1 (e.g., a computer) according to an embodiment has a function for transmitting image data to a client apparatus 2 in connection with a request for displaying an image on a display unit 2 a of the client apparatus 2 .
- the information processing apparatus 1 includes a storage unit 1 a , an update-frequency setter 1 b , an association-degree setter 1 c , a priority setter 1 d , and a transmitter 1 e.
- the storage unit 1 a stores an image to be transmitted.
- the update-frequency setter 1 b sets, for respective sections set in the image to be transmitted, update frequencies of the images stored for the sections in a predetermined period of time.
- the information processing apparatus 1 further includes a section setter 1 f for setting the sections.
- the section setter 1 f detects edges of each window displayed on the display unit 2 a .
- the section setter 1 f detects the frames of four windows a to i as edges.
- the section setter 1 f sets the sections on a screen displayed on the display unit 2 a .
- the section setter 1 f divides the screen into sections by using the detected edges.
- the section setter 1 f then deletes the divided sections having sizes that are smaller than or equal a certain size and the sections existing inside other sections.
- FIG. 1 f for setting the sections.
- the section setter 1 f sets the regions in the edges of the windows a to i as sections.
- the section setter 1 f then deletes the sections in the edges of the windows e and f, regarding that the sizes thereof are smaller than or equal to the certain size.
- the section setter 1 f also deletes the sections in the edges of the windows g, h, and i, since they exist inside the section of the window c.
- sections a 1 to d 1 of the windows a to d remain.
- the update-frequency setter 1 b sets screen update frequencies for the sections a 1 to d 1 , respectively.
- the update frequencies may be determined based on the amounts of data updated for the respective sections for each amount of time. More specifically, the update-frequency setter 1 b compares given frames of an image displayed on the display unit 2 a with each other and sets an update region in which the number of updates in the image is larger than or equal to a certain value. The update-frequency setter 1 b then superimposes the update region on the corresponding section(s) to thereby make it possible to set a screen update frequency for each section.
- the association-degree setter 1 c sets association degrees indicating the degrees of association between the sections.
- the association-degree setter 1 c refers to the update frequencies to identify the section having a highest update frequency.
- the association-degree setter 1 c further sets a high association degree for the section where an update is occurring simultaneously with the identified section.
- a combination of the section a 1 and the section c 1 are assumed to have a higher association degree than combinations of the other sections.
- the priority setter 1 d identifies a section on which an operation is being performed and sets a higher priority for both the identified section and the section that is the most highly associated with the identified section than the priorities for the other sections.
- the section in which a cursor (e.g., a mouse cursor) 2 b is located and the section that is the most highly associated with the section in which the cursor 2 b is located are set as a combination of the sections having the highest association degree.
- the transmitter 1 e transmits the image, stored in the storage unit 1 a , in sequence with the images for the sections whose set priority is higher first.
- FIG. 1 illustrates image data A to D to be transmitted by the transmitter 1 e .
- the image data A is data to be displayed in the section a 1
- the image data B is data to be displayed in the section b 1
- the image data C is data to be displayed in the section c 1
- the image data D is data to be displayed in the section d 1 .
- the transmitter 1 e transmits the image data A, B, C, and D to the client apparatus 2 , for example, in that order.
- the transmitter 1 e transmits the image for the section c 1 in which the mouse cursor 2 b is located and the image for the section a 1 that is the most highly associated with the section c 1 , prior to the images for the other sections b 1 and d 1 .
- the transmitter 1 e transmits the image data to the client apparatus 2 , for example, in order of A, C, B, and D.
- the client apparatus 2 receives the image data and then displays the image data in specified sections. In the example of FIG.
- the client apparatus 2 first displays the image data A in the section a 1 and displays the image data C in the section c 1 , and then displays the image data B in the section b 1 and displays the image data D in the section d 1 .
- the processing described above makes it possible to reduce the amount of delay in the processing performed on the section c 1 having a high degree of association with the section a 1 having a high update frequency.
- the arrangement may also be such that the association-degree setter 1 c determines whether or not the association degrees are higher than or equal to a threshold and transmits the images for sections having association degrees that are higher than or equal to the threshold prior to the images for the other sections. With this arrangement, it is possible reduce the possibility that the images for sections having low association degrees are transmitted prior to the images for the other sections.
- the update-frequency setter 1 b , the association-degree setter 1 c , the priority setter 1 d , the transmitter 1 e , and the section setter 1 f may be realized by functions of a CPU included in the information processing apparatus 1 .
- the storage unit 1 a may also be realized by a data storage area in a RAM (random access memory), a HDD (hard disk drive), or the like included in the information processing apparatus 1 .
- the transmitter 1 e may also transmit a large volume of data of still images, moving images, and so on to the client apparatus 2 as the image data in accordance with, for example, an RDP (remote desktop protocol) or an RFB (remote frame buffer) protocol for use in VNC (virtual network computing).
- RDP remote desktop protocol
- RFB remote frame buffer
- FIG. 2 is a diagram of an information processing system according to a second embodiment.
- An information processing system 5 according to the present embodiment has a server apparatus 10 and a client apparatus 20 .
- the server apparatus 10 and the client apparatus 20 are interconnected through a predetermined network 50 so as to enable mutual communication.
- the network 50 may be implemented by any type of communication network, such as the Internet, a LAN (local area network), or a VPN (virtual private network), regardless of whether it is wired or wireless.
- a description will be given of a case in which an RFB protocol for VNC is employed as one example of a protocol for communication between the server apparatus 10 and the client apparatus 20 .
- FIG. 2 Although a case in which one client apparatus 20 is connected to one server apparatus 10 is illustrated in FIG. 2 , two or more client apparatuses 20 may also be connected to one server apparatus 10 .
- the server apparatus 10 may be a computer that offers a service for remotely controlling a screen to be displayed on the client apparatus 20 .
- the client apparatus 20 may be a computer that receives the remote-screen control service offered by the server apparatus 10 .
- Examples of the client apparatus 20 include mobile terminals, such as a mobile phone, a PHS (personal handyphone system) phone, and a PDA (personal digital assistant), as well as stationary terminals, such as a personal computer.
- the server apparatus 10 sequentially checks a screen in a desktop environment in which an OS (operating system) and applications are running and transmits any update to the client apparatus 20 .
- the client apparatus 20 displays screen data received from the server apparatus 10 and also transmits a command, generated by an operation, to the server apparatus 10 .
- the user uses the screen in the desktop environment and the size of a desktop screen on the server apparatus 10 and the size of a screen of the client apparatus 20 are the same. It is also assumed that, in the desktop environment, an application (e.g., a CAD application) that features having multiple child windows within one application window or using multiple windows to implement the application is used with the entire area or a large area of the screen being occupied by those windows and transmission/reception of a large amount of update data is triggered by the user's mouse operation.
- an application e.g., a CAD application
- the information processing system 5 may also be advantageously applied to situations in which another user directly operates data of a three-dimensional (3D) object.
- a CPU 101 controls overall operations of the server apparatus 10 .
- a RAM 102 and peripherals are coupled to the CPU 101 through a bus 108 .
- the RAM 102 is used as a primary storage device for the server apparatus 10 .
- the RAM 102 temporarily stores at least part of the OS program and application programs to be executed by the CPU 101 .
- the RAM 102 stores various types of data used for processing to be executed by the CPU 101 .
- Examples of the peripherals coupled to the bus 108 include a hard disk drive (HDD) 103 , a graphics processing device 104 , an input interface 105 , an optical drive device 106 , and a communication interface 107 .
- HDD hard disk drive
- the hard disk drive 103 magnetically writes/reads data to/from its built-in disk.
- the hard disk drive 103 is used as a secondary storage device for the server apparatus 10 .
- the hard disk drive 103 stores the OS program, application programs, and various types of data.
- the secondary storage device may also be implemented by a semiconductor storage device, such as a flash memory.
- a monitor 104 a is coupled to the graphics processing device 104 .
- the graphics processing device 104 displays an image on a screen of the monitor 104 a .
- the monitor 104 a may be implemented by a liquid crystal display device, a display device using a CRT (cathode ray tube), or the like.
- a keyboard 105 a and a mouse 105 b are coupled to the input interface 105 .
- the input interface 105 sends signals, transmitted from the keyboard 105 a and the mouse 105 b , to the CPU 101 .
- the mouse 105 b is one example of a pointing device and may be implemented by another pointing device. Examples of another pointing device include a touch panel, a graphics tablet, a touchpad, and a trackball.
- the optical drive device 106 uses laser light or the like to read data recorded on an optical disk 200 .
- the optical disk 200 is a portable recording medium to which data is recorded so as to be readable via light reflection.
- Examples of the optical disk 200 include a Blu-Ray® disc, a DVD (Digital Versatile Disc), a DVD-RAM, a CD-ROM (Compact Disc—Read Only Memory), and a CD-R/RW (Recordable/ReWritable).
- the communication interface 107 is linked to the network 50 .
- the communication interface 107 transmits/receives data to/from the client apparatus 20 over the network 50 .
- a CPU 201 controls overall operations of the client apparatus 20 .
- a RAM 202 , a flash memory 203 , a graphics processing device 204 , an input interface 205 , and a communication interface 206 are coupled to the CPU 201 through a bus 207 .
- the functions of the RAM 202 , the graphics processing device 204 , the input interface 205 , and the communication interface 206 are similar to those of the RAM 102 , the graphics processing device 104 , the input interface 105 , and the communication interface 107 , respectively.
- the client apparatus 20 lacks a hard disk drive and has the flash memory 203 .
- a display device 204 a displays various types of information, for example, a desktop screen transmitted from the server apparatus 10 .
- Examples of the display device 204 a include a monitor, a display, and a touch panel.
- the client apparatus 20 illustrated in FIG. 2 may be a mobile terminal device that is equipped with the display device 204 a.
- An input device 205 a receives an instruction input by the user.
- Examples of the input device 205 a include a keyboard and a mouse.
- the display device 204 a may realize a pointing device function in cooperation with the mouse.
- Hardware configurations as described above can realize processing functions in the present embodiment.
- FIG. 3 is a block diagram illustrating functions of the server apparatus and the client apparatus in the second embodiment.
- a remote-screen controlling application for a server is preinstalled or installed on the server apparatus 10 .
- the remote-screen controlling application for a server will hereinafter be referred to as a “server-side remote-screen controlling application”.
- the server-side remote-screen controlling application has, as its basic function, a function for offering a remote-screen control service.
- the server-side remote-screen controlling application obtains information of an operation on the client apparatus 20 and causes an application running on the server apparatus 10 to perform processing requested by the operation information.
- the server-side remote-screen controlling application generates a screen for displaying a result of the processing executed by the application and transmits the generated screen to the client apparatus 20 over the network 50 .
- the server-side remote-screen controlling application transmits an image of a region having pixels in a portion that has changed relative to a bitmap image that has been displayed on the client apparatus 20 before the screen is generated this time (e.g., transmits an image of an update rectangle). While a case in which an image of an update portion is a rectangular image is described below by way of example, the disclosed apparatus is also applicable to a case in which an image of an update portion has a shape other than the rectangle.
- the server-side remote-screen controlling application further has a function for compressing data of a portion involving a large amount of inter-frame motion into data based on a moving-image compression system and transmitting the compressed data to the client apparatus 20 .
- the server-side remote-screen controlling application divides a screen, generated from the result of the processing executed by the application, into multiple regions and monitors update frequencies for the respective divided regions.
- the server-side remote-screen controlling application transmits, to the client apparatus 20 , information of the update frequency of the region in which the update frequency exceeds a threshold (the region is hereinafter referred to as a “frequent-update region”).
- the server-side remote-screen controlling application also encodes a bitmap image of the frequent-update region into data based on an MPEG (Moving Picture Experts Group) system, such as MPEG-2 or MPEG-4, and transmits the encoded data to the client apparatus 20 .
- MPEG Motion Picture Experts Group
- the compression system is not limited thereto.
- the compression system may be any moving-image compression coding system, for example, Motion-JPEG (Joint Photographic Experts Group) or the like.
- a remote-screen controlling application for a client is preinstalled or installed on the client apparatus 20 .
- the remote-screen controlling application for a client will hereinafter be referred to as a “client-side remote-screen controlling application”.
- the client-side remote-screen controlling application has a function for reporting operation information, received via the input device 205 a , to the server apparatus 10 .
- Examples of the operation information reported by the client-side remote-screen controlling application include left and right clicks, double click, and drag of the mouse, as well as the position and a displacement of a mouse cursor which are obtained as a result of a movement operation of the mouse.
- Other examples of the operation information include the amount of rotation of a mouse wheel and the type of pressed key on the keyboard.
- the client-side remote-screen controlling application has a function for causing an image, received from the server apparatus 10 , to be displayed on the display device 204 a included in the client apparatus 20 .
- the client-side remote-screen controlling application upon reception of a bitmap image for an update rectangle from the server apparatus 10 , causes the image for the update rectangle to be displayed at a position changed from the position of a previous bitmap image.
- the client-side remote-screen controlling application upon receiving update frequency information of a frequent-update region from the server apparatus 10 , sets, as a blank region in which no bitmap image is to be displayed, a region that lies on the display screen and that corresponds to a position included in the update frequency information.
- the client-side remote-screen controlling application upon reception of data based on the moving-image compression system, decodes the data and displays the decoded data in the blank region.
- server apparatus 10 The functions of the server apparatus 10 and the client apparatus 20 will now be described in detail.
- the server apparatus 10 includes an OS executor 11 , a display-screen generator 12 , a frame buffer 13 , and a remote-screen controller 14 .
- the server apparatus 10 may further include various functions, such as a function of an input device and a function of a display device, of a known computer.
- the OS executor 11 controls execution of the OS.
- the OS executor 11 detects, from the operation information obtained by an operation-information acquirer 14 b (described below), an instruction for launching an application and a command for the application.
- the OS executor 11 issues, to display-screen generator 12 , an instruction for launching the application associated with the icon.
- the OS executor 11 upon detecting an operation for requesting execution of a command on an operation screen, e.g., a window, of a running application, the OS executor 11 issues, to the display-screen generator 12 , an instruction for execution of the command.
- the display-screen generator 12 has a function (an application execution controlling function) for controlling execution of the application and a function (a rendering processing function) for rendering an image in the frame buffer 13 in accordance with an instruction from the OS executor 11 .
- the application execution controlling function operates the corresponding application.
- the application execution controlling function issues a request to the rendering processing function so as to render, in the frame buffer 13 , a display image of a processing result obtained by the execution of the application.
- the application execution controlling function notifies the rendering processing function about the display image and the rendering position of the display image.
- the application executed by the application execution controlling function may be preinstalled or may be installed after the shipment of the server apparatus 10 .
- the application executed by the application execution controlling function may also be an application that runs in a network environment based on Java® or the like.
- the rendering processing function Upon receiving the rendering request from the application execution controlling function, the rendering processing function causes an image for displaying an application processing result to be rendered, in a bitmap format, at the rendering position located in the frame buffer 13 and specified by the application execution controlling function.
- the rendering request may also be received from the OS executor 11 .
- the rendering processing function may cause an image for displaying the mouse cursor to be rendered, for example, in a bitmap format, at the rendering position located in the frame buffer 13 and specified by the OS.
- the frame buffer 13 stores image data rendered by the rendering processing function and used for updating the desktop (the image data will hereinafter be referred to as “update image data”).
- Examples of the frame buffer 13 include semiconductor memory devices, such as a RAM (e.g., a VRAM (video random access memory)), a ROM (read only memory), and a flash memory.
- the frame memory 13 may also be implemented by a hard disk drive or a storage device for an optical disk or the like.
- the remote-screen controller 14 offers the remote-screen control service to the client apparatus 20 via the server-side remote-screen controlling application.
- the remote-screen controller 14 includes a communicator 14 a , an operation-information acquirer 14 b , an operation-position detector 14 c , a window-edge detector 14 d , an update-difference creator 14 e , an update-region setter 14 f , an update-frequency calculator 14 g , an association-degree setter 14 h , a priority setter 14 i , an update-difference converter 14 j , a screen-update reporter 14 k , and a channel-band detector 14 m.
- the communicator 14 a transmits/receives data to/from the client apparatus 20 over the network 50 (not illustrated in FIG. 3 ).
- the operation-information acquirer 14 b acquires operation information received from the client apparatus 20 .
- Examples of the operation information include left and right clicks, double click, and drag of the mouse, as well as a mouse-cursor displacement obtained as a result of a movement operation of the mouse.
- Other examples of the operation information include the amount of rotation of the mouse wheel and the type of pressed key on the keyboard.
- the operation-position detector 14 c has a function for detecting the current mouse-cursor position information from the user operation information transmitted from the client apparatus 20 .
- Examples of the position information include X and Y coordinates of a point where the mouse cursor is located).
- the operation-position detector 14 c reports the obtained mouse-cursor position information to the priority setter 14 i and the association-degree setter 14 h.
- the window-edge detector 14 d has a function for detecting edges from application windows and so on included in the update image data and dividing a desktop screen into multiple sections. More specifically, the window-edge detector 14 d periodically obtains the update image data held in the frame buffer 13 . Application windows running on the desktop are rendered in the update image data. In this envisaged situation, multiple windows or child windows may be displayed on the desktop. For example, since rectangular sections having various sizes, such as windows, buttons, and toolbars, are displayed on the screen, the number of sections at this point may be too large to perform processing. Thus, the window-edge detector 14 d detects the frames of those windows as edges by using a processing scheme called “edge detection”, which is an image processing technique.
- edge detection is an image processing technique.
- the window-edge detector 14 d divides the desktop screen into multiple sections and performs processing for sorting the divided sections into sections for large windows.
- the processing for sorting the sections utilizes characteristics of application images rendered on the screen.
- the application provides buttons, toolbars, sub-windows, rendering data, and so on.
- the buttons, toolbars, and so on have a characteristic in that they are small and are typically surrounded by a section having a larger area.
- the rendering data also has a characteristic in that it is surrounded by a section having a larger area.
- the sub-windows have a characteristic in that each thereof has a relatively large area and is adjacent to a rectangle having a larger area.
- the window-edge detector 14 d sends information of the divided sections to the update-frequency calculator 14 g .
- the section information includes, for example, information indicating an X coordinate, a Y coordinate, a width, and a height of each section.
- the update-difference creator 14 e checks the frame buffer 13 to detect a different portion (an update difference) resulting from an update.
- a different portion an update difference
- One example of detection of the updated difference will be described below.
- the update-difference creator 14 e generates a screen image to be displayed on the display device 204 a of the client apparatus 20 . For example, each time the display-screen generator 12 stores bitmap data in the frame buffer 13 , the update-difference creator 14 e starts processing as described below.
- the update-difference creator 14 e compares image data displayed on the client apparatus 20 during generation of a previous frame with the update image data written in the frame buffer 13 during generation of a current frame.
- the update-difference creator 14 e then generates an image of an update rectangle, which is obtained by coupling pixels in a portion that has changed from a previous frame and shaping the coupled pixels into a rectangle, and then generates plackets for transmitting the update rectangle.
- the update-difference creator 14 e determines inter-frame update frequencies for the respective regions obtained by dividing the update image data stored in the frame buffer 13 . For example, the update-difference creator 14 e stores the generated update rectangle in an internal work memory (not illustrated) for a predetermined period of time.
- the update-difference creator 14 e also stores attribute information that allows the position and the size of the update rectangle to be specified.
- the attribute information includes, for example, information regarding the coordinates of an upper-left vertex of the update rectangle and the width and height of the update rectangle.
- the period of time in which the update rectangle is stored is correlated with the accuracy of setting a frequent-update region, and false detection of the frequent-update region decreases, as the period of time is increased. In this example, it is assumed that the image of the update rectangle is stored for one second.
- the update-difference creator 14 e determines frequencies of updates on the desktop screen by using a map having sections obtained by dividing the screen to be displayed on the display device 204 a into blocks in a meshed pattern.
- FIG. 4 illustrates how the screen to be displayed on the display device is divided.
- FIG. 4 illustrates an update-frequency determination map 30 .
- a portion surrounded by a circle in FIG. 4 indicates details of one of blocks 31 .
- the update-difference creator 14 e divides the pixels constituting the map 30 into the blocks 31 , each having eight pixels ⁇ eight pixels.
- each block 31 includes 64 pixels 32 .
- the update-difference creator 14 e sequentially deploys the images of the update rectangles onto the update-frequency determination map. Each time the update rectangle is deployed on the map, the update-difference creator 14 e accumulates and adds up the numbers of updates in each of the block(s) 31 at a portion that overlaps the update rectangle on the map. In this case, when the update rectangle deployed on the map overlaps a predetermined number of pixels included in one block, the update-difference creator 14 e increments the number of updates in the block by “1”. In this case, a description will be given of an example in which, when the update rectangle overlaps even one pixel included in one block, the number of updates in the block is incremented.
- FIGS. 5A to 5C illustrate how the frequencies of updates on the screen to be displayed on the display device are determined.
- Numerals indicated in nine of the blocks 31 in the update-frequency determination map 30 illustrated in FIG. 5A indicate the numbers of updates in the corresponding blocks 31 when an update rectangle 31 a is deployed.
- Numerals indicated in some of the blocks 31 in the map 30 illustrated in FIG. 5B indicate the numbers of updates in the corresponding blocks 31 when an update rectangle 31 b is deployed.
- Numerals indicated in some of the blocks 31 in the map 30 illustrated in FIG. 5C indicate the numbers of updates in the corresponding blocks 31 when all update rectangles accumulated in the internal work memory are deployed.
- the number of updates in each of the blocks 31 in which no numerals are indicated is assumed to be zero.
- the update rectangle 31 a when the update rectangle 31 a is deployed on the map 30 , the update rectangle 31 a overlaps the blocks 31 in a hatched portion.
- the update-difference creator 14 e increments the number of updates in each of the blocks 31 in the hatched portion by “1”.
- the number of updates in the hatched portion is incremented from “0” to “1”.
- the update rectangle 31 b when the update rectangle 31 b is deployed on the map 30 , the update rectangle 31 b overlaps the blocks 31 in a hatched portion.
- the update-difference creator 14 e increments the number of updates in each of the blocks 31 in the hatched portion by “1”. In this case, since the number of updates in each of the blocks 31 has been “1”, the number of updates in the hatched portion is incremented from “1” to “2”.
- FIG. 5C illustrates one example of the map 30 when all of update rectangles are deployed thereon.
- the update-difference creator 14 e obtains the block(s) 31 in which the number(s) of updates in a predetermined period, e.g., the update frequency or update frequencies, exceeds a threshold.
- a threshold e.g., the update frequency or update frequencies
- the threshold is assumed to be “4”
- the blocks 31 in a hatched portion are obtained.
- a portion in which a moving image is more likely to be displayed on the desktop screen can be encoded by the update-difference converter 14 j .
- an end user may select one of values preset in a stepwise manner by the creator of the server-side remote-screen controlling application or may directly set a value.
- the update-region setter 14 f uses the update difference to set, as a frequent-update region, a region that is included in the update image data in the frame buffer 13 and that has a high update frequency.
- the update-region setter 14 f couples adjacent ones of the blocks into a block group (which is hereinafter referred to as a “coupled block group”) and corrects the coupled block group into a rectangle.
- the update-region setter 14 f derives an interpolation region to be interpolated into a coupled block group and then adds the interpolation region to the coupled block group to thereby correct the coupled block group into a rectangle.
- the interpolation region may be derived by an algorithm for deriving a region with which a coupled block group is shaped into a rectangle with a minimum amount of interpolation therebetween.
- FIG. 6 illustrates how a coupled block group is corrected.
- the update-region setter 14 f adds an interpolation region 52 to a pre-correction coupled block group 51 to thereby correct the coupled block group 51 into a rectangle 53 .
- rectangle combination described below is not completed and thus the rectangle 53 has not been determined as a frequent-update region yet.
- the post-correction rectangle 53 is hereinafter referred to as a “frequent-update region candidate”.
- the update-region setter 14 f When multiple frequent-update region candidates exist, the update-region setter 14 f combines the frequent-update region candidates between which the distance is smaller than or equal to a predetermined value into a rectangle including the frequent-update region candidates.
- the expression “distance between the frequent-update region candidates” as used herein refers to a smallest one of the distances between the post-correction rectangles.
- the update-region setter 14 f derives an interpolation region to be fit into a gap between the frequent-update region candidates and adds the interpolation region to the frequent-update region candidates, to thereby combine the frequent-update region candidates into a rectangle including the candidates.
- the interpolation region may be derived by an algorithm for deriving a region with which frequent-update region candidates are shaped into a combination with a minimum amount of interpolation therebetween.
- FIG. 7 illustrates how frequent-update region candidates are combined.
- the update-region setter 14 f adds an interpolation region 62 to frequent-update region candidates 61 a and 61 b to thereby create a combination 63 including the frequent-update region candidates 61 a and 61 b .
- the update-region setter 14 f then sets the thus-obtained combination 63 as a frequent-update region.
- the update-region setter 14 f Upon setting the frequent-update region in the manner described above, the update-region setter 14 f sends update frequency information to the update-frequency calculator 14 g and the client apparatus 20 .
- the update frequency information includes information for identifying the position and the size of the identified frequent-update region, an ID (a region identifier) for identifying the frequent-update region, and the number of updates.
- a portion that is included in the image data for the desktop screen to be displayed on the client apparatus 20 and that corresponds to the frequent-update region is displayed blank.
- the update-region setter 14 f clears the number of updates in each of the blocks mapped in the internal work memory.
- the update-region setter 14 f stores the update frequency information in the internal work memory.
- FIGS. 8A to 8C illustrate region-position identification information.
- a desktop screen 70 A realized by the update image data stored in the frame buffer 13 includes a browser screen 71 and a moving-image playback screen 72 .
- a browser screen 71 When changes on the desktop screen 70 A are kept track of time-sequentially, an update rectangle of the browser screen 71 that is a still image is not detected and a mouse movement trace 73 and an update rectangle associated with a moving-image playback region 74 based on an application are detected, as illustrated on a screen 70 B in FIG. 8B .
- the update-region setter 14 f identifies, in the moving-image playback region 74 , blocks in which the numbers of updates exceed a threshold, e.g., a portion indicated by hatching.
- the update-region setter 14 f creates update frequency information by adding the largest number of updates of the numbers of updates in the regions specified by the region-position identification information, including the coordinates (x, y) of the upper-left vertex, the width w, and the height h of the frequent-update region in a hatched portion illustrated on a screen 70 C in FIG. 8C , to the region-position identification information.
- the update-region setter 14 f then stores the created update frequency information in the internal memory also sends it to the update-frequency calculator 14 g.
- any point such as a barycenter, that enables designation of the position of a frequent-update region may also be used.
- any point in or outside the screen may be used as the origin. A description will be given with reference back to FIG. 3 .
- the update-frequency calculator 14 g generates section-specific update frequency information indicating the update frequency for each section on the desktop on the basis of in-desktop section information received from the window-edge detector 14 d and the update frequency information received from the update-region setter 14 f.
- the update-frequency calculator 14 g maps the received pieces of information to the in-desktop section information and calculates the update frequency (the number of updates) for each in-desktop section to obtain “0, 0, 30, 40, 3” (X coordinate, Y coordinate, width, height, the number of updates) and so on. After the calculation, the update-frequency calculator 14 g sends the generated section-specific update frequency information to the association-degree setter 14 h and the priority setter 14 i.
- the association-degree setter 14 h has a function for calculating, upon receiving the section-specific update frequency information from the update-frequency calculator 14 g and receiving the current mouse-cursor position information from the operation-position detector 14 c , the frequency (co-occurrence frequency) of updates occurring simultaneously in multiple sections and for setting an association degree indicating the degree of association between the sections. The reason why the association degree is set will be described below.
- An application for use in CAD or the like is typically executed using the entire area or a large area of the desktop screen.
- the application is constituted and used with sections including multiple windows, child widows, or the like.
- the sections can be broadly classified into two types of section according to their functions.
- a first one of the types is a section having buttons, sliders, checkboxes, and so on and is aimed to perform some type of operation on data currently created by the user.
- Such a section will be referred to as an “operation tool section” hereinafter.
- a second type is a section for displaying data, such as a 2D or 3D object or a wire frame, currently created by the user.
- Such a section will be referred to as a “data rendering section” hereinafter.
- the operation tool section is rendered by clicking/dragging a button or slider with the mouse and is generally updated at relatively short intervals.
- the average amount of update data is small.
- data operated by the user is directly or indirectly rendered in the data rendering section and it is intermittently updated.
- the average amount of update data in the data rendering section is large.
- FIG. 9 illustrates associations between the operation tool sections and the data rendering sections.
- the vertical axis indicates the amount of update data and the horizontal axis indicates time (second).
- an operation on an operation tool section C 1 has a large influence on a data rendering section A 1 and an operation on an operation tool section D 1 has a large influence on a data rendering section B 1 .
- the association-degree setter 14 h upon receiving the section-specific update frequency information from the update-frequency calculator 14 g and receiving the current mouse-cursor position information from the operation-position detector 14 c , creates a cursor-position management table that includes the received pieces of information. On the basis of the created cursor-position management table, the association-degree setter 14 h calculates associations between the sections.
- the mouse-cursor section and a section having a highest degree of association with the mouse-cursor section are extracted and image data to be displayed in those sections are transmitted to the client apparatus 20 prior to image data to be displayed in other sections.
- a portion that has a low update frequency but is associated with a region having a high update frequency can be updated on the screen displayed on the display device 204 a , prior to the other portions. Accordingly, it is possible to increase the speed of response to a user operation.
- the operation tool sections and the data rendering sections are not distinguished from each other and are simply referred to as “sections A 1 , B 1 , C 1 , and D 1 ”.
- the cursor-position management table created by the association-degree setter 14 h will be described next.
- FIG. 10 illustrates a cursor-position management table created by the association-degree setter.
- a cursor-position management table T 1 illustrated in FIG. 10 has a “time” column, an “update-frequency information” column, and a “mouse-cursor position (x, y)” column. Pieces of information that are horizontally arranged are associated with each other.
- time the time at which the mouse-cursor position information is obtained from the operation-position detector 14 c is contained.
- the X coordinate, the Y coordinate, the width, the height, and the number of updates which are included in the section-specific update frequency information received from the update-frequency calculator 14 g are contained.
- mouse-cursor position (x, y) the mouse-cursor position information received from the operation-position detector 14 c is contained. A description will be given with reference back to FIG. 3 .
- the priority setter 14 i uses a section on which the user is currently performing an operation, an available band of the network 50 , and the section association degrees in the frame buffer 13 to set priorities for data transfer of the respective sections.
- the priority setter 14 i uses the user's current mouse-cursor position detected by the operation-position detector 14 c and the section association degrees set by the association-degree setter 14 h .
- the priority setter 14 i uses the association degrees, set by the association-degree setter 14 h , to determine a section to be given a high priority and sends, to the update-difference converter 14 j , an instruction for transmitting data for the determined section prior to data for the other sections.
- the update-difference converter 14 j converts the update image data into moving-image data or still-image data in accordance with the update frequency of the display region.
- the update-difference converter 14 j determines whether or not the generated update rectangle is included in the frequent-update region stored in the internal work memory, e.g., is included in a region to which the moving-image data is being transmitted by the communicator 14 a .
- the update rectangle for the received section is processed prior to update rectangles for the other sections.
- the update-region setter 14 f causes the communicator 14 a to transmit the image data of the update rectangle and the update frequency information.
- the update-difference converter 14 j determines whether or not the update frequency information of a frequent-update region is registered in the internal memory in the update-region setter 14 f .
- the update-difference converter 14 j cuts out, in the update image data stored in the frame buffer 13 , a bitmap image for the portion corresponding to the frequent-update region.
- the update-difference converter 14 j then encodes the cut-out bitmap image.
- the update-difference converter 14 j encodes the bitmap image for the frequent-update region.
- An encoding system may be, for example, an MPEG system, such as MPEG-2 or MPEG-4 system, or a Motion-JPEG system.
- the screen-update reporter 14 k performs processing for transmitting the update data converted by the update-difference converter 14 j.
- the screen-update reporter 14 k transmits the update-rectangle image data and the update frequency information, generated by the update-difference creator 14 e , to the client apparatus 20 .
- a communication protocol for transmitting the update-rectangle image data is, for example, an RFB protocol in VNC.
- the screen-update reporter 14 k transmits image data of the frequent-update region, the image data being encoded by the update-difference converter 14 j , (the data is hereinafter referred to as “encoded frequent-update-region image data”) to the client apparatus 20 in conjunction with the corresponding update frequency information.
- a communication protocol for transmitting the encoded frequent-update-region image data may be, for example, an RTP (Real-time Transport Protocol).
- the channel-band detector 14 m has a function for detecting a currently available band on a network channel used for data transmission/reception between the server apparatus 10 and the client apparatus 20 . More specifically, the channel-band detector 14 m obtains the amount of data transmitted from the screen-update reporter 14 k and the amount of data actually transmitted from the communicator 14 a to the client apparatus 20 . The channel-band detector 14 m then sends the amounts of the two pieces of data to the priority setter 14 i .
- the channel-band detector 14 m When the amount of data transmitted from the screen-update reporter 14 k and the amount of data transmitted from the communicator 14 a are equal to each other or when no data is transmitted from the screen-update reporter 14 k , the channel-band detector 14 m periodically transmits data for measuring an available band to the client apparatus 20 . By doing so, the channel-band detector 14 m estimates a currently available band and sends information of the currently available band to the priority setter 14 i.
- the client apparatus 20 has a remote-screen controller 21 .
- the client apparatus 20 may further has various functions, such as a function of a sound output unit, of a known computer.
- the remote-screen controller 21 receives the remote-screen control service, offered by the server apparatus 10 , via the client-side remote-screen controlling application. As illustrated in FIG. 3 , the remote-screen controller 21 includes a communicator 21 a , an operation-information acquirer 21 b , a screen-update information acquirer 21 c , an image-data display unit 21 d , and a moving-image data display unit 21 e.
- the communicator 21 a transmits/receives information to/from the server apparatus 10 over the network 50 .
- the operation-information acquirer 21 b acquires information of an operation input via the input device 205 a and reports the acquired operation information to the server apparatus 10 via the communicator 21 a .
- Examples of the operation information reported by the operation-information acquirer 21 b include left and right clicks, double click, and drag of the mouse, as well as a mouse-cursor displacement obtained as a result of a movement operation of the mouse.
- Other examples of the operation information reported by the operation-information acquirer 21 b include the amount of rotation of the mouse wheel and the type of pressed key on the keyboard.
- the screen-update information acquirer 21 c receives the update-rectangle image and the update frequency information of the frequent-update region via the communicator 21 a , the update-rectangle image and the update frequency information being transmitted by the communicator 14 a in the server apparatus 10 .
- the screen-update information acquirer 21 c also receives the frequent-update-region update frequency information transmitted by the update-region setter 14 f in the server apparatus 10 .
- the screen-update information acquirer 21 c receives the encoded frequent-update-region image data, transmitted by the communicator 14 a in the server apparatus 10 , and the update-rectangle update frequency information, transmitted along with the encoded frequent-update-region image data, via the communicator 21 a.
- the image-data display unit 21 d causes the display device 204 a to display the update-rectangle image received by the screen-update information acquirer 21 c .
- the image-data display unit 21 d causes the bitmap image of the update rectangle to be displayed on a screen region that lies on the display device 204 a and that corresponds to the frequent-update-region position and size included in the update-rectangle update frequency information received from the screen-update information acquirer 21 c.
- the image-data display unit 21 d When the screen-update information acquirer 21 c receives the frequent-update-region update frequency information, the image-data display unit 21 d performs processing in the following manner. That is, the image-data display unit 21 d sets, as a blank region in which no bitmap image is to be displayed, a screen region that lies on the display device 204 a and that corresponds to the frequent-update-region position and size included in the frequent-update-region update frequency information.
- the moving-image data display unit 21 e decodes the encoded frequent-update-region image data received by the screen-update information acquirer 21 c .
- the moving-image data display unit 21 e may be equipped with a decoder employing a decoding system corresponding to the encoding system employed by the server apparatus 10 .
- the moving-image data display unit 21 e causes the display device 204 a to display the decoded image of the frequent-update region.
- the moving-image data display unit 21 e causes the decoded image of the frequent-update region to be displayed on a screen region that lies on the display device 204 a and that corresponds to the frequent-update-region position and size included in the update frequency information of the frequent-update region.
- the remote-screen controller 21 may be implemented by various types of integrated circuit or electronic circuit. At least one of the functional units included in the remote-screen controller 21 may also be implemented by another integrated circuit or electronic circuit. Examples of the integrated circuit include an ASIC (application specific integrated circuit) and an FPGA (field programmable gate array). Examples of the electronic circuit include a CPU and an MPU (micro processing unit).
- FIG. 11 is a flowchart of processing performed by the server apparatus. After the processing of the units is briefly described first with reference to FIG. 11 , functions thereof are described in detail with reference to more detailed flowcharts.
- operation S 1 the server apparatus 10 determines whether or not a certain amount of time has passed. When the certain amount of time has not passed (No in operation S 1 ), the process proceeds to operation S 4 . When the certain amount of time has passed (Yes in operation S 1 ), the process proceeds to operation S 2 .
- the window-edge detector 14 d detects edges of windows in the screen. Thereafter, the process proceeds to operation S 3 .
- the window-edge detector 14 d divides the screen into multiple sections by using the detected edges. Thereafter, the process proceeds to operation S 4 .
- the update-difference creator 14 e determines whether or not update image data exists in the frame buffer 13 .
- the process proceeds to operation S 5 .
- no update image data exists in the frame buffer 13 No in operation S 4
- the process returns to operation S 1 .
- the update-difference creator 14 e compares image data displayed on the client apparatus 20 during generation of a previous frame with the update image data written in the frame buffer 13 during generation of a current frame. The update-difference creator 14 e then generates an update rectangle, which is obtained by coupling pixels in a portion that has changed from the previous frame and shaping the coupled pixels into a rectangle. Thereafter, the process proceeds to operation S 6 .
- the update-frequency calculator 14 g calculates the numbers of updates the respective sections for each certain amount of time. Thereafter, the process proceeds to operation S 9 .
- the association-degree setter 14 h compares the numbers of updates in the same amount of time and calculates the association degrees between the sections. Thereafter, the process proceeds to operation S 10 .
- the channel-band detector 14 m determines whether or not update image data that exceeds the bandwidth of the channel between the communicator 14 a and the communicator 21 a exists.
- update image data that exceeds the bandwidth exists (Yes in operation S 10 )
- the process proceeds to operation S 11 .
- update image data that exceeds the bandwidth does not exist (No in operation S 10 )
- the process proceeds to operation S 13 .
- the priority setter 14 i sets priorities for the update image data for the corresponding sections, on the basis of the user's current mouse-cursor position detected by the operation-position detector 14 c and the association degrees determined in operation S 9 . Thereafter, the process proceeds to operation S 12 .
- the priority setter 14 i rearranges the sections to be given a high priority according to the priorities set in operation S 11 and sends information of the rearranged sections to the update-difference converter 14 j . Thereafter, the process proceeds to operation S 13 .
- the update-difference converter 14 j obtains the update image data from the frame buffer 13 .
- the update image data for the received section(s) is processed prior to the update image data for the other sections.
- the update-difference converter 14 j determines whether or not a region in which the update image data is to be displayed is included in the frequent-update region.
- the process proceeds to operation S 14 .
- the process proceeds to operation S 16 .
- the update-difference converter 14 j converts the update image data into encoded image data by cutting out, in the update image data stored in the frame buffer 13 , a bitmap image corresponding to the frequent-update region and encoding the bitmap image. Thereafter, the process proceeds to operation S 15 .
- the screen-update reporter 14 k transmits the encoded image data, encoded by the update-difference converter 14 j , to the client apparatus 20 via the communicator 14 a . Thereafter, the processing illustrated in FIG. 11 ends.
- the screen-update information acquirer 21 c sends the received encoded image data to the moving-image data display unit 21 e .
- the moving-image image data display unit 21 e decodes the received encoded image data and displays the decoded image data on the display device 204 a.
- the screen-update reporter 14 k transmits the image data of the update rectangle to the client apparatus 20 via the communicator 14 a . Thereafter, the processing illustrated in FIG. 11 ends.
- the screen-update information acquirer 21 c sends the received image data to the image-data display unit 21 d .
- the image-data display unit 21 d displays the received image data on the display device 204 a.
- FIG. 12 is a flowchart illustrating operation-section detection processing.
- operation S 21 the operation-position detector 14 c determines whether or not reception of operation information is detected at the communicator 14 a .
- reception of operation information is detected (Yes in operation S 21 )
- the process proceeds to operation S 22 .
- reception of operation information is not detected (No in operation S 21 )
- the operation-section detection processing ends.
- the operation-position detector 14 c determines whether or not the mouse is operated. More specifically, the operation-position detector 14 c extracts mouse-cursor position information from the received operation information and compares the extracted position information with previously extracted position information. When the pieces of position information do not match each other, it is determined that the mouse is operated. When it is determined that the mouse is operated (Yes in operation S 22 ), the process proceeds to operation S 23 . When it is determined that the mouse is not operated (No in operation S 22 ), the operation-section detection processing ends.
- operation S 23 the operation-position detector 14 c obtains mouse-operation information and extracts the mouse-cursor position therefrom. Thereafter, the process proceeds to operation S 24 .
- operation S 24 the operation-position detector 14 c compares the extracted mouse-cursor position with a previous mouse-cursor position to determine whether or not the position of the mouse cursor has been changed.
- the process proceeds to operation S 25 .
- the operation-section detection processing ends.
- the operation-position detector 14 c sends the mouse-cursor position information to the association-degree setter 14 h and the priority setter 14 i . Thereafter, the operation-section detection processing ends.
- FIG. 13 is a flowchart illustrating channel-band detection processing.
- the channel-band detector 14 m measures the amount of data transmitted by the screen-update reporter 14 k . Thereafter, the process proceeds to operation S 32 .
- the channel-band detector 14 m measures the amount of data currently transmitted by the communicator 14 a . Thereafter, the process proceeds to operation S 33 .
- the channel-band detector 14 m sends the values of the two amounts of data, measured in operations S 31 and S 32 , to the priority setter 14 i . Thereafter, the process proceeds to operation S 34 .
- the channel-band detector 14 m determines whether or not the two amounts of data measured in operations S 31 and S 32 are equal to each other. When the amounts of data are equal to each other (Yes in operation S 34 ), the process proceeds to operation S 37 . When the amounts of data are not equal to each other (No in operation S 34 ), the process proceeds to operation S 35 . Also, when no data is transmitted from the screen-update reporter 14 k , the process proceeds to operation S 37 .
- the channel-band detector 14 m transmits data for measurement to the client apparatus 20 to measure an available band. Thereafter, the process proceeds to operation S 36 .
- the channel-band detector 14 m sends the value of the available band, measured in operation S 35 , to the priority setter 14 i . Thereafter, the channel-band detection processing ends.
- the channel-band detector 14 m sends the amount of data, currently transmitted by the communicator 14 a , to the priority setter 14 i as the value of the available band. Thereafter, the channel-band detection processing ends.
- FIG. 14 is a flowchart illustrating window-edge detection processing.
- the window-edge detector 14 d obtains screen data stored in the frame buffer 13 . Thereafter, the process proceeds to operation S 42 .
- the window-edge detector 14 d performs edge detection processing on the screen data, obtained in operation S 41 , to detect edges and divides the screen into multiple sections by using edges.
- the window-edge detector 14 d selects one of unselected sections of the sections divided in operation S 42 . Thereafter, the process proceeds to operation S 44 .
- the window-edge detector 14 d determines whether or not the section selected in operation S 43 is adjacent to another section with a certain gap therebetween or is superimposed on another section.
- the process proceeds to operation S 46 .
- the process proceeds to operation S 45 .
- the window-edge detector 14 d determines whether or not the section selected in operation S 43 exists inside another section. When the selected section exists inside another section (Yes in operation S 45 ), the process proceeds to operation S 48 . When the selected section does not exist inside another section (No in operation S 45 ), the process proceeds to operation S 46 .
- the window-edge detector 14 d determines whether or not the area of the section selected in operation S 43 is smaller than or equal to a predetermined area threshold (e.g., 200 pixels ⁇ 200 pixels).
- a predetermined area threshold e.g. 200 pixels ⁇ 200 pixels.
- the window-edge detector 14 d couples the section selected in operation S 43 with the adjacent or superimposed section. Thereafter, the process proceeds to operation S 49 .
- the window-edge detector 14 d determines whether or not the processing in operations S 44 to S 48 has been performed on all sections. When it is determined that the processing in operations S 44 to S 48 has been performed on all sections (Yes in operation S 49 ) the process proceeds to operation S 50 . When it is determined that the processing in operations S 44 to S 48 has not been performed on all sections (No in operation S 49 ) the process returns to operation S 43 .
- the window-edge detector 14 d determines whether or not any sections on which the coupling processing in operation S 47 or the deletion processing in operation S 48 has been performed exist in all of the selected sections.
- the process proceeds to operation S 51 .
- Operation S 50 is performed since any section on which the coupling processing or the deletion processing is to be further performed may exist.
- the process proceeds to operation S 52 .
- the window-edge detector 14 d sends, to the update-frequency calculator 14 g , section information including the sections that remain as a result of the coupling processing and deletion processing. Thereafter, the window-edge detection processing ends.
- FIG. 15 is a flowchart illustrating update-frequency calculation processing.
- the update-frequency calculator 14 g determines whether or not the section information is received from the window-edge detector 14 d .
- the process proceeds to operation S 62 .
- no section information is received (No in operation S 61 )
- the update-frequency calculation processing ends.
- the update-frequency calculator 14 g determines whether or not the update frequency information is received from the update-region setter 14 f .
- the process proceeds to operation S 63 .
- no update frequency information is received (No in operation S 62 )
- the update-frequency calculation processing ends.
- the update-frequency calculator 14 g In operation S 63 , the update-frequency calculator 14 g generates section-specific update frequency information indicating the update frequency for each section, on the basis of the section information received in operation S 61 and the update frequency information received in operation S 62 . Thereafter, the process proceeds to operation S 64 .
- the update-frequency calculator 14 g sends the section-specific update frequency information, generated in operation S 63 , to the association-degree setter 14 h and the priority setter 14 i . Thereafter, the update-frequency calculation processing ends.
- association-degree setting processing Processing (i.e., association-degree setting processing) of the association-degree setter 14 h will be described next.
- FIG. 16 is a flowchart illustrating association-degree setting processing.
- the association-degree setter 14 h determines whether or not the section-specific update frequency information is received from the update-frequency calculator 14 g .
- the process proceeds to operation S 72 .
- no section-specific update frequency information is received (No in operation S 71 )
- the association-degree setting processing ends.
- the association-degree setter 14 h determines whether or not the mouse-cursor position information is received from the operation-position detector 14 c .
- the process proceeds to operation S 73 .
- no mouse-cursor position information is received (No in operation S 72 )
- the association-degree setting processing ends.
- the association-degree setter 14 h attaches (associates) the time at which the mouse-cursor position information was received in operation S 72 to (with) the received mouse-cursor position information and stores the associated information in the cursor-position management table T 1 .
- the association-degree setter 14 h determines whether or not the amount of information stored in the cursor-position management table T 1 is larger than or equal to a predetermined amount.
- the process proceeds to operation S 75 .
- the association-degree setting processing ends.
- the association-degree setter 14 h calculates update rates for the respective sections for each certain amount of time. Thereafter, the process proceeds to operation S 76 .
- the association-degree setter 14 h calculates, for each section, the degrees of association with the sections.
- the association-degree setter 14 h then stores the determined association degrees in an association-degree management table T 6 (described below). Thereafter, the association-degree setting processing ends.
- FIG. 17 is a flowchart illustrating priority setting processing.
- the priority setter 14 i determines whether or not data transmitted by the communicator 14 a reaches an upper limit of the available band of the network 50 .
- the process proceeds to operation S 82 .
- the priority setting processing ends.
- the priority setter 14 i receives the mouse-cursor position information from the operation-position detector 14 c , and identifies the section in which the mouse cursor is located, on the basis of the section information created by the window-edge detector 14 d . Thereafter, the process proceeds to operation S 83 .
- the priority setter 14 i refers to the association-degree management table to retrieve the section having a highest degree of association with the mouse-cursor-located section identified in operation S 82 . Thereafter, the process proceeds to operation S 84 .
- the priority setter 14 i determines whether or not the association degree of the section retrieved in operation S 83 is higher than or equal to a predetermined threshold.
- the process proceeds to operation S 85 .
- the association degree of the section retrieved in operation S 83 is lower than the predetermined threshold (No in operation S 84 )
- the priority setting processing ends.
- the priority setter 14 i sets a combination of the mouse-cursor-located section and the section that is the most highly associated therewith as highest-priority sections. Thereafter, the process proceeds to operation S 86 .
- the priority setter 14 i sends information of the combination of the highest-priority sections to the update-difference converter 14 j . Thereafter, the process proceeds to operation S 87 .
- operation S 87 a determination is made as to whether or not data transmitted by the communicator 14 a reaches the upper limit of the available band of the network 50 , on the basis of the currently available network band detected by the channel-band detector 14 m .
- the process proceeds to operation S 88 .
- the priority setting processing ends.
- the priority setter 14 i refers to the association-degree management table T 6 to retrieve the section having a second highest degree of association with the mouse-cursor-located section after the section identified in operation S 83 . Thereafter, the process proceeds to operation S 89 .
- the priority setter 14 i determines whether or not the association degree of the section retrieved in operation S 88 is higher than or equal to a predetermined threshold.
- the process returns to operation S 86 .
- the association degree of the section retrieved in operation S 88 is lower than the predetermined threshold (No in operation S 89 )
- the priority setting processing ends.
- any section can assume a value of the association degree.
- the threshold is set and information regarding a combination of the mouse-cursor-located section and sections whose degrees of association therewith are higher than or equal to the threshold is transmitted prior to information for the other sections.
- the amount of data generated is larger than the amount of already transmitted data.
- transmitting the data of only the sections having the highest association degree makes it possible to reduce the amount of data generated and also makes it possible to transmit all of generated data.
- the use of the association degree also allows the data for the section having a gentle update frequency to be transmitted and displayed with high priority, thus making it possible to improve the operation response felt by the user.
- the priority setter 14 i determines whether or not the retrieved association degree is higher than or equal to the threshold.
- the arrangement may be such that, after the association-degree setter 14 h calculates the association degrees, the association degree(s) that is lower than the threshold is deleted and operation S 84 is eliminated in the priority setting processing.
- FIG. 18 illustrates a specific example of association-degree setting processing.
- a cursor-position management table T 1 illustrated in FIG. 18 the “update-frequency information” column is omitted.
- the position of the mouse cursor is assumed to be located in the section C 1 between time 10:10:10 and time 10:10:14.
- FIG. 18 illustrates a table T 2 that includes the calculated update rates.
- the table T 2 indicates that, for example, update rates of 0, 1, 0, 0, and 17 (in units of %) are obtained in the section A 1 between time 10:10:10 and time 10:10:14.
- the association-degree setter 14 h uses the table T 2 to calculate differences between the update rates for each update occurrence time at which operations were performed on the sections.
- FIG. 18 illustrates a table T 3 that includes the calculated update-rate differences.
- This value “5” is used as an update-rate difference of the section A 1 relative to the operation on the section C 1 .
- an update-rate difference of the section B 1 relative to the section C 1 is determined to be 32 and an update-rate difference of the section D 1 relative to the section C 1 is determined to be 32.
- the association-degree setter 14 h adds up the update-rate differences of all of the sections and divides the update-rate differences between the sections to calculate an update-rate mismatch degree relative to the section C 1 .
- FIG. 18 illustrates a table T 4 that includes the calculated update-rate mismatch degrees.
- the association-degree setter 14 h multiplies values, obtained by subtracting the values included in the table T 4 from “1”, by 100 to thereby calculate the association degrees.
- FIG. 18 illustrates a table T 5 indicating the calculated association degrees.
- FIG. 18 illustrates an association-degree management table T 6 that includes all the association degrees between the sections. Checking the degree of association of the mouse-cursor-located section with another section, as described above, makes it possible to determine the association degrees between the sections.
- the priority setter 14 i refers to the row of C 1 in the association-degree management table T 6 to retrieve the section that is the most highly associated with the section C 1 . Since the section A 1 having an association degree of 93 has the highest degree of association with the section C 1 in the association-degree management table T 6 , the highest association degree is given to the section C 1 and the section A 1 . The priority setter 14 i then determines that data for the sections C 1 and A 1 are to be transmitted with high priority. The priority setter 14 i then sends information of the combination of the sections C 1 and A 1 to the update-difference converter 14 j .
- image data of an update rectangle belonging to the section B 1 having a second highest degree of association with the section C 1 after the section A 1 is transmitted to the update-difference converter 14 j.
- the server apparatus 10 detects edges from the desktop screen, divides the desktop screen into multiple sections, calculates update frequencies of the respective sections, determines degrees of association with the mouse-cursor-located section, and transmits the image data of update rectangles for the sections having the highest association degree to the client apparatus 20 prior to the image data for the other sections. Accordingly, it is possible to increase the speed of response to a user operation.
- the processing performed by the server apparatus 10 may also be executed by a plurality of apparatuses in a distributed manner.
- the arrangement may be such that one apparatus performs processing up to the association-degree setting processing to generate the association-degree management table T 6 and another apparatus determines a combination of the sections having the highest association degree by using the association-degree management table T 6 .
- the information processing system according to the third embodiment is different from the information processing system according to the second embodiment in that the window-edge detector 14 d refers to the association-degree management table T 6 , set by the association-degree setter 14 h , to change conditions for setting the sections. Two methods will be described below.
- the window-edge detector 14 d performs processing for excluding the section in the window-edge detection processing.
- the window-edge detector 14 d performs processing for excluding the section F 1 in the window-edge detection processing.
- FIG. 19 is a flowchart illustrating first window-edge detection processing in the third embodiment. Operations that are different from those in the second embodiment will be particularly described below.
- the window-edge detector 14 d refers to the association-degree management table T 6 to determine whether or not a section (e.g., an unassociated section) whose degrees of association with all other sections are lower than or equal to a predetermined value exists in the sections stored in the association-degree management table T 6 .
- a section e.g., an unassociated section
- the process proceeds to operation S 42 b .
- no unassociated section exists No in operation S 42 a
- the process proceeds to operation S 43 .
- the window-edge detector 14 d deletes the unassociated section from the sections divided in operation S 42 . Thereafter, the process proceeds to operation S 43 .
- the window-edge detector 14 d changes the conditions for the edge detection and for the subsequent section division and performs processing so as to perform more finer section division.
- the threshold in operation S 45 may be varied in the edge detection processing so as to increase the number of sections in the screen.
- FIG. 20 is a flowchart illustrating second window-edge detection processing in the third embodiment. Operations that are different from those in the second embodiment will be particularly described below.
- the window-edge detector 14 d refers to the association-degree management table T 6 to determine whether or not all of the association degrees included in the association-degree management table T 6 are lower than or equal to a predetermined value.
- the process proceeds to operation S 42 d .
- the process proceeds to operation S 43 .
- the window-edge detector 14 d reduces the value of the threshold to be used in operation S 46 . Thereafter, the process proceeds to operation S 43 .
- the information processing system according to the third embodiment provides substantially the same advantages as the information processing system according to the second embodiment.
- the amount of calculation is further reduced to thereby make it possible to speed up the response to a user operation.
- the threshold is reduced, so that the number of sections increases. This arrangement makes it possible to increase the possibility that the sections having a high association degree are identifiable.
- the condition for transmitting the update-difference data in descending order of priority is that the data transfer is not performed in time, as illustrated in operation S 81 in the priority setting processing (in FIG. 17 ) performed by the priority setter 14 i.
- the information processing system reduces the amount of erroneous determination in the priority processing.
- FIG. 21 is a block diagram illustrating a server apparatus in the fourth embodiment.
- a server apparatus 10 a in the present embodiment further includes an operation-state detector 14 n for detecting an operation state.
- the operation-state detector 14 n obtains operation information from the operation-information acquirer 14 b .
- the operation-state detector 14 n then holds information indicating whether or not the user was performing an operation at a certain time and information indicating on which section the operation was performed when he or she was performing an operation.
- the priority setter 14 i sets priorities.
- FIG. 22 illustrates the information held by the operation-state detector.
- FIG. 22 illustrates a tabularized form of the information held by the operation-state detector 14 n.
- An operation-state management table T 7 illustrated in FIG. 22 has a “time” column, a “tool operation state” column, and a “section” column. Pieces of information that are horizontally arranged are associated with each other.
- time the time at which operation information that the operation-state detector 14 n obtained from the operation-information acquirer 14 b is contained.
- section information that identifies the section on which an operation was performed when the user was operating the mouse is contained.
- FIG. 23 is a flowchart illustrating priority setting processing in the fourth embodiment. Operations that are different from those in the second embodiment will be particularly described below.
- the priority setter 14 i refers to the operation-state management table T 7 to determine whether or not the information at the most recent time indicates that an operation was performed.
- the process proceeds to operation S 83 .
- the priority setting processing ends.
- the information processing system according to the fourth embodiment provides substantially the same advantages as the information processing system according to the second embodiment.
- the information processing system according to the fourth embodiment can further reduce the amount of erroneous determination in the priority processing.
- the association degree is a value indicating that updates occurred at the same time, it is, more specifically, a value indicating that updates occurred at the same time when an operation was performed.
- the server apparatus in the present embodiment utilizes operation history to enhance the reliability of the association degrees set by the association-degree setter 14 h.
- FIG. 24 is a block diagram illustrating a server apparatus in the fifth embodiment.
- a server apparatus 10 b in the present embodiment includes an operation-information history accumulator 14 p having a function for accumulating the user's operation history, in addition to the configuration of the server apparatus 10 , described above.
- the operation-information history accumulator 14 p accumulates, as history information, information indicating in which section an instruction operation (e.g., a click operation) to the mouse cursor position is performed for each unit time.
- an instruction operation e.g., a click operation
- the association-degree setter 14 h then sets association degrees on the basis of the accumulated history information.
- FIG. 25 illustrates the history information held by the operation-information history accumulator.
- FIG. 25 illustrates a tabularized form of the history information held by the operation-information history accumulator 14 p.
- An operation-information history management table T 8 illustrated in FIG. 25 has a “time” column, a “section” column, and a “mouse position (x, y)” column. Pieces of information that are horizontally arranged are associated with each other.
- the time at which the operation-information history accumulator 14 p obtains the mouse-cursor position information from the operation-position detector 14 c is contained.
- section information that identifies the number of times the mouse click operation is performed for each section is contained.
- the section information may be obtained from the window-edge detector 14 d.
- mouse-cursor position information obtained from the operation-information acquirer 14 b is contained.
- FIG. 26 is a flowchart illustrating association-degree setting processing in the fifth embodiment. Operations that are different from those in the second embodiment will be particularly described below.
- the association-degree setter 14 h refers to the operation-information history management table T 8 to determine whether or not operation history information corresponding to time of interest exists.
- operation history information corresponding to the time of interest exists Yes in operation S 76 a
- the process proceeds to operation S 76 b .
- no operation history information corresponding to the time of interest exists No in operation S 76 a
- the process proceeds to operation S 77 .
- the association-degree setter 14 h sets an association degree by reducing the update-rate difference in a time slot when an operation was performed and increasing the update-rate difference in a time slot when no operation was perfumed. Thereafter, the process proceeds to operation S 77 .
- FIG. 27 illustrates a specific example of the association-degree setting processing in the fifth embodiment.
- a cursor-position management table T 1 and a table T 2 illustrated in FIG. 27 are substantially the same as those illustrated in FIG. 18 and used in the description of the specific example of the second embodiment.
- the association-degree setter 14 h uses the table T 2 to calculate differences between the update rates for each of the update occurrence times at which operations were performed between the sections. Using the operation-information history management table T 8 , the association-degree setter 14 h applies a weight to the update-rate differences.
- FIG. 27 illustrates a table T 9 that includes values obtained by applying a weight to the calculated update-rate differences.
- the update-rate difference in the time slot when a mouse click operation was performed is reduced to one half and the update-rate difference in the time slot when no mouse click operation was performed is doubled.
- update rates of 0, 1, 0, 0, and 17 are obtained in the section A 1 between time 10:10:10 and time 10:10:14 and update rates of 0, 0, 0, 0, and 21 are obtained in the section C 1 between time 10:10:10 and time 10:10:14.
- the mouse cursor was operated twice in the section C 1 at time 10:10:14.
- the update-rate difference at 10:10:14 is reduced to one half and the update-rate differences at the other times are doubled.
- This value “4” is used as an update-rate difference of the section A 1 relative to the operation on the section C 1 .
- an update-rate difference of the section B 1 relative to the section C 1 s determined to be 35.5 and an update-rate difference of the section D 1 to the section C 1 is determined to be 34. Since the method of the subsequent association-degree determination is substantially the same as the method in the second embodiment, a description thereof is not given hereinafter.
- a table T 10 illustrated in FIG. 27 corresponds to the table T 4 and a table T 11 corresponds to the table T 5 .
- the association degree of the section C 1 relative to the section A 1 , the mouse cursor being operated in the section C 1 is 95%, which is larger than the association degree “93%” of the section C 1 relative to the section A 1 in the case of the second embodiment.
- the information processing system according to the fifth embodiment provides substantially the same advantages as the information processing system according to the second embodiment.
- a weight is further applied to thereby make it possible to further enhance the reliability of the association degrees set by the association-degree setter 14 h.
- the mouse is used to move the cursor across various points on the desktop.
- deviation occurs in the points between which the cursor is moved. For example, suppose a case in which, using a 3D CAD application, the user changes a setting for writing or the like while adjusting an overall shape of a 3D object in a data-rendering section by rotating the 3D object. In this case, a larger amount of mouse operation for moving the cursor between the data-rendering section in which the 3D object is rendered and an operation-tool section for setting the writing or the like may occur than usual.
- the deviation of the points between which the cursor is moved is reflected in the setting of the association degrees, to thereby enhance the reliability of the association degrees.
- FIG. 28 is a block diagram illustrating a server apparatus in the sixth embodiment.
- a server apparatus 10 c in the present embodiment further has a mouse-movement-history vector extractor 14 q having functions for generating information regarding a trace of mouse movement between multiple sections on the basis of the history information and accumulating the generated information.
- FIG. 29 illustrates an operation of the association-degree setter in the sixth embodiment.
- FIG. 29 illustrates a table T 12 for managing information regarding the mouse-operation traces extracted by the mouse-movement-history vector extractor 14 q.
- the association-degree setter 14 h increases, in the data of the association-degree management table T 6 , the degree of association of a corresponding section relative to a data-rendering section.
- the table T 12 illustrated in FIG. 29 indicates a case in which a mouse operation for moving the cursor from the section C 1 to the section A 1 and a mouse operation for moving the cursor from the section D 1 to the section B 1 are performed ten times or more in a certain period of time. Adjustment is performed so that the values in the association-degree management table T 6 which correspond to the extracted combination are increased according to the number of times the operation was performed. As a result, in the association-degree management table T 6 illustrated in FIG. 18 , the value of the combination of the section C 1 and the section A 1 increases from 93 to 98 and the value of the combination of the section D 1 and the section B 1 increases from 89 to 93.
- This operation allows for setting of more accurate association degrees.
- FIG. 30 is a flowchart illustrating association-degree setting processing in the sixth embodiment. Operations that are different from those in the fifth embodiment will be particularly described below.
- the association-degree setter 14 h calculates, for each section, degrees of association with the sections. Thereafter, the process proceeds to operation S 77 b.
- the association-degree setter 14 h refers to the operation-information history management table T 8 to determine whether or not operation history information corresponding to time of interest exists.
- operation history information corresponding to the time of interest exists Yes in operation S 77 b
- the process proceeds to operation S 77 c .
- no operation history information corresponding to the time of interest exists No in operation S 77 b
- the processing ends.
- the association-degree setter 14 h In operation S 77 c , the association-degree setter 14 h generates vector data on the basis of the operation history information. The association-degree setter 14 h then applies a weight to the association degree in accordance with the number of vectors. Thereafter, the association-degree setting processing ends.
- the information processing system according to the sixth embodiment provides substantially the same advantages as the information processing system according to the fifth embodiment.
- a weight is further applied to thereby make it possible to further enhance the reliability of the association degrees set by the association-degree setter 14 h.
- the above-described processing functions may be realized by a computer.
- a program in which details of the processing functions of the information processing apparatus 1 and the server apparatus 10 , 10 a , 10 b , or 10 c are written is supplied.
- the program is executed by the computer, the above-described processing functions are realized on the computer.
- the program in which the details of the processing are written may be recorded to a computer-readable recording medium.
- the computer-readable recording medium include a magnetic storage device, an optical disk, a magneto optical recording medium, and a semiconductor memory.
- the magnetic storage device include a hard disk drive, a flexible disk (FD), and a magnetic tape.
- the optical disk include a DVD, DVD-RAM, and CD-ROM/RW.
- One example of the magneto-optical recording medium is an MO (magneto-optical disk).
- portable recording media such as DVDs and CD-ROMs
- the program may also be stored in a storage device in a server computer so that the program can be transferred therefrom to another computer over a network.
- the computer that executes the program may store, in the storage device thereof, the program recorded on the portable recording medium or the like or transferred from the server computer. The computer then reads the program from the storage device thereof and executes processing according to the program. The computer may also directly read the program from the portable recording medium and execute the processing according to the program. In addition, each time the program is transferred from the server computer linked through a network, the computer may sequentially execute the processing according to the received program.
- At least one of the above-described processing functions may also be implemented by an electronic circuit, such as a DSP (digital signal processor), an ASIC, or a PLD (programmable logic device).
- a DSP digital signal processor
- ASIC application specific integrated circuit
- PLD programmable logic device
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Information Transfer Between Computers (AREA)
- Digital Computer Display Output (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
An information processing apparatus includes, a storage unit that stores an image to be transmitted, an update-frequency setter that sets, for respective sections set in the image to be transmitted, update frequencies of images stored for the sections in a predetermined period of time, an association-degree setter that sets association degrees to indicate degrees of association between the sections based on the update frequencies, a priority setter that identifies the section on which an operation is performed and sets a higher priority for the identified section and the section having a highest degree of association with the identified section than priorities for other sections, and a transmitter that transmits the image, stored by the storage unit, in sequence with the images stored for the sections whose set priority is higher first.
Description
This application is based upon and claims the benefit of priority of the prior Japanese Patent Application No. 2010-269617, filed on Dec. 2, 2010, the entire contents of which are incorporated herein by reference.
The embodiments discussed herein are related to an information processing apparatus, a method, and a recording medium.
In recent years, thin client systems have been known in which a server apparatus manages resources, such as applications and files, so that functionality equipped in a client apparatus is reduced as much as possible.
In the thin client system, with a result of processing executed by the server apparatus and/or data held thereby being displayed on the client apparatus, the client behaves as if it were playing major roles of executing the processing and/or were holding the data.
One example of a method for task execution in such a client system is a method in which a server apparatus executes applications for document creation tasks, mail management tasks, and so on and a client apparatus displays results of the processing of the applications.
In recent years, in addition to such document creation tasks and mail management tasks, there have been demands for extensively applying tasks to be executed by thin client systems to, for example, high-definition-image handling tasks, such as a CAD (computer-aided design) creation tasks, and moving-image playback and edit tasks.
When a CPU (central processing unit) of a client apparatus executes a large-load task such as a CAD creation task or a moving-image-handling task, the amount of information transferred from the server apparatus to the client apparatus increases, which may delay responses to operations executed by the client apparatus. One known example of technologies for improving the response speed is a technology in which a display screen is divided into multiple blocks, a block in which the update frequency is high is detected from the blocks, and an image in the detected block is determined as a moving image and is read and transferred with high priority (e.g., Japanese Laid-open Patent Publication No. 11-275532).
According to an aspect of the embodiment, an information processing apparatus includes, a storage unit that stores an image to be transmitted, an update-frequency setter that sets, for respective sections set in the image to be transmitted, update frequencies of images stored for the sections in a predetermined period of time, an association-degree setter that sets association degrees to indicate degrees of association between the sections based on the update frequencies, a priority setter that identifies the section on which an operation is performed and sets a higher priority for the identified section and the section having a highest degree of association with the identified section than priorities for other sections, and a transmitter that transmits the image, stored by the storage unit, in sequence with the images for the sections whose set priority is higher first.
The object and advantages of the invention will be realized and attained by at least the features, elements, and combinations particularly pointed out in the claims.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention, as claimed.
In the related art, there are cases in which an image desired by a user is transmitted subsequent to other images, thus taking a long time for the user to view the contents of the desired image. For example, applications for CAD and so on may display a set of multiple windows on a single monitor. The windows displayed include, for example, a window having a relatively high update frequency (e.g., a window in which data is rendered) and a window having a relatively low update frequency (e.g., a window in which operation tools are displayed). When rendering processing for the window having a high update frequency is processed with higher priority, the priority of the rendering processing for the window having a low update frequency is relatively reduced. Consequently, for example, when the user is performing, in a window in which operation tools are displayed, a mouse operation associated with a window in which moving-image data is rendered while viewing the window, a time lag may occur in movement of the mouse cursor although the moving-image data is quickly updated.
Delay in transmission of a desired image occurs not only in cases in which still images and moving images are handled, but is also common to cases in which the amount of information transmitted between a client apparatus and a server apparatus in a thin client system increases during screen update.
Embodiments of an information processing technology that is capable of reducing the amount of delay in transmission of desired images will be described below in detail with reference to the accompanying drawings.
An information processing apparatus according to an embodiment is first described and then other specific embodiments are described.
An information processing apparatus 1 (e.g., a computer) according to an embodiment has a function for transmitting image data to a client apparatus 2 in connection with a request for displaying an image on a display unit 2 a of the client apparatus 2.
The information processing apparatus 1 includes a storage unit 1 a, an update-frequency setter 1 b, an association-degree setter 1 c, a priority setter 1 d, and a transmitter 1 e.
The storage unit 1 a stores an image to be transmitted.
The update-frequency setter 1 b sets, for respective sections set in the image to be transmitted, update frequencies of the images stored for the sections in a predetermined period of time.
The information processing apparatus 1 according to the embodiment further includes a section setter 1 f for setting the sections. The section setter 1 f detects edges of each window displayed on the display unit 2 a. In the example of FIG. 1 , the section setter 1 f detects the frames of four windows a to i as edges. On the basis of the detected edges, the section setter 1 f sets the sections on a screen displayed on the display unit 2 a. For example, the section setter 1 f divides the screen into sections by using the detected edges. The section setter 1 f then deletes the divided sections having sizes that are smaller than or equal a certain size and the sections existing inside other sections. In the example of FIG. 1 , the section setter 1 f sets the regions in the edges of the windows a to i as sections. The section setter 1 f then deletes the sections in the edges of the windows e and f, regarding that the sizes thereof are smaller than or equal to the certain size. The section setter 1 f also deletes the sections in the edges of the windows g, h, and i, since they exist inside the section of the window c. As a result of the processing, sections a1 to d1 of the windows a to d remain.
In the example of FIG. 1 , the update-frequency setter 1 b sets screen update frequencies for the sections a1 to d1, respectively. The update frequencies may be determined based on the amounts of data updated for the respective sections for each amount of time. More specifically, the update-frequency setter 1 b compares given frames of an image displayed on the display unit 2 a with each other and sets an update region in which the number of updates in the image is larger than or equal to a certain value. The update-frequency setter 1 b then superimposes the update region on the corresponding section(s) to thereby make it possible to set a screen update frequency for each section.
On the basis of the update frequencies obtained by the update-frequency setter 1 b, the association-degree setter 1 c sets association degrees indicating the degrees of association between the sections. For example, the association-degree setter 1 c refers to the update frequencies to identify the section having a highest update frequency. The association-degree setter 1 c further sets a high association degree for the section where an update is occurring simultaneously with the identified section. In the present embodiment, a combination of the section a1 and the section c1 are assumed to have a higher association degree than combinations of the other sections.
The priority setter 1 d identifies a section on which an operation is being performed and sets a higher priority for both the identified section and the section that is the most highly associated with the identified section than the priorities for the other sections. In the example of FIG. 1 , the section in which a cursor (e.g., a mouse cursor) 2 b is located and the section that is the most highly associated with the section in which the cursor 2 b is located are set as a combination of the sections having the highest association degree.
The transmitter 1 e transmits the image, stored in the storage unit 1 a, in sequence with the images for the sections whose set priority is higher first. FIG. 1 illustrates image data A to D to be transmitted by the transmitter 1 e. The image data A is data to be displayed in the section a1, the image data B is data to be displayed in the section b1, the image data C is data to be displayed in the section c1, and the image data D is data to be displayed in the section d1. When the priorities are not considered, the transmitter 1 e transmits the image data A, B, C, and D to the client apparatus 2, for example, in that order. On the other hand, when the priorities are considered, the transmitter 1 e transmits the image for the section c1 in which the mouse cursor 2 b is located and the image for the section a1 that is the most highly associated with the section c1, prior to the images for the other sections b1 and d1. As a result, the transmitter 1 e transmits the image data to the client apparatus 2, for example, in order of A, C, B, and D. With this arrangement, since the image for the section on which the user is actually performing an operation is transmitted prior to the images for the other sections, the amount of delay in responding to the user operation can be reduced. The client apparatus 2 receives the image data and then displays the image data in specified sections. In the example of FIG. 1 , the client apparatus 2 first displays the image data A in the section a1 and displays the image data C in the section c1, and then displays the image data B in the section b1 and displays the image data D in the section d1. The processing described above makes it possible to reduce the amount of delay in the processing performed on the section c1 having a high degree of association with the section a1 having a high update frequency.
The arrangement may also be such that the association-degree setter 1 c determines whether or not the association degrees are higher than or equal to a threshold and transmits the images for sections having association degrees that are higher than or equal to the threshold prior to the images for the other sections. With this arrangement, it is possible reduce the possibility that the images for sections having low association degrees are transmitted prior to the images for the other sections.
The update-frequency setter 1 b, the association-degree setter 1 c, the priority setter 1 d, the transmitter 1 e, and the section setter 1 f may be realized by functions of a CPU included in the information processing apparatus 1. The storage unit 1 a may also be realized by a data storage area in a RAM (random access memory), a HDD (hard disk drive), or the like included in the information processing apparatus 1.
The transmitter 1 e may also transmit a large volume of data of still images, moving images, and so on to the client apparatus 2 as the image data in accordance with, for example, an RDP (remote desktop protocol) or an RFB (remote frame buffer) protocol for use in VNC (virtual network computing).
Other embodiments will be described below more specifically.
The server apparatus 10 and the client apparatus 20 are interconnected through a predetermined network 50 so as to enable mutual communication. The network 50 may be implemented by any type of communication network, such as the Internet, a LAN (local area network), or a VPN (virtual private network), regardless of whether it is wired or wireless. A description will be given of a case in which an RFB protocol for VNC is employed as one example of a protocol for communication between the server apparatus 10 and the client apparatus 20.
Although a case in which one client apparatus 20 is connected to one server apparatus 10 is illustrated in FIG. 2 , two or more client apparatuses 20 may also be connected to one server apparatus 10.
The server apparatus 10 may be a computer that offers a service for remotely controlling a screen to be displayed on the client apparatus 20.
The client apparatus 20 may be a computer that receives the remote-screen control service offered by the server apparatus 10. Examples of the client apparatus 20 include mobile terminals, such as a mobile phone, a PHS (personal handyphone system) phone, and a PDA (personal digital assistant), as well as stationary terminals, such as a personal computer.
The server apparatus 10 sequentially checks a screen in a desktop environment in which an OS (operating system) and applications are running and transmits any update to the client apparatus 20. The client apparatus 20 displays screen data received from the server apparatus 10 and also transmits a command, generated by an operation, to the server apparatus 10.
A description below is given of a case in which the user operates the client apparatus 20 to receive and use the desktop-environment screen, transmitted from the server apparatus 10, over the network 50.
It is assumed that, in this case, the user uses the screen in the desktop environment and the size of a desktop screen on the server apparatus 10 and the size of a screen of the client apparatus 20 are the same. It is also assumed that, in the desktop environment, an application (e.g., a CAD application) that features having multiple child windows within one application window or using multiple windows to implement the application is used with the entire area or a large area of the screen being occupied by those windows and transmission/reception of a large amount of update data is triggered by the user's mouse operation.
Although a case in which the user operates tools such as buttons is mainly envisaged as a situation in which the information processing system 5 of the embodiments is effective, the information processing system 5 may also be advantageously applied to situations in which another user directly operates data of a three-dimensional (3D) object.
The hardware configurations of the server apparatus 10 and the client apparatus 20 will be described below. A CPU 101 controls overall operations of the server apparatus 10. A RAM 102 and peripherals are coupled to the CPU 101 through a bus 108.
The RAM 102 is used as a primary storage device for the server apparatus 10. The RAM 102 temporarily stores at least part of the OS program and application programs to be executed by the CPU 101. The RAM 102 stores various types of data used for processing to be executed by the CPU 101.
Examples of the peripherals coupled to the bus 108 include a hard disk drive (HDD) 103, a graphics processing device 104, an input interface 105, an optical drive device 106, and a communication interface 107.
The hard disk drive 103 magnetically writes/reads data to/from its built-in disk. The hard disk drive 103 is used as a secondary storage device for the server apparatus 10. The hard disk drive 103 stores the OS program, application programs, and various types of data. The secondary storage device may also be implemented by a semiconductor storage device, such as a flash memory.
A monitor 104 a is coupled to the graphics processing device 104. In accordance with an instruction from the CPU 101, the graphics processing device 104 displays an image on a screen of the monitor 104 a. The monitor 104 a may be implemented by a liquid crystal display device, a display device using a CRT (cathode ray tube), or the like.
A keyboard 105 a and a mouse 105 b are coupled to the input interface 105. The input interface 105 sends signals, transmitted from the keyboard 105 a and the mouse 105 b, to the CPU 101. The mouse 105 b is one example of a pointing device and may be implemented by another pointing device. Examples of another pointing device include a touch panel, a graphics tablet, a touchpad, and a trackball.
The optical drive device 106 uses laser light or the like to read data recorded on an optical disk 200. The optical disk 200 is a portable recording medium to which data is recorded so as to be readable via light reflection. Examples of the optical disk 200 include a Blu-Ray® disc, a DVD (Digital Versatile Disc), a DVD-RAM, a CD-ROM (Compact Disc—Read Only Memory), and a CD-R/RW (Recordable/ReWritable).
The communication interface 107 is linked to the network 50. The communication interface 107 transmits/receives data to/from the client apparatus 20 over the network 50.
A CPU 201 controls overall operations of the client apparatus 20. A RAM 202, a flash memory 203, a graphics processing device 204, an input interface 205, and a communication interface 206 are coupled to the CPU 201 through a bus 207.
The functions of the RAM 202, the graphics processing device 204, the input interface 205, and the communication interface 206 are similar to those of the RAM 102, the graphics processing device 104, the input interface 105, and the communication interface 107, respectively.
The client apparatus 20 lacks a hard disk drive and has the flash memory 203.
A display device 204 a displays various types of information, for example, a desktop screen transmitted from the server apparatus 10. Examples of the display device 204 a include a monitor, a display, and a touch panel. The client apparatus 20 illustrated in FIG. 2 may be a mobile terminal device that is equipped with the display device 204 a.
An input device 205 a receives an instruction input by the user. Examples of the input device 205 a include a keyboard and a mouse. The display device 204 a may realize a pointing device function in cooperation with the mouse.
Hardware configurations as described above can realize processing functions in the present embodiment.
A remote-screen controlling application for a server is preinstalled or installed on the server apparatus 10. The remote-screen controlling application for a server will hereinafter be referred to as a “server-side remote-screen controlling application”.
The server-side remote-screen controlling application has, as its basic function, a function for offering a remote-screen control service. As one example, the server-side remote-screen controlling application obtains information of an operation on the client apparatus 20 and causes an application running on the server apparatus 10 to perform processing requested by the operation information. The server-side remote-screen controlling application generates a screen for displaying a result of the processing executed by the application and transmits the generated screen to the client apparatus 20 over the network 50. In this case, the server-side remote-screen controlling application transmits an image of a region having pixels in a portion that has changed relative to a bitmap image that has been displayed on the client apparatus 20 before the screen is generated this time (e.g., transmits an image of an update rectangle). While a case in which an image of an update portion is a rectangular image is described below by way of example, the disclosed apparatus is also applicable to a case in which an image of an update portion has a shape other than the rectangle.
The server-side remote-screen controlling application further has a function for compressing data of a portion involving a large amount of inter-frame motion into data based on a moving-image compression system and transmitting the compressed data to the client apparatus 20. For example, the server-side remote-screen controlling application divides a screen, generated from the result of the processing executed by the application, into multiple regions and monitors update frequencies for the respective divided regions. The server-side remote-screen controlling application transmits, to the client apparatus 20, information of the update frequency of the region in which the update frequency exceeds a threshold (the region is hereinafter referred to as a “frequent-update region”). The server-side remote-screen controlling application also encodes a bitmap image of the frequent-update region into data based on an MPEG (Moving Picture Experts Group) system, such as MPEG-2 or MPEG-4, and transmits the encoded data to the client apparatus 20. Although a case in which the data is compressed into data based on the MPEG system is described by way of example, the compression system is not limited thereto. For example, the compression system may be any moving-image compression coding system, for example, Motion-JPEG (Joint Photographic Experts Group) or the like.
A remote-screen controlling application for a client is preinstalled or installed on the client apparatus 20. The remote-screen controlling application for a client will hereinafter be referred to as a “client-side remote-screen controlling application”.
The client-side remote-screen controlling application has a function for reporting operation information, received via the input device 205 a, to the server apparatus 10. Examples of the operation information reported by the client-side remote-screen controlling application include left and right clicks, double click, and drag of the mouse, as well as the position and a displacement of a mouse cursor which are obtained as a result of a movement operation of the mouse. Other examples of the operation information include the amount of rotation of a mouse wheel and the type of pressed key on the keyboard.
In addition, the client-side remote-screen controlling application has a function for causing an image, received from the server apparatus 10, to be displayed on the display device 204 a included in the client apparatus 20. As one example, upon reception of a bitmap image for an update rectangle from the server apparatus 10, the client-side remote-screen controlling application causes the image for the update rectangle to be displayed at a position changed from the position of a previous bitmap image.
As another example, upon receiving update frequency information of a frequent-update region from the server apparatus 10, the client-side remote-screen controlling application sets, as a blank region in which no bitmap image is to be displayed, a region that lies on the display screen and that corresponds to a position included in the update frequency information. In addition, upon reception of data based on the moving-image compression system, the client-side remote-screen controlling application decodes the data and displays the decoded data in the blank region.
The functions of the server apparatus 10 and the client apparatus 20 will now be described in detail.
The server apparatus 10 includes an OS executor 11, a display-screen generator 12, a frame buffer 13, and a remote-screen controller 14. In the example of FIG. 3 , in addition to the functional units illustrated in FIG. 3 , the server apparatus 10 may further include various functions, such as a function of an input device and a function of a display device, of a known computer.
The OS executor 11 controls execution of the OS. For example, the OS executor 11 detects, from the operation information obtained by an operation-information acquirer 14 b (described below), an instruction for launching an application and a command for the application. For example, upon detecting a double click on an icon associated with an application, the OS executor 11 issues, to display-screen generator 12, an instruction for launching the application associated with the icon. As another example, upon detecting an operation for requesting execution of a command on an operation screen, e.g., a window, of a running application, the OS executor 11 issues, to the display-screen generator 12, an instruction for execution of the command.
The display-screen generator 12 has a function (an application execution controlling function) for controlling execution of the application and a function (a rendering processing function) for rendering an image in the frame buffer 13 in accordance with an instruction from the OS executor 11.
For example, when the OS executor 11 issues an instruction for launching an application or when a running application is instructed to execute a command, the application execution controlling function operates the corresponding application. The application execution controlling function issues a request to the rendering processing function so as to render, in the frame buffer 13, a display image of a processing result obtained by the execution of the application. For issuing such a rendering request, the application execution controlling function notifies the rendering processing function about the display image and the rendering position of the display image. The application executed by the application execution controlling function may be preinstalled or may be installed after the shipment of the server apparatus 10. The application executed by the application execution controlling function may also be an application that runs in a network environment based on Java® or the like.
Upon receiving the rendering request from the application execution controlling function, the rendering processing function causes an image for displaying an application processing result to be rendered, in a bitmap format, at the rendering position located in the frame buffer 13 and specified by the application execution controlling function. Although a case in which the rendering request is received from the application execution controlling function has been described above, the rendering request may also be received from the OS executor 11. As one example, upon receiving a mouse-cursor rendering request from the OS executor 11, the rendering processing function may cause an image for displaying the mouse cursor to be rendered, for example, in a bitmap format, at the rendering position located in the frame buffer 13 and specified by the OS.
The frame buffer 13 stores image data rendered by the rendering processing function and used for updating the desktop (the image data will hereinafter be referred to as “update image data”). Examples of the frame buffer 13 include semiconductor memory devices, such as a RAM (e.g., a VRAM (video random access memory)), a ROM (read only memory), and a flash memory. The frame memory 13 may also be implemented by a hard disk drive or a storage device for an optical disk or the like.
The remote-screen controller 14 offers the remote-screen control service to the client apparatus 20 via the server-side remote-screen controlling application.
The remote-screen controller 14 includes a communicator 14 a, an operation-information acquirer 14 b, an operation-position detector 14 c, a window-edge detector 14 d, an update-difference creator 14 e, an update-region setter 14 f, an update-frequency calculator 14 g, an association-degree setter 14 h, a priority setter 14 i, an update-difference converter 14 j, a screen-update reporter 14 k, and a channel-band detector 14 m.
The communicator 14 a transmits/receives data to/from the client apparatus 20 over the network 50 (not illustrated in FIG. 3 ).
The operation-information acquirer 14 b acquires operation information received from the client apparatus 20. Examples of the operation information include left and right clicks, double click, and drag of the mouse, as well as a mouse-cursor displacement obtained as a result of a movement operation of the mouse. Other examples of the operation information include the amount of rotation of the mouse wheel and the type of pressed key on the keyboard.
The operation-position detector 14 c has a function for detecting the current mouse-cursor position information from the user operation information transmitted from the client apparatus 20. Examples of the position information include X and Y coordinates of a point where the mouse cursor is located).
The operation-position detector 14 c reports the obtained mouse-cursor position information to the priority setter 14 i and the association-degree setter 14 h.
The window-edge detector 14 d has a function for detecting edges from application windows and so on included in the update image data and dividing a desktop screen into multiple sections. More specifically, the window-edge detector 14 d periodically obtains the update image data held in the frame buffer 13. Application windows running on the desktop are rendered in the update image data. In this envisaged situation, multiple windows or child windows may be displayed on the desktop. For example, since rectangular sections having various sizes, such as windows, buttons, and toolbars, are displayed on the screen, the number of sections at this point may be too large to perform processing. Thus, the window-edge detector 14 d detects the frames of those windows as edges by using a processing scheme called “edge detection”, which is an image processing technique. Using the detected edges, the window-edge detector 14 d divides the desktop screen into multiple sections and performs processing for sorting the divided sections into sections for large windows. The processing for sorting the sections utilizes characteristics of application images rendered on the screen. The application provides buttons, toolbars, sub-windows, rendering data, and so on. The buttons, toolbars, and so on have a characteristic in that they are small and are typically surrounded by a section having a larger area. The rendering data also has a characteristic in that it is surrounded by a section having a larger area. The sub-windows have a characteristic in that each thereof has a relatively large area and is adjacent to a rectangle having a larger area. Those characteristics are utilized to perform processing for sorting small sections into large sections corresponding to the sub-windows. The window-edge detector 14 d sends information of the divided sections to the update-frequency calculator 14 g. The section information includes, for example, information indicating an X coordinate, a Y coordinate, a width, and a height of each section.
The update-difference creator 14 e checks the frame buffer 13 to detect a different portion (an update difference) resulting from an update. One example of detection of the updated difference will be described below.
First, the update-difference creator 14 e generates a screen image to be displayed on the display device 204 a of the client apparatus 20. For example, each time the display-screen generator 12 stores bitmap data in the frame buffer 13, the update-difference creator 14 e starts processing as described below.
That is, the update-difference creator 14 e compares image data displayed on the client apparatus 20 during generation of a previous frame with the update image data written in the frame buffer 13 during generation of a current frame. The update-difference creator 14 e then generates an image of an update rectangle, which is obtained by coupling pixels in a portion that has changed from a previous frame and shaping the coupled pixels into a rectangle, and then generates plackets for transmitting the update rectangle.
The update-difference creator 14 e then determines inter-frame update frequencies for the respective regions obtained by dividing the update image data stored in the frame buffer 13. For example, the update-difference creator 14 e stores the generated update rectangle in an internal work memory (not illustrated) for a predetermined period of time.
In this case, the update-difference creator 14 e also stores attribute information that allows the position and the size of the update rectangle to be specified. The attribute information includes, for example, information regarding the coordinates of an upper-left vertex of the update rectangle and the width and height of the update rectangle. The period of time in which the update rectangle is stored is correlated with the accuracy of setting a frequent-update region, and false detection of the frequent-update region decreases, as the period of time is increased. In this example, it is assumed that the image of the update rectangle is stored for one second. In this case, when the predetermined period of time passes after the image of the update rectangle is stored, the update-difference creator 14 e determines frequencies of updates on the desktop screen by using a map having sections obtained by dividing the screen to be displayed on the display device 204 a into blocks in a meshed pattern.
In accordance with the positions and the sizes of the update rectangles accumulated in the internal work memory, the update-difference creator 14 e sequentially deploys the images of the update rectangles onto the update-frequency determination map. Each time the update rectangle is deployed on the map, the update-difference creator 14 e accumulates and adds up the numbers of updates in each of the block(s) 31 at a portion that overlaps the update rectangle on the map. In this case, when the update rectangle deployed on the map overlaps a predetermined number of pixels included in one block, the update-difference creator 14 e increments the number of updates in the block by “1”. In this case, a description will be given of an example in which, when the update rectangle overlaps even one pixel included in one block, the number of updates in the block is incremented.
Numerals indicated in nine of the blocks 31 in the update-frequency determination map 30 illustrated in FIG. 5A indicate the numbers of updates in the corresponding blocks 31 when an update rectangle 31 a is deployed. Numerals indicated in some of the blocks 31 in the map 30 illustrated in FIG. 5B indicate the numbers of updates in the corresponding blocks 31 when an update rectangle 31 b is deployed. Numerals indicated in some of the blocks 31 in the map 30 illustrated in FIG. 5C indicate the numbers of updates in the corresponding blocks 31 when all update rectangles accumulated in the internal work memory are deployed. In FIGS. 5A to 5C , the number of updates in each of the blocks 31 in which no numerals are indicated is assumed to be zero.
As illustrated in FIG. 5A , when the update rectangle 31 a is deployed on the map 30, the update rectangle 31 a overlaps the blocks 31 in a hatched portion. Thus, the update-difference creator 14 e increments the number of updates in each of the blocks 31 in the hatched portion by “1”. In the example of FIG. 5A , since the number of updates in each block 31 has been zero, the number of updates in the hatched portion is incremented from “0” to “1”.
As illustrated in FIG. 5B , when the update rectangle 31 b is deployed on the map 30, the update rectangle 31 b overlaps the blocks 31 in a hatched portion. Thus, the update-difference creator 14 e increments the number of updates in each of the blocks 31 in the hatched portion by “1”. In this case, since the number of updates in each of the blocks 31 has been “1”, the number of updates in the hatched portion is incremented from “1” to “2”.
When all of the update rectangles accumulated in the internal work memory have been deployed on the map 30, the update-difference creator 14 e obtains the block(s) 31 in which the number(s) of updates in a predetermined period, e.g., the update frequency or update frequencies, exceeds a threshold. In the example of FIG. 5C , when the threshold is assumed to be “4”, the blocks 31 in a hatched portion are obtained. As a larger value is set for the threshold, a portion in which a moving image is more likely to be displayed on the desktop screen can be encoded by the update-difference converter 14 j. With respect to the threshold, an end user may select one of values preset in a stepwise manner by the creator of the server-side remote-screen controlling application or may directly set a value.
A description will be given with reference back to FIG. 3 .
The update-region setter 14 f uses the update difference to set, as a frequent-update region, a region that is included in the update image data in the frame buffer 13 and that has a high update frequency.
One example of a method for setting the frequent-update region will be described below.
When the update-difference creator 14 e obtains blocks in which the numbers of updates exceed the threshold, the update-region setter 14 f couples adjacent ones of the blocks into a block group (which is hereinafter referred to as a “coupled block group”) and corrects the coupled block group into a rectangle. For example, the update-region setter 14 f derives an interpolation region to be interpolated into a coupled block group and then adds the interpolation region to the coupled block group to thereby correct the coupled block group into a rectangle. The interpolation region may be derived by an algorithm for deriving a region with which a coupled block group is shaped into a rectangle with a minimum amount of interpolation therebetween.
As illustrated in FIG. 6 , the update-region setter 14 f adds an interpolation region 52 to a pre-correction coupled block group 51 to thereby correct the coupled block group 51 into a rectangle 53. At this point, however, rectangle combination described below is not completed and thus the rectangle 53 has not been determined as a frequent-update region yet. Hence, the post-correction rectangle 53 is hereinafter referred to as a “frequent-update region candidate”.
When multiple frequent-update region candidates exist, the update-region setter 14 f combines the frequent-update region candidates between which the distance is smaller than or equal to a predetermined value into a rectangle including the frequent-update region candidates. The expression “distance between the frequent-update region candidates” as used herein refers to a smallest one of the distances between the post-correction rectangles. For example, the update-region setter 14 f derives an interpolation region to be fit into a gap between the frequent-update region candidates and adds the interpolation region to the frequent-update region candidates, to thereby combine the frequent-update region candidates into a rectangle including the candidates. The interpolation region may be derived by an algorithm for deriving a region with which frequent-update region candidates are shaped into a combination with a minimum amount of interpolation therebetween.
As illustrated in FIG. 7 , the update-region setter 14 f adds an interpolation region 62 to frequent- update region candidates 61 a and 61 b to thereby create a combination 63 including the frequent- update region candidates 61 a and 61 b. The update-region setter 14 f then sets the thus-obtained combination 63 as a frequent-update region.
Upon setting the frequent-update region in the manner described above, the update-region setter 14 f sends update frequency information to the update-frequency calculator 14 g and the client apparatus 20. The update frequency information includes information for identifying the position and the size of the identified frequent-update region, an ID (a region identifier) for identifying the frequent-update region, and the number of updates. Upon reception of the update frequency information, a portion that is included in the image data for the desktop screen to be displayed on the client apparatus 20 and that corresponds to the frequent-update region is displayed blank. Thereafter, the update-region setter 14 f clears the number of updates in each of the blocks mapped in the internal work memory. The update-region setter 14 f stores the update frequency information in the internal work memory.
As illustrated in FIG. 8A , a desktop screen 70A realized by the update image data stored in the frame buffer 13 includes a browser screen 71 and a moving-image playback screen 72. When changes on the desktop screen 70A are kept track of time-sequentially, an update rectangle of the browser screen 71 that is a still image is not detected and a mouse movement trace 73 and an update rectangle associated with a moving-image playback region 74 based on an application are detected, as illustrated on a screen 70B in FIG. 8B .
It is assumed that the update-region setter 14 f identifies, in the moving-image playback region 74, blocks in which the numbers of updates exceed a threshold, e.g., a portion indicated by hatching. In this case, the update-region setter 14 f creates update frequency information by adding the largest number of updates of the numbers of updates in the regions specified by the region-position identification information, including the coordinates (x, y) of the upper-left vertex, the width w, and the height h of the frequent-update region in a hatched portion illustrated on a screen 70C in FIG. 8C , to the region-position identification information. The update-region setter 14 f then stores the created update frequency information in the internal memory also sends it to the update-frequency calculator 14 g.
Although a case in which the coordinates of the upper-left vertex are used to represent a point for designating the position of the frequent-update region has been described above, another vertex may also be used.
Instead of a vertex, any point, such as a barycenter, that enables designation of the position of a frequent-update region may also be used. Although a case in which the upper-left vertex on the screen is used as the original of the coordinate axes X and Y has been described above, any point in or outside the screen may be used as the origin. A description will be given with reference back to FIG. 3 .
The update-frequency calculator 14 g generates section-specific update frequency information indicating the update frequency for each section on the desktop on the basis of in-desktop section information received from the window-edge detector 14 d and the update frequency information received from the update-region setter 14 f.
For example, upon receiving pieces of section information “0, 0, 30, 40” and “30, 0, 60, 80” (X coordinate, Y coordinate, width, height) and pieces of update frequency information “0, 0, 16, 16, 3” and “16, 16, 16, 16, 4” (X coordinate, Y coordinate, width, height, the number of updates), the update-frequency calculator 14 g maps the received pieces of information to the in-desktop section information and calculates the update frequency (the number of updates) for each in-desktop section to obtain “0, 0, 30, 40, 3” (X coordinate, Y coordinate, width, height, the number of updates) and so on. After the calculation, the update-frequency calculator 14 g sends the generated section-specific update frequency information to the association-degree setter 14 h and the priority setter 14 i.
The association-degree setter 14 h has a function for calculating, upon receiving the section-specific update frequency information from the update-frequency calculator 14 g and receiving the current mouse-cursor position information from the operation-position detector 14 c, the frequency (co-occurrence frequency) of updates occurring simultaneously in multiple sections and for setting an association degree indicating the degree of association between the sections. The reason why the association degree is set will be described below.
An application for use in CAD or the like is typically executed using the entire area or a large area of the desktop screen. In such a case, the application is constituted and used with sections including multiple windows, child widows, or the like. The sections can be broadly classified into two types of section according to their functions. A first one of the types is a section having buttons, sliders, checkboxes, and so on and is aimed to perform some type of operation on data currently created by the user. Such a section will be referred to as an “operation tool section” hereinafter. A second type is a section for displaying data, such as a 2D or 3D object or a wire frame, currently created by the user. Such a section will be referred to as a “data rendering section” hereinafter. The operation tool section is rendered by clicking/dragging a button or slider with the mouse and is generally updated at relatively short intervals. However, in the operation tool section, since the amount of data for each update is considerably small, the average amount of update data is small. On the other hand, data operated by the user is directly or indirectly rendered in the data rendering section and it is intermittently updated. However, since a vast amount of update occurs at the same time, the average amount of update data in the data rendering section is large.
Many of such applications have multiple operation tool sections and multiple data rendering sections, and the operation tool sections and the data rendering sections often have certain associations therebetween.
In the graph illustrated in FIG. 9 , the vertical axis indicates the amount of update data and the horizontal axis indicates time (second). As illustrated in FIG. 9 , an operation on an operation tool section C1 has a large influence on a data rendering section A1 and an operation on an operation tool section D1 has a large influence on a data rendering section B1. In the present embodiment, upon receiving the section-specific update frequency information from the update-frequency calculator 14 g and receiving the current mouse-cursor position information from the operation-position detector 14 c, the association-degree setter 14 h creates a cursor-position management table that includes the received pieces of information. On the basis of the created cursor-position management table, the association-degree setter 14 h calculates associations between the sections. In addition, the mouse-cursor section and a section having a highest degree of association with the mouse-cursor section are extracted and image data to be displayed in those sections are transmitted to the client apparatus 20 prior to image data to be displayed in other sections. As a result, a portion that has a low update frequency but is associated with a region having a high update frequency can be updated on the screen displayed on the display device 204 a, prior to the other portions. Accordingly, it is possible to increase the speed of response to a user operation. In the following description, the operation tool sections and the data rendering sections are not distinguished from each other and are simply referred to as “sections A1, B1, C1, and D1”.
The cursor-position management table created by the association-degree setter 14 h will be described next.
A cursor-position management table T1 illustrated in FIG. 10 has a “time” column, an “update-frequency information” column, and a “mouse-cursor position (x, y)” column. Pieces of information that are horizontally arranged are associated with each other.
In the “time” column, the time at which the mouse-cursor position information is obtained from the operation-position detector 14 c is contained.
In the “update-frequency information” column, the X coordinate, the Y coordinate, the width, the height, and the number of updates which are included in the section-specific update frequency information received from the update-frequency calculator 14 g are contained.
In the “mouse-cursor position (x, y)” column, the mouse-cursor position information received from the operation-position detector 14 c is contained. A description will be given with reference back to FIG. 3 .
The priority setter 14 i uses a section on which the user is currently performing an operation, an available band of the network 50, and the section association degrees in the frame buffer 13 to set priorities for data transfer of the respective sections. During the setting of the priorities, the priority setter 14 i uses the user's current mouse-cursor position detected by the operation-position detector 14 c and the section association degrees set by the association-degree setter 14 h. In accordance with the mouse-cursor position, the priority setter 14 i uses the association degrees, set by the association-degree setter 14 h, to determine a section to be given a high priority and sends, to the update-difference converter 14 j, an instruction for transmitting data for the determined section prior to data for the other sections.
The update-difference converter 14 j converts the update image data into moving-image data or still-image data in accordance with the update frequency of the display region.
More specifically, each time the update-difference creator 14 e generates an update rectangle, the update-difference converter 14 j determines whether or not the generated update rectangle is included in the frequent-update region stored in the internal work memory, e.g., is included in a region to which the moving-image data is being transmitted by the communicator 14 a. In this case, when information regarding the section to be given a high priority is received from the priority setter 14 i, the update rectangle for the received section is processed prior to update rectangles for the other sections. When the generated update rectangle is not included in the frequent-update region, the update-region setter 14 f causes the communicator 14 a to transmit the image data of the update rectangle and the update frequency information.
Each time update image data is stored in the frame buffer 13, the update-difference converter 14 j determines whether or not the update frequency information of a frequent-update region is registered in the internal memory in the update-region setter 14 f. When the update frequency information of a frequent-update region is registered, the update-difference converter 14 j cuts out, in the update image data stored in the frame buffer 13, a bitmap image for the portion corresponding to the frequent-update region. The update-difference converter 14 j then encodes the cut-out bitmap image. In this case, for example, at a point when the amount of input bitmap image for the frequent-update region reaches the number of frames from which a stream can be generated, the update-difference converter 14 j encodes the bitmap image for the frequent-update region. An encoding system may be, for example, an MPEG system, such as MPEG-2 or MPEG-4 system, or a Motion-JPEG system.
The screen-update reporter 14 k performs processing for transmitting the update data converted by the update-difference converter 14 j.
More specifically, the screen-update reporter 14 k transmits the update-rectangle image data and the update frequency information, generated by the update-difference creator 14 e, to the client apparatus 20. A communication protocol for transmitting the update-rectangle image data is, for example, an RFB protocol in VNC.
The screen-update reporter 14 k transmits image data of the frequent-update region, the image data being encoded by the update-difference converter 14 j, (the data is hereinafter referred to as “encoded frequent-update-region image data”) to the client apparatus 20 in conjunction with the corresponding update frequency information. A communication protocol for transmitting the encoded frequent-update-region image data may be, for example, an RTP (Real-time Transport Protocol).
The channel-band detector 14 m has a function for detecting a currently available band on a network channel used for data transmission/reception between the server apparatus 10 and the client apparatus 20. More specifically, the channel-band detector 14 m obtains the amount of data transmitted from the screen-update reporter 14 k and the amount of data actually transmitted from the communicator 14 a to the client apparatus 20. The channel-band detector 14 m then sends the amounts of the two pieces of data to the priority setter 14 i. When the amount of data transmitted from the screen-update reporter 14 k and the amount of data transmitted from the communicator 14 a are equal to each other or when no data is transmitted from the screen-update reporter 14 k, the channel-band detector 14 m periodically transmits data for measuring an available band to the client apparatus 20. By doing so, the channel-band detector 14 m estimates a currently available band and sends information of the currently available band to the priority setter 14 i.
Functions of the client apparatus 20 will be described next.
<Functions of Client Apparatus>
The client apparatus 20 has a remote-screen controller 21. In the example of FIG. 3 , in addition to the functional units illustrated in FIG. 3 , the client apparatus 20 may further has various functions, such as a function of a sound output unit, of a known computer.
The remote-screen controller 21 receives the remote-screen control service, offered by the server apparatus 10, via the client-side remote-screen controlling application. As illustrated in FIG. 3 , the remote-screen controller 21 includes a communicator 21 a, an operation-information acquirer 21 b, a screen-update information acquirer 21 c, an image-data display unit 21 d, and a moving-image data display unit 21 e.
The communicator 21 a transmits/receives information to/from the server apparatus 10 over the network 50.
The operation-information acquirer 21 b acquires information of an operation input via the input device 205 a and reports the acquired operation information to the server apparatus 10 via the communicator 21 a. Examples of the operation information reported by the operation-information acquirer 21 b include left and right clicks, double click, and drag of the mouse, as well as a mouse-cursor displacement obtained as a result of a movement operation of the mouse. Other examples of the operation information reported by the operation-information acquirer 21 b include the amount of rotation of the mouse wheel and the type of pressed key on the keyboard.
The screen-update information acquirer 21 c receives the update-rectangle image and the update frequency information of the frequent-update region via the communicator 21 a, the update-rectangle image and the update frequency information being transmitted by the communicator 14 a in the server apparatus 10. The screen-update information acquirer 21 c also receives the frequent-update-region update frequency information transmitted by the update-region setter 14 f in the server apparatus 10.
The screen-update information acquirer 21 c receives the encoded frequent-update-region image data, transmitted by the communicator 14 a in the server apparatus 10, and the update-rectangle update frequency information, transmitted along with the encoded frequent-update-region image data, via the communicator 21 a.
The image-data display unit 21 d causes the display device 204 a to display the update-rectangle image received by the screen-update information acquirer 21 c. For example, the image-data display unit 21 d causes the bitmap image of the update rectangle to be displayed on a screen region that lies on the display device 204 a and that corresponds to the frequent-update-region position and size included in the update-rectangle update frequency information received from the screen-update information acquirer 21 c.
When the screen-update information acquirer 21 c receives the frequent-update-region update frequency information, the image-data display unit 21 d performs processing in the following manner. That is, the image-data display unit 21 d sets, as a blank region in which no bitmap image is to be displayed, a screen region that lies on the display device 204 a and that corresponds to the frequent-update-region position and size included in the frequent-update-region update frequency information.
The moving-image data display unit 21 e decodes the encoded frequent-update-region image data received by the screen-update information acquirer 21 c. The moving-image data display unit 21 e may be equipped with a decoder employing a decoding system corresponding to the encoding system employed by the server apparatus 10.
On the basis of the frequent-update-region update frequency information received by the screen-update information acquirer 21 c, the moving-image data display unit 21 e causes the display device 204 a to display the decoded image of the frequent-update region. For example, the moving-image data display unit 21 e causes the decoded image of the frequent-update region to be displayed on a screen region that lies on the display device 204 a and that corresponds to the frequent-update-region position and size included in the update frequency information of the frequent-update region.
The remote-screen controller 21 may be implemented by various types of integrated circuit or electronic circuit. At least one of the functional units included in the remote-screen controller 21 may also be implemented by another integrated circuit or electronic circuit. Examples of the integrated circuit include an ASIC (application specific integrated circuit) and an FPGA (field programmable gate array). Examples of the electronic circuit include a CPU and an MPU (micro processing unit).
Processing performed by the server apparatus 10 will now be described with reference to flowcharts.
In operation S1, the server apparatus 10 determines whether or not a certain amount of time has passed. When the certain amount of time has not passed (No in operation S1), the process proceeds to operation S4. When the certain amount of time has passed (Yes in operation S1), the process proceeds to operation S2.
In operation S2, the window-edge detector 14 d detects edges of windows in the screen. Thereafter, the process proceeds to operation S3.
In operation S3, the window-edge detector 14 d divides the screen into multiple sections by using the detected edges. Thereafter, the process proceeds to operation S4.
In operation S4, the update-difference creator 14 e determines whether or not update image data exists in the frame buffer 13. When update image data exists in the frame buffer 13 (Yes in operation S4), the process proceeds to operation S5. When no update image data exists in the frame buffer 13 (No in operation S4), the process returns to operation S1.
In operation S5, the update-difference creator 14 e compares image data displayed on the client apparatus 20 during generation of a previous frame with the update image data written in the frame buffer 13 during generation of a current frame. The update-difference creator 14 e then generates an update rectangle, which is obtained by coupling pixels in a portion that has changed from the previous frame and shaping the coupled pixels into a rectangle. Thereafter, the process proceeds to operation S6.
In operation S6, the update-difference creator 14 e stores the created update rectangle. Thereafter, the process proceeds to operation S7.
In operation S7, the update-region setter 14 f sets a frequent-update region. Thereafter, the process proceeds to operation S8.
In operation S8, the update-frequency calculator 14 g calculates the numbers of updates the respective sections for each certain amount of time. Thereafter, the process proceeds to operation S9.
In operation S9, the association-degree setter 14 h compares the numbers of updates in the same amount of time and calculates the association degrees between the sections. Thereafter, the process proceeds to operation S10.
In operation S10, the channel-band detector 14 m determines whether or not update image data that exceeds the bandwidth of the channel between the communicator 14 a and the communicator 21 a exists. When update image data that exceeds the bandwidth exists (Yes in operation S10), the process proceeds to operation S11. When update image data that exceeds the bandwidth does not exist (No in operation S10), the process proceeds to operation S13.
In operation S11, the priority setter 14 i sets priorities for the update image data for the corresponding sections, on the basis of the user's current mouse-cursor position detected by the operation-position detector 14 c and the association degrees determined in operation S9. Thereafter, the process proceeds to operation S12.
In operation S12, the priority setter 14 i rearranges the sections to be given a high priority according to the priorities set in operation S11 and sends information of the rearranged sections to the update-difference converter 14 j. Thereafter, the process proceeds to operation S13.
In operation S13, the update-difference converter 14 j obtains the update image data from the frame buffer 13. In this case, when the information of the section(s) to be given a priority is received from the priority setter 14 i, the update image data for the received section(s) is processed prior to the update image data for the other sections. The update-difference converter 14 j then determines whether or not a region in which the update image data is to be displayed is included in the frequent-update region. When it is determined that a region in which the update image data is to be displayed is included in the frequent-update region (Yes in operation S13) the process proceeds to operation S14. When it is determined that a region in which the update image data is to be displayed is not included in the frequent-update region (No in operation S13) the process proceeds to operation S16.
In operation S14, the update-difference converter 14 j converts the update image data into encoded image data by cutting out, in the update image data stored in the frame buffer 13, a bitmap image corresponding to the frequent-update region and encoding the bitmap image. Thereafter, the process proceeds to operation S15.
In operation S15, the screen-update reporter 14 k transmits the encoded image data, encoded by the update-difference converter 14 j, to the client apparatus 20 via the communicator 14 a. Thereafter, the processing illustrated in FIG. 11 ends. When the client apparatus 20 receives the encoded image data, the screen-update information acquirer 21 c sends the received encoded image data to the moving-image data display unit 21 e. The moving-image image data display unit 21 e decodes the received encoded image data and displays the decoded image data on the display device 204 a.
In operation S16, the screen-update reporter 14 k transmits the image data of the update rectangle to the client apparatus 20 via the communicator 14 a. Thereafter, the processing illustrated in FIG. 11 ends. When the client apparatus 20 receives the image data of the update rectangle, the screen-update information acquirer 21 c sends the received image data to the image-data display unit 21 d. The image-data display unit 21 d displays the received image data on the display device 204 a.
The description of the processing in FIG. 11 is finished at this point.
Processing (i.e., operation-section detection processing) of the operation-position detector 14 c will be described next.
In operation S21, the operation-position detector 14 c determines whether or not reception of operation information is detected at the communicator 14 a. When reception of operation information is detected (Yes in operation S21), the process proceeds to operation S22. When reception of operation information is not detected (No in operation S21), the operation-section detection processing ends.
In operation S22, the operation-position detector 14 c determines whether or not the mouse is operated. More specifically, the operation-position detector 14 c extracts mouse-cursor position information from the received operation information and compares the extracted position information with previously extracted position information. When the pieces of position information do not match each other, it is determined that the mouse is operated. When it is determined that the mouse is operated (Yes in operation S22), the process proceeds to operation S23. When it is determined that the mouse is not operated (No in operation S22), the operation-section detection processing ends.
In operation S23, the operation-position detector 14 c obtains mouse-operation information and extracts the mouse-cursor position therefrom. Thereafter, the process proceeds to operation S24.
In operation S24, the operation-position detector 14 c compares the extracted mouse-cursor position with a previous mouse-cursor position to determine whether or not the position of the mouse cursor has been changed. When it is determined that the position of the mouse cursor has been changed (Yes in operation S24), the process proceeds to operation S25. When it is determined that the position of the mouse cursor has not been changed (No in operation S24), the operation-section detection processing ends.
In operation S25, the operation-position detector 14 c sends the mouse-cursor position information to the association-degree setter 14 h and the priority setter 14 i. Thereafter, the operation-section detection processing ends.
The description of the operation-section detection processing is finished at this point.
Processing (i.e., channel-band detection processing) of the channel-band detector 14 m will be described next.
In operation S31, the channel-band detector 14 m measures the amount of data transmitted by the screen-update reporter 14 k. Thereafter, the process proceeds to operation S32.
In operation S32, the channel-band detector 14 m measures the amount of data currently transmitted by the communicator 14 a. Thereafter, the process proceeds to operation S33.
In operation S33, the channel-band detector 14 m sends the values of the two amounts of data, measured in operations S31 and S32, to the priority setter 14 i. Thereafter, the process proceeds to operation S34.
In operation S34, the channel-band detector 14 m determines whether or not the two amounts of data measured in operations S31 and S32 are equal to each other. When the amounts of data are equal to each other (Yes in operation S34), the process proceeds to operation S37. When the amounts of data are not equal to each other (No in operation S34), the process proceeds to operation S35. Also, when no data is transmitted from the screen-update reporter 14 k, the process proceeds to operation S37.
In operation S35, the channel-band detector 14 m transmits data for measurement to the client apparatus 20 to measure an available band. Thereafter, the process proceeds to operation S36.
In operation S36, the channel-band detector 14 m sends the value of the available band, measured in operation S35, to the priority setter 14 i. Thereafter, the channel-band detection processing ends.
In operation S37, on the other hand, the channel-band detector 14 m sends the amount of data, currently transmitted by the communicator 14 a, to the priority setter 14 i as the value of the available band. Thereafter, the channel-band detection processing ends.
The description of the channel-band detection processing is finished at this point.
Processing (i.e., window-edge detection processing) of the window-edge detector 14 d will be described next.
In operation S41, the window-edge detector 14 d obtains screen data stored in the frame buffer 13. Thereafter, the process proceeds to operation S42.
In operation S42, the window-edge detector 14 d performs edge detection processing on the screen data, obtained in operation S41, to detect edges and divides the screen into multiple sections by using edges.
In operation S43, the window-edge detector 14 d selects one of unselected sections of the sections divided in operation S42. Thereafter, the process proceeds to operation S44.
In operation S44, the window-edge detector 14 d determines whether or not the section selected in operation S43 is adjacent to another section with a certain gap therebetween or is superimposed on another section. When the section is adjacent to another section with a certain gap therebetween or is superimposed on another section (Yes in operation S44), the process proceeds to operation S46. When the section is neither adjacent to another section with a certain gap therebetween nor superimposed on another section (No in operation S44), the process proceeds to operation S45.
In operation S45, the window-edge detector 14 d determines whether or not the section selected in operation S43 exists inside another section. When the selected section exists inside another section (Yes in operation S45), the process proceeds to operation S48. When the selected section does not exist inside another section (No in operation S45), the process proceeds to operation S46.
In operation S46, the window-edge detector 14 d determines whether or not the area of the section selected in operation S43 is smaller than or equal to a predetermined area threshold (e.g., 200 pixels×200 pixels). When the area of the section selected in operation S43 is smaller than or equal to the predetermined area threshold (Yes in operation S46), the process proceeds to operation S47. When the area of the section selected in operation S43 is larger than the predetermined area threshold (No in operation S46), the process proceeds to operation S49.
In operation S47, the window-edge detector 14 d couples the section selected in operation S43 with the adjacent or superimposed section. Thereafter, the process proceeds to operation S49.
In operation S48, on the other hand, the window-edge detector 14 d deletes the section selected in operation S43. Thereafter, the process returns to operation S43.
In operation S49, the window-edge detector 14 d determines whether or not the processing in operations S44 to S48 has been performed on all sections. When it is determined that the processing in operations S44 to S48 has been performed on all sections (Yes in operation S49) the process proceeds to operation S50. When it is determined that the processing in operations S44 to S48 has not been performed on all sections (No in operation S49) the process returns to operation S43.
In operation S50, the window-edge detector 14 d determines whether or not any sections on which the coupling processing in operation S47 or the deletion processing in operation S48 has been performed exist in all of the selected sections. When the section on which the coupling processing or deletion processing has been performed exists (Yes in operation S50), the process proceeds to operation S51. Operation S50 is performed since any section on which the coupling processing or the deletion processing is to be further performed may exist. When the section on which the coupling processing or deletion processing has been performed does not exist (No in operation S50), the process proceeds to operation S52.
In operation S51, the window-edge detector 14 d puts all of the sections into unselected states. Thereafter, the process returns to operation S43.
In operation S52, the window-edge detector 14 d sends, to the update-frequency calculator 14 g, section information including the sections that remain as a result of the coupling processing and deletion processing. Thereafter, the window-edge detection processing ends.
The description of the window-edge detection processing is finished at this point.
Processing (i.e., update-frequency calculation processing) of the update-frequency calculator 14 g will be described next.
In operation S61, the update-frequency calculator 14 g determines whether or not the section information is received from the window-edge detector 14 d. When the section information is received (Yes in operation S61), the process proceeds to operation S62. When no section information is received (No in operation S61), the update-frequency calculation processing ends.
In operation S62, the update-frequency calculator 14 g determines whether or not the update frequency information is received from the update-region setter 14 f. When the update frequency information is received (Yes in operation S62), the process proceeds to operation S63. When no update frequency information is received (No in operation S62), the update-frequency calculation processing ends.
In operation S63, the update-frequency calculator 14 g generates section-specific update frequency information indicating the update frequency for each section, on the basis of the section information received in operation S61 and the update frequency information received in operation S62. Thereafter, the process proceeds to operation S64.
In operation S64, the update-frequency calculator 14 g sends the section-specific update frequency information, generated in operation S63, to the association-degree setter 14 h and the priority setter 14 i. Thereafter, the update-frequency calculation processing ends.
The description of the update-frequency calculation processing is finished at this point.
Processing (i.e., association-degree setting processing) of the association-degree setter 14 h will be described next.
In operation S71, the association-degree setter 14 h determines whether or not the section-specific update frequency information is received from the update-frequency calculator 14 g. When the section-specific update frequency information is received (Yes in operation S71), the process proceeds to operation S72. When no section-specific update frequency information is received (No in operation S71), the association-degree setting processing ends.
In operation S72, the association-degree setter 14 h determines whether or not the mouse-cursor position information is received from the operation-position detector 14 c. When the mouse-cursor position information is received (Yes in operation S72), the process proceeds to operation S73. When no mouse-cursor position information is received (No in operation S72), the association-degree setting processing ends.
In operation S73, the association-degree setter 14 h attaches (associates) the time at which the mouse-cursor position information was received in operation S72 to (with) the received mouse-cursor position information and stores the associated information in the cursor-position management table T1.
In operation S74, the association-degree setter 14 h determines whether or not the amount of information stored in the cursor-position management table T1 is larger than or equal to a predetermined amount. When the amount of information stored in the cursor-position management table T1 is larger than or equal to the predetermined amount (Yes in operation S74), the process proceeds to operation S75. When the amount of information stored in the cursor-position management table T1 is smaller than the predetermined amount (No in operation S74), the association-degree setting processing ends.
In operation S75, the association-degree setter 14 h calculates update rates for the respective sections for each certain amount of time. Thereafter, the process proceeds to operation S76.
In operation S76, the association-degree setter 14 h calculates differences between the update rates of the sections. Thereafter, the process proceeds to operation S77.
In operation S77, the association-degree setter 14 h calculates, for each section, the degrees of association with the sections. The association-degree setter 14 h then stores the determined association degrees in an association-degree management table T6 (described below). Thereafter, the association-degree setting processing ends.
The description of the association-degree setting processing is finished at this point.
Processing (i.e., priority setting processing) of the priority setter 14 i will be described next.
In operation S81, on the basis of the currently available network band detected by the channel-band detector 14 m, the priority setter 14 i determines whether or not data transmitted by the communicator 14 a reaches an upper limit of the available band of the network 50. When the data transmitted by the communicator 14 a reaches the upper limit of the available band of the network 50 (Yes in operation S81), the process proceeds to operation S82. When the data transmitted by the communicator 14 a does not reach the upper limit of the available band of the network 50 (No in operation S81), the priority setting processing ends.
In operation S82, the priority setter 14 i receives the mouse-cursor position information from the operation-position detector 14 c, and identifies the section in which the mouse cursor is located, on the basis of the section information created by the window-edge detector 14 d. Thereafter, the process proceeds to operation S83.
In operation S83, the priority setter 14 i refers to the association-degree management table to retrieve the section having a highest degree of association with the mouse-cursor-located section identified in operation S82. Thereafter, the process proceeds to operation S84.
In operation S84, the priority setter 14 i determines whether or not the association degree of the section retrieved in operation S83 is higher than or equal to a predetermined threshold. When the association degree of the section retrieved in operation S83 is higher than or equal to the predetermined threshold (Yes in operation S84), the process proceeds to operation S85. When the association degree of the section retrieved in operation S83 is lower than the predetermined threshold (No in operation S84), the priority setting processing ends.
In operation S85, the priority setter 14 i sets a combination of the mouse-cursor-located section and the section that is the most highly associated therewith as highest-priority sections. Thereafter, the process proceeds to operation S86.
In operation S86, the priority setter 14 i sends information of the combination of the highest-priority sections to the update-difference converter 14 j. Thereafter, the process proceeds to operation S87.
In operation S87, a determination is made as to whether or not data transmitted by the communicator 14 a reaches the upper limit of the available band of the network 50, on the basis of the currently available network band detected by the channel-band detector 14 m. When the data transmitted by the communicator 14 a reaches the upper limit of the available band of the network 50 (Yes in operation S87), the process proceeds to operation S88. When the data transmitted by the communicator 14 a does not reach the upper limit of the available band of the network 50 (No in operation S87), the priority setting processing ends.
In operation S88, the priority setter 14 i refers to the association-degree management table T6 to retrieve the section having a second highest degree of association with the mouse-cursor-located section after the section identified in operation S83. Thereafter, the process proceeds to operation S89.
In operation S89, the priority setter 14 i determines whether or not the association degree of the section retrieved in operation S88 is higher than or equal to a predetermined threshold. When the association degree of the section retrieved in operation S88 is higher than or equal to the predetermined threshold (Yes in operation S89), the process returns to operation S86. When the association degree of the section retrieved in operation S88 is lower than the predetermined threshold (No in operation S89), the priority setting processing ends.
The description of the priority setting processing is finished at this point.
Any section can assume a value of the association degree. Thus, if priority is simply given to the section having the highest association degree, data for the section(s) that does not have an absolutely high association degree is also transmitted prior to data for the other sections. Accordingly, in the priority setting processing, the threshold is set and information regarding a combination of the mouse-cursor-located section and sections whose degrees of association therewith are higher than or equal to the threshold is transmitted prior to information for the other sections. At the point when the priority processing is executed, the amount of data generated is larger than the amount of already transmitted data. Thus, transmitting the data of only the sections having the highest association degree makes it possible to reduce the amount of data generated and also makes it possible to transmit all of generated data. In addition, the use of the association degree also allows the data for the section having a gentle update frequency to be transmitted and displayed with high priority, thus making it possible to improve the operation response felt by the user.
In the present embodiment, in operation S84 described above, the priority setter 14 i determines whether or not the retrieved association degree is higher than or equal to the threshold. Alternatively, however, the arrangement may be such that, after the association-degree setter 14 h calculates the association degrees, the association degree(s) that is lower than the threshold is deleted and operation S84 is eliminated in the priority setting processing.
Specific examples of the processing operations will be described next.
In a cursor-position management table T1 illustrated in FIG. 18 , the “update-frequency information” column is omitted. In the cursor-position management table T1 illustrated in FIG. 18 , the position of the mouse cursor is assumed to be located in the section C1 between time 10:10:10 and time 10:10:14.
When the amount of information accumulated in the cursor-position management table T1 is larger than or equal to a certain amount, the association-degree setter 14 h adds up the numbers of updates in each section and calculates an update rate indicating what percentage of the updates in the total amount of time for each update occurrence time.
The table T2 indicates that, for example, update rates of 0, 1, 0, 0, and 17 (in units of %) are obtained in the section A1 between time 10:10:10 and time 10:10:14.
After calculating the update rates, the association-degree setter 14 h uses the table T2 to calculate differences between the update rates for each update occurrence time at which operations were performed on the sections. FIG. 18 illustrates a table T3 that includes the calculated update-rate differences.
For example, when update rates of 0, 1, 0, 0, and 17 are obtained in the section A1 between time 10:10:10 and time 10:10:14, update rates of 0, 0, 0, 0, and 21 are obtained in the section C1 between time 10:10:10 and time 10:10:14, and the mouse cursor is located in the section C1 during the update time, the difference of the two update rates is calculated to be 5 (=|0−0|+|0−1|+|0−0|+|0−0|+|21−17|). This value “5” is used as an update-rate difference of the section A1 relative to the operation on the section C1. Similarly, an update-rate difference of the section B1 relative to the section C1 is determined to be 32 and an update-rate difference of the section D1 relative to the section C1 is determined to be 32.
The association-degree setter 14 h adds up the update-rate differences of all of the sections and divides the update-rate differences between the sections to calculate an update-rate mismatch degree relative to the section C1.
For example, the mismatch degree of the section A1 relative to the operation on the section C1 may be determined in the following manner. Summation of the update-rate differences in the table T3 yields 69 (=5+32+32). Division of the update-rate difference “5” of the section A1 relative to the operation on section C1 by 69 (i.e., 5/69) yields 0.072 . . . . Other mismatch degrees can also be determined in the same manner.
The association-degree setter 14 h multiplies values, obtained by subtracting the values included in the table T4 from “1”, by 100 to thereby calculate the association degrees. FIG. 18 illustrates a table T5 indicating the calculated association degrees. For example, in the table T4, the association degree of the section A1 relative to the operation on the section C1 may be determined in the following manner. Since the mismatch degree of the section A1 relative to the operation on the section C1 is 0.07 in the table T4, the association degree is 93% (=(1−0.07)×100)). Other association degrees can also be determined in the same manner.
A specific example of the priority setter will be described next with reference to an association-degree management table T6.
A description in this specific example will be given of an example of a case in which the position information received by the priority setter 14 i indicates that the mouse cursor is located in the section C1.
The priority setter 14 i refers to the row of C1 in the association-degree management table T6 to retrieve the section that is the most highly associated with the section C1. Since the section A1 having an association degree of 93 has the highest degree of association with the section C1 in the association-degree management table T6, the highest association degree is given to the section C1 and the section A1. The priority setter 14 i then determines that data for the sections C1 and A1 are to be transmitted with high priority. The priority setter 14 i then sends information of the combination of the sections C1 and A1 to the update-difference converter 14 j. If the amount of data to be transmitted exceeds the amount of data generated as a result of the processing described above, image data of an update rectangle belonging to the section B1 having a second highest degree of association with the section C1 after the section A1 is transmitted to the update-difference converter 14 j.
As described above, according to the information processing system 5 of the present embodiment, the server apparatus 10 detects edges from the desktop screen, divides the desktop screen into multiple sections, calculates update frequencies of the respective sections, determines degrees of association with the mouse-cursor-located section, and transmits the image data of update rectangles for the sections having the highest association degree to the client apparatus 20 prior to the image data for the other sections. Accordingly, it is possible to increase the speed of response to a user operation.
The processing performed by the server apparatus 10 may also be executed by a plurality of apparatuses in a distributed manner. For example, the arrangement may be such that one apparatus performs processing up to the association-degree setting processing to generate the association-degree management table T6 and another apparatus determines a combination of the sections having the highest association degree by using the association-degree management table T6.
An information processing system according to a third embodiment will be described next.
The information processing system according to the third embodiment will be described below in conjunction with, mainly, points that are different from those of the second embodiment described above, and a description of similar points is not given hereinafter.
The information processing system according to the third embodiment is different from the information processing system according to the second embodiment in that the window-edge detector 14 d refers to the association-degree management table T6, set by the association-degree setter 14 h, to change conditions for setting the sections. Two methods will be described below.
In a first method, when the association degree of one section in the association-degree management table T6 set by the association-degree setter 14 h has no difference from the association degrees of any other sections and is equal to the association degrees thereof (or has a difference that is smaller than or equal to a predetermined threshold), the window-edge detector 14 d performs processing for excluding the section in the window-edge detection processing.
For example, when sections A1, B1, C1, D1, E1, and F1 exist in the association-degree management table T6 and the association degree of the section F1 has no difference from the association degrees of any other sections A1 to E1 and is equal to the association degrees thereof (or has a difference that is smaller than or equal to a predetermined threshold), the window-edge detector 14 d performs processing for excluding the section F1 in the window-edge detection processing.
In operation S42 a, the window-edge detector 14 d refers to the association-degree management table T6 to determine whether or not a section (e.g., an unassociated section) whose degrees of association with all other sections are lower than or equal to a predetermined value exists in the sections stored in the association-degree management table T6. When an unassociated section exists (Yes in operation S42 a), the process proceeds to operation S42 b. When no unassociated section exists (No in operation S42 a), the process proceeds to operation S43.
In operation S42 b, the window-edge detector 14 d deletes the unassociated section from the sections divided in operation S42. Thereafter, the process proceeds to operation S43.
In a second method, when the association degrees in the association-degree management table T6 set by the association-degree setter 14 h have not difference from the association degrees of any other sections and are equal to the association degrees (or have differences that are smaller than or equal to a predetermined threshold), the window-edge detector 14 d changes the conditions for the edge detection and for the subsequent section division and performs processing so as to perform more finer section division.
For example, when section A1, B1, C1, and D1 exist and these sections have no differences in the association degrees therebetween (or have differences that are smaller than or equal to the predetermined threshold), the threshold in operation S45 may be varied in the edge detection processing so as to increase the number of sections in the screen.
In operation S42 c, the window-edge detector 14 d refers to the association-degree management table T6 to determine whether or not all of the association degrees included in the association-degree management table T6 are lower than or equal to a predetermined value. When all of the association degrees are lower than or equal to the predetermined value (Yes in operation S42 c), the process proceeds to operation S42 d. When not all of the association degrees are lower than or equal to the predetermined value (No in operation S42 c), the process proceeds to operation S43.
In operation S42 d, the window-edge detector 14 d reduces the value of the threshold to be used in operation S46. Thereafter, the process proceeds to operation S43.
Once the value of the threshold to be used in operation S46 is reduced, when an association degree that is larger than the predetermined value exists in the association degrees included in the association-degree management table T6, processing for returning the reduced value of the threshold to its original value may also be performed (although such an arrangement is not illustrated in the flowchart in FIG. 19 ).
The information processing system according to the third embodiment provides substantially the same advantages as the information processing system according to the second embodiment.
According to the first window detection processing described above, the amount of calculation is further reduced to thereby make it possible to speed up the response to a user operation. According to the second window detection processing described above, the threshold is reduced, so that the number of sections increases. This arrangement makes it possible to increase the possibility that the sections having a high association degree are identifiable.
An information processing system according to a fourth embodiment will be described next.
The information processing system according to the fourth embodiment will be described below in conjunction with, mainly, points that are different from those of the second embodiment described above, and a description of similar points is not given hereinafter.
In the second embodiment described above, the condition for transmitting the update-difference data in descending order of priority is that the data transfer is not performed in time, as illustrated in operation S81 in the priority setting processing (in FIG. 17 ) performed by the priority setter 14 i.
However, in the priority setting processing in the second embodiment, since the sections are identified from only the current mouse-cursor position information, the priority setting processing is executed even when the user happens to locate the mouse cursor in one section without intentional mouse operation.
Consequently, data for a section having a high degree of association with the mouse-cursor-located section is transmitted prior to data for the other sections. In practice, however, since the user may simply be viewing a section in which the update frequency is high, erroneous determination in the priority processing may occur. Accordingly, the information processing system according to the fourth embodiment reduces the amount of erroneous determination in the priority processing.
A server apparatus 10 a in the present embodiment further includes an operation-state detector 14 n for detecting an operation state.
The operation-state detector 14 n obtains operation information from the operation-information acquirer 14 b. The operation-state detector 14 n then holds information indicating whether or not the user was performing an operation at a certain time and information indicating on which section the operation was performed when he or she was performing an operation.
On the basis of the information held by the operation-state detector 14 n, the priority setter 14 i sets priorities.
An operation-state management table T7 illustrated in FIG. 22 has a “time” column, a “tool operation state” column, and a “section” column. Pieces of information that are horizontally arranged are associated with each other.
In the “time” column, the time at which operation information that the operation-state detector 14 n obtained from the operation-information acquirer 14 b is contained.
In the “tool operation state” column, information indicating whether or not the user operated the mouse at a certain time. In the “tool operation state” column, “O” indicates that the mouse was operated and “×” indicates that the mouse was not operated.
In the “section” column, information that identifies the section on which an operation was performed when the user was operating the mouse is contained.
Priority setting processing in the fourth embodiment will be described next.
In operation S82 a, the priority setter 14 i refers to the operation-state management table T7 to determine whether or not the information at the most recent time indicates that an operation was performed. When the information at the most recent time indicates that an operation was performed (Yes in operation S82 a), the process proceeds to operation S83. When the information at the most recent time indicates that no operation was performed (No in operation S82 a), the priority setting processing ends.
The information processing system according to the fourth embodiment provides substantially the same advantages as the information processing system according to the second embodiment. The information processing system according to the fourth embodiment can further reduce the amount of erroneous determination in the priority processing.
An information processing system according to a fifth embodiment will be described next.
The information processing system according to the fifth embodiment will be described below in conjunction with, mainly, points that are different from those of the second embodiment described above, and a description of similar points is not given hereinafter.
In the second embodiment described above, only the update frequencies of the sections are used to calculate the association degrees (refer to, for example, the association-degree calculation processing illustrated in FIG. 16 ). While the association degree is a value indicating that updates occurred at the same time, it is, more specifically, a value indicating that updates occurred at the same time when an operation was performed. Thus, when only the update frequencies are merely used to calculate the association degrees, the degree of association between two sections in which updates occur simultaneously by accident becomes relatively high. The server apparatus in the present embodiment utilizes operation history to enhance the reliability of the association degrees set by the association-degree setter 14 h.
A server apparatus 10 b in the present embodiment includes an operation-information history accumulator 14 p having a function for accumulating the user's operation history, in addition to the configuration of the server apparatus 10, described above.
The operation-information history accumulator 14 p accumulates, as history information, information indicating in which section an instruction operation (e.g., a click operation) to the mouse cursor position is performed for each unit time.
The association-degree setter 14 h then sets association degrees on the basis of the accumulated history information.
An operation-information history management table T8 illustrated in FIG. 25 has a “time” column, a “section” column, and a “mouse position (x, y)” column. Pieces of information that are horizontally arranged are associated with each other.
In the “time” column, the time at which the operation-information history accumulator 14 p obtains the mouse-cursor position information from the operation-position detector 14 c is contained.
In the “section” column, information that identifies the number of times the mouse click operation is performed for each section is contained. The section information may be obtained from the window-edge detector 14 d.
In the “mouse position” column, the mouse-cursor position information obtained from the operation-information acquirer 14 b is contained.
In operation S76 a, the association-degree setter 14 h refers to the operation-information history management table T8 to determine whether or not operation history information corresponding to time of interest exists. When operation history information corresponding to the time of interest exists (Yes in operation S76 a), the process proceeds to operation S76 b. When no operation history information corresponding to the time of interest exists (No in operation S76 a), the process proceeds to operation S77.
In operation S76 b, the association-degree setter 14 h sets an association degree by reducing the update-rate difference in a time slot when an operation was performed and increasing the update-rate difference in a time slot when no operation was perfumed. Thereafter, the process proceeds to operation S77.
A cursor-position management table T1 and a table T2 illustrated in FIG. 27 are substantially the same as those illustrated in FIG. 18 and used in the description of the specific example of the second embodiment.
After calculating the update rates, the association-degree setter 14 h uses the table T2 to calculate differences between the update rates for each of the update occurrence times at which operations were performed between the sections. Using the operation-information history management table T8, the association-degree setter 14 h applies a weight to the update-rate differences.
For example, update rates of 0, 1, 0, 0, and 17 are obtained in the section A1 between time 10:10:10 and time 10:10:14 and update rates of 0, 0, 0, 0, and 21 are obtained in the section C1 between time 10:10:10 and time 10:10:14. Referring now to the operation-information history management table T8, the mouse cursor was operated twice in the section C1 at time 10:10:14. Thus, the update-rate difference at 10:10:14 is reduced to one half and the update-rate differences at the other times are doubled. As a result, the difference between the two update rates is calculated to be “4” (=|0−0|+2×|0−1|+|0−0|+|0−0|+1/2×|21−17|. This value “4” is used as an update-rate difference of the section A1 relative to the operation on the section C1. Similarly, an update-rate difference of the section B1 relative to the section C1 s determined to be 35.5 and an update-rate difference of the section D1 to the section C1 is determined to be 34. Since the method of the subsequent association-degree determination is substantially the same as the method in the second embodiment, a description thereof is not given hereinafter. A table T10 illustrated in FIG. 27 corresponds to the table T4 and a table T11 corresponds to the table T5. As illustrated in the table T11, the association degree of the section C1 relative to the section A1, the mouse cursor being operated in the section C1, is 95%, which is larger than the association degree “93%” of the section C1 relative to the section A1 in the case of the second embodiment.
The information processing system according to the fifth embodiment provides substantially the same advantages as the information processing system according to the second embodiment.
In addition, according to the information processing system of the fifth embodiment, a weight is further applied to thereby make it possible to further enhance the reliability of the association degrees set by the association-degree setter 14 h.
An information processing system according to a sixth embodiment will be described next.
The information processing system according to the sixth embodiment will be described below in conjunction with, mainly, points that are different from those of the fifth embodiment described above, and a description of similar points is not given hereinafter.
The mouse is used to move the cursor across various points on the desktop. However, when a specific application is used, deviation occurs in the points between which the cursor is moved. For example, suppose a case in which, using a 3D CAD application, the user changes a setting for writing or the like while adjusting an overall shape of a 3D object in a data-rendering section by rotating the 3D object. In this case, a larger amount of mouse operation for moving the cursor between the data-rendering section in which the 3D object is rendered and an operation-tool section for setting the writing or the like may occur than usual.
Accordingly, in the server apparatus of the sixth embodiment, the deviation of the points between which the cursor is moved is reflected in the setting of the association degrees, to thereby enhance the reliability of the association degrees.
A server apparatus 10 c in the present embodiment further has a mouse-movement-history vector extractor 14 q having functions for generating information regarding a trace of mouse movement between multiple sections on the basis of the history information and accumulating the generated information.
In the table T12 illustrated in FIG. 29 , information indicating from which section to which section a mouse operation is performed in a certain amount of time is contained. In the table T12, information indicating that a mouse operation for moving the cursor from the section C1 to the section A1 was performed ten times is contained.
On the basis of the table T12, the association-degree setter 14 h increases, in the data of the association-degree management table T6, the degree of association of a corresponding section relative to a data-rendering section.
For example, a combination of sections between which a mouse operation was performed ten times or more in a certain amount of time is extracted. The table T12 illustrated in FIG. 29 indicates a case in which a mouse operation for moving the cursor from the section C1 to the section A1 and a mouse operation for moving the cursor from the section D1 to the section B1 are performed ten times or more in a certain period of time. Adjustment is performed so that the values in the association-degree management table T6 which correspond to the extracted combination are increased according to the number of times the operation was performed. As a result, in the association-degree management table T6 illustrated in FIG. 18 , the value of the combination of the section C1 and the section A1 increases from 93 to 98 and the value of the combination of the section D1 and the section B1 increases from 89 to 93.
This operation allows for setting of more accurate association degrees.
In operation S77 a, the association-degree setter 14 h calculates, for each section, degrees of association with the sections. Thereafter, the process proceeds to operation S77 b.
In operation S77 b, the association-degree setter 14 h refers to the operation-information history management table T8 to determine whether or not operation history information corresponding to time of interest exists. When operation history information corresponding to the time of interest exists (Yes in operation S77 b), the process proceeds to operation S77 c. When no operation history information corresponding to the time of interest exists (No in operation S77 b), the processing ends.
In operation S77 c, the association-degree setter 14 h generates vector data on the basis of the operation history information. The association-degree setter 14 h then applies a weight to the association degree in accordance with the number of vectors. Thereafter, the association-degree setting processing ends.
The description of the association-degree setting processing is finished at this point.
The information processing system according to the sixth embodiment provides substantially the same advantages as the information processing system according to the fifth embodiment.
In addition, according to the information processing system of the sixth embodiment, a weight is further applied to thereby make it possible to further enhance the reliability of the association degrees set by the association-degree setter 14 h.
Although the information processing apparatus, the information processing method, and the information processing program according to the present invention have been described above in conjunction with the illustrated embodiments, the present invention is not limited thereto. The configurations of the units may be replaced with any elements having similar functions. Any other element or process may also be added to the present invention.
Additionally, in the present invention, two or more arbitrary elements (or features) in the above-described embodiments may also be combined together.
The above-described processing functions may be realized by a computer. In such a case, a program in which details of the processing functions of the information processing apparatus 1 and the server apparatus 10, 10 a, 10 b, or 10 c are written is supplied. When the program is executed by the computer, the above-described processing functions are realized on the computer. The program in which the details of the processing are written may be recorded to a computer-readable recording medium. Examples of the computer-readable recording medium include a magnetic storage device, an optical disk, a magneto optical recording medium, and a semiconductor memory. Examples of the magnetic storage device include a hard disk drive, a flexible disk (FD), and a magnetic tape. Examples of the optical disk include a DVD, DVD-RAM, and CD-ROM/RW. One example of the magneto-optical recording medium is an MO (magneto-optical disk).
For distribution of the program, portable recording media (such as DVDs and CD-ROMs) on which the program is recorded may be made commercially available. The program may also be stored in a storage device in a server computer so that the program can be transferred therefrom to another computer over a network.
The computer that executes the program may store, in the storage device thereof, the program recorded on the portable recording medium or the like or transferred from the server computer. The computer then reads the program from the storage device thereof and executes processing according to the program. The computer may also directly read the program from the portable recording medium and execute the processing according to the program. In addition, each time the program is transferred from the server computer linked through a network, the computer may sequentially execute the processing according to the received program.
At least one of the above-described processing functions may also be implemented by an electronic circuit, such as a DSP (digital signal processor), an ASIC, or a PLD (programmable logic device).
Technologies disclosed in the above-described embodiments encompass the technologies described in appendices below.
All examples and conditional language recited herein are intended for pedagogical purposes to aid the reader in understanding the principles of the invention and the concepts contributed by the inventor to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions, nor does the organization of such examples in the specification relate to a showing of the superiority and inferiority of the invention. Although the embodiments of the present inventions has been described in detail, it should be understood that the various changes, substitutions, and alterations could be made hereto without departing from the spirit and scope of the invention.
Claims (11)
1. A method implemented by a computer, the method comprising:
setting, by the computer, for respective sections set in an image to be transmitted, update frequencies of images stored in a storage unit for the sections in a predetermined period of time;
setting, by the computer, association degrees indicating degrees of association between the sections based on the update frequencies, wherein the association degrees are based on calculating an update rate for the respective sections for each certain amount of time in the predetermined period of time;
first determining, by the computer, whether history information corresponding to the each certain amount of time in the predetermined period of time exists;
applying, by the computer, a weight to the association degrees based on the history information when the history information exists based on the first determining;
second determining, by the computer, whether an amount of data transmitted reaches an upper limit of an available band of a network used for the image transmission;
identifying by the computer, a section on which an operation by a user is performed;
setting by the computer, a higher priority for the identified section and another section having a highest degree of association with the identified section than priorities for other sections only when the amount of data transmitted reaches an upper limit of an available band of a network used for the image transmission;
and transmitting by the computer, the image, stored in the storage unit, in sequence with the images stored for the sections whose set priority is higher first.
2. The method as claimed in claim 1 , wherein the section on which an operation is performed is identified based on the section in which a cursor is located.
3. The method as claimed in claim 1 , wherein when the association degree of the another section that is most highly associated with the identified section is higher than or equal to a threshold, a higher priority is set for the most highly associated section than priorities for the other sections.
4. The method as claimed in claim 2 , further comprising:
storing a time at which the cursor is operated in conjunction with information regarding the cursor-located section, wherein the section on which an operation is performed is identified based on the time at which the cursor is operated.
5. The method as claimed in claim 1 , further comprising:
setting an update region in which a number of updates is larger than or equal to a certain value by comparing given frames of images with each other, wherein the update frequencies is set based on the set update region and the set sections.
6. The method as claimed in claim 1 , wherein the sections are set by detecting edges of windows in the image to be transmitted.
7. The method as claimed in claim 6 , wherein the section is changed after the edges are detected based on the association degrees set.
8. The method as claimed in claim 7 , wherein a number of sections to be set is increased when all of the association degrees are lower than or equal to a threshold.
9. The method as claimed in claim 7 , wherein a section whose degrees of association with all of the other sections are smaller than or equal to a threshold is excluded from the sections to be set.
10. A non-transitory computer-readable recording medium that stores a program for causing a computer to execute processing comprising:
setting, for respective sections set in an image to be transmitted, update frequencies of images stored in a storage unit for the sections in a predetermined period of time;
setting association degrees indicating degrees of association between the sections based on the update frequencies, wherein the association degrees are based on calculating an update rate for the respective sections for each certain amount of time in the predetermined period of time;
first determining whether history information corresponding to the each certain amount of time in the predetermined period of time exists;
applying a weight to the association degrees based on the history information when the history information exists based on the first determining;
second determining whether an amount of data transmitted reaches an upper limit in an available band of a network used for the image transmission;
identifying a section on which an operation by a user is performed and setting a higher priority for the identified section and another section having a highest degree of association with the identified section than priorities for other sections, only when the amount of data transmitted reaches an upper limit of an available band of a network used for the image transmission;
and transmitting the image, stored in the storage unit, in sequence with the images stored for the sections whose set priority is higher first.
11. A system comprising:
processing circuitry configured to
set, for respective sections set in an image to be transmitted, update frequencies of images stored in a memory for the sections in a predetermined period of time;
set association degrees indicating degrees of association between the sections based on the update frequencies, wherein the association degrees are based on calculating an update rate for the respective sections for each certain amount of time in the predetermined period of time;
first determine whether history information corresponding to the each certain amount of time in the predetermined period of time exists;
apply a weight to the association degrees based on the history information when the history information exists based on the first determining;
second determine whether an amount of data transmitted reaches an upper limit in an available band of a network used for the image transmission;
identify a section on which an operation by a user is performed; and
set a higher priority for the identified section and another section having a highest degree of association with the identified section than priorities for other sections only when the amount of data transmitted reaches an upper limit of an available band of a network used for the image transmission; and
communication interface circuitry configured to transmit the image, stored in the memory, in sequence with the images stored for the sections whose set priority is higher first.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-269617 | 2010-12-02 | ||
JP2010269617A JP5678619B2 (en) | 2010-12-02 | 2010-12-02 | Information processing apparatus, information processing method, and information processing program |
Publications (2)
Publication Number | Publication Date |
---|---|
US20120144397A1 US20120144397A1 (en) | 2012-06-07 |
US9666166B2 true US9666166B2 (en) | 2017-05-30 |
Family
ID=46163510
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/308,678 Active 2033-10-17 US9666166B2 (en) | 2010-12-02 | 2011-12-01 | Information processing apparatus, method, and recording medium |
Country Status (2)
Country | Link |
---|---|
US (1) | US9666166B2 (en) |
JP (1) | JP5678619B2 (en) |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20130004737A (en) * | 2011-07-04 | 2013-01-14 | 삼성전자주식회사 | Image display method and apparatus |
US9536044B2 (en) * | 2011-12-06 | 2017-01-03 | Microsoft Technology Licensing, Llc | Metadata extraction pipeline |
US10430036B2 (en) * | 2012-03-14 | 2019-10-01 | Tivo Solutions Inc. | Remotely configuring windows displayed on a display device |
EP2882164A4 (en) * | 2012-08-06 | 2016-04-06 | Nec Corp | Communication system, server apparatus, server apparatus controlling method and program |
WO2014077307A1 (en) * | 2012-11-15 | 2014-05-22 | 日本電気株式会社 | Server device, terminal, thin client system, screen transmission method and program |
JP2014174710A (en) * | 2013-03-08 | 2014-09-22 | Casio Comput Co Ltd | Information processing device, information processing method, and program |
KR102143618B1 (en) | 2014-01-17 | 2020-08-11 | 삼성전자주식회사 | Method for controlling a frame rate and an electronic device |
FI127221B (en) * | 2014-10-17 | 2018-01-31 | Rightware Oy | Dynamic rendering of graphics |
US10283078B2 (en) * | 2016-01-20 | 2019-05-07 | Mediatek Inc. | Adaptive display partial update methods and apparatus thereof for power saving in pixel processing |
US10163184B2 (en) * | 2016-08-17 | 2018-12-25 | Adobe Systems Incorporated | Graphics performance for complex user interfaces |
JP7252444B2 (en) | 2019-03-13 | 2023-04-05 | 富士通株式会社 | Display control program, display control method and information processing device |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11275532A (en) | 1998-03-24 | 1999-10-08 | Sanyo Electric Co Ltd | Video signal transmission system |
US20040151390A1 (en) * | 2003-01-31 | 2004-08-05 | Ryuichi Iwamura | Graphic codec for network transmission |
US20060155735A1 (en) * | 2005-01-07 | 2006-07-13 | Microsoft Corporation | Image server |
US20080239414A1 (en) * | 2007-03-26 | 2008-10-02 | Brother Kogyo Kabushiki Kaisha | Image Reader and Image Forming Apparatus |
US20100322523A1 (en) * | 2007-06-29 | 2010-12-23 | Akitake Mitsuhashi | Screen data transmitting system, screen data transmitting server, screen data transmitting method and program recording medium |
US20120184227A1 (en) * | 2009-09-28 | 2012-07-19 | Nec Corporation | Wireless transmission apparatus, wireless transmission method and computer program |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008289030A (en) * | 2007-05-21 | 2008-11-27 | Nec Corp | Picture drawing transfer system |
JP4946667B2 (en) * | 2007-07-02 | 2012-06-06 | カシオ計算機株式会社 | Server apparatus and program |
-
2010
- 2010-12-02 JP JP2010269617A patent/JP5678619B2/en active Active
-
2011
- 2011-12-01 US US13/308,678 patent/US9666166B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11275532A (en) | 1998-03-24 | 1999-10-08 | Sanyo Electric Co Ltd | Video signal transmission system |
US20040151390A1 (en) * | 2003-01-31 | 2004-08-05 | Ryuichi Iwamura | Graphic codec for network transmission |
US20060155735A1 (en) * | 2005-01-07 | 2006-07-13 | Microsoft Corporation | Image server |
US20080239414A1 (en) * | 2007-03-26 | 2008-10-02 | Brother Kogyo Kabushiki Kaisha | Image Reader and Image Forming Apparatus |
US20100322523A1 (en) * | 2007-06-29 | 2010-12-23 | Akitake Mitsuhashi | Screen data transmitting system, screen data transmitting server, screen data transmitting method and program recording medium |
US20120184227A1 (en) * | 2009-09-28 | 2012-07-19 | Nec Corporation | Wireless transmission apparatus, wireless transmission method and computer program |
Also Published As
Publication number | Publication date |
---|---|
JP2012118881A (en) | 2012-06-21 |
JP5678619B2 (en) | 2015-03-04 |
US20120144397A1 (en) | 2012-06-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9666166B2 (en) | Information processing apparatus, method, and recording medium | |
EP2487911B1 (en) | Remote server, image transmission program, and image display method | |
JP5471903B2 (en) | Information processing apparatus, image transmission program, and image display method | |
US8819270B2 (en) | Information processing apparatus, computer-readable non transitory storage medium storing image transmission program, and computer-readable storage medium storing image display program | |
US20130155075A1 (en) | Information processing device, image transmission method, and recording medium | |
JP5471794B2 (en) | Information processing apparatus, image transmission program, and image display method | |
JP5899897B2 (en) | Information processing apparatus, information processing method, and program | |
US9300818B2 (en) | Information processing apparatus and method | |
US8411972B2 (en) | Information processing device, method, and program | |
US20170269709A1 (en) | Apparatus, method for image processing, and non-transitory medium storing program | |
US20200265635A1 (en) | Measurement method, measurement device, and recording medium | |
US10078383B2 (en) | Apparatus and method to display moved image data processed via a server at a predicted position on a screen | |
US9269281B2 (en) | Remote screen control device, remote screen control method, and recording medium | |
JP5874257B2 (en) | Information processing apparatus, image transmission method, and image transmission program | |
CN113657518A (en) | Training method, target image detection method, device, electronic device, and medium | |
US20160155429A1 (en) | Information processing apparatus and terminal device | |
US20150281699A1 (en) | Information processing device and method | |
US20140086550A1 (en) | System, terminal device, and image capturing method | |
WO2014080440A1 (en) | Information processing device, control method, and control program | |
KR20240037557A (en) | Method, computer device, and computer program for saving video storage space |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: FUJITSU LIMITED, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:IMAI, TOMOHARU;MATSUI, KAZUKI;REEL/FRAME:027393/0122 Effective date: 20111117 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |