WO2010050593A1 - Communication apparatus and display apparatus - Google Patents

Communication apparatus and display apparatus Download PDF

Info

Publication number
WO2010050593A1
WO2010050593A1 PCT/JP2009/068699 JP2009068699W WO2010050593A1 WO 2010050593 A1 WO2010050593 A1 WO 2010050593A1 JP 2009068699 W JP2009068699 W JP 2009068699W WO 2010050593 A1 WO2010050593 A1 WO 2010050593A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
processing time
unit
software
pixels
Prior art date
Application number
PCT/JP2009/068699
Other languages
French (fr)
Japanese (ja)
Inventor
泰如 西林
村井 信哉
後藤 真孝
山口 健作
博史 川添
Original Assignee
株式会社東芝
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社東芝 filed Critical 株式会社東芝
Publication of WO2010050593A1 publication Critical patent/WO2010050593A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/507Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction using conditional replenishment
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/42Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation

Definitions

  • the present invention relates to a communication device and a display device that realize a function of sharing an application screen between devices.
  • Patent Document 1 proposes a technique related to a system that projects screen information of a main device (such as a personal computer or a server computer) onto a remote display device (display terminal) via a network.
  • input information (such as pen input by a digitizer) from the display terminal is similarly transmitted to the main unit via the network, and the actual application program processing is executed by the main unit. Thereafter, the execution result and the screen update information are transferred to the display terminal via the network.
  • the display terminal executes output processing (drawing processing) based on the received screen update information.
  • VNC Virtual Network Computing
  • VNC Virtual Network Computing
  • the value of the read pixel information is compared with the value of the pixel information transmitted to the previous display terminal, and the updated screen area changed from the previous time is determined. Further, after the still image compression of the updated screen area, only the compressed screen difference information is transmitted to the display terminal. Thereby, the consumption of a communication band can be suppressed. Accordingly, the amount of screen information to be transmitted increases when the screen change is large, such as movement of a window, and conversely, the amount of screen information to be transmitted decreases when the screen change is small.
  • the display terminal executes a decompression process by hardware when the number of pixels of the compressed image received from the main device is larger than a predetermined threshold value, and otherwise decompresses by software.
  • a method of switching the processing so as to execute the processing can be considered.
  • Patent Document 2 As a method for selecting and implementing one process from a plurality of processes based on image information, in Patent Document 2, either hardware or software is selected based on still image addition (header) information. There has been proposed a technique for selecting and decompressing compressed data. Patent Document 3 proposes a technique of selecting and encoding either lossless compression or lossy compression based on image resolution information.
  • the compressed image having the number of pixels larger than the threshold is subjected to decompression processing by hardware, and the compressed image having the number of pixels equal to or less than the threshold is switched by software so that the processing is switched. In some cases, the efficiency of image expansion processing on the terminal side could not be improved.
  • the processing time may be shorter than when the compressed image corresponding to the area is expanded by software a plurality of times.
  • the present invention has been made in view of the above, and provides a communication device and a display device capable of improving the efficiency of processing of a display device capable of decompressing a compressed image transmitted from a communication device by software and hardware.
  • the purpose is to do.
  • the present invention is a communication device connectable to a display device capable of displaying an image via a network, an image storage unit for storing a display image to be displayed on the display device, and an update for updating the display image
  • an update image generation unit that generates an image
  • a detection unit that detects a difference region indicating a region where pixel information does not match between the update image and the display image, and a plurality of the difference regions are detected
  • a software processing time representing a processing time when software performs an expansion process on each compressed image obtained by compressing a plurality of the difference area images
  • an integrated area image representing one area including the plurality of difference areas.
  • a determination unit configured to determine whether or not time is smaller than the software processing time; and when the hardware processing time is determined to be smaller than the software processing time, a compressed image obtained by compressing the image of the integrated region is generated.
  • a compressed image generation unit and a transmission unit that transmits the generated compressed image to the display device are provided.
  • the present invention is a display device connected to a communication device via a network, and receives a compressed image obtained by compressing an image in the updated region and the number of pixels of the image in the updated region from the communication device.
  • a receiving unit, a decompression circuit capable of executing decompression processing of the compressed image, a software decompression unit capable of executing decompression processing of the compressed image by software, the number of pixels and a predetermined threshold value are compared, When the number of pixels is smaller than the threshold, the software decompression unit decompresses the compressed image to generate an updated image corresponding to the updated area.
  • the decompression circuit An image generation unit that expands the compressed image to generate a display image corresponding to the updated area, a display unit that displays the display image, and the software expansion unit.
  • a measurement unit that measures a software processing time that represents the processing time of the decompression process and a hardware processing time that represents a processing time of the decompression process by the decompression circuit, the number of pixels of the image decompressed by the software decompression unit, and the software processing time
  • the first function for calculating the software processing time from the number of pixels is generated based on the number of pixels, and the hardware processing time is calculated from the number of pixels based on the number of pixels of the image expanded by the expansion circuit and the hardware processing time.
  • An information generation unit that generates a second function to be calculated, and a transmission unit that transmits the first function and the second function to the communication device.
  • the communication system includes a main device (communication device) that executes an application and a display terminal (display device) that displays a screen updated by executing the application.
  • a main device communication device
  • a display terminal display device
  • the main apparatus integrates the plurality of difference areas into one rectangular area, thereby executing an image executed on the display terminal It is determined whether the processing time of the decompression process is shortened.
  • the main body apparatus compresses the image information corresponding to the rectangular area obtained by integrating a plurality of difference areas and transmits the compressed image information to the display terminal. Thereby, the efficiency of the image expansion process at the display terminal can be improved.
  • FIG. 1 is a block diagram showing a configuration of a communication system according to the present embodiment.
  • the communication system according to the present embodiment is a system that transmits an image of an updated part due to an event occurring on the screen of the main device to a display terminal.
  • a screen transfer system such a communication system is referred to as a screen transfer system.
  • a screen transfer system 10 includes a main body device 100 as a communication device, a radio base station 300 as an access point connected to the main body device 100 via a network 400, A display terminal 200 is provided as a display device that performs wireless communication with the wireless base station 300 through a wireless LAN.
  • the screen transfer system 10 has a function of wirelessly transferring a screen of application software operating on the main device 100 to the display terminal 200 via the radio base station 300 and displaying the application screen of the main device 100 on each display terminal 200. Have.
  • the screen transfer system 10 in order to transfer the screen updated on the main device 100 side to the display terminal 200 in real time, only the image information of the updated portion in the screen of the main device 100 is transferred. That is, the main device 100 can transmit the image information via the radio base station 300 to the display terminal 200 that displays the image information.
  • the display terminal 200 receives the image information from the main device 100, expands the received image information, and displays it on the corresponding part in the screen.
  • the wireless base station 300 is a wireless communication base station that complies with a wireless communication protocol such as IEEE802.11.
  • the network 400 is a network that conforms to a wired communication protocol such as IEEE802.3, for example.
  • the network form is not limited to this, and it may be configured to connect with another protocol. Further, the display terminal 200 and the main device 100 may be connected via a wired network.
  • FIG. 2 is a block diagram of main device 100 according to the present embodiment.
  • the main apparatus 100 includes a display 101, an input device 102, an image buffer 121, a condition storage unit 122, a session information storage unit 123, an event acquisition unit 111, and a difference area detection unit 112.
  • the display 101 is a display device realized by an LCD (Liquid Crystal Display) or the like.
  • the input device 102 is realized by a mouse or the like that moves the cursor displayed on the screen of the display 101.
  • a keyboard, a trackball, or the like may be used as the input device 102.
  • the image buffer 121 is a storage unit that stores images. Based on the number of pixels of the image, the condition storage unit 122 performs processing time when the compressed image is decompressed by software on the display terminal 200 (hereinafter referred to as software processing time), and decompresses the compressed image by hardware on the display terminal 200. A predetermined condition for calculating the processing time (hereinafter referred to as hardware processing time) is stored.
  • the condition storage unit 122 has function information for deriving a function (first function) that outputs the software processing time with the number of pixels as an input, and a function (first operation) that outputs the hardware processing time with the number of pixels as an input.
  • the function information for deriving (two functions) is stored as a condition.
  • the slope information and intercept information of the linear function correspond to function information for deriving the function.
  • the function is not limited to a linear function.
  • the session information storage unit 123 stores session information representing information regarding the display terminal 200 that is establishing a session with the main device 100.
  • the session information storage unit 123 includes user identification information for identifying a user, status information indicating whether the session is being used, and whether transmission control is TCP (Transmission Control Protocol) or UDP (User Datagram Protocol). Session information in which information such as transmission control information indicating whether or not there is associated is stored.
  • TCP Transmission Control Protocol
  • UDP User Datagram Protocol
  • session information storage unit 123 identifies a display terminal 200 that is a destination of a message transmitted from main device 100. Session information including identification information is stored.
  • the image buffer 121, the condition storage unit 122, and the session information storage unit 123 are configured by any commonly used storage medium such as an HDD (Hard Disk Drive), an optical disk, a memory card, and a RAM (Random Access Memory). can do.
  • HDD Hard Disk Drive
  • optical disk an optical disk
  • memory card an optical disk
  • RAM Random Access Memory
  • the event acquisition unit 111 acquires an event generated by the operation of an application program or the like.
  • the event acquisition unit 111 includes an operating system (OS) that performs overall control of the computer, a virtual display driver having a function equivalent to a display driver incorporated in the OS, and application programs such as application software that operates on the OS. It is realized with.
  • OS operating system
  • application programs such as application software that operates on the OS. It is realized with.
  • the event acquisition unit 111 updates the screen (image) when the screen is updated by application software or when the cursor is moved by a mouse operation or the like to update an image of an arbitrary area in the screen. Get that as an event.
  • the event acquisition unit 111 includes an updated image generation unit 111a as a detailed configuration.
  • the updated image generation unit 111a can be realized as, for example, a virtual display driver incorporated in the OS.
  • the update image generation unit 111a acquires a drawing command from the graphic engine of the OS, and generates a display image representing an image to be displayed on the display terminal 200 by performing a drawing process.
  • the data is sequentially output to the buffer 121 and stored. As a result, display images are sequentially held in the image buffer 121.
  • an image stored in the image buffer 121 is referred to as a display image
  • an image that is newly generated by the update image generation unit 111a and is stored in the image buffer 121 is referred to as an update image.
  • the update image generation unit 111a generates an update image to be displayed on the display terminal 200 according to an event generated by the operation of the application program.
  • the difference area detection unit 112 detects a difference area representing an area where the pixel information does not match between the old and new display images sequentially held in the image buffer 121. That is, the difference area detection unit 112, when notified by the update image generation unit 111a that an update image has been generated, generates new image information (update image) and the image buffered in the image buffer 121. A difference area between information (display image) is detected.
  • the difference area detection unit 112 detects, for example, a minimum rectangle including a portion that does not match between two pieces of image information as a difference area. Note that the difference area detection unit 112 may be configured to check whether there is a difference at predetermined time intervals.
  • the compressed image generation unit 113 generates a compressed image obtained by compressing the difference region image detected by the difference region detection unit 112 for transmission.
  • the compressed image may be generated using irreversible compression such as JPEG (Joint Photographic Experts Group) or may be generated using a lossless compression method.
  • the compressed image generation unit 113 and the above-described difference area detection unit 112 are realized by an application program for screen transfer.
  • the region integration determination unit 114 determines whether or not it is more efficient for the display terminal 200 to integrate the plurality of difference regions when a plurality of difference regions are detected. For example, the region integration determination unit 114 receives the notification of the difference region information including the number of pixels of the image indicated by the difference region from the difference region detection unit 112, and the condition stored in the condition storage unit 122 and the notified number of pixels. Therefore, it is determined whether it is more efficient to integrate a plurality of difference areas.
  • the difference area detection unit 112 sets the area integration determination unit as 76800 pixels obtained by multiplying both as the number of pixels of the image indicated by the difference area. 114 is notified.
  • the region integration determination unit 114 uses the processing time calculation unit 115 to integrate a plurality of difference regions into a single region (hereinafter referred to as an integrated region) and a software process for each of the plurality of difference regions. Calculate time.
  • the processing time calculation unit 115 performs a hardware processing time when a compressed image obtained by compressing an image in the integrated area is decompressed by hardware and a compressed process obtained by compressing each compressed image obtained by compressing a plurality of difference area images by software.
  • the software processing time is calculated.
  • the processing time calculation unit 115 inputs the number of pixels of the plurality of difference area images notified from the difference area detection unit 112 to the first function stored in the condition storage unit 122, and sets the software processing time. calculate. In addition, the processing time calculation unit 115 inputs the number of pixels of the integrated region image obtained by integrating the plurality of difference regions detected by the difference region detection unit 112 into the second function stored in the condition storage unit 122, and performs hardware processing. Calculate time.
  • the region integration determination unit 114 determines whether or not the calculated hardware processing time is smaller than the software processing time. If the calculated hardware processing time is small, it is determined that it is more efficient to integrate a plurality of difference regions, and the determination result is determined as a difference. The area detection unit 112 is notified. In addition, the compressed image generation unit 113 compresses the image of the integrated region obtained by integrating the plurality of difference regions when the region integration determining unit 114 described later determines that the integration of the plurality of difference regions is more efficient. Generate a compressed image.
  • FIG. 3 is a diagram illustrating an example of the relationship between the number of pixels of an image and the expansion processing time.
  • the rectangular area 301 represents an integrated area obtained by integrating the rectangular areas ⁇ and ⁇ into one.
  • the rectangular area 301 can be obtained as a minimum rectangular area including rectangular areas ⁇ and ⁇ , for example.
  • An area ⁇ indicates a rectangular area in the rectangular area 301 that is not actually updated.
  • a function (a relationship between the expansion processing time by the software expansion processing and the number of pixels when the horizontal axis is the number of pixels of the image and the vertical axis is the expansion processing time of the compressed image on the display terminal 200 ( An example of the first function) and a function (second function) representing the relationship between the medical processing time by the hardware expansion processing and the number of pixels is also shown.
  • a function expression representing the processing time when processing one rectangular area is represented by a solid line (SW (1 rectangle)) 2
  • a function equation representing the total processing time from the start to the end of processing when processing two rectangular areas separately is indicated by a dotted line (SW (2 rectangles)).
  • the values of the decompression processing times corresponding to the horizontal axes “ ⁇ ” and “ ⁇ ” in the SW (1 rectangle) function formula are This is the expansion processing time for each rectangular area.
  • an overhead corresponding to the function intercept is required for each processing, and thus corresponds to the horizontal axis “ ⁇ + ⁇ ” in the SW (2 rectangles) function expression of FIG.
  • the value of the expansion processing time is the total processing time when the two rectangular areas are processed by software.
  • the main device 100 can be integrated into one rectangular area 301 in the display screen by adding a non-updated area ⁇ to the rectangular areas ⁇ and ⁇ .
  • the main device 100 can calculate the processing time when the rectangular area 301 is expanded by hardware using the HW (one rectangle) function formula of FIG.
  • the expansion processing time corresponding to the horizontal axis “ ⁇ + ⁇ + ⁇ ” in the function equation of HW (1 rectangle) is the expansion processing time corresponding to the horizontal axis “ ⁇ + ⁇ ” in the function equation of SW (2 rectangles).
  • the region integration determination unit 114 generates a compressed image of the plurality of rectangular regions separately when one rectangular region including two rectangular regions is cut out from the display screen and the compressed image is generated. It can be determined that the display terminal 200 is more efficient.
  • the difference area detection unit 112 cuts out one rectangular area including a plurality of rectangular areas, and notifies the compressed image generation unit 113 of a target area for generating a compressed image.
  • the intercept which added the offset for the number of detected difference areas The total processing time when each area is expanded separately by software is calculated from a function expression having information. Then, the calculated total processing time is compared with the processing time when it is integrated into one rectangular area and decompressed by hardware. If the hardware decompression time is shorter, the plurality of rectangular areas Is cut out from the display screen to generate a compressed image.
  • the message analysis unit 116 analyzes the message received from the display terminal 200. For example, when the time information including the time required for the decompression process of the compressed image is received from the display terminal 200, the message analysis unit 116 obtains the first function and the second function from the time information and stores them in the condition storage unit 122. . As described above, in the present embodiment, the time required for the actual decompression process is acquired from the display terminal 200, and a function for calculating the processing time can be dynamically obtained based on the acquired time. Instead of performing the above processing by the message analysis unit 116, a function may be obtained in advance and stored in the condition storage unit 122.
  • the communication processing unit 117 transmits / receives a message to / from an external device such as the display terminal 200.
  • the communication processing unit 117 includes a transmission unit 117a that transmits a message and a reception unit 117b that receives the message.
  • the transmission unit 117 a transmits a transmission image message including the compressed image generated by the compressed image generation unit 113 to the display terminal 200.
  • the transmission image message includes a compressed image of the rectangular area to be updated and rectangular information.
  • the compressed image indicates compressed still image information such as JPEG.
  • the rectangular information is information indicating the drawing position of the image of the rectangular area to be updated.
  • the rectangular information indicates an image drawing position with a configuration such as (horizontal direction start point coordinates, vertical direction start point coordinates, horizontal width pixel number, vertical width pixel number).
  • the transmission unit 117a transmits a message to be transmitted via the radio base station 300 with the display terminal 200 specified by the session manager 118 described later as a destination.
  • the receiving unit 117b receives a message from the display terminal 200 and passes it to the message analyzing unit 116. For example, the receiving unit 117b receives time information including the time required for decompression processing of the compressed image from the display terminal 200.
  • the session manager 118 manages communication (session) established with the display terminal 200. For example, when the session manager 118 establishes a session with a certain display terminal 200, the session manager 118 generates session information in which user identification information of the user of the display terminal 200, session state information, transmission control information, and the like are associated with each other. The information is stored in the information storage unit 123. Then, the session manager 118 specifies the display terminal 200 as a message communication destination with reference to the session information, and transmits / receives a message by the communication processing unit 117 with the specified display terminal 200 as the communication destination.
  • FIG. 4 is a block diagram of the display terminal 200 according to the present embodiment.
  • the display terminal 200 includes a display 201, an input device 202, an antenna 203, an image buffer 221, a generation time storage unit 222, a session information storage unit 223, an input / output interface 211, A software decompression unit 212, a hardware decompression circuit 213, an image generation unit 214, a generation time measurement unit 215, a message generation unit 216, a wireless communication processing unit 217, and a session manager 218 are provided.
  • the display 201 is a display device realized by an LCD or the like.
  • the input device 202 is realized by a digitizer or a touch screen that moves a cursor displayed on the screen of the display 201.
  • the input information acquired by the input device 202 is passed to the input / output interface 211 (described later).
  • the antenna 203 transmits and receives radio waves for wireless communication with an external device such as the main device 100.
  • the image buffer 221 is a storage unit that stores images. Further, the generation time storage unit 222 stores the generation time (extension processing time) of the expanded image measured by the generation time measurement unit 215 described later. Specifically, the generation time storage unit 222 multiplies the method used for image expansion processing (either hardware expansion processing or software expansion processing) and the number of pixels of the image (the vertical width and the horizontal width of the compressed image). Value) and the time actually required for the expansion processing (expansion processing time) are stored in association with each other.
  • the session information storage unit 223 stores session information representing information regarding the main device 100 that is establishing a session.
  • the session information storage unit 223 stores session information including session state information and transmission control information.
  • the image buffer 221, the generation time storage unit 222, and the session information storage unit 223 can be configured by any commonly used storage medium such as an HDD, an optical disk, a memory card, and a RAM.
  • the input / output interface 211 is an input / output interface for the display 201 and the input device 202, and is realized by an application program such as GUI (Graphical User Interface).
  • GUI Graphic User Interface
  • the input / output interface 211 acquires image information from the image buffer 221 and displays it on the display 201. Further, the input / output interface 211 acquires the image information sent from the main body device 100 via the image generation unit 214 and writes it to the image buffer 221, in addition to the function for GUI generated independently in the display terminal 200. The image information is written into the image buffer 221.
  • the software decompression unit 212 performs decompression processing of the compressed image received from the main device 100 by software in accordance with an instruction from the image generation unit 214 described later.
  • the hardware decompression circuit 213 executes the decompression processing of the compressed image received from the main body device 100 by hardware according to the instruction from the image generation unit 214.
  • the image generation unit 214 expands the compressed image received from the main device 100 by the software expansion unit 212 or the hardware expansion circuit 213 and then writes the expanded image information to the designated drawing position of the image buffer 221 for drawing. That is, the image generation unit 214 displays a partial image generated by expanding the compressed image transmitted from the main device 100 and received by the wireless communication processing unit 217 at a designated position on the display 201.
  • the image generation unit 214 performs software expansion processing (expansion processing by the software expansion unit 212) and hardware expansion processing (by the hardware expansion circuit 213) based on the size of the number of pixels of the compressed image and a predetermined threshold. Switch between (decompression processing). Note that the image generation unit 214 can acquire the number of pixels of the compressed image from the rectangular information in the transmission image message or the number of pixels (resolution) information in the header of the compressed image.
  • the threshold value is a rectangular area of 10,000 pixels obtained by multiplying the vertical width by 100 pixels and the horizontal width by 100 pixels.
  • the received compressed image is 76800 pixels obtained by multiplying the vertical width by 320 pixels and the horizontal width by 240 pixels, the number of images is larger than the threshold value, so the image generation unit 214 uses the hardware expansion circuit 213 to The compressed image is decompressed by the hardware.
  • the received compressed image is 1024 pixels obtained by multiplying the horizontal width by 32 pixels and the horizontal width by 32 pixels, the number of images is smaller than the threshold value. Decompress the compressed image.
  • FIG. 5 is a diagram illustrating an example of the relationship between the number of pixels of an image and the expansion processing time.
  • This figure shows a functional equation with the resolution (number of pixels) as the horizontal axis and the expansion processing time as the vertical axis.
  • SW represents the relationship between the number of pixels and the expansion process time when the expansion process is performed by software.
  • HW represents the relationship between the number of pixels and the expansion processing time when the expansion processing is performed by hardware.
  • the processing time focused on the portion where the compressed image is expanded is shorter when processed by hardware than when processed by software.
  • the slope of the function formula indicating the relationship between the processing time by hardware and the number of pixels is smaller than the slope of the function formula by software. Therefore, it is more efficient to execute the decompression process with software when the number of pixels of the image is small, and it is more efficient to execute the extension process with hardware when the number of pixels of the image is large.
  • the image generation unit 214 may set the pixel value corresponding to the intersection portion where the two function formulas intersect as a threshold when switching between software expansion processing and hardware expansion processing. desirable.
  • the information for deriving the function as shown in the figure is statistical information on the type of decompression processing, the number of pixels of the decompressed image, the decompression processing time, etc. while actually decompressing the compressed image on the display terminal 200. Can be obtained and stored in the generation time storage unit 222. A method of storing predetermined statistical information in the generation time storage unit 222 of the display terminal 200 or the condition storage unit 122 of the main device 100 may be used.
  • the generation time measurement unit 215 measures the decompression processing time by the image generation unit 214 and stores it in the generation time storage unit 222. Specifically, the generation time measurement unit 215 associates the method used by the image generation unit 214 for decompression processing of the compressed image (decompression method), the number of pixels of the decompressed image, and the decompression processing time, and generates the generation time. Save in the storage unit 222.
  • the message generation unit 216 generates a message to be transmitted to an external device such as the main device 100.
  • the message generation unit 216 transmits time information in which the expansion method, the number of pixels of the expanded image, and the expansion processing time are associated with each other from the information stored in the generation time storage unit 222. Generate as a message.
  • FIG. 6 is a diagram illustrating an example of a transmission message for transmitting time information.
  • FIG. 6 shows a configuration example of a transmission message including the decompression processing time of the compressed image actually measured by the display terminal 200.
  • the time information transmission message includes an IP (Internet Protocol) header, a UDP / TCP header, a message type, an expansion method, area information, and an expansion processing time.
  • IP Internet Protocol
  • the IP header is used to identify the destination address of the transmission / reception terminal.
  • the UDP / TCP header indicates transmission control protocol information.
  • the message type is used for identifying that the message is a message including time information generated by the display terminal 200.
  • the expansion method is information for identifying whether the expansion processing time indicates the software processing time or the hardware processing time.
  • the area information includes the number of pixels of the expanded image.
  • the expansion processing time describes the processing time of the actually measured expansion processing in units of milliseconds, for example. Note that the information from the expansion method to the expansion processing time is added by N pieces measured by the display terminal 200.
  • the message analysis unit 116 of the main device 100 receives a transmission message including time information as shown in FIG. 6, the number of pixels and the expansion processing time are calculated from the time information obtained by analyzing the transmission message. A function expression indicating the relationship between and is obtained. That is, when the message configuration shown in FIG. 6 is used, the main body apparatus 100 does not derive the function expression but the display terminal 200.
  • the display terminal 200 can also be configured to obtain a function expression from the time information and transmit the function information representing the obtained function expression to the main device 100.
  • FIG. 7 is a diagram illustrating an example of a transmission message for transmitting function information from the display terminal 200 to the main device 100 when configured in this manner.
  • FIG. 7 shows the slope information and intercept information for deriving the first function that represents the relationship between the number of pixels and the software processing time, and the second function that represents the relationship between the number of pixels and the hardware processing time.
  • the example of a structure of the transmission message containing inclination information and intercept information is shown.
  • the expansion method is information for identifying whether the expansion processing time indicates the software processing time or the hardware processing time.
  • the inclination information indicates the inclination of a functional expression indicating the relationship between the number of pixels and the expansion processing time as shown in FIG.
  • the intercept information indicates the intercept of the functional expression. As shown in FIG. 7, two pieces of information corresponding to software processing and hardware processing are added to the information from the decompression method to the intercept information.
  • the message generation unit 216 receives the generated time information (or function information) via the wireless communication processing unit 217 when, for example, a communication session with the main device 100 is started or at regular time intervals. To 100.
  • the radio communication processing unit 217 transmits and receives signals to and from the radio base station 300 via the antenna 203.
  • the wireless communication processing unit 217 includes a transmission unit 217a that transmits a message and a reception unit 217b that receives the message.
  • the wireless communication processing unit 217 is realized by a wireless LAN function compliant with IEEE 802.11 or the like.
  • the reception unit 217b of the wireless communication processing unit 217 demodulates the received wireless signal to generate a packet, and passes the corresponding data to the image generation unit 214 according to the message type of the packet. For example, when the packet is a packet of a transmission image message including a compressed image, information such as the compressed image and the number of pixels extracted from the packet is passed to the image generation unit 214.
  • the input / output interface 211 analyzes the coordinate information from the input information and then transmits the input information to the main body device 100 via the transmission unit 217a of the wireless communication processing unit 217. Sent. In this case, the main device 100 executes application processing based on the input information received from the display terminal 200. When an update occurs in the screen area as a result of the application process, the main device 100 acquires the image information to be updated, generates a transmission image message, and transmits the message to the display terminal 200.
  • a transmission image message is generated after determining whether to obtain the transmission image message.
  • the session manager 218 manages communication (session) established with the main device 100. For example, when the session manager 218 establishes a session with the main device 100, the session manager 218 generates session information in which session state information, transmission control information, and the like are associated with each other, and stores the session information in the session information storage unit 223.
  • FIG. 8 is a flowchart showing the overall flow of the image transmission process in the present embodiment.
  • the difference area detection unit 112 performs image information (display image) stored in the image buffer 121 and new image information (update image) generated by the update image generation unit 111a at predetermined time intervals. A difference area is detected (step S801).
  • the area integration determination unit 114 determines whether or not a plurality of difference areas are detected (step S802).
  • the processing time calculation unit 115 calculates an extension processing time (software processing time) when each of the plurality of difference areas is extended by software (step S803). ). For example, if two difference areas are detected, the processing time calculation unit 115 corresponds to the number of pixels of the images of the plurality of difference areas by using the SW (2 rectangle) function formula of FIG. Calculate software processing time.
  • the processing time calculation unit 115 calculates an expansion processing time (hardware processing time) when one integrated region obtained by integrating a plurality of difference regions is extended by hardware (step S804). For example, the processing time calculation unit 115 calculates the hardware processing time corresponding to the number of pixels of the image in the integrated region by using the function formula of HW (one rectangle) in FIG.
  • the region integration determination unit 114 determines whether or not the calculated hardware processing time is smaller than the calculated software processing time (step S805). If the hardware processing time is smaller than the software processing time (step S805: YES), the difference area detection unit 112 sets the integrated area obtained by integrating a plurality of difference areas as a target area for generating a compressed image to the compressed image generation unit 113. Notification is made (step S806).
  • the compressed image generation unit 113 generates a compressed image obtained by compressing the image in the integrated area (step S807). If a plurality of difference areas are not detected in step S802 (step S802: NO), the compressed image generation unit 113 generates a compressed image obtained by compressing the detected image of one difference area. If it is determined in step S805 that the hardware processing time is not shorter than the software processing time (step S805: NO), the compressed image generation unit 113 individually compresses the detected plurality of difference regions. To generate a plurality of compressed images.
  • the transmission unit 117a transmits a transmission image message including the compressed image generated by the compressed image generation unit 113 and the number of pixels of the compressed image to the display terminal 200 (step S808), and the image transmission process is terminated.
  • FIG. 9 is a flowchart showing the overall flow of the image display processing in the present embodiment.
  • the receiving unit 217b of the wireless communication processing unit 217 receives a transmission image message including the compressed image and the number of pixels sent from the main device 100 (step S901).
  • the reception unit 217b extracts the compressed image and the number of pixels from the transmission image message, and notifies the image generation unit 214 of them.
  • the image generation unit 214 determines whether or not the notified number of pixels of the compressed image is smaller than a predetermined threshold (step S902).
  • the image generation unit 214 decompresses the compressed image by software using the software decompression unit 212 (step S903). If the number of pixels is equal to or greater than the threshold (step S902: NO), the image generation unit 214 decompresses the compressed image by hardware using the hardware decompression circuit 213 (step S904).
  • the image generation unit 214 draws the image information generated by the decompression process at the designated position of the image buffer 221 via the input / output interface 211 (step S905). As a result, the updated portion is reflected on the display 201.
  • the message generation unit 216 of the display terminal 200 transmits time information including the decompression processing time of the compressed image by software and hardware to the main device 100 at the start of the communication session. (Step S1001).
  • the message analysis unit 116 of the main device 100 When receiving the time information from the display terminal 200, the message analysis unit 116 of the main device 100 obtains the slope information and intercept information (function information) of the linear function for calculating the expansion processing time from the received time information, and the condition The function information stored in the storage unit 122 is updated.
  • the difference area detection unit 112 of the main device 100 detects an update part (difference area) in the display screen according to the result of the application process (step S1002).
  • the region integration determination unit 114 performs integration determination of a plurality of difference regions when a plurality of difference regions are detected (step S1003).
  • the compressed image generation unit 113 performs generation of a compressed image and transmission processing of a transmission image message including the compressed image (step S1004).
  • a compressed image obtained by compressing the image in the difference area is transmitted to the display terminal 200 (step S1005).
  • the time information may be implemented not only at the start of the communication session but also at regular time intervals.
  • the main apparatus 100 can generate and transmit a compressed image having the optimum size for the display apparatus (display terminal 200).
  • the efficiency of the processing on the display terminal 200 side which is performed by switching between the decompression process by software and the decompression process by hardware.
  • FIG. 11 is an explanatory diagram illustrating a hardware configuration of the communication device and the display device according to the present embodiment.
  • the communication device and the display device communicate with a control device such as a CPU (Central Processing Unit) 51 and a storage device such as a ROM (Read Only Memory) 52 and a RAM 53 while communicating with a network.
  • a control device such as a CPU (Central Processing Unit) 51 and a storage device such as a ROM (Read Only Memory) 52 and a RAM 53 while communicating with a network.
  • An I / F 54 an external storage device such as an HDD (Hard Disk Drive), a CD (Compact Disc) drive device, a display device such as a display device, an input device such as a keyboard and a mouse, and a bus 61 that connects each part.
  • I have.
  • the image transmission program executed by the communication device according to the present embodiment and the communication program executed by the display device are files in an installable format or an executable format, CD-ROM (Compact Disk Read Only Memory), flexible It is recorded on a computer-readable recording medium such as a disk (FD), CD-R (Compact Disk Recordable), DVD (Digital Versatile Disk) and the like.
  • CD-ROM Compact Disk Read Only Memory
  • FD disk
  • CD-R Compact Disk Recordable
  • DVD Digital Versatile Disk
  • the image transmission program executed by the communication device according to the present embodiment and the communication program executed by the display device are stored on a computer connected to a network such as the Internet, and provided by being downloaded via the network. You may comprise.
  • the image transmission program executed by the communication apparatus according to the present embodiment and the communication program executed by the display apparatus may be provided or distributed via a network such as the Internet.
  • image transmission program and the communication program according to the present embodiment may be provided by being incorporated in advance in a ROM or the like.
  • the image transmission program executed by the communication apparatus includes the above-described units (event acquisition unit, difference region detection unit, compressed image generation unit, region integration determination unit, processing time calculation unit, message analysis unit, communication).
  • the CPU 51 processor
  • the CPU 51 reads the image transmission program from the storage medium and executes it to load the above-described units onto the main storage device.
  • the above-described units are generated on the main storage device.
  • the communication program executed by the display device includes the above-described units (input / output interface, software expansion unit, hardware expansion circuit, image generation unit, generation time measurement unit, message generation unit, wireless communication). It is a module configuration including a processing unit and a session manager), and as actual hardware, the CPU 51 (processor) reads the communication program from the storage medium and executes it to load each unit on the main storage device, Each unit described above is generated on the main memory.
  • the apparatus, method, and program according to the present invention are suitable for an apparatus that realizes a function of sharing an application screen between apparatuses.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Compression Of Band Width Or Redundancy In Fax (AREA)
  • Facsimiles In General (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

Provided is a communication apparatus comprising: an updated image generation unit (111a) for generating an updated image in order to update a displayed image; a difference region detection unit (112) for detecting a difference region between an updated image and a displayed image; a processing time calculation unit (115) for calculating the software processing time when an image of a plurality of difference regions is expanded using software, and the hardware processing time when an image of an integrated region that includes the plurality of difference regions is expanded using hardware in the case where the plurality of difference regions are detected; a region integration determination unit (114) for determining whether the hardware processing time is shorter than the software processing time; a compressed image generation unit (113) for generating a compressed image in which the image of the integration region is compressed when the hardware processing time is shorter than the software processing time; and a transmission unit (117a) for transmitting the compressed image to a display apparatus.

Description

通信装置および表示装置Communication device and display device
 本発明は、装置間でアプリケーションの画面を共有する機能を実現する通信装置および表示装置に関する。 The present invention relates to a communication device and a display device that realize a function of sharing an application screen between devices.
 ユーザビリティの向上を目的として、最低限の入出力インターフェースを有する端末装置をユーザ側に配し、複雑な演算処理は遠隔地に設置された通信装置(本体装置)上で実行するコンピューティングシステムが存在する。例えば、特許文献1では、本体装置(パーソナルコンピュータ、サーバコンピュータ等)の画面情報を、ネットワークを介して遠隔の表示装置(表示端末)に投影するシステムに関する技術が提案されている。 For the purpose of improving usability, there is a computing system in which a terminal device having a minimum input / output interface is arranged on the user side, and complicated arithmetic processing is executed on a communication device (main device) installed at a remote place. To do. For example, Patent Document 1 proposes a technique related to a system that projects screen information of a main device (such as a personal computer or a server computer) onto a remote display device (display terminal) via a network.
 このようなシステムでは、表示端末からの入力情報(デジタイザによるペン入力等)は、同様にネットワークを介して本体装置に送信され、実際のアプリケーションプログラム処理は本体装置が実行する。その後、実行結果および画面更新情報が、ネットワークを介して表示端末に転送される。表示端末は、受信された画面更新情報により出力処理(描画処理)を実行する。 In such a system, input information (such as pen input by a digitizer) from the display terminal is similarly transmitted to the main unit via the network, and the actual application program processing is executed by the main unit. Thereafter, the execution result and the screen update information are transferred to the display terminal via the network. The display terminal executes output processing (drawing processing) based on the received screen update information.
 一方、遠隔ネットワーク上の本体装置からの画面情報を表示端末へ効率よく伝送するための技術として、VNC(Virtual Network Computing)が知られている。このVNCでは、画面の更新を検知した際、読み出した画素情報の値を前回表示端末に伝送した画素情報の値と比較し、前回から変化した更新画面領域を決定する。さらに更新画面領域を静止画圧縮した後、表示端末に対して、圧縮した画面の差分情報のみを伝送する。これにより、通信帯域の消費を抑えることができる。従って、ウィンドウの移動等、画面の変更が大きい場合には送信すべき画面情報量が増加し、反対に画面の変更が小さい場合には送信すべき画面情報量は減少する。 On the other hand, VNC (Virtual Network Computing) is known as a technique for efficiently transmitting screen information from a main unit on a remote network to a display terminal. In this VNC, when the update of the screen is detected, the value of the read pixel information is compared with the value of the pixel information transmitted to the previous display terminal, and the updated screen area changed from the previous time is determined. Further, after the still image compression of the updated screen area, only the compressed screen difference information is transmitted to the display terminal. Thereby, the consumption of a communication band can be suppressed. Accordingly, the amount of screen information to be transmitted increases when the screen change is large, such as movement of a window, and conversely, the amount of screen information to be transmitted decreases when the screen change is small.
 そこで、表示端末が本体装置から受信した圧縮画像の伸張処理を専用のハードウェアで行うことによって、更新画面の表示に関するレスポンス性を向上させることが考えられる。しかし、ハードウェアで処理する場合、一般的にハードウェアの初期化処理に要する時間、圧縮画像の伸張処理に要する時間、および処理完了の割り込み型通知までに要する待機時間などがオーバーヘッドとして必要となる。このため、伸張する画像の画素数の大きさによっては、ハードウェアで処理するよりもソフトウェアで処理する方が少ない時間で済む場合が存在し得る。 Therefore, it is conceivable to improve the responsiveness regarding the display of the update screen by performing the decompression process of the compressed image received from the main unit by the display terminal with dedicated hardware. However, when processing in hardware, generally, the time required for hardware initialization processing, the time required for decompression processing of compressed images, the waiting time required for interrupt type notification of processing completion, etc. are required as overhead. . For this reason, depending on the size of the number of pixels of the image to be expanded, there may be a case where it takes less time to process with software than with hardware.
 この問題に対しては、表示端末が、本体装置から受信した圧縮画像の画素数が予め定めた閾値よりも大きい場合にはハードウェアによる伸張処理を実行し、それ以外の場合にはソフトウェアによる伸張処理を実行するように、処理を切替える方法が考えうる。 To solve this problem, the display terminal executes a decompression process by hardware when the number of pixels of the compressed image received from the main device is larger than a predetermined threshold value, and otherwise decompresses by software. A method of switching the processing so as to execute the processing can be considered.
 なお、画像の情報に基づいて複数の処理の中から1つの処理を選択して実施する方法として、特許文献2では、静止画像の付加(ヘッダー)情報に基づいて、ハードウェアとソフトウェアのいずれかを選択して圧縮データを解凍する技術が提案されている。また、特許文献3では、画像の解像度情報に基づいて、可逆圧縮と非可逆圧縮のいずれかを選択して符号化する技術が提案されている。 As a method for selecting and implementing one process from a plurality of processes based on image information, in Patent Document 2, either hardware or software is selected based on still image addition (header) information. There has been proposed a technique for selecting and decompressing compressed data. Patent Document 3 proposes a technique of selecting and encoding either lossless compression or lossy compression based on image resolution information.
米国特許第6784855号公報US Pat. No. 6,784,855 特開2000-50263号公報JP 2000-50263 A 特開2008-42681号公報JP 2008-42681 A
 しかしながら、上述のように画素数が閾値よりも大きい圧縮画像はハードウェアによる伸張処理を実行し、画素数が閾値以下の圧縮画像はソフトウェアによる伸張処理を実行するように処理を切り替える方法では、表示端末側での画像伸張処理を効率化できない場合があった。 However, as described above, the compressed image having the number of pixels larger than the threshold is subjected to decompression processing by hardware, and the compressed image having the number of pixels equal to or less than the threshold is switched by software so that the processing is switched. In some cases, the efficiency of image expansion processing on the terminal side could not be improved.
 例えば、圧縮画像の画素数が閾値より小さくなる更新領域が複数存在する場合、複数の更新領域を1つに統合した矩形領域に対応する圧縮画像をハードウェアにより伸張処理する方が、複数の更新領域に対応する圧縮画像それぞれに対して複数回ソフトウェアにより伸張処理するより処理時間が少なくなる場合がある。しかし、上記の方法では、圧縮画像の画素数が閾値より小さくなる更新領域が複数存在する場合は、複数の更新領域に対応する圧縮画像それぞれに対して複数回のソフトウェア伸張処理を実行することしかできない。 For example, when there are a plurality of update areas in which the number of pixels of the compressed image is smaller than the threshold value, it is more likely that the compressed image corresponding to the rectangular area obtained by integrating the plurality of update areas into one is expanded by hardware. In some cases, the processing time may be shorter than when the compressed image corresponding to the area is expanded by software a plurality of times. However, in the above method, when there are a plurality of update areas in which the number of pixels of the compressed image is smaller than the threshold value, it is only necessary to execute software decompression processing a plurality of times for each of the compressed images corresponding to the plurality of update areas. Can not.
 本発明は、上記に鑑みてなされたものであって、通信装置から送信された圧縮画像をソフトウェアおよびハードウェアにより伸張可能な表示装置の処理を効率化することができる通信装置および表示装置を提供することを目的とする。 The present invention has been made in view of the above, and provides a communication device and a display device capable of improving the efficiency of processing of a display device capable of decompressing a compressed image transmitted from a communication device by software and hardware. The purpose is to do.
 本発明は、画像を表示可能な表示装置にネットワークを介して接続可能な通信装置であって、前記表示装置に表示する表示画像を記憶する画像記憶部と、前記表示画像を更新するための更新画像を生成する更新画像生成部と、前記更新画像と前記表示画像との間で画素情報が一致しない領域を示す差分領域を検出する検出部と、複数の前記差分領域が検出された場合に、複数の前記差分領域の画像を圧縮した各々の圧縮画像に対する伸張処理をソフトウェアにより実行したときの処理時間を表すソフトウェア処理時間と、複数の前記差分領域を含む1つの領域を表す統合領域の画像を圧縮した圧縮画像に対する伸張処理をハードウェアにより実行したときの処理時間を表すハードウェア処理時間とを算出する算出部と、前記ハードウェア処理時間が前記ソフトウェア処理時間より小さいか否かを判定する判定部と、前記ハードウェア処理時間が前記ソフトウェア処理時間より小さいと判定された場合に、前記統合領域の画像を圧縮した圧縮画像を生成する圧縮画像生成部と、生成された圧縮画像を前記表示装置に送信する送信部と、を備えたことを特徴とする。 The present invention is a communication device connectable to a display device capable of displaying an image via a network, an image storage unit for storing a display image to be displayed on the display device, and an update for updating the display image When an update image generation unit that generates an image, a detection unit that detects a difference region indicating a region where pixel information does not match between the update image and the display image, and a plurality of the difference regions are detected, A software processing time representing a processing time when software performs an expansion process on each compressed image obtained by compressing a plurality of the difference area images, and an integrated area image representing one area including the plurality of difference areas. A calculation unit for calculating a hardware processing time representing a processing time when the decompression process for the compressed image is executed by hardware; and the hardware process. A determination unit configured to determine whether or not time is smaller than the software processing time; and when the hardware processing time is determined to be smaller than the software processing time, a compressed image obtained by compressing the image of the integrated region is generated A compressed image generation unit and a transmission unit that transmits the generated compressed image to the display device are provided.
 また、本発明は、通信装置にネットワークを介して接続された表示装置であって、更新された領域の画像を圧縮した圧縮画像と更新された領域の画像の画素数とを前記通信装置から受信する受信部と、前記圧縮画像の伸張処理を実行可能な伸張回路と、前記圧縮画像の伸張処理をソフトウェアにより実行可能なソフトウェア伸張部と、前記画素数と予め定められた閾値とを比較し、前記画素数が前記閾値より小さい場合に前記ソフトウェア伸張部により前記圧縮画像を伸張して前記更新された領域に対応する更新画像を生成し、前記画素数が前記閾値以上の場合に前記伸張回路により前記圧縮画像を伸張して前記更新された領域に対応する表示画像を生成する画像生成部と、前記表示画像を表示する表示部と、前記ソフトウェア伸張部による伸張処理の処理時間を表すソフトウェア処理時間と前記伸張回路による伸張処理の処理時間を表すハードウェア処理時間とを計測する計測部と、前記ソフトウェア伸張部により伸張した画像の画素数と前記ソフトウェア処理時間とに基づいて画素数から前記ソフトウェア処理時間を算出する第1関数を生成し、前記伸張回路により伸張した画像の画素数と前記ハードウェア処理時間とに基づいて画素数から前記ハードウェア処理時間を算出する第2関数を生成する情報生成部と、前記第1関数および前記第2関数を前記通信装置に送信する送信部と、を備えたことを特徴とする。 In addition, the present invention is a display device connected to a communication device via a network, and receives a compressed image obtained by compressing an image in the updated region and the number of pixels of the image in the updated region from the communication device. A receiving unit, a decompression circuit capable of executing decompression processing of the compressed image, a software decompression unit capable of executing decompression processing of the compressed image by software, the number of pixels and a predetermined threshold value are compared, When the number of pixels is smaller than the threshold, the software decompression unit decompresses the compressed image to generate an updated image corresponding to the updated area. When the number of pixels is equal to or greater than the threshold, the decompression circuit An image generation unit that expands the compressed image to generate a display image corresponding to the updated area, a display unit that displays the display image, and the software expansion unit. A measurement unit that measures a software processing time that represents the processing time of the decompression process and a hardware processing time that represents a processing time of the decompression process by the decompression circuit, the number of pixels of the image decompressed by the software decompression unit, and the software processing time The first function for calculating the software processing time from the number of pixels is generated based on the number of pixels, and the hardware processing time is calculated from the number of pixels based on the number of pixels of the image expanded by the expansion circuit and the hardware processing time. An information generation unit that generates a second function to be calculated, and a transmission unit that transmits the first function and the second function to the communication device.
 本発明によれば、通信装置から送信された圧縮画像をソフトウェアおよびハードウェアにより伸張可能な表示装置の処理を効率化することができるという効果を奏する。 According to the present invention, it is possible to improve the efficiency of processing of a display device that can decompress a compressed image transmitted from a communication device by software and hardware.
本実施の形態にかかる通信システムの構成を示すブロック図である。It is a block diagram which shows the structure of the communication system concerning this Embodiment. 本実施の形態にかかる通信装置のブロック図である。It is a block diagram of the communication apparatus concerning this Embodiment. 画像の画素数と伸張処理時間との関係の一例を示す図である。It is a figure which shows an example of the relationship between the pixel number of an image, and the expansion process time. 本実施の形態にかかる表示装置のブロック図である。It is a block diagram of the display apparatus concerning this Embodiment. 画像の画素数と伸張処理時間との関係の一例を示す図である。It is a figure which shows an example of the relationship between the pixel number of an image, and the expansion process time. 時間情報を送信するための送信メッセージの一例を示す図である。It is a figure which shows an example of the transmission message for transmitting time information. 関数情報を送信するための送信メッセージの一例を示す図である。It is a figure which shows an example of the transmission message for transmitting function information. 本実施の形態における画像送信処理の全体の流れを示すフローチャートである。It is a flowchart which shows the whole flow of the image transmission process in this Embodiment. 本実施の形態における画像表示処理の全体の流れを示すフローチャートである。It is a flowchart which shows the whole flow of the image display process in this Embodiment. 表示装置と通信装置との間で通信される情報の流れを示すシーケンス図である。It is a sequence diagram which shows the flow of the information communicated between a display apparatus and a communication apparatus. 本実施の形態にかかる通信装置および表示装置のハードウェア構成を示す説明図である。It is explanatory drawing which shows the hardware constitutions of the communication apparatus and display apparatus concerning this Embodiment.
 以下に添付図面を参照して、この発明にかかる装置、方法およびプログラムの最良な実施の形態を詳細に説明する。 DETAILED DESCRIPTION Exemplary embodiments of an apparatus, a method, and a program according to the present invention will be described below in detail with reference to the accompanying drawings.
 本実施の形態にかかる通信システムは、アプリケーションを実行する本体装置(通信装置)と、アプリケーションの実行により更新された画面を表示する表示端末(表示装置)とを備える。本体装置は、アプリケーションプログラムの動作により発生する画像情報の差分領域が複数存在することを検出した場合に、当該複数の差分領域を1つの矩形領域に統合することによって、表示端末で実行される画像伸張処理の処理時間が短縮されるかを判定する。そして処理時間が短縮されると判定した場合、本体装置は複数の差分領域を統合した矩形領域に対応する画像情報を圧縮して表示端末に送信する。これにより、表示端末での画像伸張処理を効率化することができる。 The communication system according to the present embodiment includes a main device (communication device) that executes an application and a display terminal (display device) that displays a screen updated by executing the application. When the main device detects that there are a plurality of difference areas of image information generated by the operation of the application program, the main apparatus integrates the plurality of difference areas into one rectangular area, thereby executing an image executed on the display terminal It is determined whether the processing time of the decompression process is shortened. When it is determined that the processing time is shortened, the main body apparatus compresses the image information corresponding to the rectangular area obtained by integrating a plurality of difference areas and transmits the compressed image information to the display terminal. Thereby, the efficiency of the image expansion process at the display terminal can be improved.
 図1は、本実施の形態にかかる通信システムの構成を示すブロック図である。本実施の形態にかかる通信システムは、本体装置の画面に生じたイベントによる更新部分の画像を表示端末に伝送するシステムである。以下では、このような通信システムを画面転送システムと呼ぶ。 FIG. 1 is a block diagram showing a configuration of a communication system according to the present embodiment. The communication system according to the present embodiment is a system that transmits an image of an updated part due to an event occurring on the screen of the main device to a display terminal. Hereinafter, such a communication system is referred to as a screen transfer system.
 図1に示すように、本実施の形態にかかる画面転送システム10は、通信装置としての本体装置100と、本体装置100とネットワーク400を介して接続されたアクセスポイントとしての無線基地局300と、無線基地局300と無線LANにより無線通信する表示装置としての表示端末200とを備えている。 As shown in FIG. 1, a screen transfer system 10 according to the present embodiment includes a main body device 100 as a communication device, a radio base station 300 as an access point connected to the main body device 100 via a network 400, A display terminal 200 is provided as a display device that performs wireless communication with the wireless base station 300 through a wireless LAN.
 画面転送システム10は、本体装置100上で動作するアプリケーションソフトウェアの画面を、無線基地局300を介して表示端末200へ無線転送し、各表示端末200へ本体装置100のアプリケーション画面を表示する機能を有している。画面転送システム10では、本体装置100の側で更新された画面をリアルタイムに表示端末200へ転送するために、本体装置100の画面内で更新された部分の画像情報のみを転送する。すなわち、本体装置100は、画像情報を表示する表示端末200に対して、無線基地局300を介して画像情報を送信可能である。 The screen transfer system 10 has a function of wirelessly transferring a screen of application software operating on the main device 100 to the display terminal 200 via the radio base station 300 and displaying the application screen of the main device 100 on each display terminal 200. Have. In the screen transfer system 10, in order to transfer the screen updated on the main device 100 side to the display terminal 200 in real time, only the image information of the updated portion in the screen of the main device 100 is transferred. That is, the main device 100 can transmit the image information via the radio base station 300 to the display terminal 200 that displays the image information.
 表示端末200は、本体装置100から画像情報を受信し、受信した画像情報を伸張して画面内の対応する部分に表示する。 The display terminal 200 receives the image information from the main device 100, expands the received image information, and displays it on the corresponding part in the screen.
 無線基地局300は、IEEE802.11などの無線通信プロトコルに準拠した無線通信の基地局である。ネットワーク400は、例えば、IEEE802.3などの有線通信プロトコルに準拠したネットワークである。なお、ネットワーク形態はこれに限られず、他のプロトコルで接続するように構成してもよい。また、表示端末200と本体装置100とを有線ネットワークで接続するように構成してもよい。 The wireless base station 300 is a wireless communication base station that complies with a wireless communication protocol such as IEEE802.11. The network 400 is a network that conforms to a wired communication protocol such as IEEE802.3, for example. The network form is not limited to this, and it may be configured to connect with another protocol. Further, the display terminal 200 and the main device 100 may be connected via a wired network.
 次に、本体装置100の詳細な構成について図2を用いて説明する。図2は、本実施の形態にかかる本体装置100のブロック図である。同図に示すように、本体装置100は、ディスプレイ101と、入力デバイス102と、画像バッファ121と、条件記憶部122と、セッション情報記憶部123と、イベント取得部111と、差分領域検出部112と、圧縮画像生成部113と、領域統合判定部114と、処理時間算出部115と、メッセージ解析部116と、通信処理部117と、セッションマネージャ118と、を備えている。 Next, the detailed configuration of the main device 100 will be described with reference to FIG. FIG. 2 is a block diagram of main device 100 according to the present embodiment. As shown in the figure, the main apparatus 100 includes a display 101, an input device 102, an image buffer 121, a condition storage unit 122, a session information storage unit 123, an event acquisition unit 111, and a difference area detection unit 112. A compressed image generation unit 113, a region integration determination unit 114, a processing time calculation unit 115, a message analysis unit 116, a communication processing unit 117, and a session manager 118.
 ディスプレイ101は、LCD(Liquid Crystal Display)などで実現される表示装置である。入力デバイス102は、ディスプレイ101の画面に表示されたカーソルを移動操作するマウスなどで実現される。この他、入力デバイス102としては、キーボード、トラックボールなどを用いてもよい。 The display 101 is a display device realized by an LCD (Liquid Crystal Display) or the like. The input device 102 is realized by a mouse or the like that moves the cursor displayed on the screen of the display 101. In addition, as the input device 102, a keyboard, a trackball, or the like may be used.
 画像バッファ121は、画像を記憶する記憶部である。条件記憶部122は、画像の画素数から、圧縮画像を表示端末200でソフトウェアで伸張処理した場合の処理時間(以下、ソフトウェア処理時間という)と、圧縮画像を表示端末200でハードウェアで伸張処理した場合の処理時間(以下、ハードウェア処理時間という)を算出するための所定の条件を記憶する。例えば、条件記憶部122は、画素数を入力としてソフトウェア処理時間を出力する関数(第1関数)を導出するための関数情報、および、画素数を入力としてハードウェア処理時間を出力する関数(第2関数)を導出するための関数情報を条件として記憶する。 The image buffer 121 is a storage unit that stores images. Based on the number of pixels of the image, the condition storage unit 122 performs processing time when the compressed image is decompressed by software on the display terminal 200 (hereinafter referred to as software processing time), and decompresses the compressed image by hardware on the display terminal 200. A predetermined condition for calculating the processing time (hereinafter referred to as hardware processing time) is stored. For example, the condition storage unit 122 has function information for deriving a function (first function) that outputs the software processing time with the number of pixels as an input, and a function (first operation) that outputs the hardware processing time with the number of pixels as an input. The function information for deriving (two functions) is stored as a condition.
 例えば、画素数とソフトウェア処理時間またはハードウェア処理時間との関係が一次関数で表される場合は、一次関数の傾き情報および切片情報が関数を導出するための関数情報に相当する。なお、関数は一次関数に限られるものではない。 For example, when the relationship between the number of pixels and software processing time or hardware processing time is expressed by a linear function, the slope information and intercept information of the linear function correspond to function information for deriving the function. The function is not limited to a linear function.
 セッション情報記憶部123は、本体装置100とセッションを確立中の表示端末200に関する情報を表すセッション情報を記憶する。例えば、セッション情報記憶部123は、ユーザを識別するユーザ識別情報などと、セッションが利用中か否かを示す状態情報、および伝送制御がTCP(Transmission ControlProtocol)であるかUDP(User Datagram Protocol)であるかを表す伝送制御情報などの情報とを対応づけたセッション情報を記憶する。 The session information storage unit 123 stores session information representing information regarding the display terminal 200 that is establishing a session with the main device 100. For example, the session information storage unit 123 includes user identification information for identifying a user, status information indicating whether the session is being used, and whether transmission control is TCP (Transmission Control Protocol) or UDP (User Datagram Protocol). Session information in which information such as transmission control information indicating whether or not there is associated is stored.
 1台の本体装置100に対して、複数の表示端末200がセッションを確立している状態では、セッション情報記憶部123は、本体装置100から送信するメッセージの宛先とする表示端末200を識別する端末識別情報を含むセッション情報を記憶する。 In a state where a plurality of display terminals 200 have established a session for one main device 100, session information storage unit 123 identifies a display terminal 200 that is a destination of a message transmitted from main device 100. Session information including identification information is stored.
 なお、画像バッファ121、条件記憶部122およびセッション情報記憶部123は、HDD(Hard Disk Drive)、光ディスク、メモリカード、RAM(Random Access Memory)などの一般的に利用されているあらゆる記憶媒体により構成することができる。 The image buffer 121, the condition storage unit 122, and the session information storage unit 123 are configured by any commonly used storage medium such as an HDD (Hard Disk Drive), an optical disk, a memory card, and a RAM (Random Access Memory). can do.
 イベント取得部111は、アプリケーションプログラム等の動作により発生するイベントを取得する。例えば、イベント取得部111は、コンピュータを統括制御するオペレーティングシステム(OS)と、このOSに組み込まれたディスプレイドライバと同等の機能を有する仮想ディスプレイドライバと、OS上で動作するアプリケーションソフトウェアなどのアプリケーションプログラムで実現される。 The event acquisition unit 111 acquires an event generated by the operation of an application program or the like. For example, the event acquisition unit 111 includes an operating system (OS) that performs overall control of the computer, a virtual display driver having a function equivalent to a display driver incorporated in the OS, and application programs such as application software that operates on the OS. It is realized with.
 イベント取得部111は、アプリケーションソフトウェアにより画面が更新される場合や、マウス操作などでカーソルが移動操作されて画面内の任意の領域の画像が更新される場合に、画面(画像)が更新されたことをイベントとして取得する。 The event acquisition unit 111 updates the screen (image) when the screen is updated by application software or when the cursor is moved by a mouse operation or the like to update an image of an arbitrary area in the screen. Get that as an event.
 イベント取得部111は、詳細な構成として更新画像生成部111aを備えている。更新画像生成部111aは、例えばOSに組み込まれた仮想ディスプレイドライバとして実現できる。更新画像生成部111aは、画面更新のイベントが取得された場合、OSのグラフィックエンジンから描画命令を取得し、描画処理を行うことで表示端末200に表示させる画像を表す表示画像を生成して画像バッファ121へ順次出力して記憶する。これにより、画像バッファ121には、表示画像が順次保持される。 The event acquisition unit 111 includes an updated image generation unit 111a as a detailed configuration. The updated image generation unit 111a can be realized as, for example, a virtual display driver incorporated in the OS. When a screen update event is acquired, the update image generation unit 111a acquires a drawing command from the graphic engine of the OS, and generates a display image representing an image to be displayed on the display terminal 200 by performing a drawing process. The data is sequentially output to the buffer 121 and stored. As a result, display images are sequentially held in the image buffer 121.
 なお、以下では、画像バッファ121に保持されている画像を表示画像といい、更新画像生成部111aにより新たに生成され、画像バッファ121に保存する前の画像を更新画像という。このように、更新画像生成部111aは、アプリケーションプログラムの動作により発生するイベントに従って表示端末200に表示させる更新画像を生成する。 In the following description, an image stored in the image buffer 121 is referred to as a display image, and an image that is newly generated by the update image generation unit 111a and is stored in the image buffer 121 is referred to as an update image. As described above, the update image generation unit 111a generates an update image to be displayed on the display terminal 200 according to an event generated by the operation of the application program.
 差分領域検出部112は、画像バッファ121に順次保持される新旧の表示画像で画素情報が一致しない領域を表す差分領域を検出する。すなわち、差分領域検出部112は、更新画像生成部111aにより更新画像が生成されたことを通知された場合に、生成された新たな画像情報(更新画像)と画像バッファ121にバッファリングされた画像情報(表示画像)との間の差分領域を検出する。差分領域検出部112は、例えば、2つの画像情報間で一致しない部分を含む最小の矩形を、差分領域として検出する。なお、差分領域検出部112が、所定の時間間隔毎に差分の有無を確認するように構成してもよい。 The difference area detection unit 112 detects a difference area representing an area where the pixel information does not match between the old and new display images sequentially held in the image buffer 121. That is, the difference area detection unit 112, when notified by the update image generation unit 111a that an update image has been generated, generates new image information (update image) and the image buffered in the image buffer 121. A difference area between information (display image) is detected. The difference area detection unit 112 detects, for example, a minimum rectangle including a portion that does not match between two pieces of image information as a difference area. Note that the difference area detection unit 112 may be configured to check whether there is a difference at predetermined time intervals.
 圧縮画像生成部113は、差分領域検出部112により検出された差分領域の画像を送信用に圧縮処理した圧縮画像を生成する。圧縮画像は、JPEG(Joint Photographic Experts Group)のような不可逆圧縮を用いて生成してもよいし、可逆圧縮方式を用いて生成してもよい。 The compressed image generation unit 113 generates a compressed image obtained by compressing the difference region image detected by the difference region detection unit 112 for transmission. The compressed image may be generated using irreversible compression such as JPEG (Joint Photographic Experts Group) or may be generated using a lossless compression method.
 なお、圧縮画像生成部113および上述の差分領域検出部112は、画面転送用のアプリケーションプログラムなどにより実現される。 Note that the compressed image generation unit 113 and the above-described difference area detection unit 112 are realized by an application program for screen transfer.
 領域統合判定部114は、複数の差分領域が検出された場合に、当該複数の差分領域を統合する方が表示端末200にとって効率的となるか否かを判定する。例えば、領域統合判定部114は、差分領域検出部112から差分領域によって示される画像の画素数を含む差分領域情報の通知を受け、通知された画素数と条件記憶部122に記憶された条件とから、複数の差分領域を統合する方が効率的か否かを判定する。 The region integration determination unit 114 determines whether or not it is more efficient for the display terminal 200 to integrate the plurality of difference regions when a plurality of difference regions are detected. For example, the region integration determination unit 114 receives the notification of the difference region information including the number of pixels of the image indicated by the difference region from the difference region detection unit 112, and the condition stored in the condition storage unit 122 and the notified number of pixels. Therefore, it is determined whether it is more efficient to integrate a plurality of difference areas.
 差分領域によって示される画像の画素数としては、例えば、差分領域の画像の縦幅と横幅とを掛け合わせた値を用いる。例えば、縦幅320画素、横幅240画素の矩形領域を差分領域として検出した場合、差分領域検出部112は、双方を掛け合わせた76800画素を差分領域によって示される画像の画素数として領域統合判定部114に通知する。 As the number of pixels of the image indicated by the difference area, for example, a value obtained by multiplying the vertical width and the horizontal width of the image of the difference area is used. For example, when a rectangular area having a vertical width of 320 pixels and a horizontal width of 240 pixels is detected as the difference area, the difference area detection unit 112 sets the area integration determination unit as 76800 pixels obtained by multiplying both as the number of pixels of the image indicated by the difference area. 114 is notified.
 以下に、領域統合判定部114による判定処理の詳細について説明する。領域統合判定部114は、まず、処理時間算出部115を用いて複数の差分領域を1つに統合した領域(以下、統合領域という)に対するハードウェア処理時間と、複数の差分領域それぞれに対するソフトウェア処理時間とを算出する。 Hereinafter, details of the determination processing by the region integration determination unit 114 will be described. First, the region integration determination unit 114 uses the processing time calculation unit 115 to integrate a plurality of difference regions into a single region (hereinafter referred to as an integrated region) and a software process for each of the plurality of difference regions. Calculate time.
 すなわち、処理時間算出部115は、統合領域の画像を圧縮した圧縮画像をハードウェアにより伸張処理したときのハードウェア処理時間と、複数の差分領域の画像を圧縮した圧縮画像それぞれをソフトウェアにより伸張処理したときのソフトウェア処理時間とを算出する。 That is, the processing time calculation unit 115 performs a hardware processing time when a compressed image obtained by compressing an image in the integrated area is decompressed by hardware and a compressed process obtained by compressing each compressed image obtained by compressing a plurality of difference area images by software. The software processing time is calculated.
 具体的には、処理時間算出部115は、条件記憶部122に記憶した第1関数に、差分領域検出部112から通知された複数の差分領域の画像の画素数を入力し、ソフトウェア処理時間を算出する。また、処理時間算出部115は、条件記憶部122に記憶した第2関数に、差分領域検出部112が検出した複数の差分領域を統合した統合領域の画像の画素数を入力し、ハードウェア処理時間を算出する。 Specifically, the processing time calculation unit 115 inputs the number of pixels of the plurality of difference area images notified from the difference area detection unit 112 to the first function stored in the condition storage unit 122, and sets the software processing time. calculate. In addition, the processing time calculation unit 115 inputs the number of pixels of the integrated region image obtained by integrating the plurality of difference regions detected by the difference region detection unit 112 into the second function stored in the condition storage unit 122, and performs hardware processing. Calculate time.
 そして、領域統合判定部114は、算出したハードウェア処理時間がソフトウェア処理時間より小さいか否かを判定し、小さい場合に複数の差分領域を統合する方が効率的と判定して判定結果を差分領域検出部112に通知する。また、圧縮画像生成部113は、後述する領域統合判定部114により、複数の差分領域を統合した方が効率的と判定された場合に、複数の差分領域を統合した統合領域の画像を圧縮した圧縮画像を生成する。 Then, the region integration determination unit 114 determines whether or not the calculated hardware processing time is smaller than the software processing time. If the calculated hardware processing time is small, it is determined that it is more efficient to integrate a plurality of difference regions, and the determination result is determined as a difference. The area detection unit 112 is notified. In addition, the compressed image generation unit 113 compresses the image of the integrated region obtained by integrating the plurality of difference regions when the region integration determining unit 114 described later determines that the integration of the plurality of difference regions is more efficient. Generate a compressed image.
 以下に、領域統合判定部114による判定処理の詳細について図3を用いてさらに説明する。図3は、画像の画素数と伸張処理時間との関係の一例を示す図である。 Hereinafter, details of the determination processing by the region integration determination unit 114 will be further described with reference to FIG. FIG. 3 is a diagram illustrating an example of the relationship between the number of pixels of an image and the expansion processing time.
 同図の右上には、本体装置100の表示画面で2つの矩形領域α、βが更新すべき差分領域として検出された状態が示されている。なお、矩形領域301は、矩形領域α、βを1つに統合した統合領域を表す。矩形領域301は、例えば矩形領域α、βを含む最小の矩形領域として求めることができる。また、領域γは、矩形領域301内で実際には更新が発生していない矩形領域を示している。 In the upper right of the figure, a state where two rectangular areas α and β are detected as difference areas to be updated on the display screen of the main device 100 is shown. The rectangular area 301 represents an integrated area obtained by integrating the rectangular areas α and β into one. The rectangular area 301 can be obtained as a minimum rectangular area including rectangular areas α and β, for example. An area γ indicates a rectangular area in the rectangular area 301 that is not actually updated.
 また、同図では、横軸を画像の画素数、縦軸を表示端末200での圧縮画像の伸張処理時間とした場合の、ソフトウェア伸張処理による伸張処理時間と画素数との関係を表す関数(第1関数)と、ハードウェア伸張処理による診療処理時間と画素数との関係を表す関数(第2関数)の一例が併せて示されている。 Further, in the figure, a function (a relationship between the expansion processing time by the software expansion processing and the number of pixels when the horizontal axis is the number of pixels of the image and the vertical axis is the expansion processing time of the compressed image on the display terminal 200 ( An example of the first function) and a function (second function) representing the relationship between the medical processing time by the hardware expansion processing and the number of pixels is also shown.
 なお、同図では、ソフトウェア伸張処理による伸張処理時間と画素数との関係を表す関数として、1つの矩形領域を処理する場合処理時間を表す関数式を実線(SW(1矩形))で、2つの矩形領域を別個に処理する場合の処理開始から完了までの総処理時間を表す関数式を点線(SW(2矩形))で示している。 In the figure, as a function representing the relationship between the decompression processing time by the software decompression processing and the number of pixels, a function expression representing the processing time when processing one rectangular area is represented by a solid line (SW (1 rectangle)) 2 A function equation representing the total processing time from the start to the end of processing when processing two rectangular areas separately is indicated by a dotted line (SW (2 rectangles)).
 例えば、表示端末200が、矩形領域αとβをそれぞれソフトウェアで伸張処理する場合、SW(1矩形)の関数式における横軸「α」と「β」に該当する伸張処理時間の値それぞれが、各矩形領域に対する伸張処理時間となる。矩形領域αとβをそれぞれ別個に処理する場合は、関数の切片に該当するオーバーヘッドが処理ごとに必要となるため、同図のSW(2矩形)の関数式における横軸「α+β」に該当する伸張処理時間の値が、2つの矩形領域をソフトウェアで処理する場合の総処理時間となる。 For example, when the display terminal 200 decompresses the rectangular regions α and β by software, the values of the decompression processing times corresponding to the horizontal axes “α” and “β” in the SW (1 rectangle) function formula are This is the expansion processing time for each rectangular area. When the rectangular areas α and β are processed separately, an overhead corresponding to the function intercept is required for each processing, and thus corresponds to the horizontal axis “α + β” in the SW (2 rectangles) function expression of FIG. The value of the expansion processing time is the total processing time when the two rectangular areas are processed by software.
 一方、本体装置100は、矩形領域αとβに対して、更新が発生していない領域γを加えることによって、表示画面内の1つの矩形領域301に統合することができる。そして、本体装置100は、矩形領域301をハードウェアで伸張処理した場合の処理時間を、同図のHW(1矩形)の関数式を用いて算出することができる。 On the other hand, the main device 100 can be integrated into one rectangular area 301 in the display screen by adding a non-updated area γ to the rectangular areas α and β. The main device 100 can calculate the processing time when the rectangular area 301 is expanded by hardware using the HW (one rectangle) function formula of FIG.
 同図では、HW(1矩形)の関数式における横軸「α+β+γ」に対応する伸張処理時間が、SW(2矩形)の関数式における横軸「α+β」に対応する伸張処理時間よりも短くなる場合が例示されている。このような場合は、領域統合判定部114は、2つの矩形領域を含む1つの矩形領域を表示画面から切り出して圧縮画像を生成する方が、当該複数の矩形領域の圧縮画像を別個に生成するよりも、表示端末200にとって効率的であると判定することができる。この結果、差分領域検出部112は、複数の矩形領域を含む1つの矩形領域を切り出し、圧縮画像を生成する対象領域として圧縮画像生成部113に通知する。 In the drawing, the expansion processing time corresponding to the horizontal axis “α + β + γ” in the function equation of HW (1 rectangle) is the expansion processing time corresponding to the horizontal axis “α + β” in the function equation of SW (2 rectangles). The case where it becomes shorter than it is illustrated. In such a case, the region integration determination unit 114 generates a compressed image of the plurality of rectangular regions separately when one rectangular region including two rectangular regions is cut out from the display screen and the compressed image is generated. It can be determined that the display terminal 200 is more efficient. As a result, the difference area detection unit 112 cuts out one rectangular area including a plurality of rectangular areas, and notifies the compressed image generation unit 113 of a target area for generating a compressed image.
 なお、同図では、2つの矩形領域が差分領域として検出された場合の例を示したが、より多くの矩形領域が検出された場合は、検出された差分領域数分のオフセットを加算した切片情報を持つ関数式から各領域を別個にソフトウェアで伸張処理した場合の総処理時間を算出する。そして、算出した総処理時間と、1つの矩形領域内に統合してハードウェアで伸張処理した場合の処理時間とを比較し、ハードウェアによる伸張処理時間の方が小さければ、当該複数の矩形領域を含む1つの矩形領域を表示画面から切り出して圧縮画像を生成する。 In addition, although the figure showed the example when two rectangular areas were detected as a difference area, when more rectangular areas were detected, the intercept which added the offset for the number of detected difference areas The total processing time when each area is expanded separately by software is calculated from a function expression having information. Then, the calculated total processing time is compared with the processing time when it is integrated into one rectangular area and decompressed by hardware. If the hardware decompression time is shorter, the plurality of rectangular areas Is cut out from the display screen to generate a compressed image.
 図2に戻り、メッセージ解析部116は、表示端末200から受信したメッセージを解析する。例えば、メッセージ解析部116は、圧縮画像の伸張処理に要する時間を含む時間情報を表示端末200から受信した場合に、時間情報から第1関数および第2関数を求め、条件記憶部122に保存する。このように、本実施の形態では、実際の伸張処理に要する時間を表示端末200から取得し、取得した時間を元に処理時間を算出するための関数を動的に得ることができる。なお、メッセージ解析部116による上記処理を行わず、事前に関数を求めて条件記憶部122に記憶するように構成してもよい。 2, the message analysis unit 116 analyzes the message received from the display terminal 200. For example, when the time information including the time required for the decompression process of the compressed image is received from the display terminal 200, the message analysis unit 116 obtains the first function and the second function from the time information and stores them in the condition storage unit 122. . As described above, in the present embodiment, the time required for the actual decompression process is acquired from the display terminal 200, and a function for calculating the processing time can be dynamically obtained based on the acquired time. Instead of performing the above processing by the message analysis unit 116, a function may be obtained in advance and stored in the condition storage unit 122.
 通信処理部117は、表示端末200などの外部装置との間でメッセージを送受信する。通信処理部117は、メッセージを送信する送信部117aと、メッセージを受信する受信部117bとを備えている。例えば、送信部117aは、圧縮画像生成部113により生成された圧縮画像を含む送信画像メッセージを表示端末200に送信する。 The communication processing unit 117 transmits / receives a message to / from an external device such as the display terminal 200. The communication processing unit 117 includes a transmission unit 117a that transmits a message and a reception unit 117b that receives the message. For example, the transmission unit 117 a transmits a transmission image message including the compressed image generated by the compressed image generation unit 113 to the display terminal 200.
 送信画像メッセージは、更新する矩形領域の圧縮画像と矩形情報とを含む。圧縮画像は、JPEGなどの圧縮された静止画情報を示す。矩形情報は、更新する矩形領域の画像の描画位置を示す情報である。例えば、矩形情報は、(横方向始点座標、縦方向始点座標、横幅画素数、縦幅画素数)のような構成で画像の描画位置を示す。 The transmission image message includes a compressed image of the rectangular area to be updated and rectangular information. The compressed image indicates compressed still image information such as JPEG. The rectangular information is information indicating the drawing position of the image of the rectangular area to be updated. For example, the rectangular information indicates an image drawing position with a configuration such as (horizontal direction start point coordinates, vertical direction start point coordinates, horizontal width pixel number, vertical width pixel number).
 なお、送信部117aは、後述するセッションマネージャ118により特定された表示端末200をあて先として、送信すべきメッセージを無線基地局300を介して送信する。 The transmission unit 117a transmits a message to be transmitted via the radio base station 300 with the display terminal 200 specified by the session manager 118 described later as a destination.
 また、受信部117bは、表示端末200からメッセージを受信してメッセージ解析部116に渡す。例えば、受信部117bは、圧縮画像の伸張処理に要する時間を含む時間情報を表示端末200から受信する。 Further, the receiving unit 117b receives a message from the display terminal 200 and passes it to the message analyzing unit 116. For example, the receiving unit 117b receives time information including the time required for decompression processing of the compressed image from the display terminal 200.
 セッションマネージャ118は、表示端末200との間で確立した通信(セッション)を管理する。例えば、セッションマネージャ118は、ある表示端末200とセッションを確立したときに、表示端末200のユーザのユーザ識別情報、セッションの状態情報、および伝送制御情報などを対応づけたセッション情報を生成してセッション情報記憶部123に記憶する。そして、セッションマネージャ118は、メッセージの通信先とする表示端末200をセッション情報を参照して特定し、特定した表示端末200を通信先として通信処理部117によりメッセージを送受信する。 The session manager 118 manages communication (session) established with the display terminal 200. For example, when the session manager 118 establishes a session with a certain display terminal 200, the session manager 118 generates session information in which user identification information of the user of the display terminal 200, session state information, transmission control information, and the like are associated with each other. The information is stored in the information storage unit 123. Then, the session manager 118 specifies the display terminal 200 as a message communication destination with reference to the session information, and transmits / receives a message by the communication processing unit 117 with the specified display terminal 200 as the communication destination.
 次に、表示端末200の詳細な構成について図4を用いて説明する。図4は、本実施の形態にかかる表示端末200のブロック図である。同図に示すように、表示端末200は、ディスプレイ201と、入力デバイス202と、アンテナ203と、画像バッファ221と、生成時間記憶部222と、セッション情報記憶部223と、入出力インターフェース211と、ソフトウェア伸張部212と、ハードウェア伸張回路213と、画像生成部214と、生成時間計測部215と、メッセージ生成部216と、無線通信処理部217と、セッションマネージャ218と、を備えている。 Next, the detailed configuration of the display terminal 200 will be described with reference to FIG. FIG. 4 is a block diagram of the display terminal 200 according to the present embodiment. As shown in the figure, the display terminal 200 includes a display 201, an input device 202, an antenna 203, an image buffer 221, a generation time storage unit 222, a session information storage unit 223, an input / output interface 211, A software decompression unit 212, a hardware decompression circuit 213, an image generation unit 214, a generation time measurement unit 215, a message generation unit 216, a wireless communication processing unit 217, and a session manager 218 are provided.
 ディスプレイ201は、LCDなどで実現される表示装置である。入力デバイス202は、ディスプレイ201の画面に表示されたカーソルを移動操作するデジタイザやタッチスクリーンなどで実現される。入力デバイス202が取得した入力情報は、入出力インターフェース211(後述)に渡される。 The display 201 is a display device realized by an LCD or the like. The input device 202 is realized by a digitizer or a touch screen that moves a cursor displayed on the screen of the display 201. The input information acquired by the input device 202 is passed to the input / output interface 211 (described later).
 アンテナ203は、本体装置100などの外部装置との間で無線通信するための電波を送受信する。 The antenna 203 transmits and receives radio waves for wireless communication with an external device such as the main device 100.
 画像バッファ221は、画像を記憶する記憶部である。また、生成時間記憶部222は、後述する生成時間計測部215により計測された伸張画像の生成時間(伸張処理時間)をを記憶する。具体的には、生成時間記憶部222は、画像の伸張処理に用いた方式(ハードウェアによる伸張処理かソフトウェアによる伸張処理か)、画像の画素数(圧縮画像の縦幅と横幅を掛け合わせた値)、および伸張処理に実際に要した時間(伸張処理時間)を対応づけて記憶する。 The image buffer 221 is a storage unit that stores images. Further, the generation time storage unit 222 stores the generation time (extension processing time) of the expanded image measured by the generation time measurement unit 215 described later. Specifically, the generation time storage unit 222 multiplies the method used for image expansion processing (either hardware expansion processing or software expansion processing) and the number of pixels of the image (the vertical width and the horizontal width of the compressed image). Value) and the time actually required for the expansion processing (expansion processing time) are stored in association with each other.
 セッション情報記憶部223は、セッションを確立中の本体装置100に関する情報を表すセッション情報を記憶する。例えば、セッション情報記憶部223は、セッションの状態情報、および伝送制御情報などを含むセッション情報を記憶する。 The session information storage unit 223 stores session information representing information regarding the main device 100 that is establishing a session. For example, the session information storage unit 223 stores session information including session state information and transmission control information.
 なお、画像バッファ221、生成時間記憶部222およびセッション情報記憶部223は、HDD、光ディスク、メモリカード、RAMなどの一般的に利用されているあらゆる記憶媒体により構成することができる。 Note that the image buffer 221, the generation time storage unit 222, and the session information storage unit 223 can be configured by any commonly used storage medium such as an HDD, an optical disk, a memory card, and a RAM.
 入出力インターフェース211は、ディスプレイ201および入力デバイス202に対する入出力のインターフェースであり、GUI(Graphical User Interface)などのアプリケーションプログラムによって実現される。 The input / output interface 211 is an input / output interface for the display 201 and the input device 202, and is realized by an application program such as GUI (Graphical User Interface).
 例えば、入出力インターフェース211は、画像バッファ221から画像情報を取得し、ディスプレイ201に表示する。また、入出力インターフェース211は、本体装置100から送られた画像情報を、画像生成部214を介して取得し、画像バッファ221に書き込む機能に加え、表示端末200内で独自に生成されたGUI用の画像情報を画像バッファ221に書き込む機能を有する。 For example, the input / output interface 211 acquires image information from the image buffer 221 and displays it on the display 201. Further, the input / output interface 211 acquires the image information sent from the main body device 100 via the image generation unit 214 and writes it to the image buffer 221, in addition to the function for GUI generated independently in the display terminal 200. The image information is written into the image buffer 221.
 ソフトウェア伸張部212は、後述する画像生成部214の指示に応じて、本体装置100から受信した圧縮画像の伸張処理をソフトウェアにより実行する。ハードウェア伸張回路213は、画像生成部214の指示に応じて、本体装置100から受信した圧縮画像の伸張処理をハードウェアにより実行する。 The software decompression unit 212 performs decompression processing of the compressed image received from the main device 100 by software in accordance with an instruction from the image generation unit 214 described later. The hardware decompression circuit 213 executes the decompression processing of the compressed image received from the main body device 100 by hardware according to the instruction from the image generation unit 214.
 画像生成部214は、本体装置100から受信した圧縮画像を、ソフトウェア伸張部212またはハードウェア伸張回路213で伸張した後、伸張した画像情報を、描画用の画像バッファ221の指定描画位置へ書き込む。すなわち、画像生成部214は、本体装置100から送信され無線通信処理部217により受信された圧縮画像を伸張して生成した部分画像をディスプレイ201の指定位置へ表示する。 The image generation unit 214 expands the compressed image received from the main device 100 by the software expansion unit 212 or the hardware expansion circuit 213 and then writes the expanded image information to the designated drawing position of the image buffer 221 for drawing. That is, the image generation unit 214 displays a partial image generated by expanding the compressed image transmitted from the main device 100 and received by the wireless communication processing unit 217 at a designated position on the display 201.
 なお、画像生成部214は、圧縮画像の画素数の大きさと予め定めた閾値に基づいて、ソフトウェアによる伸張処理(ソフトウェア伸張部212による伸張処理)とハードウェアによる伸張処理(ハードウェア伸張回路213による伸張処理)とを切替える。なお、画像生成部214は、送信画像メッセージ内の矩形情報、または、圧縮画像のヘッダ内の画素数(解像度)情報から圧縮画像の画素数を取得することができる。 The image generation unit 214 performs software expansion processing (expansion processing by the software expansion unit 212) and hardware expansion processing (by the hardware expansion circuit 213) based on the size of the number of pixels of the compressed image and a predetermined threshold. Switch between (decompression processing). Note that the image generation unit 214 can acquire the number of pixels of the compressed image from the rectangular information in the transmission image message or the number of pixels (resolution) information in the header of the compressed image.
 例えば、閾値を縦幅100画素、横幅100画素を掛け合わせた10000画素の矩形領域とする。そして、例えば受信した圧縮画像が縦幅320画素、横幅240画素を掛け合わせた76800画素であれば、画像数が閾値よりも大きいため、画像生成部214は、ハードウェア伸張回路213を用いてハードウェアにより圧縮画像を伸張する。 For example, the threshold value is a rectangular area of 10,000 pixels obtained by multiplying the vertical width by 100 pixels and the horizontal width by 100 pixels. For example, if the received compressed image is 76800 pixels obtained by multiplying the vertical width by 320 pixels and the horizontal width by 240 pixels, the number of images is larger than the threshold value, so the image generation unit 214 uses the hardware expansion circuit 213 to The compressed image is decompressed by the hardware.
 また、例えば受信した圧縮画像が縦幅32画素、横幅32画素を掛け合わせた1024画素であれば、画像数が閾値よりも小さいため、画像生成部214は、ソフトウェア伸張部212を用いてソフトウェアにより圧縮画像を伸張する。 Further, for example, if the received compressed image is 1024 pixels obtained by multiplying the horizontal width by 32 pixels and the horizontal width by 32 pixels, the number of images is smaller than the threshold value. Decompress the compressed image.
 ここで、画像生成部214による伸張処理の切替方法について図5を用いてさらに説明する。図5は、画像の画素数と伸張処理時間との関係の一例を示す図である。 Here, a method for switching the expansion process by the image generation unit 214 will be further described with reference to FIG. FIG. 5 is a diagram illustrating an example of the relationship between the number of pixels of an image and the expansion processing time.
 同図は、解像度(画素数)を横軸、伸張処理時間を縦軸とした関数式を示している。「SW」がソフトウェアによる伸張処理を行った場合の画素数と伸張処理時間の関係を表している。また、「HW」がハードウェアによる伸張処理を行った場合の画素数と伸張処理時間の関係を表している。 This figure shows a functional equation with the resolution (number of pixels) as the horizontal axis and the expansion processing time as the vertical axis. “SW” represents the relationship between the number of pixels and the expansion process time when the expansion process is performed by software. “HW” represents the relationship between the number of pixels and the expansion processing time when the expansion processing is performed by hardware.
 ハードウェアによる伸張処理を行う場合は、ハードウェアの初期化処理や伸張処理完了通知をハンドリングするまでの待機時間、具体的にはハードウェアデバイスからオペレーティングシステムへの割り込み通知までの待機時間などが固定のオーバーヘッドとして存在する。このため、同図に示すように関数の切片値が大きくなる。一方、ソフトウェアによる伸張処理を行う場合は、そのようなオーバーヘッドの問題は解消されるため、関数の切片値は小さくなる。 When performing decompression processing by hardware, the waiting time until handling hardware initialization processing and decompression processing completion notification, specifically waiting time from hardware device to interrupt notification to operating system is fixed. Exist as overhead. For this reason, the intercept value of the function becomes large as shown in FIG. On the other hand, when decompression processing by software is performed, such an overhead problem is solved, so that the intercept value of the function becomes small.
 また、圧縮画像を伸張する部分に焦点を当てた処理時間は、ハードウェアにて処理する方がソフトウェアにて処理するよりも小さい。このため、ハードウェアによる処理時間と画素数との関係を示す関数式の傾きは、ソフトウェアによる関数式の傾きよりも小さくなる。従って、画像の画素数が小さい場合にはソフトウェアで伸張処理を実行する方が効率がよく、画像の画素数が大きい場合にはハードウェアで伸張処理を実行する方が効率がよい。 Also, the processing time focused on the portion where the compressed image is expanded is shorter when processed by hardware than when processed by software. For this reason, the slope of the function formula indicating the relationship between the processing time by hardware and the number of pixels is smaller than the slope of the function formula by software. Therefore, it is more efficient to execute the decompression process with software when the number of pixels of the image is small, and it is more efficient to execute the extension process with hardware when the number of pixels of the image is large.
 このような状況を鑑み、画像生成部214は、双方の関数式が交わる交点部分に該当する画素値を、ソフトウェアによる伸張処理とハードウェアによる伸張処理を切替えて実施する際の閾値とすることが望ましい。 In view of such a situation, the image generation unit 214 may set the pixel value corresponding to the intersection portion where the two function formulas intersect as a threshold when switching between software expansion processing and hardware expansion processing. desirable.
 同図のような関数を導出するための情報は、表示端末200にて圧縮画像の伸張を実際に行いながら、伸張処理の種別情報と、伸張した画像の画素数、伸張処理時間などを統計情報として取得し、生成時間記憶部222に記憶するように構成することができる。なお、予め定めた統計情報を表示端末200の生成時間記憶部222や本体装置100の条件記憶部122に記憶する方法を用いてもよい。 The information for deriving the function as shown in the figure is statistical information on the type of decompression processing, the number of pixels of the decompressed image, the decompression processing time, etc. while actually decompressing the compressed image on the display terminal 200. Can be obtained and stored in the generation time storage unit 222. A method of storing predetermined statistical information in the generation time storage unit 222 of the display terminal 200 or the condition storage unit 122 of the main device 100 may be used.
 図2に戻り、生成時間計測部215は、画像生成部214による伸張処理時間を計測し、生成時間記憶部222に記憶する。具体的には、生成時間計測部215は、画像生成部214が圧縮画像の伸張処理に用いた方式(伸張方式)と、伸張した画像の画素数と、伸張処理時間とを対応づけて生成時間記憶部222に保存する。 2, the generation time measurement unit 215 measures the decompression processing time by the image generation unit 214 and stores it in the generation time storage unit 222. Specifically, the generation time measurement unit 215 associates the method used by the image generation unit 214 for decompression processing of the compressed image (decompression method), the number of pixels of the decompressed image, and the decompression processing time, and generates the generation time. Save in the storage unit 222.
 メッセージ生成部216は、本体装置100などの外部装置に送信するメッセージを生成する。例えば、メッセージ生成部216は、生成時間記憶部222に保存された情報から、伸張方式と、伸張した画像の画素数と、伸張処理時間とを対応づけた時間情報を、本体装置100に送信するメッセージとして生成する。 The message generation unit 216 generates a message to be transmitted to an external device such as the main device 100. For example, the message generation unit 216 transmits time information in which the expansion method, the number of pixels of the expanded image, and the expansion processing time are associated with each other from the information stored in the generation time storage unit 222. Generate as a message.
 図6は、時間情報を送信するための送信メッセージの一例を示す図である。図6は、表示端末200で実際に計測した圧縮画像の伸張処理時間を含む送信メッセージの構成例を示している。同図に示すように、時間情報の送信メッセージは、IP(Internet Protocol)ヘッダと、UDP/TCPヘッダと、メッセージ種別と、伸張方式と、領域情報と、伸張処理時間とを含んでいる。 FIG. 6 is a diagram illustrating an example of a transmission message for transmitting time information. FIG. 6 shows a configuration example of a transmission message including the decompression processing time of the compressed image actually measured by the display terminal 200. As shown in the figure, the time information transmission message includes an IP (Internet Protocol) header, a UDP / TCP header, a message type, an expansion method, area information, and an expansion processing time.
 IPヘッダは、送受信端末の宛先アドレスを識別するために用いる。UDP/TCPヘッダは、伝送制御プロトコルの情報を示す。メッセージ種別は、当該メッセージが表示端末200が生成した時間情報を含むメッセージであることを識別するために用いる。伸張方式は、伸張処理時間がソフトウェア処理時間を示すか、ハードウェア処理時間を示すかを識別する情報である。領域情報は、伸張した画像の画素数を含む。伸張処理時間は、実際に計測した伸張処理の処理時間を例えばミリ秒単位で記載する。なお、伸張方式から伸張処理時間までの情報は、表示端末200が計測したN個分だけ付加される。 The IP header is used to identify the destination address of the transmission / reception terminal. The UDP / TCP header indicates transmission control protocol information. The message type is used for identifying that the message is a message including time information generated by the display terminal 200. The expansion method is information for identifying whether the expansion processing time indicates the software processing time or the hardware processing time. The area information includes the number of pixels of the expanded image. The expansion processing time describes the processing time of the actually measured expansion processing in units of milliseconds, for example. Note that the information from the expansion method to the expansion processing time is added by N pieces measured by the display terminal 200.
 上述のように、本体装置100のメッセージ解析部116は、図6のような時間情報を含む送信メッセージを受信した場合、この送信メッセージを解析して得られる時間情報から、画素数と伸張処理時間との関係を示す関数式を求める。すなわち、同図のメッセージ構成を用いる場合は、関数式の導出を表示端末200ではなく本体装置100が行う。 As described above, when the message analysis unit 116 of the main device 100 receives a transmission message including time information as shown in FIG. 6, the number of pixels and the expansion processing time are calculated from the time information obtained by analyzing the transmission message. A function expression indicating the relationship between and is obtained. That is, when the message configuration shown in FIG. 6 is used, the main body apparatus 100 does not derive the function expression but the display terminal 200.
 これに対し、表示端末200が、時間情報から関数式を求め、求めた関数式を表す関数情報を本体装置100に送信するように構成することもできる。図7は、このように構成した場合に表示端末200から本体装置100に関数情報を送信するための送信メッセージの一例を示す図である。 On the other hand, the display terminal 200 can also be configured to obtain a function expression from the time information and transmit the function information representing the obtained function expression to the main device 100. FIG. 7 is a diagram illustrating an example of a transmission message for transmitting function information from the display terminal 200 to the main device 100 when configured in this manner.
 図7は、画素数とソフトウェア処理時間との関係を表す第1関数を導出するための傾き情報および切片情報と、画素数とハードウェア処理時間との関係を表す第2関数を導出するための傾き情報および切片情報とを含む送信メッセージの構成例を示している。伸張方式は、伸張処理時間がソフトウェア処理時間を示すか、ハードウェア処理時間を示すかを識別する情報である。傾き情報は、図3のような画素数と伸張処理時間との関係を示す関数式の傾きを示す。切片情報は、関数式の切片を示す。図7に示すように、伸張方式から切片情報までの情報は、ソフトウェア処理とハードウェア処理とに対応する2個分が付加される。 FIG. 7 shows the slope information and intercept information for deriving the first function that represents the relationship between the number of pixels and the software processing time, and the second function that represents the relationship between the number of pixels and the hardware processing time. The example of a structure of the transmission message containing inclination information and intercept information is shown. The expansion method is information for identifying whether the expansion processing time indicates the software processing time or the hardware processing time. The inclination information indicates the inclination of a functional expression indicating the relationship between the number of pixels and the expansion processing time as shown in FIG. The intercept information indicates the intercept of the functional expression. As shown in FIG. 7, two pieces of information corresponding to software processing and hardware processing are added to the information from the decompression method to the intercept information.
 なお、メッセージ生成部216は、例えば本体装置100との通信セッションが開始された場合や、一定の時間間隔ごとに、生成した時間情報(または関数情報)を無線通信処理部217を介して本体装置100に送信する。 Note that the message generation unit 216 receives the generated time information (or function information) via the wireless communication processing unit 217 when, for example, a communication session with the main device 100 is started or at regular time intervals. To 100.
 図2に戻り、無線通信処理部217は、無線基地局300との間でアンテナ203を介して信号を送受信する。無線通信処理部217は、メッセージを送信する送信部217aと、メッセージを受信する受信部217bとを備えている。無線通信処理部217は、IEEE802.11などに準拠した無線LAN機能によって実現される。 2, the radio communication processing unit 217 transmits and receives signals to and from the radio base station 300 via the antenna 203. The wireless communication processing unit 217 includes a transmission unit 217a that transmits a message and a reception unit 217b that receives the message. The wireless communication processing unit 217 is realized by a wireless LAN function compliant with IEEE 802.11 or the like.
 例えば、無線通信処理部217の受信部217bは、受信した無線信号を復調してパケットを生成し、パケットのメッセージ種別に応じて画像生成部214へ該当データを渡す。例えば、パケットが圧縮画像を含む送信画像メッセージのパケットであった場合、そのパケットから抽出した圧縮画像および画素数などの情報が画像生成部214に渡される。 For example, the reception unit 217b of the wireless communication processing unit 217 demodulates the received wireless signal to generate a packet, and passes the corresponding data to the image generation unit 214 according to the message type of the packet. For example, when the packet is a packet of a transmission image message including a compressed image, information such as the compressed image and the number of pixels extracted from the packet is passed to the image generation unit 214.
 また、例えば入力デバイス202がユーザからの入力情報を取得した場合、入出力インターフェース211によって入力情報から座標情報などが解析された後、無線通信処理部217の送信部217aを介して本体装置100に送信される。この場合、本体装置100は、表示端末200より受信した入力情報を元に、アプリケーション処理を実行する。アプリケーション処理の結果として画面の領域内に更新が発生した場合、本体装置100は、更新する画像情報を取得して送信画像メッセージを生成した後、表示端末200に向けて送信する。なお、この場合も、本体装置100側では、検出した画像の差分領域が複数存在する場合は、当該複数の差分領域を1つの矩形領域に統合する方が表示端末200の伸張処理時に効率的となり得るかどうかの判定処理を行った上で、送信画像メッセージを生成する。 For example, when the input device 202 acquires input information from the user, the input / output interface 211 analyzes the coordinate information from the input information and then transmits the input information to the main body device 100 via the transmission unit 217a of the wireless communication processing unit 217. Sent. In this case, the main device 100 executes application processing based on the input information received from the display terminal 200. When an update occurs in the screen area as a result of the application process, the main device 100 acquires the image information to be updated, generates a transmission image message, and transmits the message to the display terminal 200. Also in this case, on the main device 100 side, when there are a plurality of difference areas of the detected image, it is more efficient at the time of expansion processing of the display terminal 200 to integrate the plurality of difference areas into one rectangular area. A transmission image message is generated after determining whether to obtain the transmission image message.
 セッションマネージャ218は、本体装置100との間で確立した通信(セッション)を管理する。例えば、セッションマネージャ218は、本体装置100とセッションを確立したときに、セッションの状態情報、および伝送制御情報などを対応づけたセッション情報を生成してセッション情報記憶部223に記憶する。 The session manager 218 manages communication (session) established with the main device 100. For example, when the session manager 218 establishes a session with the main device 100, the session manager 218 generates session information in which session state information, transmission control information, and the like are associated with each other, and stores the session information in the session information storage unit 223.
 次に、このように構成された本実施の形態にかかる本体装置100による画像送信処理について図8を用いて説明する。図8は、本実施の形態における画像送信処理の全体の流れを示すフローチャートである。 Next, image transmission processing by the main body apparatus 100 according to the present embodiment configured as described above will be described with reference to FIG. FIG. 8 is a flowchart showing the overall flow of the image transmission process in the present embodiment.
 まず、差分領域検出部112が、所定の時間間隔ごとに、画像バッファ121に記憶された画像情報(表示画像)と、更新画像生成部111aにより生成された新たな画像情報(更新画像)との差分領域を検出する(ステップS801)。 First, the difference area detection unit 112 performs image information (display image) stored in the image buffer 121 and new image information (update image) generated by the update image generation unit 111a at predetermined time intervals. A difference area is detected (step S801).
 差分領域が検出された場合、領域統合判定部114は、複数の差分領域が検出されたか否かを判断する(ステップS802)。複数の差分領域が検出された場合(ステップS802:YES)、処理時間算出部115が、複数の差分領域をそれぞれソフトウェアにより伸張処理する場合の伸張処理時間(ソフトウェア処理時間)を算出する(ステップS803)。例えば、2つの差分領域が検出された場合であれば、処理時間算出部115は、図3のSW(2矩形)の関数式を用いることにより、複数の差分領域の画像の画素数に対応するソフトウェア処理時間を算出する。 When the difference area is detected, the area integration determination unit 114 determines whether or not a plurality of difference areas are detected (step S802). When a plurality of difference areas are detected (step S802: YES), the processing time calculation unit 115 calculates an extension processing time (software processing time) when each of the plurality of difference areas is extended by software (step S803). ). For example, if two difference areas are detected, the processing time calculation unit 115 corresponds to the number of pixels of the images of the plurality of difference areas by using the SW (2 rectangle) function formula of FIG. Calculate software processing time.
 次に、処理時間算出部115は、複数の差分領域を統合した1つの統合領域をハードウェアにより伸張処理する場合の伸張処理時間(ハードウェア処理時間)を算出する(ステップS804)。例えば、処理時間算出部115は、図3のHW(1矩形)の関数式を用いることにより、統合領域の画像の画素数に対応するハードウェア処理時間を算出する。 Next, the processing time calculation unit 115 calculates an expansion processing time (hardware processing time) when one integrated region obtained by integrating a plurality of difference regions is extended by hardware (step S804). For example, the processing time calculation unit 115 calculates the hardware processing time corresponding to the number of pixels of the image in the integrated region by using the function formula of HW (one rectangle) in FIG.
 次に、領域統合判定部114は、算出されたハードウェア処理時間が算出されたソフトウェア処理時間より小さいか否かを判定する(ステップS805)。ハードウェア処理時間がソフトウェア処理時間より小さい場合(ステップS805:YES)、差分領域検出部112は、複数の差分領域を統合した統合領域を、圧縮画像を生成する対象領域として圧縮画像生成部113に通知する(ステップS806)。 Next, the region integration determination unit 114 determines whether or not the calculated hardware processing time is smaller than the calculated software processing time (step S805). If the hardware processing time is smaller than the software processing time (step S805: YES), the difference area detection unit 112 sets the integrated area obtained by integrating a plurality of difference areas as a target area for generating a compressed image to the compressed image generation unit 113. Notification is made (step S806).
 次に、圧縮画像生成部113は、統合領域の画像を圧縮した圧縮画像を生成する(ステップS807)。なお、ステップS802で、複数の差分領域が検出されなかった場合は(ステップS802:NO)、圧縮画像生成部113は、検出された1つの差分領域の画像を圧縮した圧縮画像を生成する。また、ステップS805で、ハードウェア処理時間がソフトウェア処理時間より小さくないと判定された場合は(ステップS805:NO)、圧縮画像生成部113は、検出された複数の差分領域をそれぞれ別個に圧縮して複数の圧縮画像を生成する。 Next, the compressed image generation unit 113 generates a compressed image obtained by compressing the image in the integrated area (step S807). If a plurality of difference areas are not detected in step S802 (step S802: NO), the compressed image generation unit 113 generates a compressed image obtained by compressing the detected image of one difference area. If it is determined in step S805 that the hardware processing time is not shorter than the software processing time (step S805: NO), the compressed image generation unit 113 individually compresses the detected plurality of difference regions. To generate a plurality of compressed images.
 次に、送信部117aが、圧縮画像生成部113により生成された圧縮画像と圧縮画像の画素数とを含む送信画像メッセージを表示端末200に送信し(ステップS808)、画像送信処理を終了する。 Next, the transmission unit 117a transmits a transmission image message including the compressed image generated by the compressed image generation unit 113 and the number of pixels of the compressed image to the display terminal 200 (step S808), and the image transmission process is terminated.
 次に、このように構成された本実施の形態にかかる表示端末200による画像表示処理について図9を用いて説明する。図9は、本実施の形態における画像表示処理の全体の流れを示すフローチャートである。 Next, image display processing by the display terminal 200 according to the present embodiment configured as described above will be described with reference to FIG. FIG. 9 is a flowchart showing the overall flow of the image display processing in the present embodiment.
 まず、無線通信処理部217の受信部217bが、本体装置100から送られた圧縮画像と画素数とを含む送信画像メッセージを受信する(ステップS901)。受信部217bは、送信画像メッセージから圧縮画像と画素数とを抽出し、画像生成部214に通知する。 First, the receiving unit 217b of the wireless communication processing unit 217 receives a transmission image message including the compressed image and the number of pixels sent from the main device 100 (step S901). The reception unit 217b extracts the compressed image and the number of pixels from the transmission image message, and notifies the image generation unit 214 of them.
 次に、画像生成部214が、通知された圧縮画像の画素数が所定の閾値より小さいか否かを判定する(ステップS902)。画素数が閾値より小さい場合(ステップS902:YES)、画像生成部214は、ソフトウェア伸張部212を用いてソフトウェアにより圧縮画像を伸張する(ステップS903)。画素数が閾値以上の場合(ステップS902:NO)、画像生成部214は、ハードウェア伸張回路213を用いてハードウェアにより圧縮画像を伸張する(ステップS904)。 Next, the image generation unit 214 determines whether or not the notified number of pixels of the compressed image is smaller than a predetermined threshold (step S902). When the number of pixels is smaller than the threshold (step S902: YES), the image generation unit 214 decompresses the compressed image by software using the software decompression unit 212 (step S903). If the number of pixels is equal to or greater than the threshold (step S902: NO), the image generation unit 214 decompresses the compressed image by hardware using the hardware decompression circuit 213 (step S904).
 次に、画像生成部214は、入出力インターフェース211を介して、伸張処理で生成された画像情報を、画像バッファ221の指定位置に描画する(ステップS905)。これにより、ディスプレイ201に更新部分が反映されることになる。 Next, the image generation unit 214 draws the image information generated by the decompression process at the designated position of the image buffer 221 via the input / output interface 211 (step S905). As a result, the updated portion is reflected on the display 201.
 次に、表示端末200と本体装置100との間で通信される情報の流れを図10のシーケンス図を用いて説明する。同図の例では、まず、表示端末200のメッセージ生成部216が、通信セッションの開始時に、ソフトウェアおよびハードウェアによる圧縮画像の伸張処理時間を含む時間情報を本体装置100に対して送信している(ステップS1001)。 Next, the flow of information communicated between the display terminal 200 and the main device 100 will be described with reference to the sequence diagram of FIG. In the example shown in the figure, first, the message generation unit 216 of the display terminal 200 transmits time information including the decompression processing time of the compressed image by software and hardware to the main device 100 at the start of the communication session. (Step S1001).
 本体装置100のメッセージ解析部116は、表示端末200から時間情報を受信すると、受信した時間情報から、伸張処理時間を算出するための一次関数の傾き情報および切片情報(関数情報)を求め、条件記憶部122に記憶されている関数情報を更新する。 When receiving the time information from the display terminal 200, the message analysis unit 116 of the main device 100 obtains the slope information and intercept information (function information) of the linear function for calculating the expansion processing time from the received time information, and the condition The function information stored in the storage unit 122 is updated.
 その後、本体装置100の差分領域検出部112が、アプリケーション処理の結果に伴う表示画面内の更新部分(差分領域)を検出する(ステップS1002)。領域統合判定部114は、複数の差分領域が検出された場合に、複数の差分領域の統合判定を行う(ステップS1003)。その結果を元に、圧縮画像生成部113が、圧縮画像の生成と圧縮画像を含む送信画像メッセージの送信処理を行う(ステップS1004)。これにより、差分領域の画像を圧縮した圧縮画像が表示端末200に伝送される(ステップS1005)。なお、時間情報は、通信セッションの開始時だけではなく、一定の時間間隔毎に実施してもよい。 After that, the difference area detection unit 112 of the main device 100 detects an update part (difference area) in the display screen according to the result of the application process (step S1002). The region integration determination unit 114 performs integration determination of a plurality of difference regions when a plurality of difference regions are detected (step S1003). Based on the result, the compressed image generation unit 113 performs generation of a compressed image and transmission processing of a transmission image message including the compressed image (step S1004). As a result, a compressed image obtained by compressing the image in the difference area is transmitted to the display terminal 200 (step S1005). The time information may be implemented not only at the start of the communication session but also at regular time intervals.
 このように、本実施の形態にかかる通信装置(本体装置100)によれば、本体装置100が表示装置(表示端末200)にとって最適な大きさの圧縮画像を生成して送信することができるため、圧縮画像のソフトウェアによる伸張処理とハードウェアによる伸張処理を切り替えて実施する表示端末200側の処理を効率化することが可能となる。 As described above, according to the communication apparatus (main apparatus 100) according to the present embodiment, the main apparatus 100 can generate and transmit a compressed image having the optimum size for the display apparatus (display terminal 200). Thus, it is possible to improve the efficiency of the processing on the display terminal 200 side, which is performed by switching between the decompression process by software and the decompression process by hardware.
 次に、本実施の形態にかかる通信装置(本体装置100)および表示装置(表示端末200)のハードウェア構成について図11を用いて説明する。図11は、本実施の形態にかかる通信装置および表示装置のハードウェア構成を示す説明図である。 Next, the hardware configuration of the communication device (main device 100) and the display device (display terminal 200) according to the present embodiment will be described with reference to FIG. FIG. 11 is an explanatory diagram illustrating a hardware configuration of the communication device and the display device according to the present embodiment.
 本実施の形態にかかる通信装置および表示装置は、CPU(Central Processing Unit)51などの制御装置と、ROM(Read Only Memory)52やRAM53などの記憶装置と、ネットワークに接続して通信を行う通信I/F54と、HDD(Hard Disk Drive)、CD(Compact Disc)ドライブ装置などの外部記憶装置と、ディスプレイ装置などの表示装置と、キーボードやマウスなどの入力装置と、各部を接続するバス61を備えている。 The communication device and the display device according to the present embodiment communicate with a control device such as a CPU (Central Processing Unit) 51 and a storage device such as a ROM (Read Only Memory) 52 and a RAM 53 while communicating with a network. An I / F 54, an external storage device such as an HDD (Hard Disk Drive), a CD (Compact Disc) drive device, a display device such as a display device, an input device such as a keyboard and a mouse, and a bus 61 that connects each part. I have.
 本実施の形態にかかる通信装置で実行される画像送信プログラムおよび表示装置で実行される通信プログラムは、インストール可能な形式又は実行可能な形式のファイルでCD-ROM(Compact Disk Read Only Memory)、フレキシブルディスク(FD)、CD-R(Compact Disk Recordable)、DVD(Digital Versatile Disk)等のコンピュータで読み取り可能な記録媒体に記録されて提供される。 The image transmission program executed by the communication device according to the present embodiment and the communication program executed by the display device are files in an installable format or an executable format, CD-ROM (Compact Disk Read Only Memory), flexible It is recorded on a computer-readable recording medium such as a disk (FD), CD-R (Compact Disk Recordable), DVD (Digital Versatile Disk) and the like.
 また、本実施の形態にかかる通信装置で実行される画像送信プログラムおよび表示装置で実行される通信プログラムを、インターネット等のネットワークに接続されたコンピュータ上に格納し、ネットワーク経由でダウンロードさせることにより提供するように構成してもよい。また、本実施の形態にかかる通信装置で実行される画像送信プログラムおよび表示装置で実行される通信プログラムをインターネット等のネットワーク経由で提供または配布するように構成してもよい。 In addition, the image transmission program executed by the communication device according to the present embodiment and the communication program executed by the display device are stored on a computer connected to a network such as the Internet, and provided by being downloaded via the network. You may comprise. The image transmission program executed by the communication apparatus according to the present embodiment and the communication program executed by the display apparatus may be provided or distributed via a network such as the Internet.
 また、本実施の形態の画像送信プログラムおよび通信プログラムを、ROM等に予め組み込んで提供するように構成してもよい。 Further, the image transmission program and the communication program according to the present embodiment may be provided by being incorporated in advance in a ROM or the like.
 本実施の形態にかかる通信装置で実行される画像送信プログラムは、上述した各部(イベント取得部、差分領域検出部、圧縮画像生成部、領域統合判定部、処理時間算出部、メッセージ解析部、通信処理部、セッションマネージャ)を含むモジュール構成となっており、実際のハードウェアとしてはCPU51(プロセッサ)が上記記憶媒体から画像送信プログラムを読み出して実行することにより上記各部が主記憶装置上にロードされ、上述した各部が主記憶装置上に生成されるようになっている。 The image transmission program executed by the communication apparatus according to the present embodiment includes the above-described units (event acquisition unit, difference region detection unit, compressed image generation unit, region integration determination unit, processing time calculation unit, message analysis unit, communication). As a real hardware, the CPU 51 (processor) reads the image transmission program from the storage medium and executes it to load the above-described units onto the main storage device. The above-described units are generated on the main storage device.
 また、本実施の形態にかかる表示装置で実行される通信プログラムは、上述した各部(入出力インターフェース、ソフトウェア伸張部、ハードウェア伸張回路、画像生成部、生成時間計測部、メッセージ生成部、無線通信処理部、セッションマネージャ)を含むモジュール構成となっており、実際のハードウェアとしてはCPU51(プロセッサ)が上記記憶媒体から通信プログラムを読み出して実行することにより上記各部が主記憶装置上にロードされ、上述した各部が主記憶装置上に生成されるようになっている。 The communication program executed by the display device according to the present embodiment includes the above-described units (input / output interface, software expansion unit, hardware expansion circuit, image generation unit, generation time measurement unit, message generation unit, wireless communication). It is a module configuration including a processing unit and a session manager), and as actual hardware, the CPU 51 (processor) reads the communication program from the storage medium and executes it to load each unit on the main storage device, Each unit described above is generated on the main memory.
 以上のように、本発明にかかる装置、方法およびプログラムは、装置間でアプリケーションの画面を共有する機能を実現する装置等に適している。 As described above, the apparatus, method, and program according to the present invention are suitable for an apparatus that realizes a function of sharing an application screen between apparatuses.
 51 CPU
 52 ROM
 53 RAM
 54 通信I/F
 61 バス
 10 画面転送システム
 100 本体装置
 101 ディスプレイ
 102 入力デバイス
 111 イベント取得部
 111a 更新画像生成部
 112 差分領域検出部
 113 圧縮画像生成部
 114 領域統合判定部
 115 処理時間算出部
 116 メッセージ解析部
 117 通信処理部
 117a 送信部
 117b 受信部
 118 セッションマネージャ
 121 画像バッファ
 122 条件記憶部
 123 セッション情報記憶部
 200 表示端末
 201 ディスプレイ
 202 入力デバイス
 203 アンテナ
 211 入出力インターフェース
 212 ソフトウェア伸張部
 213 ハードウェア伸張回路
 214 画像生成部
 215 生成時間計測部
 216 メッセージ生成部
 217 無線通信処理部
 217a 送信部
 217b 受信部
 218 セッションマネージャ
 221 画像バッファ
 222 生成時間記憶部
 223 セッション情報記憶部
 300 無線基地局
 301 矩形領域
 400 ネットワーク
51 CPU
52 ROM
53 RAM
54 Communication I / F
61 Bus 10 Screen Transfer System 100 Main Device 101 Display 102 Input Device 111 Event Acquisition Unit 111a Update Image Generation Unit 112 Difference Region Detection Unit 113 Compressed Image Generation Unit 114 Region Integration Determination Unit 115 Processing Time Calculation Unit 116 Message Analysis Unit 117 Communication Processing Unit 117a transmission unit 117b reception unit 118 session manager 121 image buffer 122 condition storage unit 123 session information storage unit 200 display terminal 201 display 202 input device 203 antenna 211 input / output interface 212 software expansion unit 213 hardware expansion circuit 214 image generation unit 215 Generation time measurement unit 216 Message generation unit 217 Wireless communication processing unit 217a Transmission unit 217b Reception unit 218 Session money Catcher 221 image buffer 222 generating time storage unit 223 the session information storage unit 300 radio base station 301 rectangular regions 400 Network

Claims (8)

  1.  画像を表示可能な表示装置にネットワークを介して接続可能な通信装置であって、
     前記表示装置に表示する表示画像を記憶する画像記憶部と、
     前記表示画像を更新するための更新画像を生成する更新画像生成部と、
     前記更新画像と前記表示画像との間で画素情報が一致しない領域を示す差分領域を検出する検出部と、
     複数の前記差分領域が検出された場合に、複数の前記差分領域の画像を圧縮した各々の圧縮画像に対する伸張処理をソフトウェアにより実行したときの処理時間を表すソフトウェア処理時間と、複数の前記差分領域を含む1つの領域を表す統合領域の画像を圧縮した圧縮画像に対する伸張処理をハードウェアにより実行したときの処理時間を表すハードウェア処理時間とを算出する算出部と、
     前記ハードウェア処理時間が前記ソフトウェア処理時間より小さいか否かを判定する判定部と、
     前記ハードウェア処理時間が前記ソフトウェア処理時間より小さいと判定された場合に、前記統合領域の画像を圧縮した圧縮画像を生成する圧縮画像生成部と、
     生成された圧縮画像を前記表示装置に送信する送信部と、
     を備えたことを特徴とする通信装置。
    A communication device connectable to a display device capable of displaying an image via a network,
    An image storage unit for storing a display image to be displayed on the display device;
    An update image generation unit for generating an update image for updating the display image;
    A detection unit for detecting a difference region indicating a region where pixel information does not match between the updated image and the display image;
    When a plurality of difference areas are detected, a software processing time representing a processing time when the decompression process for each compressed image obtained by compressing the images of the plurality of difference areas is executed by software; and the plurality of difference areas A calculation unit that calculates a hardware processing time that represents a processing time when the decompression process for the compressed image obtained by compressing the image of the integrated region that represents one area including
    A determination unit for determining whether the hardware processing time is smaller than the software processing time;
    A compressed image generation unit that generates a compressed image obtained by compressing the image of the integrated region when it is determined that the hardware processing time is smaller than the software processing time;
    A transmission unit for transmitting the generated compressed image to the display device;
    A communication apparatus comprising:
  2.  画像の画素数から前記ソフトウェア処理時間および前記ハードウェア処理時間を算出するための予め定められた条件を記憶する条件記憶部をさらに備え、
     前記算出部は、複数の前記差分領域の画像それぞれの画素数から前記条件を用いて前記ソフトウェア処理時間を算出し、前記統合領域の画像の画素数から前記条件を用いて前記ハードウェア処理時間を算出すること、
     を特徴とする請求項1に記載の通信装置。
    A condition storage unit that stores a predetermined condition for calculating the software processing time and the hardware processing time from the number of pixels of an image;
    The calculation unit calculates the software processing time using the condition from the number of pixels of each of the plurality of difference area images, and calculates the hardware processing time using the condition from the number of pixels of the integrated area image. To calculate,
    The communication apparatus according to claim 1.
  3.  前記条件記憶部は、画像の画素数を入力した結果として前記ソフトウェア処理時間を出力する第1関数と、画像の画素数を入力した結果として前記ハードウェア処理時間を出力する第2関数とを前記条件として記憶し、
     前記算出部は、複数の前記差分領域の画像の画素数を前記第1関数に入力することによって前記ソフトウェア処理時間を算出し、前記統合領域の画像の画素数を前記第2関数に入力した結果として出力される前記ハードウェア処理時間を算出すること、
     を特徴とする請求項2に記載の通信装置。
    The condition storage unit includes a first function that outputs the software processing time as a result of inputting the number of pixels of an image, and a second function that outputs the hardware processing time as a result of inputting the number of pixels of an image. Remember as a condition,
    The calculation unit calculates the software processing time by inputting the number of pixels of the image of the plurality of difference areas to the first function, and the result of inputting the number of pixels of the image of the integration area to the second function Calculating the hardware processing time output as
    The communication device according to claim 2.
  4.  伸張処理をソフトウェアにより実行したかハードウェアにより実行したかを表す伸張方式と、伸張した画像の画素数と、伸張処理に要した処理時間とを対応づけた時間情報を前記表示装置から受信する受信部と、
     伸張処理をソフトウェアにより実行したことを表す前記伸張方式を含む前記時間情報から前記第1関数を生成し、伸張処理をハードウェアにより実行したことを表す前記伸張方式を含む前記時間情報から前記第2関数を生成し、生成した前記第1関数および前記第2関数を前記条件として前記条件記憶部に保存するメッセージ解析部と、をさらに備えたこと、
     を特徴とする請求項3に記載の通信装置。
    Reception that receives from the display device time information that associates a decompression method that indicates whether the decompression process is performed by software or hardware, the number of pixels of the decompressed image, and the processing time required for the decompression process And
    The first function is generated from the time information including the decompression method representing that the decompression process has been executed by software, and the second function is derived from the time information including the decompression method representing that the decompression process has been performed by hardware. A message analysis unit that generates a function and stores the generated first function and the second function as the conditions in the condition storage unit;
    The communication device according to claim 3.
  5.  前記第1関数および前記第2関数を前記表示装置から受信する受信部と、
     受信された前記第1関数および前記第2関数を前記条件として前記条件記憶部に保存するメッセージ解析部と、をさらに備えたこと、
     を特徴とする請求項4に記載の通信装置。
    A receiving unit for receiving the first function and the second function from the display device;
    A message analysis unit that stores the received first function and the second function as the conditions in the condition storage unit;
    The communication device according to claim 4.
  6.  前記検出部は、前記更新画像と前記表示画像との間で一致しない矩形領域を表す前記差分領域を検出し、
     前記算出部は、複数の前記差分領域が検出された場合に、前記ソフトウェア処理時間と、複数の前記差分領域を含む最小の矩形領域を表す前記統合領域に対する画像処理をハードウェアにより実行したときの前記ハードウェア処理時間とを算出すること、
     を特徴とする請求項5に記載の通信装置。
    The detection unit detects the difference area representing a rectangular area that does not match between the update image and the display image;
    When the plurality of difference areas are detected, the calculation unit executes the software processing time and the image processing for the integrated area representing the smallest rectangular area including the plurality of difference areas by hardware. Calculating the hardware processing time;
    The communication device according to claim 5.
  7.  通信装置にネットワークを介して接続された表示装置であって、
     更新された領域の画像を圧縮した圧縮画像と更新された領域の画像の画素数とを前記通信装置から受信する受信部と、
     前記圧縮画像の伸張処理を実行可能な伸張回路と、
     前記圧縮画像の伸張処理をソフトウェアにより実行可能なソフトウェア伸張部と、
     前記画素数と予め定められた閾値とを比較し、前記画素数が前記閾値より小さい場合に前記ソフトウェア伸張部により前記圧縮画像を伸張して前記更新された領域に対応する更新画像を生成し、前記画素数が前記閾値以上の場合に前記伸張回路により前記圧縮画像を伸張して前記更新された領域に対応する表示画像を生成する画像生成部と、
     前記表示画像を表示する表示部と、
     前記ソフトウェア伸張部による伸張処理の処理時間を表すソフトウェア処理時間と前記伸張回路による伸張処理の処理時間を表すハードウェア処理時間とを計測する計測部と、
     伸張処理を前記ソフトウェア伸張部により実行したか前記伸張回路により実行したかを表す伸張方式と、伸張した画像の画素数と、前記ソフトウェア処理時間および前記ハードウェア処理時間のいずれかと、対応づけた時間情報を生成する情報生成部と、
     前記時間情報を前記通信装置に送信する送信部と、
     を備えたことを特徴とする表示装置。
    A display device connected to a communication device via a network,
    A receiving unit that receives the compressed image obtained by compressing the image of the updated region and the number of pixels of the image of the updated region from the communication device;
    A decompression circuit capable of executing decompression processing of the compressed image;
    A software decompression unit capable of executing decompression processing of the compressed image by software;
    Comparing the number of pixels with a predetermined threshold, and when the number of pixels is smaller than the threshold, decompress the compressed image by the software decompression unit to generate an updated image corresponding to the updated region, An image generation unit that generates a display image corresponding to the updated region by expanding the compressed image by the expansion circuit when the number of pixels is equal to or greater than the threshold;
    A display unit for displaying the display image;
    A measurement unit that measures a software processing time that represents the processing time of the decompression process by the software decompression unit and a hardware processing time that represents the processing time of the decompression process by the decompression circuit;
    A time associated with a decompression method indicating whether the decompression process is performed by the software decompression unit or the decompression circuit, the number of pixels of the decompressed image, and either the software processing time or the hardware processing time An information generator for generating information;
    A transmission unit for transmitting the time information to the communication device;
    A display device comprising:
  8.  通信装置にネットワークを介して接続された表示装置であって、
     更新された領域の画像を圧縮した圧縮画像と更新された領域の画像の画素数とを前記通信装置から受信する受信部と、
     前記圧縮画像の伸張処理を実行可能な伸張回路と、
     前記圧縮画像の伸張処理をソフトウェアにより実行可能なソフトウェア伸張部と、
     前記画素数と予め定められた閾値とを比較し、前記画素数が前記閾値より小さい場合に前記ソフトウェア伸張部により前記圧縮画像を伸張して前記更新された領域に対応する更新画像を生成し、前記画素数が前記閾値以上の場合に前記伸張回路により前記圧縮画像を伸張して前記更新された領域に対応する表示画像を生成する画像生成部と、
     前記表示画像を表示する表示部と、
     前記ソフトウェア伸張部による伸張処理の処理時間を表すソフトウェア処理時間と前記伸張回路による伸張処理の処理時間を表すハードウェア処理時間とを計測する計測部と、
     前記ソフトウェア伸張部により伸張した画像の画素数と前記ソフトウェア処理時間とに基づいて画素数から前記ソフトウェア処理時間を算出する第1関数を生成し、前記伸張回路により伸張した画像の画素数と前記ハードウェア処理時間とに基づいて画素数から前記ハードウェア処理時間を算出する第2関数を生成する情報生成部と、
     前記第1関数および前記第2関数を前記通信装置に送信する送信部と、
     を備えたことを特徴とする表示装置。
    A display device connected to a communication device via a network,
    A receiving unit that receives the compressed image obtained by compressing the image of the updated region and the number of pixels of the image of the updated region from the communication device;
    A decompression circuit capable of executing decompression processing of the compressed image;
    A software decompression unit capable of executing decompression processing of the compressed image by software;
    Comparing the number of pixels with a predetermined threshold, and when the number of pixels is smaller than the threshold, decompress the compressed image by the software decompression unit to generate an updated image corresponding to the updated region, An image generation unit that generates a display image corresponding to the updated region by expanding the compressed image by the expansion circuit when the number of pixels is equal to or greater than the threshold;
    A display unit for displaying the display image;
    A measurement unit that measures a software processing time that represents the processing time of the decompression process by the software decompression unit and a hardware processing time that represents the processing time of the decompression process by the decompression circuit;
    A first function for calculating the software processing time from the number of pixels is generated based on the number of pixels of the image expanded by the software expansion unit and the software processing time, and the number of pixels of the image expanded by the expansion circuit and the hardware An information generation unit that generates a second function that calculates the hardware processing time from the number of pixels based on the hardware processing time;
    A transmitter that transmits the first function and the second function to the communication device;
    A display device comprising:
PCT/JP2009/068699 2008-10-30 2009-10-30 Communication apparatus and display apparatus WO2010050593A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2008279007A JP5159562B2 (en) 2008-10-30 2008-10-30 Communication device, image transmission method, image transmission program, display device, communication method, and communication program
JP2008-279007 2008-10-30

Publications (1)

Publication Number Publication Date
WO2010050593A1 true WO2010050593A1 (en) 2010-05-06

Family

ID=42128952

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/068699 WO2010050593A1 (en) 2008-10-30 2009-10-30 Communication apparatus and display apparatus

Country Status (2)

Country Link
JP (1) JP5159562B2 (en)
WO (1) WO2010050593A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5496130B2 (en) * 2011-02-28 2014-05-21 京セラドキュメントソリューションズ株式会社 Image forming apparatus
US9811292B2 (en) 2013-09-06 2017-11-07 Seiko Epson Corporation Using image difference data to reduce data processing
JP6318771B2 (en) * 2014-03-28 2018-05-09 セイコーエプソン株式会社 Control device, printing system, and control method of control device
JP6264790B2 (en) * 2013-09-06 2018-01-24 セイコーエプソン株式会社 Image processing apparatus, printing system, and image processing method

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000050084A (en) * 1998-07-28 2000-02-18 Canon Inc Picture processor, picture processing method and sotrage medium

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000050084A (en) * 1998-07-28 2000-02-18 Canon Inc Picture processor, picture processing method and sotrage medium

Also Published As

Publication number Publication date
JP5159562B2 (en) 2013-03-06
JP2010109637A (en) 2010-05-13

Similar Documents

Publication Publication Date Title
US8072631B2 (en) Image transmission apparatus, display apparatus and method
JP5678743B2 (en) Information processing apparatus, image transmission program, image transmission method, and image display method
US20160112259A1 (en) Home Cloud with Virtualized Input and Output Roaming over Network
US7983651B2 (en) Communication apparatus, communication method and communication system
WO2010055792A1 (en) Communication device, communication method, and communication program
JP5821610B2 (en) Information processing apparatus, information processing method, and program
WO2021169236A1 (en) Rendering method and apparatus
JPWO2012157014A1 (en) Remote operation communication device and navigation device
JP5159562B2 (en) Communication device, image transmission method, image transmission program, display device, communication method, and communication program
US20170272545A1 (en) Method and system for transmitting remote screen
JP6274067B2 (en) Information processing apparatus and information processing method
US20130002521A1 (en) Screen relay device, screen relay system, and computer -readable storage medium
CN111625211A (en) Screen projection method and device, android device and display device
WO2016016607A1 (en) Managing display data for display
JP4675944B2 (en) Image processing apparatus, image processing method, and image processing program
JP5200979B2 (en) Image transfer apparatus, method and program
JP2005033763A (en) Transmission apparatus, image processing system, image processing method, program, and recording medium
JP5401877B2 (en) Information processing apparatus, information processing system, power saving method, and program
US20150106733A1 (en) Terminal device, thin client system, display method, and recording medium
JP2004187062A (en) Remote control system and its image transferring method
JP6922344B2 (en) Information processing equipment, information processing system, and information processing method
JP2010098622A (en) Computer
JP5278048B2 (en) Image supply apparatus, image supply system, image supply method, and image supply program
JP2010119030A (en) Communication device, communication method, and communication program
CN109003313B (en) Method, device and system for transmitting webpage picture

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09823703

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09823703

Country of ref document: EP

Kind code of ref document: A1