US20040130568A1 - Display system, network interactive display device, terminal, and control program - Google Patents

Display system, network interactive display device, terminal, and control program Download PDF

Info

Publication number
US20040130568A1
US20040130568A1 US10623518 US62351803A US2004130568A1 US 20040130568 A1 US20040130568 A1 US 20040130568A1 US 10623518 US10623518 US 10623518 US 62351803 A US62351803 A US 62351803A US 2004130568 A1 US2004130568 A1 US 2004130568A1
Authority
US
Grant status
Application
Patent type
Prior art keywords
display
terminal
screen
window
display device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10623518
Inventor
Miki Nagano
Norihiro Yoshikuni
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Seiko Epson Corp
Original Assignee
Seiko Epson Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation, e.g. computer aided management of electronic mail or groupware; Time management, e.g. calendars, reminders, meetings or time accounting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television, VOD [Video On Demand]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network, synchronizing decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television, VOD [Video On Demand]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network, synchronizing decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4314Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for fitting data in a restricted space on the screen, e.g. EPG data in a rectangular grid
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television, VOD [Video On Demand]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network, synchronizing decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television, VOD [Video On Demand]
    • H04N21/60Selective content distribution, e.g. interactive television, VOD [Video On Demand] using Network structure or processes specifically adapted for video distribution between server and client or between remote clients; Control signaling specific to video distribution between clients, server and network components, e.g. to video encoder or decoder; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/647Control signaling between network components and server or clients; Network processes for video distribution between server and clients, e.g. controlling the quality of the video stream, by dropping packets, protecting content from unauthorised alteration within the network, monitoring of network load, bridging between two different networks, e.g. between IP and wireless
    • H04N21/64784Data processing by the network
    • H04N21/64792Controlling the complexity of the content stream, e.g. by dropping packets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems

Abstract

A technique for presenting display screens of a plurality terminals connected to a network on a multi-window screen of a display screen of a display device. A communication unit receives image data captured and then sent by each terminal having a screen capture function. A display control unit controls an image synthesizer to synthesize the captured image data into single screen multi-window format data. A multi-window screen is thus presented.

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of the Invention [0001]
  • The present invention relates to a technique that presents screens of a plurality terminals connected to a network on a single screen of a display device in a multi-window presentation fashion. [0002]
  • 2. Description of the Related Art [0003]
  • FIG. 25 shows a structure of a conventional display system. Connected to a network [0004] 210 as shown are a projector 220, a notebook computer 230, and desk-top computers 251, 252, 253, and 254 as network interactive display devices. Further connected to the network 210 are tablets 261, 262, 263, and 264, each of which directly inputs drawings to the projector 220 and inserts data in the already projected drawings. A screen capture software program is already installed on each of the notebook computer 230 and the desk-top computers 251, 252, 253, and 254.
  • In the conventional display system thus constructed, the screen capture software program captures the content displayed on the screen of the notebook computer [0005] 230, and the captured image data is then sent to the projector 220 through the network 210. The screen presented on the notebook computer 230 is thus projected and displayed on the projector 220. Operating a remote controller as an attachment to the projector 220, the projected image on the projector 220 is switched from a screen of the notebook computer 230 to a screen of the desk-top computer 251, for example.
  • The image presented on the projector [0006] 220 in the conventional display system is one of the computer screen images of the notebook computer 230, and the desk-top computers 251, 252, 253, and 254. To compare the contents of one screen to another of the notebook computer 230 and the desk-top computers 251, 252, 253, and 254, the user is forced to switch the screens from one to another. There is a growing need for a function that allows screens of a plurality of personal computers on a single screen of a display device.
  • To satisfy such a need, development efforts have been made. However, no such a display system satisfying the need exist. For one reason, a large throughput is required of a projector (a network interactive display device) in a display system, and a workload on a network also increases. [0007]
  • SUMMARY OF THE INVENTION
  • Accordingly, it is an object of the present invention to provide a network interactive display device and a display control program for presenting display screens of a plurality of terminals, connected to a network, on a screen of a display of the display device in a multi-window presentation fashion. [0008]
  • It is also another object of the present invention to provide a display system, a network interactive display device, a display control software program, a network interactive projector, a network interactive plasma display apparatus, a network interactive liquid-crystal display apparatus, a terminal, and a control software program, each for presenting display screens of a plurality of terminals, connected to a network, on a screen of a display of the network interactive display device in a multi-window presentation fashion without introducing an increase in workload on the network interactive display device and the network. [0009]
  • A display system in one aspect of the present invention includes a plurality of terminals, each terminal having a screen capture function, and sending image data, captured using the screen capture function, over a network, and a network interactive display device, including a display, and receiving the captured image data transmitted from the terminal through the network, and having a multi-window screen presentation function for synthesizing the captured image data into single screen multi-window format data to be displayed on a display screen of the display, wherein, as processes required to present the single screen multi-window format data on the display screen of the display of the network interactive display device, the terminal performs a size conversion process of an image size of the image data captured using the screen capture function and the network interactive display device acquires the captured image data subsequent to the size conversion thereof from the terminal, and synthesizes the received captured image data. [0010]
  • The size conversion process for a multi-window screen presentation function is performed on the terminal. The display system presents the screens of the plurality terminals connected to the network on the display of the network interactive display device in a multi-window format without introducing an increase in workload on the network interactive display device and the network. [0011]
  • In the display system of a preferred embodiment of the present invention, the network interactive display device may divide the display screen of the display into windows of the number equal to the number of terminals to be displayed, may determine a display size of the window assigned to each terminal to be displayed, and may send information of the display size to the terminal, and the terminal may perform the size conversion on the image size of the captured image data to the received display size when the terminal receives the display size. [0012]
  • In accordance with the preferred embodiment of the present invention, the terminal performs the size conversion process based on the display size determined by the network interactive display device. [0013]
  • In the display system of a preferred embodiment of the present invention, the terminal may further perform a color conversion process on the captured image data in accordance with a color count of the display of the network interactive display device before sending the captured image data to the network interactive display device, in addition to the size conversion process on the image data captured using the screen capture function. [0014]
  • In the preferred embodiment of the present invention, the workload on the network interactive display device is further reduced because the terminal performs the color conversion process before sending the captured image data to the network interactive display device. [0015]
  • In the display system of a preferred embodiment of the present invention, the network interactive display device may also send the color count of own display to the terminal when sending the display size to the terminal, while the terminal may perform the color conversion process in response to the color count received from the network interactive display device. [0016]
  • In accordance with the preferred embodiment of the present invention, the terminal performs the color conversion process in response to the color count designated by and received from the network interactive display device. [0017]
  • In the display system of a preferred embodiment of the present invention, the network interactive display device may include a projector. [0018]
  • In accordance with the preferred embodiment of the present invention, the network interactive display device is the projector. [0019]
  • In the display system of a preferred embodiment of the present invention, the network interactive display device may include a plasma display. [0020]
  • In accordance with the preferred embodiment, a plasma display is used as the network interactive display device. [0021]
  • In the display system of a preferred embodiment of the present invention, the network interactive display device may include a liquid-crystal monitor. [0022]
  • In accordance with the preferred embodiment, a liquid-crystal monitor is used as the network interactive display device. [0023]
  • In the display system of a preferred embodiment of the present invention, the network interactive display device may include an organic EL display. [0024]
  • In accordance with the preferred embodiment, an organic EL (Electroluminescent) display is used as the network interactive display device. [0025]
  • In the display system of a preferred embodiment of the present invention, the terminal may include one of a personal computer and a PDA (Personal Digital Assistant). [0026]
  • In accordance with the preferred embodiment of the present invention, one of a personal computer and a PDA is used as the terminal. [0027]
  • A network interactive display device in another aspect of the present invention is connected to a plurality of terminals through a network, each terminal having a screen capture function, and includes a display, a communication unit for communicating in a two-way fashion with each of the terminals, and a display control unit, wherein the communication unit receives the image data which has been captured by each terminal through the screen capture function thereof, and which has been size converted to a predetermined image size by each terminal, and the display control unit has a multi-window screen presentation function for synthesizing the captured image data received by the communication unit into single screen multi-window format data to be displayed on a display screen of the display. [0028]
  • Since the network interactive display device receives the captured image data in the size converted form thereof from the terminal, and synthesizes the received image data, the workload of processing in the multi-window presentation is reduced in the network interactive display device. [0029]
  • In the network interactive display device of a preferred embodiment of the present invention, the display control unit may have an insertion function for inserting a new window into a current display screen to display the new window. [0030]
  • In accordance with the preferred embodiment, the network interactive display device has the insertion function. [0031]
  • In the network interactive display device of a preferred embodiment of the present invention, the user may select at will a terminal, which provides the captured image data to be displayed on the display screen of the display, from among the plurality of terminals connected to the network interactive display device. [0032]
  • In accordance with the preferred embodiment, the network interactive display device allows the user to select at will a terminal, which is to provide the captured image data to be displayed on the display screen of the display, from among the plurality of terminals connected to the network interactive display device. [0033]
  • In the network interactive display device of a preferred embodiment of the present invention, the terminal that provides the captured image data to be displayed on the display screen of the display may be selected in a two-way communication of the communication unit by one of the network interactive display device and the terminals. [0034]
  • In accordance with the preferred embodiment, the network interactive display device allows the terminal displaying the captured image data on the display screen to be selected in a two-way communication of the communication unit by one of the network interactive display device and the terminal. [0035]
  • In the network interactive display device of a preferred embodiment of the present invention, the display control unit may have an expansion display function for expanding a predetermined window from among a plurality of windows forming a multi-window screen displayed on the display screen of the display. [0036]
  • In accordance with the preferred embodiment, the network interactive display device has the expansion display function for expanding the predetermined window. [0037]
  • In the network interactive display device of a preferred embodiment of the present invention, the display control unit may have a single-window screen selection function for switching the display screen from a predetermined window from among a plurality of windows forming a multi-window screen displayed on the display screen to a single-window full screen. [0038]
  • In accordance with the preferred embodiment, the network interactive display device has the single-window screen selection function for switching the display screen from the predetermined window to the single-window full screen. [0039]
  • In the network interactive display device of a preferred embodiment of the present invention, the display control unit may have an erase function for erasing a predetermined window from among a plurality of windows forming a multi-window screen displayed on the display screen of the display. [0040]
  • In accordance with the preferred embodiment, the network interactive display device has the erase function for erasing the predetermined window. [0041]
  • In the network interactive display device of a preferred embodiment of the present invention, the predetermined window may be selected in response to an operation by the user. [0042]
  • In accordance with preferred embodiments, the network interactive display device allows the user to select the window to be expanded, the window to be switched to the single-window full screen, and the window to be erased. [0043]
  • In the network interactive display device of a preferred embodiment of the present invention, the predetermined window may be selected by one of the network interactive display device and the terminal in a two-way communication of the communication unit thereof. [0044]
  • In the preferred embodiment, the window to be expanded, the window to be switched to the single-window full screen, or the window to be erased is designated by one of the network interactive display device or the terminal. [0045]
  • In the network interactive display device of a preferred embodiment of the present invention, the image captured data received from the terminal may be obtained by designating the whole or a portion of the display screen of the terminal. [0046]
  • In accordance with the preferred embodiment, the network interactive display device displays the captured image data, which is obtained using full-screen capturing or partial-screen capturing. [0047]
  • In the network interactive display device of a preferred embodiment of the present invention, the captured image data received from the terminal may be obtained by detecting and capturing only a change on the display screen of the terminal. [0048]
  • In accordance with the preferred embodiment, the workload on the network is reduced by capturing only the change on the screen of the terminal. The network interactive display device thus presents image data which is a combination of existing captured data and the changed component of data. [0049]
  • In a preferred embodiment of the present invention, the network interactive display device may include a display size determining unit that divides the display screen of the display into windows of the number equal to the number of terminals to be displayed, and determines a display size of the window to which the terminal to be displayed is assigned, and a controller that sends the display size determined by the display size determining unit to the corresponding terminal through the communication unit, wherein the controller receives, through the communication unit, the captured image data, having the converted size equal to the display size of the window assigned to the terminal, from the terminal to which the display size is sent, and controls the display control unit to synthesize the received captured image data into single screen multi-window format data to be displayed on the display screen of the display. [0050]
  • In accordance with the preferred embodiment, the display size converted by the terminal is set to be display size determined by the display size determining unit. [0051]
  • In the network interactive display device of a preferred embodiment of the present invention, an aspect ratio of the window assigned to the terminal to be displayed may be equalized to an aspect ratio of the display screen of the display of the terminal. [0052]
  • In accordance with the preferred embodiments of the present invention, the network interactive display device provides a display screen free from discordance. [0053]
  • In the network interactive display device of a preferred embodiment of the present invention, through the communication unit, the controller may also send a display color count of the display to the terminal when sending the display size to the terminal, may receive the captured image data having the converted size equal to the display size of the window assigned to the terminal and having the display color count converted to the display color count of the display of the network interactive display device, from the terminal to which the display size and the display color count have been sent, and may control the display control unit to synthesize the received captured image data into single screen multi-window format data to be displayed on the display screen of the display. [0054]
  • In accordance with the preferred embodiment, the network interactive display device receives, from the terminal, the captured image data the terminal has color converted in addition to the size conversion process for contraction and synthesizes the received image data. The workload of processing in the multi-window presentation is reduced in the network interactive display device. [0055]
  • In the network interactive display device of a preferred embodiment of the present invention, a communication protocol of the communication unit may include the TCP/IP protocol. [0056]
  • In accordance with the preferred embodiment, the widely used TCP/IP is used as the communication protocol of the communication unit. [0057]
  • In the network interactive display device of a preferred embodiment of the present invention, the network may include one of a LAN, a radio LAN, and a near-field communication radio LAN. [0058]
  • In accordance with the preferred embodiment, one of the LAN (Local-Area Network), the radio LAN, the near-field communication radio LAN is used as the network. [0059]
  • A network interactive projector in yet another aspect of the present invention includes one of the above-referenced network interactive display devices. [0060]
  • In accordance with the above aspect, the projector has the above-referenced advantages of the network interactive display device. [0061]
  • In the network interactive projector of a preferred embodiment of the present invention, the display may include one of a liquid-crystal light valve, an LCOS light valve, and a DMD (Digital Micromirror Device) (Trademark of Texas Instruments). [0062]
  • In accordance with the preferred embodiment, the projector including one of the liquid-crystal light valve, the LCOS light valve, and the DMD has the above-referenced advantages of the network interactive display device. [0063]
  • A network interactive plasma display apparatus in yet another aspect of the present invention includes one of the above-referenced network interactive display devices, wherein the display includes a plasma display panel. [0064]
  • In accordance with the above aspect of the present invention, the plasma display apparatus provides the above-referenced advantages of the network interactive display device. [0065]
  • A network interactive liquid-crystal display apparatus in yet another aspect of the present invention includes one of the above-referenced network interactive display devices, wherein the display includes a liquid-crystal panel. [0066]
  • In accordance with the above aspect of the present invention, the liquid-crystal display apparatus have the above-referenced embodiments of the network interactive display device. [0067]
  • A network interactive organic EL display apparatus in a further aspect of the present invention includes one of the above-referenced network interactive display devices, wherein the display includes an organic EL panel. [0068]
  • In accordance with the above aspect of the present invention, the network interactive organic EL display apparatus provides the above-referenced advantages of the network interactive display device. [0069]
  • The present invention in a further aspect relates to a display control software program of a CPU that constitutes the display control unit of one of the above-referenced network interactive display devices. [0070]
  • In accordance with the above aspect of the present invention, the display control program allows a display device to provides the above-referenced advantages of the network interactive display device. [0071]
  • In yet another aspect of the present invention, a terminal, connected to one of the above-referenced network interactive display devices includes a display, a communication unit that communicates in a two-way fashion with the network interactive display device, a screen capture processor that captures the content displayed on the display screen of the display, an image converter which converts the image data captured by the screen capture processor to image data having a predetermined image size, and a controller that sends the captured image data, size converted by the image converter, from the communication unit to the network interactive display device, wherein the terminal generates the captured image data that is to be displayed on one of the multi windows displayed on the display screen of the network interactive display device. [0072]
  • When the captured image data to be displayed on one of the windows of the screen of the network interactive display device is generated, a part of the process required for the multi-window presentation, i.e., the size conversion process is performed by the terminal. The terminal thus contributes to a reduction in the workload on the network interactive display device. [0073]
  • In the terminal of a preferred embodiment of the present invention, the display screen of the display of the network interactive display device may be divided into windows of the number equal to the number of terminals to be displayed, a display size of the window assigned to each terminal to be displayed is determined, and the image converter converts the image data captured by the screen capture processor to image data having the display size assigned to own terminal. [0074]
  • In accordance with the preferred embodiment, the size conversion process is carried out based on the display size determined by the network interactive display device. [0075]
  • In the terminal of a preferred embodiment of the present invention, the image converter may perform a color conversion on the captured image data to match the display color count of the display of the network interactive display device in addition to the size conversion process, and the controller may send the captured image data, which has been subjected to the size conversion process and the color conversion process, from the communication unit to the network interactive display device. [0076]
  • In accordance with the preferred embodiment, the image converter performs the color conversion on the captured image data in addition to the size conversion process, thereby further reducing the workload on the network interactive display device. [0077]
  • The present invention in a further aspect relates to a control software program of a CPU that constitutes each processor of one of the above-referenced terminals. [0078]
  • In accordance with the above aspect, the control program allows a terminal to have the above-referenced advantages of the above terminal. [0079]
  • The present invention in a further aspect relates to a network interactive display device connected to each of a plurality of terminals through a network, each terminal having a screen capture function, and includes a display, a communication unit for communicating in a two-way fashion with each of the terminals, and a display control unit, wherein the display control unit has a multi-window screen presentation function for synthesizing the captured image data, captured by each terminal through the screen capture function and received by the communication unit, into single screen multi-window format data to be displayed on a display screen of the display. [0080]
  • The network interactive display device presents the screens of the plurality terminals connected to the network on the display screen of a display of the network interactive display device in a multi-window format. [0081]
  • In the network interactive display device of a preferred embodiment of the present invention, the display control unit may have an insertion function for inserting a new window into a current display screen to display the new window. [0082]
  • In accordance with the preferred embodiment, the network interactive display device has the insertion function. [0083]
  • In the network interactive display device of a preferred embodiment of the present invention, the user may select at will a terminal, which provides the captured image data to be displayed on the display screen of the display, from among the plurality of terminals connected to the network interactive display device. [0084]
  • In accordance with the preferred embodiment, the network interactive display device allows the user to select at will a terminal, which is to provide the captured image data to be displayed on the display screen of the display, from among the plurality of terminals connected to the network interactive display device. [0085]
  • In the network interactive display device of a preferred embodiment of the present invention, the terminal that provides the captured image data to be displayed on the display screen of the display may be selected in a two-way communication of the communication unit by one of the network interactive display device and the terminal. [0086]
  • In accordance with the preferred embodiment, the network interactive display device allows the terminal providing the captured image data on the display screen to be selected in a two-way communication of the communication unit by one of the network interactive display device and the terminal. [0087]
  • In the network interactive display device of a preferred embodiment of the present invention, the display control unit may have an expansion display function for expanding a predetermined window from among a plurality of windows forming a multi-window screen displayed on the display screen of the display. [0088]
  • In accordance with the preferred embodiment, the network interactive display device has the expansion display function for expanding the predetermined window. [0089]
  • In the network interactive display device of a preferred embodiment of the present invention, the display control unit may have a single-window screen selection function for switching the display screen from a predetermined window from among a plurality of windows forming a multi-window screen displayed on the display screen of the display to a single-window full screen. [0090]
  • In accordance with the preferred embodiment, the network interactive display device has the single-window screen selection function for switching the display screen from the predetermined window to the single-window full screen. [0091]
  • In the network interactive display device of a preferred embodiment of the present invention, the display control unit may have an erase function for erasing a predetermined window from among a plurality of windows forming a multi-window screen displayed on the display screen of the display. [0092]
  • In accordance with the preferred embodiment, the network interactive display device has the erase function for erasing the predetermined window. [0093]
  • In the network interactive display device of a preferred embodiment of the present invention, the predetermined window may be selected in response to an operation by the user. [0094]
  • In accordance with preferred embodiments, the network interactive display device allows the user to select the window to be expanded, the window to be switched to the single-window full screen, and the window to be erased. [0095]
  • In the network interactive display device of a preferred embodiment of the present invention, the predetermined window may be selected by one of the network interactive display device and the terminal in a two-way communication of the communication unit thereof. [0096]
  • In the preferred embodiment, any of the window to be expanded, the window to be switched to the full-screen mode, or the window to be erased is designated by one of the network interactive display device and the terminal. [0097]
  • In the network interactive display device of a preferred embodiment of the present invention, the captured image data received from the terminal may be obtained by designating the whole or a portion of the display screen of the terminal. [0098]
  • In accordance with the preferred embodiment, the network interactive display device displays the captured image data, which is obtained using full-screen capturing or partial-screen capturing. [0099]
  • In the network interactive display device of a preferred embodiment of the present invention, the captured image data received from the terminal may be obtained by detecting and capturing only a change on the display screen of the terminal. [0100]
  • In accordance with the preferred embodiment, the workload on the network is reduced by capturing only the change on the screen of the terminal. The network interactive display device thus presents image data which is a combination of existing captured data and the changed component of data. [0101]
  • In the network interactive display device of a preferred embodiment of the present invention, the display control unit may include a window area information generator which divides the display screen of the display into windows of the number equal to the number of terminals to be displayed, and generates window area information containing a display size of the window to which the terminal to be displayed is assigned, and information identifying a display position of the window, an image synthesizer which synthesizes the captured image data from the terminals into single screen multi-window format data in accordance with the window area information generated by the window area information generator, thereby generating synthesized image data, and an image processor which processes the synthesized image data generated by the image synthesizer, thereby generating display image data and outputting the display image data to the display. [0102]
  • In the network interactive display device of a preferred embodiment of the present invention, the image synthesizer may synthesize the captured image data by contracting or expanding the captured image data from each terminal with an aspect ratio of the image size of the captured image data maintained. [0103]
  • In accordance with the preferred embodiment, the network interactive display device provides a display screen free from discordance. [0104]
  • In the network interactive display device of a preferred embodiment of the present invention, a communication protocol of the communication unit may include the TCP/IP protocol. [0105]
  • In accordance with the preferred embodiment, the widely used TCP/IP is used as the communication protocol of the communication unit. [0106]
  • In the network interactive display device of a preferred embodiment of the present invention, the network may include one of a LAN, a radio LAN, and a near-field communication radio LAN. [0107]
  • In accordance with the preferred embodiment, one of the LAN, the radio LAN, the near-field communication radio LAN is used as the network. [0108]
  • A network interactive projector in a further aspect of the present invention includes one of the above-referenced network interactive display devices. [0109]
  • In accordance with the above aspect of the present invention, the projector has the above-referenced advantages of the network interactive display device. [0110]
  • In the network interactive projector of a preferred embodiment of the present invention, the display may include one of a liquid-crystal light valve, an LCOS light valve, and a DMD. [0111]
  • In accordance with the preferred embodiment, the projector including one of the liquid-crystal light valve, the LCOS light valve, and the DMD has the above-referenced advantages of the network interactive display device. [0112]
  • A network interactive plasma display apparatus in a further aspect of the present invention includes one of the above-referenced network interactive display devices, wherein the display includes a plasma display panel. [0113]
  • In accordance with the above aspect, the plasma display apparatus have the above-referenced advantages of the network interactive display device. [0114]
  • A network interactive liquid-crystal display apparatus in a further aspect of the present invention includes one of the above-referenced network interactive display devices, wherein the display includes a liquid-crystal panel. [0115]
  • In accordance with the above aspect, the liquid-crystal display apparatus have the above-referenced embodiments of the network interactive display device. [0116]
  • A network interactive organic EL display apparatus in a further aspect of the present invention includes one of the above-referenced network interactive display devices, wherein the display includes an organic EL panel. [0117]
  • In accordance with the above aspect, the network interactive organic EL display apparatus provides the above-referenced advantages of the network interactive display device. [0118]
  • The present invention in a further aspect relates to a display control software program of a CPU that constitutes the display control unit of one of the above-referenced network interactive display devices. [0119]
  • In accordance with the above aspect, the display control program provides a display device with the above-referenced advantages of one of the above network interactive display devices.[0120]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 illustrates a network of a display system including a network interactive display device in accordance with preferred embodiments of the present invention; [0121]
  • FIG. 2 is a block diagram illustrating the structure of a terminal in accordance with a first preferred embodiment of the present invention; [0122]
  • FIG. 3 is a block diagram illustrating the structure of a network interactive display device of the first preferred embodiment; [0123]
  • FIG. 4 is a functional diagram illustrating the function of the network interactive display device of the first preferred embodiment; [0124]
  • FIG. 5 illustrates items of terminal information managed in a display status management file; [0125]
  • FIG. 6 is a flow diagram illustrating the operation of the first preferred embodiment; [0126]
  • FIG. 7 is a continuation of the flow diagram of FIG. 6; [0127]
  • FIG. 8 illustrates the configuration of the display system in which display screens of four terminals [0128] 1 a-1 d are presented on a display screen of a display device;
  • FIG. 9 is a flow diagram illustrating the flow of a window area information generation process; [0129]
  • FIG. 10 illustrates a specific structure of the display status management file; [0130]
  • FIG. 11A illustrates a specific structure of a table held in a tentative window area setting file, and FIG. 11B illustrates a tentative window area based on the table of FIG. 11A; [0131]
  • FIG. 12 illustrates a true window area size and a true origin; [0132]
  • FIG. 13 illustrates one example of a window area information file; [0133]
  • FIG. 14 is a flow diagram of an operation of the terminal which has received a capture start command and a display status management file from the display device of the first preferred embodiment; [0134]
  • FIG. 15 diagrammatically illustrates the display system to explain an expansion display function; [0135]
  • FIG. 16A illustrates a tentative window area setting table which is referenced when a priority order is updated, and FIG. 16B illustrates a tentative window area based on the table of FIG. 16A; [0136]
  • FIG. 17 illustrates a true window area with the priority order modified; [0137]
  • FIG. 18 illustrates a window area information file that is produced when the priority order is modified; [0138]
  • FIG. 19 diagrammatically illustrates a single-window screen presentation function; [0139]
  • FIG. 20 diagrammatically illustrates an insertion function; [0140]
  • FIG. 21 diagrammatically illustrates a window erase function; [0141]
  • FIG. 22 is a flow diagram illustrating an operation of terminal which has received a difference capture start command and a display status management file from the display device; [0142]
  • FIG. 23 illustrates a screen comparison process wherein a mouse pointer is moved; [0143]
  • FIG. 24 illustrates the screen comparison process of FIG. 23; [0144]
  • FIG. 25 illustrates a conventional display system; [0145]
  • FIG. 26 is a block diagram illustrating a structure of the terminal in accordance with a second preferred embodiment of the present invention; [0146]
  • FIG. 27 is a block diagram illustrating a structure of the network interactive display device of the second preferred embodiment; [0147]
  • FIG. 28 is a functional diagram of an operation of the network interactive display device of the second preferred embodiment of the present invention; [0148]
  • FIG. 29 is a flow diagram illustrating an operation of the display system of the second preferred embodiment of the present invention; and [0149]
  • FIG. 30 is a flow diagram illustrating the operation of the terminal which has received a capture start command and a display status management file from the display device of the second preferred embodiment of the present invention.[0150]
  • DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • First Embodiment [0151]
  • FIG. 1 illustrates a network of a display system [0152] 100 including a network interactive display device 2 in accordance with preferred embodiments of the present invention.
  • The display system [0153] 100 includes a plurality of terminals (only four terminals 1 a, 1 b, 1 c, and 1 d are shown in FIG. 1), and the network interactive display device 2 (a projector here) having a multi-window screen presentation function as one of major functions of the present invention. The plurality of terminals 1 are respectively connected to the network interactive display device 2 (hereinafter simply referred to the display device 2) through a network 3 in a two-way communication based on the TCP/IP protocol. A unique name is provided beforehand to each terminal 1 (hereinafter referred to as a terminal name). The network 3 may be any of a LAN (Local Area Network), a radio LAN, and a near-field communication radio LAN such as Bluetooth (Tradename of Bluetooth SIG Inc., U.S.A.).
  • The display system [0154] 100 allows screens presented on the plurality of terminals 1 to be concurrently presented on a multi-window display screen of the display device 2. Such a system 100 is useful in a conference or a presentation. The terminal 1 and the display device 2 will now be discussed in detail.
  • FIG. 2 is a block diagram illustrating the structure of the terminal [0155] 1 in accordance with a first preferred embodiment of the present invention.
  • The terminal [0156] 1 may be a personal computer or a PDA (Personal Digital Assistant). The terminal 1 includes a display 11 for presenting a diversity of information such as materials for presentation, a video memory 12 for storing the content to be presented on the display 11, an input section 13 including of a tablet, a mouse, or a keyboard, a user interface 14 for detecting an operational input entered from the input section 13 and outputting the operational input to a controller (CPU) 16, a storage 15 for storing application software programs (such as a control program) for performing the processes of the present invention and a variety of pieces of data, the controller 16, and a communication unit 17.
  • The control program stored in the storage [0157] 15 is used to perform a terminal control function to perform a multi-window screen presentation function on the network interactive display device 2, a screen capture function to capture a whole or a part of the screen of the display 11, an image conversion function to convert captured image data acquired using the screen capture function into data in a format of a display 21 of the network interactive display device 2, and a function to detect a change on the screen of the display 11. The application software programs and the CPU constitute a data management processor 18, a screen capture processor 19, an image converter 19A, and a screen comparison processor 20.
  • The controller [0158] 16 receives a variety of requests, including a connection request, a display request, an expansion display request, a request to switch to a single-window full screen, and an erase request through the user interface 14 or the communication unit 17, and performs processes responsive to each request. Under the control of the controller 16, the image converter 19A converts the image data acquired by the screen capture processor 19, and the communication unit 17 sends the converted captured image data to the network interactive display device 2.
  • The communication unit [0159] 17 carries out a two-way communication with the network interactive display device 2. The communication protocol used here is the TCP/IP. The communication unit 17 has a protocol processing function for ARP, ICMP, IP, TCP, UDP, etc. required for the TCP/IP connection. This protocol processing function is carried out under the control of an OS.
  • The conversion processes performed by the image converter [0160] 19A are required when the display device 2 to be discussed later performs a multi-window screen presentation. Specifically, the conversion processes include a size conversion process to convert the captured image data into data in a display size of a window assigned to own terminal 1, and a color conversion process to convert the captured image data into data having a display color count of the display 21 of the display device 2. The terminal 1 performs the conversion processes, required to present a multi-window screen on the display device 2, on the captured image data acquired by the screen capture processor 19, and then sends the converted captured image data to the display device 2.
  • FIG. 3 is a block diagram illustrating the structure of the display device [0161] 2 of the first preferred embodiment.
  • The display device [0162] 2 includes the display 21, a display control unit 22 which has a multi-window screen presentation function, an expansion display function, a function to switch to the single-window full screen, an insertion function, and an erase function, and controls the display screen to be presented on the display 21, an input section 23 including a remote controller, a mouse, or a keyboard, a user interface 24 for detecting an operational input from the input section 23 and for outputting the operational input to a controller 27 to be discussed later, a program storage 25 for storing the display control program to perform the multi-window screen presentation function of the present invention, a data storage 26 for storing a variety of files and data required to carry out the display control program, a controller (CPU) 27 for generally controlling the display device 2, and a communication unit 28 for performing a two-way communication with each terminal 1.
  • The communication unit [0163] 28 carries out a two-way communication with the terminal 1. The communication protocol used here is the TCP/IP. The communication unit 28 has a protocol processing function for ARPICMP, IP, TCP, UDP, etc. required for the TCP/IP connection.
  • The display device [0164] 2 may be a plasma display, or a liquid-crystal display instead of the projector shown in FIG. 1. FIG. 3 shows major portions only related to the context of the present invention, and does not show other elements respectively uniquely relating to the projector, the plasma display, and the liquid-crystal display because they are not closely related to the context of the present invention. If the elements shown in FIG. 3 are added to each of an existing projector, an existing plasma display, and an existing liquid-crystal display, they respectively become a network interactive projector, a network interactive plasma display, and a network interactive liquid-crystal display. The display 21 is different depending on the type of the display device 2. Specifically, the display 21 is one of a liquid-crystal light valve, an LCOS light valve, or a DMD (Digital Micromirror Device) (Trademark of Texas Instruments) in the projector. The display screen of the display 21 becomes a projecting screen. The display 21 is a plasma display panel in a plasma display device, a liquid-crystal panel in a liquid-crystal display device, or an organic EL (Electroluminescent) panel in an organic EL display device.
  • Referring to FIG. 4, a variety of files stored in the data storage [0165] 26 is discussed.
  • The data storage [0166] 26 stores a display specification management file 30, a permitted connection management file 31, a permitted display management file 32, a connection status management file 33, a display status management file 34, a tentative window area setting file 35, and a window area information file 36. The data storage 26 further includes a captured image data memory 37 for storing the captured image data sent from each terminal 1.
  • The display specification management file [0167] 30 registers a screen size representing the number of pixels in vertical and horizontal directions of the display screen of the display 21, and color count information representing a display color count of the display 21. In this preferred embodiment, the screen size is 1280×1024 (SXGA), and the color count is 167,777,216 colors.
  • The permitted connection management file [0168] 31 registers a terminal name of a terminal 1 which is permitted for connection. The permitted display management file 32 registers a terminal name of a terminal 1 which is permitted for screen display.
  • The connection status management file [0169] 33 registers a terminal name of a terminal 1 which is currently connected to the display device 2.
  • The display status management file [0170] 34 manages a display status of the current display 21. The display status management file 34 manages, in a table form, terminal information relating to the terminal 1 that is a source of the captured image data currently presented on the display screen of the display 21. The display status management file 34 is updated each time the display screen of the display 21 is modified. For example, if the display screen is switched from a four-window screen to a three-window screen, terminal information of the terminal 1 corresponding to an erased window is deleted. If the display screen is switched from a four-window screen to a five-window screen, terminal information of the terminal 1 corresponding to an added window is newly registered.
  • FIG. 5 illustrates items of the terminal information managed in the display status management file [0171] 34.
  • The display status management file [0172] 34 contains, as items thereof, a “terminal name”, an “IP address”, a “screen size”, “color count information”, “priority”, a “capture area management flag”, a “difference capture management flag”, a “captured image size”, a “difference captured image size 1”, a “difference captured image size 2”, a “difference capture origin 1”, and a “difference capture origin 2”.
  • The “terminal name” is a name provided beforehand to the terminal [0173] 1. The “screen size” is the number of pixels in the vertical and horizontal directions of the display screen of the display 11. For example, an SXGA terminal has 1280×1024 pixels, and an XGA terminal has 1024×768 pixels. The “color count information” represents the number of display colors of the display 11, and may be 256 colors, or 167,777,216 colors, for example. The terminal name, the IP address, the screen size, and the color count information are the items that must be stored in the display status management file 34 during a registration. Other items are set (updated) by the user as necessary.
  • The “priority” determines the size of the display size of each window assigned to the terminal [0174] 1 that is identified by the terminal name. The priority takes “highest”, “high” or “none”. As will be discussed in detail, the display size having a higher priority order becomes larger. The “capture area management flag” manages the capturing as to whether the screen of the terminal 1 identified by the terminal name is captured in a full-screen capture mode or a partial-screen capture mode. The capture area management flag is “0” in the full-screen capture mode, which is a standard capture mode, and “1” in the partial-screen capture mode.
  • The “difference capture management flag” manages the capture of whether the screen of the terminal [0175] 1 identified by the terminal name is captured in a normal capture mode or a change capture mode (hereinafter referenced to as a difference capture mode). The difference capture management file is “0” in the normal capture mode, or “1” in the difference capture mode.
  • The “captured image size” is the size of the captured image data (the number of pixels in the vertical and horizontal directions) when the capture area management flag is “1”, i.e., in the partial-screen capture mode. [0176]
  • The “difference captured image size 1” and the “difference captured image size 2” represent the sizes of two different areas acquired in the difference capture when the difference capture management flag is “1”. The “difference capture origin 1”, and the “difference capture origin 2” are the origins of the two different areas acquired in the difference capture, and are the absolute coordinates within an area defined by the captured image size. [0177]
  • The tentative window area setting file [0178] 35 is a file in which information identifying a tentative window area assigned to the terminal 1 is set beforehand. The tentative window area setting file 35 contains a plurality of tables, each table prepared for the terminals. The table has a structure as shown in FIG. 11A and FIG. 16A. The table will be discussed in more detail later. The window area information file 36 will also be discussed later.
  • When a predetermined operation is performed on the input section [0179] 23, the display 21 displays the contents of the files 30, 31, 32, 33, 34, 35, and 36. The user thus checks and modifies the data on the display screen at will.
  • Returning to FIG. 4, the display control unit [0180] 22 includes a window area information generator 41 as a display size determining unit, an image synthesizer 42, and an image processor 43. The controller 27 receives a variety of requests such as a connection request, a display request, an insertion display request, an erase request, etc., received through the user interface 24 or the communication unit 28. In response to these requests, under the control of the controller 27, the processors 41, 42, and 43 respectively perform required processes while accessing necessary files in the data storage 26. The controller 27 thus controls the display 21. A display control program, stored in the program storage 25, for providing a multi-window screen presentation function and the controller (CPU) 27 constitute the display control unit 22.
  • From the display status management file [0181] 34, the window area information generator 41 learns the number of terminals 1 to be presented, and the priority order and the screen size of each terminal 1. The window area information generator 41 splits the display screen of the display 21 in accordance with the number of terminals 1 to be presented, and the priority order and the screen size of each terminal 1. The window area information generator 41 generates the window area information containing display size (hereinafter referred to as a window area size) of the window on the display 21 assigned to each terminal 1 to be displayed, and information identifying the display position of the window (the absolute coordinates at the top left corner of the window with respect to the display screen, hereinafter also referred to as an origin). The information is stored in the data storage 26 as the window area information file 36.
  • The controller [0182] 27 sends, to each terminal 1 to be displayed from the communication unit 28, the window area size of the window assigned to each terminal 1 to be displayed in the window area information file 36 generated in the window area information generator 41 together with the display color count of the display 21 held in the display specification management file 30. The captured image data memory 37 then stores the captured image data returned from each terminal 1 that has received these pieces of information, namely, the captured image data that has been subjected to the size conversion process and the color conversion process in accordance with the received window area size and display color count. The information referred to as the window area size is identical to a true window area size in the discussion that follows.
  • The image synthesizer [0183] 42 synthesizes the size converted and color converted captured image data from each terminal I stored in the captured image data memory 37 in accordance with the window area information file 36 generated by the window area information generator 41. Synthesized image data thus results.
  • The image processor [0184] 43 performs a scanning frequency conversion process on a variety of pieces of image data such as the synthesized image data generated by the image synthesizer 42, and the display status management file 34 of the data storage 26 which is referenced using an OSD (on- screen display) function, thereby generating display image data and outputting the display image data to the display 21. The image processor 43 includes a scan converter, for example.
  • The operation of the first preferred embodiment of the present invention will now be discussed. FIGS. 6 and 7 are flow diagrams illustrating the operation of the first preferred embodiment. [0185]
  • A predetermined operational input is entered in the input section [0186] 23 in the display device 2 in a preliminary step for multi-window screen presentation. Upon detecting the operational input through the user interface 24, the controller 27 broadcasts a request to return a terminal name and an IP address together with the IP address of the display device 2 through the communication unit 28 and the network 3. When each terminal 1 receives the broadcast request to return the terminal name and the IP address, the terminal 1 returns own terminal name and IP address to the display device 2.
  • The display device [0187] 2 receives a reply (the terminal name and the IP address) from each terminal 1 through the communication unit 28, and determines whether each terminal is a connection permitted terminal. Specifically, the display device 2 determines whether the returned terminal name agrees with a terminal name registered in the permitted connection management file 31. If it is determined that the returned terminal name agrees with the registered terminal name, the display device 2 handles the terminal 1 as a connection permitted terminal.
  • The terminal names and the IP addresses of the terminals [0188] 1 determined as connection permitted terminals are successively registered in the connection status management file 33. The connection status management file 33 allows the display device 2 to learn how many terminals 1 are currently connected. Since the determination of whether the connection is permitted or not is based on the terminal name, the system works even if the IP address, provided to the terminal 1 using the DHCP, becomes different each time connection is made.
  • The display device [0189] 2 waits on standby for any request after the above preliminary step is complete. For example, the display screens of the four terminals 1 a-1 d, out of the terminals 1 operated by conference participants, are presented on a multi-window display screen 50 of the display device 2. As for the resolutions thereof, the terminal 1 a has an SXGA resolution (1280×1024 pixels), the terminal 1 b has an SVGA resolution (800×600 pixels), the terminal 1 c has an XGA resolution (1024×768 pixels), and the terminal 1 d has a resolution of 480×640 pixels.
  • Multi-Window Screen Presentation Function [0190]
  • FIG. 8 illustrates the configuration of the display system in which display screens of four terminals [0191] 1 a-1 d are presented on the display screen of the display device 2.
  • The user operates the remote controller in the input section [0192] 23 to enter an operational input to display the screens of the terminals 1 a-1 d. Through the user interface 24, the controller 27 is notified of the input information, namely, a request to display the screens of the terminals 1 a-1 d together with identification information of the terminals 1 a-1 d (step S1). When the request is placed, the priority order, the partial capture, and the difference capture may also be designated. Here, no particular designation is performed.
  • Upon receiving the display request, the controller [0193] 27 in the display device 2 performs processes in steps S3-S9 for each of the terminals 1 a-1 d to be displayed (step S2). More specifically, the controller 27 references the permitted connection management file 31 and the permitted display management file 32 according to the terminal name indicated by the identification information contained in the display request, thereby determining whether or not each terminal 1 is permitted for connection and whether or not each terminal 1 is permitted for display (step S3). If it is determined that each terminal 1 is permitted for both connection and display (step S4), the controller 27 requests, through the communication unit 28, each terminal 1 to send the terminal information (the terminal name, the IP address, the screen size, and the color count information) (step S5). The controller 27 receives the terminal information which has been sent in response to the request (step S6), and registers the terminal information in the display status management file 34 (step S7). If the priority order, the partial capture, and the difference capture are designated during the placement of the display request, the priority order, the capture area management flag, and the difference capture management flag are also registered in the registration in step S7.
  • The controller [0194] 27 sends the display status management file 34 and a screen capture start command to the terminal 1 which is permitted for connection and display (step S8). When any terminal 1 is not permitted for connection and display, the controller 27 sends a notification to that effect to the terminal 1 (step S9).
  • If all four terminals [0195] 1 a-1 d are permitted for connection and display, the terminal information from the terminals 1 a-1 d is registered in the display status management file 34 in the processes in steps S2-S9. At the same time, the screen capture start command is sent together with the display status management file 34 to each of the terminals 1 a-1 d through the communication unit 28.
  • Subsequent to the above processes, the controller [0196] 27 notifies the window area information generator 41 in the display control unit 22 of a window area split request. The display device 2 then enters a window area information generation process (step S10). The operation of the terminal 1 having received the screen capture start command will be discussed later. Discussed first is the window area information generation process performed by the window area information generator 41 in response to the window area split request.
  • FIG. 9 is a flow diagram illustrating the flow of a window area information generation process. The operation of the window area information generator [0197] 41 is specifically discussed on the assumption that the display status management file 34 is constructed as shown in FIG. 10. As shown in FIG. 10, terminal names PC-1, PC-2, PC-3, and PDA-1 correspond to the terminals 1 a, 1 b, 1 c and 1 d, respectively.
  • Upon receiving the window area split request from the controller [0198] 27, the window area information generator 41 learns the number of the terminals 1 to be displayed (here, four terminals 1) referencing the display status management file 34. The window area information generator 41 also learns the priority order of each of the terminals 1 a-1 d (step S21). The window area information generator 41 references the tentative window area setting file 35 according to the number of terminals 1 and the priority order of each of the terminals 1 a-1 d, and acquires a tentative size and a tentative origin of each tentative window area assigned to each of the terminals 1 a, 1 b, 1 c, and 1 d (step S22). As will be clarified later, the adjective “tentative” is used because the area window here assigned to the terminal 1 is updated in a later step to size convert the captured image data.
  • As shown in FIG. 10, the four terminals [0199] 1 are to be displayed here and no priority order is set to all of the four terminals 1. A tentative area setting table in the tentative window area setting file 35 is organized as shown in FIG. 11A. Here, the display screen of the display 21 has a resolution of 1280×1024 (SXGA), and the tentative area setting table shown in FIG. 11A is organized based on this display screen. FIG. 11B shows the tentative window area based on the table window area setting table shown in FIG. 11A.
  • The priority order shown in FIG. 11A is determined based on the “priority” item in the display status management file [0200] 34, and the terminals 1 are first, second, third, and fourth from the high order to the low order. The terminals 1 are assigned the “tentative size” and the “tentative origin” for the tentative window area in the lower table. All the terminals 1 a-1 d have “none” in the priority order row with no priority order set therefor (see FIG. 10). If no priority order is set, the order of assignment may be a predetermined one, or may be the order of registration to the display status management file 34. In the first preferred embodiment, the terminals 1 a, 1 b, 1 c, and 1 d (hereinafter referred to as the terminal names PC-1, PC-2, PC-3, and PDA-1 as appropriate) are assigned tentative window areas 50A, 50B, 50C, and 50D in that order.
  • The window area information generator [0201] 41 further acquires the screen sizes of the PC-1, PC-2, PC-3, and PDA-1 from the display status management file 34 (see FIG. 10), and determines the sizes and origins of true windows respectively assigned thereto based on the acquired screen sizes (step S23).
  • FIG. 12 illustrates the true window area size and the true origin. As shown, [0202] 51A, 51B, 51C, and 51D represent the true window areas assigned to the PC-1, PC-2, PC-3, and PDA-1, respectively. The captured image data to be displayed on the tentative window area is size converted with the aspect ratio thereof maintained. The true window areas are display areas of the tentative window areas 50A, 50B, 50C, and 50D in which the converted images are respectively displayed with the centers thereof aligned to be centered on the respective tentative window areas. The PC-2 is now specifically discussed in connection with the true window area size to determine the true window area. The screen size of the PC-2 is 1024×768 pixels (see FIG. 10). The image data of this size is contracted with the aspect ratio thereof (namely, the aspect ratio of the display screen of the display 11) maintained so that the image data is displayed within the window area 50B having the size of 640×512 pixels assigned to the PC-2. The contracted size is thus the true window area size. The true origin is used to place the window of that size at the center of the window area 50B as shown in FIG. 12, and is represented in pixel coordinates at the top left corner of the window (the absolute coordinates with respect to the entire display screen).
  • The window area information generator [0203] 41 determines the above-referenced true window area sizes and true origins for the PC-1, PC-2, PC-3, and PDA-1, and generates window area information containing the terminal name item, the window area item, and the origin item as shown in FIG. 13, and then stores the window area information in the data storage 26 as the window area information file 36 (step S24). The window area information generation process thus ends. The window area information file 36 is tagged with processing date (May 21, 2002, 17:00:32, for example).
  • Returning to FIG. 6, the controller [0204] 27 in the display device 2 performs processes in steps S12-S14 to each of the terminals 1 a, 1 b, 1 c, and 1 d to be displayed (step S11) when the window area information generator 41 completes the window area information generation process (step S10). More specifically, the controller 27 places a request to send the captured image data (step S12). The sent request to send the captured image data includes the true window area size assigned to the terminal 1 to which the request is sent, and the display color count of the display 21 of the display device 2 stored in the display specification management file 30. For example, the request to send the captured image data including the true window area size of 640×512 (see FIG. 13) and the display color count of 167,777,216 of the display device 2 is sent to the terminal 1 a (PC-1).
  • Upon receiving the capture start command sent from the display device [0205] 2 in step S8, the terminal 1 starts capturing the screen thereof. If the screen capture performed by the terminal 1 is a full-screen capture, the terminal 1 sends, to the display device 2, the captured image data which has been subjected to the size conversion process and the color conversion process in accordance with the true window area size and the display color count contained in the request to send the captured image data, in response to the request to send the captured image data sent in step S12. If the screen capture performed by the terminal 1 is a partial-screen capture, an image size of the partial-captured image is sent to the display device 2.
  • The display device [0206] 2 receives the reply from the terminal 1 (step S13). If the rely is the captured image data, the display device 2 determines that the screen capture performed by the terminal 1 is a full-screen capture (step S14), and the received captured image data is written onto the captured image data memory 37 (step S18).
  • If the reply from the terminal [0207] 1 received in step S13 is the image size, the controller 27 determines the screen capture performed by the terminal 1 is a partial-screen capture (step S14). The captured image size in the display status management file 34 is updated with the received image size. The controller 27 regenerates the window area information (the true window area size and the true origin) based on the received image size (step S15). The controller 27 updates the required portion of the window area information file 36, and sends a request to send recaptured image data responsive to the regenerated true window area size and the display color count of the display 21 (step S16). The terminal 1 receives the request to send the recaptured image data. The terminal 1 returns, to the display device 2, the captured image data which has been subjected to the size conversion process and the color conversion process in accordance with the true window area size and the display color count of the display 21, contained in the request to send the recaptured image data. The display device 2 receives the reply from the terminal 1 (step S17), and writes the reply onto the captured image data memory 37 (step S18).
  • The above process is performed for each of the terminals [0208] 1 a-1 d. When the captured image data is received from all terminals 1 a-1 d, the controller 27 sends an image synthesis command to the image synthesizer 42. The display device 2 enters an image synthesis process (step S19).
  • Upon receiving the image synthesis command, the image synthesizer [0209] 42 identifies locations of synthesis of the size-converted captured image data and the color-converted captured image data stored in the captured image data memory 37 in accordance with the true origin of the window area information in the window area information file 36, and synthesizes the captured image data into a single screen image data, thereby generating the synthesized image data. The synthesized image data is then output to the image processor 43.
  • The image processor [0210] 43 converts the synthesized data from the image synthesizer 42 in display image data having a scanning frequency of the display 21. The display image data is then output to the display 21. As shown in FIG. 8, a multi-window screen is thus present on the display screen 50 in which the captured image data (display screen) of the terminals 1 a, 1 b, 1 c, and 1 d is presented on the true window areas (hereinafter also referred to as window screens) of 51A, 51B, 51C, and 51D (step S20).
  • The operation of the terminal [0211] 1 having received the capture start command and the display status management file 34 from the display device 2 is discussed below.
  • FIG. 14 is a flow diagram of the operation of the terminal [0212] 1 which has received the capture start command and the display status management file 34 from the display device 2. The terminal 1 is here 1 a (PC-1).
  • The controller [0213] 16 in the terminal 1 a receives, through the communication unit 17, the capture start command and the display status management file 34 sent from the display device 2 (step S31). The controller 16 references the capture area management flag in the terminal 1 a in the display status management file 34 (step S32). Since the capture area setting flag is “0” (step S33), the full-screen capture is determined to be activated. The controller 16 sends a full-screen capture command to the screen capture processor 19. In response to the full-screen capture command, the screen capture processor 19 stores the content of the video memory 12 (i.e., the content currently displayed on the display screen of the display 11) in the storage 15 in a bit-map format (a full-screen capture process) (step S34).
  • When the terminal [0214] 1 a receives, from the display device 2, the request to send the captured image data (step S35), the terminal 1 a performs the size conversion on the captured image data acquired in the full-screen capture process in step S34 in accordance with the true window area size contained in the request to send the captured image data while performs the color conversion process on the captured image data in accordance with the display color count contained in the request to send the captured image data (step S36). Since the terminal 1 a (PC-1) has a screen size of 1280×1024 pixels (see FIG. 10), the captured image data of this size is converted (in a contraction process) into data of size as large as the assigned window area size of 640×512 (see FIG. 13). No color conversion is performed because the display 11 has the same color count of 167,777,216 as the display device 2. If the display 11 has a color count larger than that of the display device 2, the color count is down-converted to match that of the display device 2. The captured image data, size converted and color converted in this way, is sent through the communication unit 17 (step S37).
  • If the capture area setting flag is determined to be “1” in step S[0215] 33, a partial capture is determined to be activated. A capture area designation screen indicating a message saying “designate a capture area” is presented on the display 11 (step S38), and a partial capture command is sent to the screen capture processor 19. When the user, who reads the message on the capture area designation screen, selects a window or encloses a desired area using the input section 13, the screen capture processor 19 recognizes the user operation through the user interface 14. The image data on the video memory 12 corresponding to the designated area and the image size is stored in the storage 15 (a partial capture process) (step S39).
  • Upon receiving the request to send the captured image data from the display device [0216] 2 (step S40), the terminal 1 a returns the size of the partial captured image captured in the partial capture process in step S39 (step S41). The reply is received by the display device 2 as already discussed. The display device 2 regenerates the true window area size based on the image size in the partial capture process, and sends, to the terminal 1 a, the request to send the recaptured image data containing the regenerated true window area size and the display color count of the display 21 of the display device 2. Upon receiving the request, the terminal 1 a performs the size conversion process on the partial captured image data stored in the storage 15 in step S39, based on the true window area size contained in the request to send the captured image data, while performing the color conversion process on the partial captured image data in accordance with the color count contained in the request to send the captured image data (step S36). The terminal 1 a then sends the captured image data, which has been size converted and color converted in this way, to the display device 2 through the communication unit 17 (step S37).
  • The above-referenced process is similarly performed on each of the remaining terminals [0217] 1 b, 1 c, and 1 d in addition to the terminal 1 a. As a result, the display device 2 receives, from each of the terminals 1 a, 1 b, 1 c, and 1 d, the captured image data which has been size converted into the true window area size assigned thereto, and which has been color converted to the display color count of the display 21 of the display device 2.
  • Each terminal [0218] 1 receives the display status management file 34 from the display device 2. The display status management file 34 is used to check the status of the capture area setting flag of own terminal 1. Furthermore, the content of the file may be displayed on the display 11 by performing a predetermined operation on the input section 13. The permitted connection management file 31, the permitted display management file 32, and the connection status management file 33 may also be acquired from the display device 2 as necessary to be presented on the display 11. In this way, the user may learn what terminals are displayed other than own terminal operated by the user himself and the range of authority granted to own terminal 1.
  • Expansion Display Function [0219]
  • An expansion display function for expanding any one of a plurality of currently presented windows is discussed below. The expansion display function is performed by updating the priority order in the above-referenced arrangement. [0220]
  • FIG. 15 diagrammatically illustrates the display system to explain the expansion display function. The screen of the terminal [0221] 1 a is expanded by heightening the priority order of the terminal 1 a.
  • The screen of the terminal [0222] 1 a is designated by operating the remote controller. If the terminal 1 a is assigned any key in the remote controller, the user designates the terminal 1 a by pressing that key. If no particular key is assigned, the user may operate the remote controller to select the terminal 1 a on a menu screen on the display 21, or may click the screen of the terminal 1 a with a pointer on the display screen using the remote controller.
  • Designation information input in this way, i.e., an expansion display request containing the identification information of the terminal [0223] 1 a, is sent to the controller 27 through the user interface 24. The controller 27 identifies the terminal 1 a based on the identification information contained in the expansion display request. The controller 27 sets the priority of the terminal 1 a (PC-1) in the display status management file 34 to be “highest”, and sends a window area split request to the window area information generator 41.
  • The window area information generator [0224] 41 generates the window area information file 36 as already described. The tentative window area setting table, which is referenced in the generation of the window area information file 36, is organized as illustrated in FIG. 16A. FIG. 16B shows tentative window areas based on the tentative window area setting table. The priority order illustrated in FIG. 16B is determined based on the priority order in the display status management file 34. The priority order of the terminal 1 a is higher than those of the remaining terminals 1 b, 1 c, and 1 d. The terminal 1 a is thus assigned the tentative window area 52A (the window area for the first terminal) having the first priority in FIG. 16B. The assignment of the window areas to the remaining terminals 1 b, 1 c, and 1 d having no priority order set therefor is arbitrary. For example, the terminals 1 b, 1 c, and 1 d are now assigned tentative window areas 52B, 52C, and 52D, respectively. As already described, the true window areas to be assigned to the terminals 1 b, 1 c, and 1 d are determined.
  • FIG. 17 illustrates the true window areas. The terminals [0225] 1 a, 1 b, 1 c, and 1 d are assigned the true window areas 53A, 53B, 53C, and 53D, respectively. The window area information file 36 is then organized as illustrated in FIG. 18.
  • As already described, the captured image data (display screen) of the terminals [0226] 1 b, 1 c, and 1 d appears on the true window areas 53A, 53B, 53C, and 53D, respectively. The designated window 51A is shown in an expanded state thereof on the multi-window screen 50.
  • Function to Switch to Single-Window Screen [0227]
  • One of the plurality of multi windows is shown on a single-window screen as shown in FIG. 19. The user may operate the remote controller to enter an operational input for the function to switch to a single-window screen. More specifically, the controller [0228] 27 is notified of the input information, namely, the single-window display request containing the identification information of the terminal 1 c corresponding to the window 51C to be displayed on the single-window screen. The controller 27 identifies the terminal 1 c based on the identification information contained in the single-window display request. The controller 27 sets the priority order item of the terminal 1 c (PC-3) in the display status management file 34 to be the highest, thereby sending a window area split request to the window area information generator 41.
  • As a result, the multi-window screen is replaced with the single-window screen as shown in FIG. 19. The user may return to the multi-window screen by performing a predetermined operation on the remote controller to set the priority order to “none”. [0229]
  • The switching to the single-window screen allows the user to recognize details, which are not visible in the contracted scale on the window. The ease of use is assured because the predetermined operation quickly returns the screen to the multi-window screen. [0230]
  • Insertion Function [0231]
  • As shown in FIG. 20, a new window may be inserted into a currently presented multi-window screen. Such an insertion corresponds to a display request subsequent to the display of a multi-window screen. The screen insertion is thus performed in the same process as in the display request process. [0232]
  • Erase Function [0233]
  • As shown in FIG. 21, one of the plurality of currently presented windows may be erased. The user operates the remote controller to enter an operational input for the erase function. More specifically, the input information, namely, an erase request containing the identification information of the terminal [0234] 1 d corresponding to the window 51D to be erased is sent to the controller 27 through the user interface 24. The controller 27 identifies the terminal 1 d based on the identification information contained in the erase request. The controller 27 deletes the terminal information of the terminal 1 d from the display status management file 34, and sends a window area split request to the window area information generator 41. The subsequent process remains identical to the one already discussed. The window 50D designated for erasure is erased as shown in FIG. 21. The multi-window screen is reorganized so that a plurality of windows corresponding to the number of terminals subsequent to the erasure are presented. Alternatively, the window 51D designated for erasure may be merely erased.
  • The first preferred embodiment of the present invention provides the display system [0235] 100 having the multi-window screen presentation function in which the screens of the plurality of terminals 1 connected to the network are presented on the plurality of windows on the display screen of the display device 2. The processes required to perform the multi-window screen presentation function, i.e., the size conversion process and the color conversion process, are performed on the terminal 1. This arrangement dramatically reduces the workload on the display device 2, in comparison with the case in which the display device 2 performs the same processes on the captured image data sent from the terminal 1. Since the size conversion process and the color conversion process are performed by the terminal 1, the effect of an increase in the number of terminals 1 on processes to be performed by the display device 2 is minimized in the display system 100.
  • The workload on the network [0236] 3 is also reduced because the terminal 1 performs the size conversion process on the captured image data before the transmission of the captured image data to the display device 2 over the network 3.
  • The ease of use of the system [0237] 100 is assured because the expansion display function, the switching function to the single-window screen, the insertion function, and the erase function are available in addition to the multi-window screen presentation function.
  • The captured image data acquired in the terminal [0238] 1 is presented on the corresponding window on the display screen of the display device with the aspect ratio thereof maintained. The display screen of the display device is thus free from discordance.
  • Rather than updating the display screen each time the conference participant (the user) places the display request through the input section [0239] 23, the display device 2 itself updates the display screen thereof every three seconds, for example. In this case, the controller 27 in the display device 2 controls timings, thereby performing subsequent processes as described above at regular intervals. The constantly updated display screen of each terminal 1 is viewed on the display of the display device 2.
  • Difference Capture Function [0240]
  • The full-screen capture method and the partial-screen capture method have been described. The display system [0241] 100 also provides a method in which a change in the display screen on the terminal 1 is detected and image data obtained by capturing the change only is sent.
  • The user operates the remote controller of the input section [0242] 23 to enter an operational input to perform a difference capture on each of the terminals 1 a-1 d. More specifically, the difference capture management flag in the display status management file 34 is set to be “1”.
  • For a first cycle of process immediately subsequent to the setting of the difference capture management flag to “1”, steps S[0243] 1-S20 are performed as described above to present the multi-window screen.
  • After the completion of the multi-window presentation, the controller [0244] 27 in the display device 2 sends a difference capture start command together with the display status management file 34 to the terminal 1 with the difference capture management flag thereof set to “1” through the communication unit 28 and the network 3.
  • The operation of the terminal [0245] 1 having received the difference capture start command and the display status management file 34 from the display device 2 will be discussed below.
  • FIG. 22 is a flow diagram illustrating the operation of terminal which has received the difference capture start command and the display status management file [0246] 34 from the display device 2. Here, the terminal of interest is the terminal 1 a (PC-1).
  • The data management processor [0247] 18 in the terminal 1 a receives, through the communication unit 17, the difference capture start command and the display status management file 34 sent by the display device 2 (step S41). The terminal 1 a thus performs a subsequent full-screen capture process. More specifically, the screen capture processor 19 stores the content of the video memory 12 in the storage 15 (step S42). The image data obtained here is referred to as pre-full-screen data.
  • The screen comparison processor [0248] 20 references the difference capture management flag in the display status management file 34, received by the data management processor 18, at any regular intervals set (once every 0.5 second, for example) (step S43). If the difference capture management flag is “1” (step S44), a subsequent difference capture operation is performed.
  • To quit the difference capture operation, the user simply sets the difference capture management flag to “0” in the same way as the display status management file [0249] 34 is modified. The screen comparison processor 20 determines whether or not the capture operation is suspended, by referencing the difference capture management flag.
  • After performing the full-screen capture process, the screen capture processor [0250] 19 stores the content of the video memory 12 in the storage 15 (step S45). The image data obtained here is referred to as post-full-screen data. The screen comparison processor 20 in the controller 16 compares the pre-full-screen data with the post-full-screen data (step S46).
  • A screen comparison process is discussed in which a mouse pointer is moved (see FIG. 23). The mouse pointer changes the position thereof from pre-full-screen data [0251] 60 and post-full-screen data 61. As shown in FIG. 24, two areas 70 and 71 are recognized as being changed. The screen comparison processor 20 detects a change. If it is determined that there has been a change in the screen (step S47), the screen comparison processor 20 acquires image data of an area that has undergone a change, a size of the image data (the number of pixels in vertical and horizontal directions), and coordinates of an origin of the image data (absolute coordinates within the area defined by the captured image size in the display status management file 34).
  • In this case, the screen comparison processor [0252] 20 acquires, with respect to the area 70, captured image data of the area 70 (hereinafter referred to difference captured image data 1), a difference captured image size 1, and a difference capture origin 1, and with respect to the area 71, captured image data of the area 71 (hereinafter referred to difference captured image data 2), a difference captured image size 2, and a difference capture origin 2. The screen comparison processor 20 then stores these pieces of information in the storage 15 while notifying the data management processor 18 that the difference captured image data has been acquired (step S48). The difference captured image size 1, the difference captured image size 2, the difference capture origin 1, and the difference capture origin 2 are written on the respective portions thereof in the display status management file 34.
  • The data management processor [0253] 18 sends these pieces of data to the display device 2 through the communication unit 17 and the network 3 (step S49). The difference capture function is different from the full-screen capture and the partial-screen capture. More specifically, the difference capture function is performed not in response to the request to send received from the display device 2 but in response to the change in the screen detected by the screen comparison processor 20. The transmission operation of the captured image data is then performed.
  • The post-full-screen data is set to be pre-full-screen data for a subsequent screen comparison process (step S[0254] 50).
  • The display device [0255] 2 then receives the difference captured image data 1, the difference captured image data 2, and the display status management file 34 from the terminal 1 a. The operation of the display device 2 subsequent to the reception of these pieces of data will now be discussed.
  • Upon receiving the difference captured image data 1, the difference captured image data 2, and the display status management file [0256] 34 from the terminal 1 a at the communication unit 28, the controller 27 transfers an image synthesis command to the image synthesizer 42. The display device 2 enters the image synthesis process in the difference capture function.
  • Upon receiving the image synthesis command, the image synthesizer [0257] 42 rewrites portions of the captured image data stored in the captured image data memory 37 corresponding to the difference captured image data 1 and the difference captured image data 2, based on the window area information file 36 corresponding to the image displayed on the display 21 and the display status management file 34 (the difference captured image size 1, the difference captured image size 2, the difference capture origin 1, and the difference capture origin 2) received from the terminal 1 a. The process subsequent to this operation are identical to those in steps S19 and S20.
  • The amount of image data transmitted over the network [0258] 3 is smaller in the difference capture process than in the full-screen capture (or the partial-screen capture) process. The workload on the network 3 is thus reduced. The user constantly monitors an updated image of the terminal 1.
  • The above description of the difference capture process is based on the assumption that a dedicated program installed on the terminal [0259] 1 is used to capture the screen. If a driver for directly detecting a difference in the content of the video memory is available on the operating system (OS), such a driver may be used.
  • The image data may be exchanged in a compression standard format (such as JPEG) between the terminal [0260] 1 and the display device 2 to reduce the workload on the network 3.
  • In the above description, a variety of requests such as the display request and the single-window screen display request is performed by operating the input section [0261] 23 on the display device 2. In other words, the display device 2 has the initiative in the organization of the screen. Alternatively, each terminal 1 may have the initiative. The conference participant (the user) enters a desired operational input by operating the input section 13 on own terminal 1. The input information is transferred to the controller 16 through the user interface 14. The controller 16 in turn sends the request, including the terminal name and the IP address of own terminal 1, responsive to the input information to the display device 2 via the communication unit 17 and the network 3. The request is received by the communication unit 28 in the display device 2 through the network 3. The request is then transferred to the controller 27. The operation subsequent thereto remains the same as those already discussed. In this way, a variety of requests may be placed using the terminal 1.
  • The terminal [0262] 1 may designate the capture area by containing information, which designates the full-screen capture or the partial-screen capture, in the variety of requests.
  • The priority order may be designated by the user as necessary as described above. Alternatively, a plurality of terminals [0263] 1 which are scheduled to be connected to the display device 2 may be assigned beforehand priority order. If the terminals 1 are assigned beforehand the priority order, the priority assigned to each terminal 1 is automatically set in the priority item of the display status management file 34 when the terminal information is registered in the display status management file 34.
  • In the above description, the tentative window area is determined based on the tentative window area setting file [0264] 35. Alternatively, the tentative window area may be determined through calculation each time.
  • In the partial capture, the terminal [0265] 1 returns the image size to the display device 2. The display device 2 regenerates the true window area in response to the received image size. The regeneration of the true window area may be performed by the terminal 1.
  • Second Embodiment [0266]
  • FIG. 1 illustrates a network of a display system [0267] 100 including a network interactive display device 2 in accordance with preferred embodiments of the present invention.
  • The display system [0268] 100 includes a plurality of terminals (only four terminals 1 a, 1 b, 1 c, and 1 d are shown in FIG. 1), and the network interactive display device 2 (a projector here) having a multi-window screen presentation function as one of major functions of the present invention. The plurality of terminals 1 are respectively connected to the network interactive display device 2 through a network 3 in a two-way communication based on the TCP/IP protocol. A unique name is provided beforehand to each terminal 1 (hereinafter referred to as a terminal name). The network 3 may be any of a LAN (Local Area Network), a radio LAN, and a near-field communication radio LAN such as Bluetooth (Tradename of Bluetooth SIG Inc., U.S.A.).
  • The display system [0269] 100 allows screens presented on the plurality of terminals 1 to be concurrently presented on a multi-window display screen of the network interactive display device 2. Such a system 100 is useful in a conference or a presentation. The terminal 1 and the network interactive display device 2 will now be discussed in detail.
  • FIG. 26 is a block diagram illustrating the structure of the terminal [0270] 1 in accordance with a second preferred embodiment of the present invention.
  • The terminal [0271] 1 may be a personal computer or a PDA (Personal Digital Assistant). The terminal 1 includes a display 11 for presenting a diversity of information such as materials for presentation, a video memory 12 for storing the content to be presented on the display 11, an input section 13 including a tablet, a mouse, or a keyboard, a user interface 14 for detecting an operational input from the input section 13 and outputting the operational input to an arithmetic unit (CPU) 16, a storage 15 for storing application software programs (such as a control program) for performing the processes of the present invention, the arithmetic unit 16, and a communication unit 17.
  • The control program stored in the storage [0272] 15 is used to perform a terminal control function to achieve a multi-window screen presentation function on the network interactive display device 2, a screen capture function to capture a whole or a part of the screen of the display 11, an image conversion function to convert captured image data acquired by the screen capture function into data in a format of a display 21 of the network interactive display device 2, and a function to detect a change on the screen of the display 11. The application software programs and the CPU constitute a data management processor 18, a screen capture processor 19, and a screen comparison processor 20.
  • The data management processor [0273] 18 receives a variety of requests, including a connection request, a display request, an expansion display request, a request to switch to a single-window full screen, and an erase request through the user interface 14 or the communication unit 17, and performs processes responsive to each request. The data management processor 18 sends the captured image data, acquired by the image capture processor 19, to the display device 2 through the communication unit 17.
  • The communication unit [0274] 17 carries out a two-way communication with the network interactive display device 2. The communication protocol used here is the TCP/IP. The communication unit 17 has a protocol processing function for ARPICMP, IP, TCP, UDP, etc. required for the TCP/IP connection. This protocol processing function is carried out under the control of an OS.
  • FIG. 27 is a block diagram illustrating the structure of the network interactive display device [0275] 2 of the second preferred embodiment.
  • The display device [0276] 2 includes the display 21, a display control unit 22 which has a multi-window screen presentation function, an expansion display function, a function to switch to the single-window full screen, an insertion function, and an erase function, and controls the display screen to be presented on the display 21, an input section 23 including a remote controller, a mouse, or a keyboard, a user interface 24 for detecting an operational input from the input section 23 and for outputting the operational input to an arithmetic unit 27 to be discussed later, a program storage 25 for storing the display control program to provide the multi-window screen presentation function of the present invention, a data storage 26 for storing a variety of files and data required to carry out a control program, the arithmetic unit (CPU) 27 for generally controlling the display device 2, and a communication unit 28 for performing a two-way communication with each terminal 1.
  • The communication unit [0277] 28 carries out a two-way communication with the terminal 1. The communication protocol used here is the TCP/IP. The communication unit 28 has a protocol processing function for ARPICMP, IP, TCP, UDP, etc. required for the TCP/IP connection.
  • The display device [0278] 2 may be a plasma display, or a liquid-crystal display instead of the projector shown in FIG. 1. The display 21 is different depending on the type of the display device 2. Specifically, the display 21 is one of a liquid-crystal light valve, an LCOS light valve, or a DMD (Digital Micromirror Device) (Trademark of Texas Instruments) in the projector. The display screen of the video memory 12 becomes a projecting screen. The display 21 is a plasma display panel in the plasma display device, a liquid-crystal panel in the liquid-crystal display device, or an organic EL (Electroluminescent) panel in the organic EL display device.
  • Referring to FIG. 28, a variety of files stored in the data storage [0279] 25 are discussed.
  • The data storage [0280] 26 stores a permitted connection management file 31, a permitted display management file 32, a connection status management file 33, a display status management file 34, a tentative window area setting file 35, and a window area information file 36. The data storage 26 further includes a captured image data memory 37 for storing the captured image data sent from each terminal 1.
  • The permitted connection management file [0281] 31 registers a terminal name of a terminal 1 which is permitted for connection. The permitted display management file 32 registers a terminal name of a terminal 1 which is permitted for screen display.
  • The connection status management file [0282] 33 registers a terminal name of a terminal 1 which is currently connected to the display device 2.
  • The display status management file [0283] 34 manages a display status of the current display 21. The display status management file 34 manages, in a table form, terminal information relating to the terminal 1 that is a source of the captured image data currently presented on the display screen of the display 21. The display status management file 34 is updated each time the display screen of the display 21 is modified. For example, if the display screen is switched from a four-window screen to a three-window screen, terminal information of the terminal 1 corresponding to an erased window is deleted. If the display screen is switched from a four-window screen to a five-window screen, terminal information of the terminal 1 corresponding to an added window is newly registered.
  • FIG. 5 illustrates items of terminal information managed in the display status management file [0284] 34.
  • The display status management file [0285] 34 contains, as items thereof, a “terminal name”, an “IP address”, a “screen size”, “color count information”, “priority”, a “capture area management flag”, a “difference capture management flag”, a “captured image size”, a “difference captured image size 1”, a “difference captured image size 2”, a “difference capture origin 1”, and a “difference capture origin 2”.
  • The “terminal name” is a name provided beforehand to the terminal [0286] 1. The “screen size” is the number of pixels in the vertical and horizontal directions of the display screen of the display 11. For example, an SXGA terminal has 1280×1024 pixels, and an XGA terminal has 1024×768 pixels. The “color count information” represents the number of display colors of the display 11, and may be 256 colors, or 167,777,216 colors, for example. The terminal name, the IP address, the screen size, and the color count information are the items that must be stored in the display status management file 34 during a registration. Other items are set (updated) by the user as necessary.
  • The “priority” determines the size of the display size of each window assigned to the terminal [0287] 1 that is identified by the terminal name. The priority order takes “highest”, “high” or “none”. As will be discussed in detail, the display size having a high priority order becomes large. The “capture area management flag” manages the capture as to whether the screen of the terminal 1 identified by the terminal name is captured in a full-screen capture mode or a partial-screen capture mode. The capture area management flag is “0” in the full-screen capture mode, which is a standard mode, or “1” in the partial-screen capture mode.
  • The “difference capture management flag” manages the capture of whether the screen of the terminal [0288] 1 identified by the terminal name is captured in a normal capture mode or a change capture mode (hereinafter referenced to as a difference capture mode) for only a change on the display screen. The difference capture management file is “0” in the normal capture mode, or “1” in the difference capture mode.
  • The “captured image size” is the size of the captured image data (the number of pixels in the vertical and horizontal directions) when the capture area management flag is “1”, i.e., in the partial-screen capture mode. [0289]
  • The “difference captured image size 1” and the “difference captured image size 2” represent the sizes of two different areas acquired in the difference capture when the difference capture management flag is “1”. The “difference capture origin 1”, and the “difference capture origin 2” are the origins of the two different areas acquired in the difference capture, and are the absolute coordinates within an area defined by the captured image size. [0290]
  • The tentative window area setting file [0291] 35 is a file in which information identifying a tentative window area assigned to the terminal 1 is set beforehand. The tentative window area setting file 35 contains a plurality of tables, each table prepared for the terminals. The table has a structure as shown in FIG. 11A and FIG. 16A. The table will be discussed later. The window area information file 36 will also be discussed later.
  • When a predetermined operation is performed on the input section [0292] 23, the display 21 displays the contents of the files 31, 32, 33, 34, 35, and 36. The user thus checks and modifies the data on the display screen at will.
  • Returning to FIG. 28, the display control unit [0293] 22 includes a window area information generator 41, an image synthesizer 42, and an image processor 43. The arithmetic unit 27 receives a variety of requests such as the connection request, the display request, the insertion display request, the erase request, etc., received through the user interface 24 or the communication unit 28. In response to these requests, under the control of the arithmetic unit 27, the processors 41, 42, and 43 respectively perform required processes while accessing necessary files in the data storage 26. The arithmetic unit 27 thus controls the display 21. A display control program, stored in the program storage 25, for providing a multi-window screen presentation function and the arithmetic unit (CPU) 27 constitute the display control unit 22.
  • From the display status management file [0294] 34, the window area information generator 41 learns the number of terminals 1 to be presented, and the priority order and the screen size of each terminal 1. The window area information generator 41 splits the display screen size of the display 21 in accordance with the number of terminals 1 to be presented, and the priority order and the screen size of each terminal 1. The window area information generator 41 generates the window area information containing display size (hereinafter referred to as a window area size) of the window on the display 21 assigned to each terminal 1 to be displayed, and information identifying the display position of the window (the absolute coordinates at the top left corner of the window with respect to the display screen, hereinafter also referred to as an origin). The information is stored in the data storage 26 as the window area information file 36.
  • The image synthesizer [0295] 42 performs various processes including a contraction process, an expansion process, and a color conversion process on the captured image data stored in the captured image data memory 37, and then synthesizes the captured image data in accordance with the window area information file 36 generated in the window area information generator 41.
  • The image processor [0296] 43 performs a scanning frequency conversion process on a variety of pieces of image data such as the synthesized image data generated in the image synthesizer 42, and the display status management file 34 of the data storage 26 which is referenced using an OSD (on-screen display) function, thereby generating display image data and outputting the display image data to the display 21. The image processor 43 includes a scan converter, for example.
  • The operation of the second preferred embodiment of the present invention will now be discussed. FIG. 29 is a flow diagram illustrating the operation of the second preferred embodiment. [0297]
  • A predetermined operational input is entered in the input section [0298] 23 in the display device 2 in a preliminary step for multi-window screen presentation. Upon detecting the operational input through the user interface 24, the arithmetic unit 27 broadcasts a request to return a terminal name and an IP address together with the IP address of the display device 2 through the communication unit 28 and the network 3. When each terminal 1 receives the broadcast request to return the terminal name and the IP address, the terminal 1 returns own terminal name and IP address to the display device 2.
  • The display device [0299] 2 receives a reply (the terminal name and the IP address) from each terminal 1 through the communication unit 28, and determines whether each terminal is a connection permitted terminal. Specifically, the display device 2 determines whether the returned terminal name agrees with a terminal name registered in the permitted connection management file 31. If it is determined that the returned terminal name agrees with the registered terminal name, the display device 2 handles the terminal 1 as a connection permitted terminal.
  • The terminal names and the IP addresses of the terminals [0300] 1 determined as connection permitted terminals are successively registered in the connection status management file 33. The connection status management file 33 allows the display device 2 to learn how many terminals 1 are currently connected. Since the determination of whether the connection is permitted or not is based on the terminal name, the system works even if the IP address, provided to the terminal 1 using the DHCP, becomes different each time connection is made.
  • The display device [0301] 2 waits on standby for any request after the above preliminary step is complete. The display screens of the four terminals 1 a-1 d, out of the terminals 1 operated by conference participants, are presented on a multi-window display screen 50 of the display device 2. As for the resolutions thereof, the terminal 1 a has an SXGA resolution (1280×1024 pixels), the terminal 1 b has an SVGA resolution (800×600 pixels), the terminal 1 c has an XGA resolution (1024×768 pixels), and the terminal 1 d has a resolution of 480×640 pixels.
  • Multi-Window Screen Presentation Function [0302]
  • FIG. 8 illustrates the configuration of the display system in which display screens of four terminals [0303] 1 a-1 d are presented on the display screen of the display device 2.
  • The user operates the remote controller in the input section [0304] 23 to enter an operational input to display the screens of the terminals 1 a-1 d. Through the user interface 24, the arithmetic unit 27 is notified of the input information, namely, a request to display the screens of the terminals 1 a-1 d together with identification information of the terminals 1 a-1 d (step S1). When the request is placed, the priority order, the partial capture, and the difference capture may be designated. Here, no particular designation is performed.
  • Upon receiving the display request, the arithmetic unit [0305] 27 in the display device 2 performs processes in steps S3-S9 for each of the terminals 1 a-1 d to be displayed (step S2). The arithmetic unit 27 references the permitted connection management file 31 and the permitted display management file 32 according to the terminal name indicated by the identification information contained in the display request, thereby determining whether or not each terminal 1 is permitted for connection and whether or not each terminal 1 is permitted for display (step S3). If it is determined that each terminal 1 is permitted for both connection and display (step S4), the arithmetic unit 27 requests, through the communication unit 28, each terminal 1 to send the terminal information (the terminal name, the IP address, the screen size, and the color count information) (step S5). The arithmetic unit 27 receives the terminal information which has been sent in response to the request (step S6), and registers the terminal information in the display status management file 34 (step S7). If the priority order, the partial capture, and the difference capture are designated during the placement of the display request, the priority order, the capture area management flag, and the difference capture management flag are also registered in the registration in step S7.
  • The arithmetic unit [0306] 27 sends the display status management file 34 and a screen capture start command to the terminal 1 which is permitted for connection and display (step S8). When any terminal 1 is not permitted for connection and display, the arithmetic unit 27 sends a notification to that effect to the terminal 1 (step S9).
  • If all four terminals [0307] 1 a-1 d are permitted for connection and display, the terminal information from the terminals 1 a-1 d is registered in the display status management file 34 in the processes in steps S2-S9. At the same time, the screen capture start command is sent together with the display status management file 34 to each of the terminals 1 a-1 d through the communication unit 28.
  • Subsequent to the above processes, the arithmetic unit [0308] 27 notifies the window area information generator 41 in the display control unit 22 of a window area split request. The display device 2 enters a window area information generation process (step S10). The operation of the terminal 1 having received the screen capture start command will be discussed later. Discussed first is the window area information generation process performed by the window area information generator 41 in response to the window area split request.
  • FIG. 9 is a flow diagram illustrating the flow of a window area information generation process. The operation of the window area information generator [0309] 41 is specifically discussed on the assumption that the display status management file 34 is constructed as shown in FIG. 10. As shown in FIG. 10, terminal names PC-1, PC-2, PC-3, and PDA-1 correspond to the terminals 1 a, 1 b, 1 c, and 1 d, respectively.
  • Upon receiving the window area split request from the arithmetic unit [0310] 27, the window area information generator 41 learns the number of the terminals 1 to be displayed (here, four terminals 1) referencing the display status management file 34. The user interface 14 also acquires the priority order of each of the terminals 1 a-1 d (step S21). The window area information generator 41 references the tentative window area setting file 35 according to the number of terminals 1 and the priority order of each of the terminals 1 a-1 d, and acquires a tentative size and a tentative origin of each tentative window area assigned to each of the terminals 1 a, 1 b, 1 c, and 1 d (step S22). As will be clarified later, the adjective “tentative” is used because the area window here assigned to the terminal 1 is updated in a later step to size convert the captured image data.
  • As shown in FIG. 10, the four terminals [0311] 1 are to be displayed here and no priority order is set to all of the four terminals 1. A tentative area setting table in the tentative window area setting file 35 is organized as shown in FIG. 11A. Here, the display screen of the display 21 has a resolution of 1280×1024 (SXGA), and the tentative area setting table shown in FIG. 11A is organized based on this display screen. FIG. 11B shows the tentative window area based on the table window area setting table shown in FIG. 11A.
  • The priority order shown in FIG. 11A is determined based on the “priority” item in the display status management file [0312] 34, and the terminals 1 are first, second, third, and fourth in from the high order to the low order. The terminals 1 are assigned the “tentative size” and the “tentative origin” for the tentative window area in the lower table. All the terminals 1 a-1 d have “none” in the priority order row with no priority order set therefor (see FIG. 10). If no priority order is set, the order of assignment may be a predetermined one, or may be the order of registration to the display status management file 34. In the second preferred embodiment, the terminals 1 a, 1 b, 1 c, and 1 d (hereinafter referred to as the terminal names PC-1, PC-2, PC-3, and PDA-1 as appropriate) are assigned tentative window areas 50A, 50B, 50C, and 50D in that order.
  • The window area information generator [0313] 41 further acquires the screen sizes of the PC-1, PC-2, PC-3, and PDA-1 from the display status management file 34 (see FIG. 10), and determines the sizes and origins of true windows respectively assigned thereto based on the acquired screen sizes (step S23).
  • FIG. 12 illustrates the true window area size and the true origin. As shown, [0314] 51A, 51B, 51C, and 51D represent the true window areas assigned to the PC-1, PC-2, PC-3, and PDA-1, respectively. The captured image data to be displayed on the tentative window area is size converted with the aspect ratio thereof maintained. The true window areas are display areas of the tentative window areas 50A, 50B, 50C, and 50D in which the converted images are respectively displayed with the centers thereof aligned to be centered on the respective tentative window areas. The PC-2 is now specifically discussed in connection with the true window area size to determine the true window area. The screen size of the PC-2 is 1024×768 pixels (see FIG. 10). The image data of this size is contracted with the aspect ratio thereof (namely, the aspect ratio of the display screen of the display 11) maintained so that the image data is displayed within the window area 50B having the size of 640×512 pixels assigned to the PC-2. The contracted size is thus the true window area size. The true origin is used to place the window of that size at the center of the window area 50B as shown in FIG. 12, and is represented in pixel coordinates at the top left corner of the window (the absolute coordinates with respect to the entire display screen).
  • The window area information generator [0315] 41 determines the above-referenced true window area sizes and true origins for the PC-1, PC-2, PC-3, and PDA-1, and generates window area information containing the terminal name item, the window area item, and the origin item as shown in FIG. 13, and then stores the window area information in the data storage 26 as the window area information file 36 (step S24). The window area information generation process thus ends. The window area information file 36 is tagged with processing date (May 21, 2002, 17:00:32, for example).
  • Returning to FIG. 29, the arithmetic unit [0316] 27 in the display device 2 performs processes in steps S12-S14 to each of the terminals 1 a, 1 b, 1 c, and 1 d to be displayed (step S11) when the window area information generator 41 completes the window area information generation process (step S10). More specifically, the arithmetic unit 27 places a request to send the captured image data (step S12). Each of the terminals 1 a, 1 b, 1 c, and 1 d has already started the image capture process after receiving the capture start command transmitted from the display device 2 in step S8. The captured image data acquired in the screen capture process and the image size (in the partial capture process) are sent to the display device 2 in response to the request to send the captured image data in step S11.
  • The display device [0317] 2 receives the reply from the terminal 1 (step S13). If the rely is the captured image data, the display device 2 determines that the screen capture performed by the terminal 1 is the full-screen capture (step S14), and the received captured image data is written onto the captured image data memory 37 (step S16).
  • If the reply from the terminal [0318] 1 received in step S13 contains the captured image data and the image size, the arithmetic unit 27 determines the screen capture performed by the terminal 1 is a partial-screen capture (step S14). The captured image size in the display status management file 34 is updated with the received image size. The arithmetic unit 27 regenerates the window area information (the true window area size and the true origin) based on the received image size (step S15). The arithmetic unit 27 writes the received captured image data together with the image size on the captured image data memory 37 (step S16).
  • The above process is performed for each of the terminals [0319] 1 a-1 d. When the captured image data is received from the terminals 1 a-1 d, the arithmetic unit 27 sends an image synthesis command to the image synthesizer 42. The display device 2 enters the image synthesis process. (step S17).
  • Upon receiving the image synthesis command, the image synthesizer [0320] 42 performs the size conversion process and the color conversion process on the captured image data stored in the captured image data memory 37 based on the window area information in the window area information file 36 and the color count information in the display status management file 34. The image synthesizer 42 then the captured image data into a single screen image data, thereby generating the synthesized image data. The synthesized image data is then output to the image processor 43.
  • The image processor [0321] 43 converts the synthesized data from the image synthesizer 42 in display image data having a scanning frequency of the display 21. The display image data is then output to the display 21. As shown in FIG. 8, a multi-window display is thus present on the display screen 50 in which the captured image data (display screen) of the terminals 1 a, 1 b, 1 c, and 1 d is presented on the true window areas (hereinafter also referred to as window screens) of 51A, 51B, 51C, and 51D (step S18).
  • The operation of the terminal [0322] 1 having received the capture start command and the display status management file 34 from the display device 2 is discussed below.
  • FIG. 30 is a flow diagram of an operation of the terminal [0323] 1 which has received the capture start command and the display status management file 34 from the display device 2. The terminal 1 is here 1 a (PC-1).
  • The data management processor [0324] 18 in the terminal 1 a receives, through the communication unit 17, the capture start command and the display status management file 34 sent from the display device 2 (step S31). The data management processor 18 references the capture area setting flag in the terminal 1 a in the display status management file 34 (step S32). Since the capture area setting flag is “0” (step S33), the full-screen capture is determined to be activated. The data management processor 18 sends a full-screen capture command to the screen capture processor 19. In response to the full-screen capture command, the screen capture processor 19 stores the content of the video memory 12 (i.e., the content currently displayed on the display screen of the display 11) in the storage 15 in a bit-map format (the full-screen capture process) (step S34). The image capture processor 19 notifies the data management processor 18 of the completion of the screen capture process.
  • If the capture area setting flag is “1” in step S[0325] 33, a partial capture is determined to be activated. A capture area designation screen indicating a message saying “designate a capture area” is presented on the display 11 (step S35), and a partial capture command is sent to the screen capture processor 19. When the user, who reads the message on the capture area designation screen, selects a window or encloses. a desired area using the input section 13, the screen capture processor 19 recognizes the user operation through the user interface 14. The image data on the video memory 12 corresponding to the designated area and the image size is stored in the storage 15 (the partial capture process) (step S36). The image capture processor 19 notifies the data management processor 18 of the completion of the screen capture process.
  • Upon receiving the capture start command, the terminal [0326] 1 a performs the screen capture process. Meanwhile, the display device 2 performs the window area information generation process as already described. The terminal 1 a sends the captured image data and the image size (in the partial capture) to the display device 2 through the communication unit 17 (step S38) after the screen capture processor 19 completes the screen capture process and the request to send the captured image data is received from the display device 2 (step S37).
  • The above-referenced process is similarly performed on each of the remaining terminals [0327] 1 b, 1 c, and 1 d in addition to the terminal 1 a. As a result, the display device 2 receives, from each of the terminals 1 a, 1 b, 1 c, and 1 d, the captured image data and the image size (in the partial capture process).
  • Each terminal [0328] 1 receives the display status management file 34 from the display device 2. The display status management file 34 is used to check the status of the capture area setting flag of own terminal 1. Furthermore, the content of the file may be displayed on the display 11 by performing a predetermined operation on the input section 13. The permitted connection management file 31, the permitted display management file 32, and the connection status management file 33 may also be acquired from the display device 2 as necessary to be presented on the display 11. In this way, the user may learn what terminals are displayed other than own terminal operated by the user himself and the range of authority granted to own terminal 1.
  • Expansion Display Function [0329]
  • The expansion display function for expanding any one of a plurality of currently presented windows is discussed below. The expansion display function is performed by updating the priority order in the above-referenced arrangement. [0330]
  • FIG. 15 diagrammatically illustrates the display system to explain the expansion display function. The screen of the terminal [0331] 1 a is expanded by heightening the priority order of the terminal 1 a.
  • The screen of the terminal [0332] 1 a is designated by operating the remote controller. If the terminal 1 a is assigned any key in the remote controller, the user designates the terminal 1 a by pressing that key. If no particular key is assigned, the user may operate the remote controller to select the terminal 1 a on a menu screen on the display 21, or may click the screen of the terminal 1 a with a pointer on the display screen using the remote controller.
  • Designation information input in this way, i.e., the expansion display request containing the identification information of the terminal [0333] 1 a, is sent to the arithmetic unit 27 through the user interface 24. The arithmetic unit 27 identifies the terminal 1 a based on the identification information contained in the expansion display request. The arithmetic unit 27 sets the priority order of the terminal 1 a (PC-1) in the display status management file 34 to be “highest”, and sends a window area split request to the window area information generator 41.
  • The window area information generator [0334] 41 generates the window area information file 36 as already described. The tentative window area setting table, which is referenced in the generation of the window area information file 36, is organized as illustrated in FIG. 16A. FIG. 16B shows tentative window areas based on the tentative window area setting table. The priority order illustrated in FIG. 16B is determined based on the priority order in the display status management file 34. The priority order of the terminal 1 a is higher than those of the remaining terminals 1 b, 1 c, and 1 d. The terminal 1 a is thus assigned the tentative window area 52A (the window area for the first terminal) having the first priority in FIG. 16B. The assignment of the window areas to the remaining terminals 1 b, 1 c, and 1 d having no priority order set therefor is arbitrary. For example, the terminals 1 b, 1 c, and 1 d are now assigned the window areas 52B, 52C, and 52D, respectively. As already described, the true window areas to be assigned to the terminals 1 b, 1 c, and 1 d are determined.
  • FIG. 17 illustrates the true window areas. The terminals [0335] 1 a, 1 b, 1 c, and 1 d are assigned the true window areas 53A, 53B, 53C, and 53D, respectively. The window area information file 36 is then organized as illustrated in FIG. 18.
  • As already described, the captured image data (display screen) of the terminals [0336] 1 b, 1 c, and 1 d appears on the true window areas 53A, 53B, 53C, and 53D, respectively. The designated window 53A is shown in the expanded state on the multi-window screen 50 as shown in FIG. 15.
  • Function to Switch to Single-Window Screen [0337]
  • One of the plurality of multi windows is shown on a single-window screen as shown in FIG. 19. The user operates the remote controller to enter an operational input for the function to switch to a single-window screen. More specifically, the arithmetic unit [0338] 27 is notified of the input information, namely, the single-window display request containing the identification information of the terminal 1 c corresponding to the window 51C to be displayed on the single-window screen, through the user interface 24. The arithmetic unit 27 identifies the terminal 1 c based on the identification information contained in the single-window display request. The arithmetic unit 27 sets the priority order item of the terminal 1 c (PC-3) in the display status management file 34 to be the highest, thereby sending a window area split request to the window area information generator 41.
  • As a result, the multi-window screen is replaced with the single-window screen as shown in FIG. 19. The user may return to the multi-window screen by performing a predetermined operation on the remote controller to set the priority order to “none”. [0339]
  • The switching to the single-window screen allows the user to recognize details, which are not visible in the contracted scale on the window. The ease of use is assured because the predetermined operation quickly returns the screen to the multi-window screen. [0340]
  • Insertion Function [0341]
  • As shown in FIG. 20, a new screen may be inserted into a currently presented multi-window screen. Such an insertion corresponds to a display request subsequent to the display of a multi-window screen. The screen insertion is thus performed in the same process as in the display request process. [0342]
  • Erase Function [0343]
  • As shown in FIG. 21, one of the plurality of currently presented windows may be erased. The user operates the remote controller to enter an operational input for the erase function. More specifically, the input information, namely, an erase request containing the identification information of the terminal [0344] 1 d corresponding to the window 51D to be erased is sent to the arithmetic unit 27 through the user interface 24. The arithmetic unit 27 identifies the terminal 1 d based on the identification information contained in the erase request. The arithmetic unit 27 deletes the terminal information of the terminal 1 d from the display status management file 34, and sends a window area split request to the window area information generator 41. The subsequent process remains identical to the one already discussed. The window 50D designated for erasure is erased as shown in FIG. 21. The multi-window screen is reorganized so that a plurality of windows corresponding to the number of terminals subsequent to the erasure are presented. Alternatively, the window 51D designated for erasure may be merely erased.
  • In accordance with the second preferred embodiment, the screens respectively presented on the plurality of terminals [0345] 1 connected to the network 3 are presented on the display of the display device 2 having the multi-window screen presentation function.
  • Since the expansion display function, the switching function to the single-window screen, the insertion function, and the erase function are available in addition to the multi-window screen presentation function, a sophisticated display device [0346] 2 is provided.
  • The size conversion is performed with the aspect ratio of the captured image data maintained when a multi-window screen is presented. The display device [0347] 2 thus presents a display screen free from discordance.
  • Rather than updating the display screen each time the conference participant (the user) places the display request through the input section [0348] 23, the display device 2 itself updates the display screen thereof every three seconds, for example. In this case, the arithmetic unit 27 in the display device 2 controls timings, thereby performing subsequent processes as described above at regular intervals. The display screen of each terminal 1 is constantly updated.
  • Difference Capture Function [0349]
  • The full-screen capture method and the partial-screen capture method have been described. The display system [0350] 100 also provides a method in which a change in the display screen on the terminal 1 is detected and image data obtained by capturing the change only is sent.
  • The user operates the remote controller of the input section [0351] 23 to enter an operational input to perform a difference capture on each of the terminals 1 a-1 d. More specifically, the difference capture management flag in the display status management file 34 is set to be “1”.
  • For a first cycle of process subsequent to the setting of the difference capture management flag to “1”, steps S[0352] 1-S18 are performed as described above to present the multi-window screen.
  • After the completion of the multi-window presentation, the arithmetic unit [0353] 27 in the display device 2 sends a difference capture start command together with the display status management file 34 to the terminal 1, having the difference capture management flag set to “1”, through the communication unit 28 and the network 3.
  • The operation of the terminal [0354] 1 having received the difference capture start command and the display status management file 34 from the display device 2 will be discussed below.
  • FIG. 22 is a flow diagram illustrating the operation of terminal which has received the difference capture start command and the display status management file [0355] 34 from the display device 2. Here, the terminal is the terminal 1 a (PC-1).
  • The data management processor [0356] 18 in the terminal 1 a receives, through the communication unit 17, the difference capture start command and the display status management file 34 sent by the display device 2 (step S41). The terminal 1 a thus performs a subsequent full-screen capture process. More specifically, the screen capture processor 19 stores the content of the video memory 12 in the storage 15 (step S42). The image data obtained here is referred to as pre-full-screen data.
  • The screen comparison processor [0357] 20 references the difference capture management flag in the display status management file 34, received by the data management processor 18, at any regular intervals set (once every 0.5 second, for example) (step S43). If the difference capture management flag is “1” (step S44), a subsequent difference capture operation is performed.
  • To quit the difference capture operation, the user simply sets the difference capture management flag to “0” in the same way the display status management file [0358] 34 is modified. The screen comparison processor 20 determines whether or not the capture operation is suspended, by referencing the difference capture management flag.
  • After performing the full-screen capture process, the screen capture processor [0359] 19 stores the content of the video memory 12 in the storage 15 (step S45). The image data obtained here is referred to as post-full-screen data. The screen comparison processor 20 in the controller 16 compares the pre-full-screen data with the post-full-screen data (step S46).
  • A screen comparison process is discussed in which a mouse pointer is moved (see FIG. 23). The mouse pointer changes the position thereof from pre-full-screen data [0360] 60 and post-full-screen data 61. As shown in FIG. 24, two areas 70 and 71 are recognized as being changed. The screen comparison processor 20 detects a change. If it is determined that there has been a change in the screen (step S47), the screen comparison processor 20 acquires image data of an area that has undergone a change, a size of the image data (the number of pixels in vertical and horizontal directions), and coordinates of an origin of the image data (absolute coordinates within the area defined by the captured image size in the display status management file 34).
  • In this case, the screen comparison processor [0361] 20 acquires, with respect to the area 70, captured image data of the area 70 (hereinafter referred to difference captured image data 1), a difference captured image size 1, and a difference capture origin 1, and with respect to the area 71, captured image data of the area 71 (hereinafter referred to difference captured image data 2), a difference captured image size 2, and a difference capture origin 2. The screen comparison processor 20 then stores these pieces of information in the storage 15 while notifying the data management processor 18 that the difference captured image data has been acquired (step S48). The difference captured image size 1, the difference captured image size 2, the difference capture origin 1, and the difference capture origin 2 are written on the respective portions thereof in the display status management file 34.
  • The data management processor [0362] 18 sends these pieces of data to the display device 2 through the communication unit 17 and the network 3 (step S49). The difference capture function is different from the full-screen capture and the partial-screen capture. More specifically, the difference capture function is performed not in response to the request to send received from the display device 2 but in response to the change in the screen detected by the screen comparison processor 20. The transmission operation of the captured image data is then performed.
  • The post-full-screen data is set to be pre-full-screen data for a subsequent screen comparison process (step S[0363] 50).
  • The display device [0364] 2 then receives the difference captured image data 1, the difference captured image data 2, and the display status management file 34 from the terminal 1 a. The operation of the display device 2 subsequent to the reception of these pieces of data will now be discussed.
  • Upon receiving the difference captured image data 1, the difference captured image data 2, and the display status management file [0365] 34 from the terminal 1 a at the communication unit 28, the arithmetic unit 27 transfers an image synthesis command to the image synthesizer 42. The display device 2 enters the image synthesis process in the difference capture function.
  • Upon receiving the image synthesis command, the image synthesizer [0366] 42 rewrites portions of the captured image data stored in the captured image data memory 37 corresponding to the difference captured image data 1 and the difference captured image data 2, based on the window area information file 36 corresponding to the image displayed on the display 21 and the display status management file 34 (the difference captured image size 1, the difference captured image size 2, the difference capture origin 1, and the difference capture origin 2) received from the terminal 1 a. The process subsequent to this operation are identical to those in steps S17 and S18.
  • The amount of image data transmitted over the network [0367] 3 is smaller in the difference capture process than in the full-screen capture (or the partial-screen capture) process. The workload on the network 3 is thus reduced. The user constantly monitors an updated image of the terminal 1.
  • The above description of the difference capture process is based on the assumption that a dedicated program installed on the terminal [0368] 1 is used to capture the screen. If a driver for directly detecting a difference in the content of the video memory is available on the operating system (OS), such a driver may be used.
  • The image data may be exchanged in a compression standard format (such as JPEG) between the terminal [0369] 1 and the display device 2 to reduce the workload on the network 3.
  • In the above description, a variety of requests such as the display request and the single-window screen presentation request is performed by operating the input section [0370] 23 on the display device 2. In other words, the display device 2 has the initiative in the organization of the screen. Alternatively, each terminal 1 may have the initiative. The conference participant (the user) enters a desired operational input by operating the input section 13 on own terminal 1. The input information is transferred to the data management processor 18 through the user interface 14. The data management processor 18 in turn sends the request, including the terminal name and the IP address of own terminal 1, responsive to the input information to the display device 2 via the communication unit 17 and the network 3. The request is received by the communication unit 28 in the display device 2 through the network 3. The request is then transferred to the arithmetic unit 27. The operation subsequent thereto remains the same as those already discussed. In this way, a variety of requests may be placed on the terminal 1.
  • The terminal [0371] 1 may designate the capture area by containing information, which designates the full-screen capture or the partial-screen capture, in the variety of requests.
  • The priority order may be designated by the user as necessary as described above. Alternatively, a plurality of terminals [0372] 1 which are scheduled to be connected to the display device 2 may be assigned beforehand priority order. If the terminals 1 are assigned beforehand the priority order, the priority assigned to each terminal 1 is automatically set in the priority item of the display status management file 34 when the terminal information is registered in the display status management file 34.
  • In the second preferred embodiment, the tentative window area is determined based on the tentative window area setting file [0373] 35. Alternatively, the tentative window area may be determined through calculation each time.

Claims (30)

    What is claimed is:
  1. 1. A display system comprising:
    a plurality of terminals, each terminal having a screen capture function, and sending image data, captured using the screen capture function, over a network; and
    a network interactive display device, including a display, receiving the captured image data transmitted from the terminal through the network, and having a multi-window screen presentation function for synthesizing the captured image data into single screen multi-window format data to be displayed on a display screen of the display,
    wherein, as processes required to present the single screen multi-window format data on the display screen of the display of the network interactive display device, the terminal performs a size conversion process of an image size of the image data captured using the screen capture function and the network interactive display device acquires the captured image data subsequent to the size conversion thereof from the terminal, and synthesizes the received captured image data.
  2. 2. A display system according to claim 1, wherein the network interactive display device divides the display screen of the display into windows of the number equal to the number of terminals to be displayed, determines a display size of the window assigned to each terminal to be displayed, and sends information of the display size to the terminal, and wherein the terminal performs the size conversion process on the image size of the captured image data to the received display size when the terminal receives the display size.
  3. 3. A display system according to claim 1, wherein, in addition to the size conversion process on the image data captured using the screen capture function, the terminal further performs a color conversion process on the captured image data in accordance with a color count of the display of the network interactive display device before sending the captured image data to the network interactive display device.
  4. 4. A display system according to claim 3, wherein the network interactive display device also sends the color count of own display to the terminal when sending the display size to the terminal, while the terminal performs the color conversion process in response to the color count received from the network interactive display device.
  5. 5. A network interactive display device connected to each of a plurality of terminals through a network, each terminal having a screen capture function, the network interactive display device comprising:
    a display;
    a communication unit for communicating in a two-way fashion with each of the terminals; and
    a display control unit,
    wherein the communication unit receives the image data which has been captured by each terminal through the screen capture function thereof, and which has been size converted to a predetermined image size by each terminal, and the display control unit has a multi-window screen presentation function for synthesizing the captured image data received by the communication unit into single screen multi-window format data to be displayed on a display screen of the display.
  6. 6. A network interactive display device according to claim 5, wherein the display control unit has an insertion function for inserting a new window into a current display screen to display the new window.
  7. 7. A network interactive display device according to claim 5, wherein the terminal that provides the captured image data to be displayed on the display screen of the display is selected in a two-way communication of the communication unit by one of the network interactive display device and the terminal.
  8. 8. A network interactive display device according to claim 5, wherein the display control unit has an expansion display function for expanding a predetermined window from among a plurality of windows forming a multi-window screen displayed on the display screen of the display.
  9. 9. A network interactive display device according to claim 5, wherein the display control unit has a single-window screen selection function for switching the display screen from a predetermined window from among a plurality of windows forming a multi-window screen displayed on the display screen of the display to a single-window full screen.
  10. 10. A network interactive display device according to claim 5, wherein the display control unit has an erase function for erasing a predetermined window from among a plurality of windows forming a multi-window screen displayed on the display screen of the display.
  11. 11. A network interactive display device according to claim 10, wherein the predetermined window is selected by one of the network interactive display device and the terminal in a two-way communication of the communication unit thereof.
  12. 12. A network interactive display device according to claim 5, wherein the image captured data received from the terminal is obtained by designating the whole or a portion of the display screen of the terminal.
  13. 13. A network interactive display device according to claim 5, wherein the captured image data received from the terminal is obtained by detecting and capturing only a change on the display screen of the terminal.
  14. 14. A network interactive display device according to claim 5, further comprising a display size determining unit that divides the display screen of the display into windows of the number equal to the number of terminals to be displayed, and determines a display size of the window to which the terminal to be displayed is assigned, and a controller that sends the display size determined by the display size determining unit to the corresponding terminal through the communication unit, wherein the controller receives, through the communication unit, the captured image data, having the converted size equal to the display size of the window assigned to the terminal, from the terminal to which the display size is sent, and controls the display control unit to synthesize the received captured image data into single screen multi-window format data to be displayed on the display screen of the display.
  15. 15. A network interactive display device according to claim 14, wherein an aspect ratio of the window assigned to the terminal to be displayed is equalized to an aspect ratio of the display screen of the display of the terminal.
  16. 16. A network interactive display device according to claim 5, wherein, through the communication unit, the controller also sends a display color count of the display to the terminal when sending the display size to the terminal, and receives the captured image data having the converted size equal to the display size of the window assigned to the terminal and having the display color count converted to the display color count of the display of the network interactive display device, from the terminal to which the display size and the display color count have been sent, and controls the display control unit to synthesize the received captured image data into single screen multi-window format data to be displayed on the display screen of the display.
  17. 17. A terminal connected to a network interactive display device according to claim 5 through a network, the terminal comprising:
    a display;
    a communication unit that communicates in a two-way fashion with the network interactive display device;
    a screen capture processor that captures the content displayed on the display screen of the display;
    an image converter which converts the image data captured by the screen capture processor to data of a predetermined image size; and
    a controller that sends the captured image data, size converted by the image converter, from the communication unit to the network interactive display device,
    wherein the terminal generates the captured image data that is to be displayed on one of the multi windows displayed on the display screen of a display of the network interactive display device.
  18. 18. A terminal according to claim 17, wherein the display screen of the display of the network interactive display device is divided into windows of the number equal to the number of terminals to be displayed, a display size of the window assigned to each terminal to be displayed is determined, and the image converter converts the image data captured by the screen capture processor to data of the display size assigned to own terminal.
  19. 19. A terminal according to claim 17, wherein the image converter performs a color conversion process on the captured image data to match the display color count of the display of the network interactive display device in addition to the size conversion process, and the controller sends the captured image data, which has been subjected to the size conversion process and the color conversion process, from the communication unit to the network interactive display device.
  20. 20. A network interactive display device connected to each of a plurality of terminals through a network, each terminal having a screen capture function, the network interactive display device comprising:
    a display;
    a communication unit for communicating in a two-way fashion with each of the terminals; and
    a display control unit,
    wherein the display control unit has a multi-window screen presentation function for synthesizing the captured image data, captured by each terminal through the screen capture function and received by the communication unit, into single screen multi-window format data to be displayed on display screen of the display of the network interactive display device.
  21. 21. A network interactive display device according to claim 20, wherein the display control unit has an insertion function for inserting a new window into a current display screen to display the new window.
  22. 22. A network interactive display device according to claim 20, wherein the terminal that provides the captured image data to be displayed on the display screen of the display is selected in a two-way communication of the communication unit by one of the network interactive display device and the terminal.
  23. 23. A network interactive display device according to claim 20, wherein the display control unit has an expansion display function for expanding a predetermined window from among a plurality of windows forming a multi-window screen displayed on the display screen of the display.
  24. 24. A network interactive display device according to claim 20, wherein the display control unit has a single-window screen selection function for switching the display screen from a predetermined window from among a plurality of windows forming a multi-window screen displayed on the display screen of the display to a single-window full screen.
  25. 25. A network interactive display device according to claim 20, wherein the display control unit has an erase function for erasing a predetermined window from among a plurality of windows forming a multi-window screen displayed on the display screen of the display.
  26. 26. A network interactive display device according to claim 25, wherein the predetermined window is selected by one of the network interactive display device and the terminal in a two-way communication of the communication unit thereof.
  27. 27. A network interactive display device according to claim 20, wherein the captured image data received from the terminal is obtained by designating the whole or a portion of the display screen of the terminal.
  28. 28. A network interactive display device according to claim 20, wherein the captured image data received from the terminal is obtained by detecting and capturing only a change on the display screen of the terminal.
  29. 29. A network interactive display device according to claim 20, wherein the display control unit comprises a window area information generator which divides the display screen of the display into windows of the number equal to the number of terminals to be displayed, and generates window area information containing a display size of the window to which the terminal to be displayed is assigned, and information identifying a display position of the window, an image synthesizer which synthesizes the captured image data from the terminals into single screen multi-window format data in accordance with the window area information generated by the window area information generator, thereby generating synthesized image data, and an image processor which processes the synthesized image data generated by the image synthesizer, thereby generating display image data and outputting the display image data to the display.
  30. 30. A network interactive display device according to claim 29, wherein the image synthesizer synthesizes the captured image data by contracting or expanding the captured image data from each terminal with an aspect ratio of the image size of the captured image data maintained.
US10623518 2002-07-23 2003-07-22 Display system, network interactive display device, terminal, and control program Abandoned US20040130568A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2002214405A JP4010198B2 (en) 2002-07-23 2002-07-23 Network-enabled display devices, network-enabled projector and the display control program
JP2002-214405 2002-07-23
JP2002-214406 2002-07-23
JP2002214406A JP4010199B2 (en) 2002-07-23 2002-07-23 Display system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US12588714 US8656302B2 (en) 2002-07-23 2009-10-26 Display system, network interactive display device, terminal, and control program
US14139369 US20140115528A1 (en) 2002-07-23 2013-12-23 Display system, network interactive display device, terminal, and control program

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US12588714 Continuation US8656302B2 (en) 2002-07-23 2009-10-26 Display system, network interactive display device, terminal, and control program

Publications (1)

Publication Number Publication Date
US20040130568A1 true true US20040130568A1 (en) 2004-07-08

Family

ID=30002385

Family Applications (3)

Application Number Title Priority Date Filing Date
US10623518 Abandoned US20040130568A1 (en) 2002-07-23 2003-07-22 Display system, network interactive display device, terminal, and control program
US12588714 Active 2025-06-10 US8656302B2 (en) 2002-07-23 2009-10-26 Display system, network interactive display device, terminal, and control program
US14139369 Abandoned US20140115528A1 (en) 2002-07-23 2013-12-23 Display system, network interactive display device, terminal, and control program

Family Applications After (2)

Application Number Title Priority Date Filing Date
US12588714 Active 2025-06-10 US8656302B2 (en) 2002-07-23 2009-10-26 Display system, network interactive display device, terminal, and control program
US14139369 Abandoned US20140115528A1 (en) 2002-07-23 2013-12-23 Display system, network interactive display device, terminal, and control program

Country Status (4)

Country Link
US (3) US20040130568A1 (en)
EP (1) EP1385336B1 (en)
CN (1) CN1268122C (en)
DE (2) DE60316388D1 (en)

Cited By (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050160479A1 (en) * 2004-01-21 2005-07-21 Seiko Epson Corporation Network system of projector
US20060218583A1 (en) * 2005-03-25 2006-09-28 Alcatel Interactive displaying system
US20070050778A1 (en) * 2005-08-30 2007-03-01 Si-Hyoung Lee User interface method, system, and device in multitasking environment
US20070055941A1 (en) * 2005-09-08 2007-03-08 Bhakta Dharmesh N Method and apparatus to selectively display portions of a shared desktop in a collaborative environment
US20070085846A1 (en) * 2005-10-07 2007-04-19 Benq Corporation Projector and method for issuing display authority token to computers from the same
US20070098353A1 (en) * 2005-11-01 2007-05-03 Lite-On It Corp. Dvd recorder with surveillance function
US20070186174A1 (en) * 2006-02-07 2007-08-09 Kazunori Horikiri Electronic conference system, electronic conference assistance method and conference control terminal device
US20070257927A1 (en) * 2004-03-10 2007-11-08 Yasuaki Sakanishi Image Transmission System and Image Transmission Method
US20080022291A1 (en) * 2004-12-01 2008-01-24 Tong Shao Device And Method For Computer Display Synthesis
US20080079740A1 (en) * 2006-09-29 2008-04-03 Bruce Aaron Tankleff Intelligent display
US20080148331A1 (en) * 2006-12-19 2008-06-19 At&T Knowledge Ventures, Lp System and apparatus for managing media content
US20080158438A1 (en) * 2004-03-10 2008-07-03 Tsuyoshi Maeda Image Transmission System and Image Transmission Method
US20090044116A1 (en) * 2007-08-07 2009-02-12 Seiko Epson Corporation Graphical user interface device
US20090064016A1 (en) * 2007-08-31 2009-03-05 Hong Fu Jin Precision Industry(Shenzhen) Co., Ltd. Displaying device with user-defined display regions and method thereof
US20090207321A1 (en) * 2008-02-15 2009-08-20 Seiko Epson Corporation Image transfer device, image display apparatus, and image data transfer method
WO2009114232A2 (en) * 2008-03-14 2009-09-17 Microsoft Corporation Multi-monitor remote desktop environment user interface
US20090276707A1 (en) * 2008-05-01 2009-11-05 Hamilton Ii Rick A Directed communication in a virtual environment
US20090284667A1 (en) * 2003-03-24 2009-11-19 Seiko Epson Corporation Image-display method, projector, image-display system, projector-control method, image-display program, and projector-control program
US20100100847A1 (en) * 2002-05-27 2010-04-22 Seiko Epson Corporation Image data transmission system, process and program, image data output device and image display device
US20100174992A1 (en) * 2009-01-04 2010-07-08 Leon Portman System and method for screen recording
US20100257586A1 (en) * 2001-08-28 2010-10-07 Seiko Epson Corporation Projector projecting password
US20100289806A1 (en) * 2009-05-18 2010-11-18 Apple Inc. Memory management based on automatic full-screen detection
EP2023630A3 (en) * 2007-08-07 2010-12-01 Seiko Epson Corporation Conferencing system, server, image display method, and computer program product
US20100302130A1 (en) * 2009-05-29 2010-12-02 Seiko Epson Corporation Image display system, image display device, and image display method
US20110169836A1 (en) * 2008-09-03 2011-07-14 Hitachi High-Technologies Corporation Automatic analyzer
US20110209063A1 (en) * 2008-11-17 2011-08-25 Shenzhen Tcl New Technology Ltd. Apparatus and method for portable media player notification
US20110221763A1 (en) * 2010-03-15 2011-09-15 Seiko Epson Corporation Display device, terminal device, display system, display method, and image alteration method
US20120030594A1 (en) * 2010-07-29 2012-02-02 Seiko Epson Corporation Information storage medium, terminal device, display system, and image generating method
US20120026189A1 (en) * 2010-07-29 2012-02-02 Seiko Epson Corporation Display device, display system and display method
US20120075332A1 (en) * 2010-09-24 2012-03-29 Walton Advanced Engineering Inc. Portable storage device and its operating method
US20120137222A1 (en) * 2010-11-30 2012-05-31 Satoshi Ozaki Program synthesizing device and program synthesizing method
US8296572B2 (en) 2006-04-04 2012-10-23 Seiko Epson Corporation Projector system
US20130145315A1 (en) * 2011-12-05 2013-06-06 Hai-Bo Zhou Electronic device with multi-window displaying function and multi-window displaying method thereof
US20130293667A1 (en) * 2012-05-07 2013-11-07 Cellco Partnership D/B/A Verizon Wireless Method and apparatus for dynamic sharing of desktop content
US20130328779A1 (en) * 2012-06-08 2013-12-12 Microsoft Corporation Remote session control using multi-touch inputs
US20140168168A1 (en) * 2012-12-18 2014-06-19 Seiko Epson Corporation Display device, and method of controlling display device
US20140245185A1 (en) * 2013-02-28 2014-08-28 Ricoh Company, Ltd. Electronic Information Collaboration System
US20140244720A1 (en) * 2013-02-28 2014-08-28 Ricoh Company, Ltd. Electronic Information Collaboration System
US20150042561A1 (en) * 2013-08-12 2015-02-12 Seiko Epson Corporation Information processing device, information processing method, and recording medium
US20150054715A1 (en) * 2008-12-19 2015-02-26 Canon Kabushiki Kaisha Display controlling apparatus and image processing apparatus
US20150109400A1 (en) * 2012-06-05 2015-04-23 Huawei Technologies Co., Ltd. Method, Apparatus and System for Controlling Multipicture Display
US20150363089A1 (en) * 2014-06-17 2015-12-17 Sony Corporation Information acquiring apparatus and method, and electronic device
US20160065718A1 (en) * 2014-09-02 2016-03-03 Ricoh Company, Ltd. Information processing system, information processing apparatus, device control method, and medium
US20160072925A1 (en) * 2014-09-10 2016-03-10 Ricoh Company, Ltd. Information processing system, information processing device, and device control method
US20160313807A1 (en) * 2008-01-07 2016-10-27 Samsung Electronics Co., Ltd. Method for providing area of image displayed on display apparatus in gui form using electronic apparatus, and electronic apparatus applying the same
US20170024031A1 (en) * 2014-04-18 2017-01-26 Seiko Epson Corporation Display system, display device, and display control method
US9641570B2 (en) 2013-02-28 2017-05-02 Ricoh Company, Ltd. Electronic information collaboration system
US20170195378A1 (en) * 2012-09-28 2017-07-06 Intel Corporation Multiple-device screen capture
US9756398B2 (en) * 2013-10-23 2017-09-05 Lg Electronics Inc. TV and operating method thereof
US20170300285A1 (en) * 2016-04-13 2017-10-19 Seiko Epson Corporation Display system, display device, and method of controlling display system

Families Citing this family (49)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4281593B2 (en) 2004-03-24 2009-06-17 セイコーエプソン株式会社 Control of the projector
JP2005284195A (en) 2004-03-31 2005-10-13 Seiko Epson Corp Image display system
DE102005009105A1 (en) * 2005-02-28 2006-09-07 Siemens Ag Process and manage a display device
CN100396078C (en) 2005-07-13 2008-06-18 圆展科技股份有限公司 Camera capable of outputting integrated multiple pieces of image in use for brief report
CN1941884A (en) * 2005-09-27 2007-04-04 联想(北京)有限公司 Method and device for wireless transmitting display signal
US9198084B2 (en) 2006-05-26 2015-11-24 Qualcomm Incorporated Wireless architecture for a traditional wire-based protocol
KR100788698B1 (en) 2006-07-13 2007-12-26 삼성전자주식회사 Display service method and network device capable of performing the method, and storage medium thereof
CN101272295B (en) 2007-03-21 2012-01-25 联想(北京)有限公司 Virtual network projection system and method supporting multi-projection source
US8667144B2 (en) 2007-07-25 2014-03-04 Qualcomm Incorporated Wireless architecture for traditional wire based protocol
CN101426126B (en) 2007-11-01 2011-02-09 上海宝信软件股份有限公司 Projection wall window regulation method for large screen monitoring system
US8811294B2 (en) 2008-04-04 2014-08-19 Qualcomm Incorporated Apparatus and methods for establishing client-host associations within a wireless network
US9398089B2 (en) 2008-12-11 2016-07-19 Qualcomm Incorporated Dynamic resource sharing among multiple wireless devices
US9264248B2 (en) 2009-07-02 2016-02-16 Qualcomm Incorporated System and method for avoiding and resolving conflicts in a wireless mobile display digital interface multicast environment
CN101742221B (en) 2009-11-09 2012-06-13 中兴通讯股份有限公司 Method and device for synthesizing multiple pictures in video conference system
US9582238B2 (en) 2009-12-14 2017-02-28 Qualcomm Incorporated Decomposed multi-stream (DMS) techniques for video display systems
CN101860715A (en) * 2010-05-14 2010-10-13 中兴通讯股份有限公司 Multi-picture synthesis method and system and media processing device
JP2012014640A (en) * 2010-07-05 2012-01-19 Sony Computer Entertainment Inc Screen output device, screen output system, and screen output method
JP2012029218A (en) * 2010-07-27 2012-02-09 Toshiba Corp Electronic device and input signal switching method
JP5677034B2 (en) * 2010-11-04 2015-02-25 キヤノン株式会社 Display apparatus and a control method thereof, an information processing apparatus and a control method thereof, an image display system, the program
US9065876B2 (en) 2011-01-21 2015-06-23 Qualcomm Incorporated User input back channel from a wireless sink device to a wireless source device for multi-touch gesture wireless displays
US9787725B2 (en) 2011-01-21 2017-10-10 Qualcomm Incorporated User input back channel for wireless displays
US9582239B2 (en) 2011-01-21 2017-02-28 Qualcomm Incorporated User input back channel for wireless displays
US8964783B2 (en) 2011-01-21 2015-02-24 Qualcomm Incorporated User input back channel for wireless displays
US9413803B2 (en) 2011-01-21 2016-08-09 Qualcomm Incorporated User input back channel for wireless displays
US10108386B2 (en) 2011-02-04 2018-10-23 Qualcomm Incorporated Content provisioning for wireless back channel
US8674957B2 (en) 2011-02-04 2014-03-18 Qualcomm Incorporated User input device for wireless back channel
US9503771B2 (en) 2011-02-04 2016-11-22 Qualcomm Incorporated Low latency wireless display for graphics
JP6180072B2 (en) * 2011-08-24 2017-08-16 サターン ライセンシング エルエルシーSaturn Licensing LLC Display device, a display system, and a display method
US10050800B2 (en) 2011-09-14 2018-08-14 Barco N.V. Electronic tool and methods for meetings for providing connection to a communications network
JP6051521B2 (en) * 2011-12-27 2016-12-27 株式会社リコー Image synthesis system
US9525998B2 (en) 2012-01-06 2016-12-20 Qualcomm Incorporated Wireless display with multiscreen service
US20130227457A1 (en) * 2012-02-24 2013-08-29 Samsung Electronics Co. Ltd. Method and device for generating captured image for display windows
CN102638654B (en) * 2012-03-28 2015-03-25 华为技术有限公司 Method, device and equipment for outputting multi-pictures
GB201206841D0 (en) 2012-04-18 2012-05-30 Barco Nv Electonic tool and method for meetings
US9176703B2 (en) * 2012-06-29 2015-11-03 Lg Electronics Inc. Mobile terminal and method of controlling the same for screen capture
KR101875744B1 (en) 2012-07-13 2018-07-06 엘지전자 주식회사 Electonic device and method for controlling of the same
CN102904926A (en) * 2012-08-31 2013-01-30 苏州佳世达光电有限公司 Method and system for sharing and editing file
US9250792B2 (en) 2012-11-29 2016-02-02 International Business Machines Corporation Method, apparatus and computer program to designate content retrieval on an interactive display
CN103856809A (en) * 2012-12-03 2014-06-11 中国移动通信集团公司 Method, system and terminal equipment for multipoint at the same screen
KR20140112850A (en) * 2013-03-14 2014-09-24 엘지전자 주식회사 Video display apparatus and method of controlliing thereof
CN104661085A (en) * 2013-11-22 2015-05-27 中兴通讯股份有限公司 Multi-way wireless display method and device
CN104978157A (en) * 2014-04-10 2015-10-14 富泰华工业(深圳)有限公司 Display device and image display method of display device
CN104049750B (en) * 2014-05-08 2017-11-07 苏州佳世达光电有限公司 And a display system using a display method thereof
JP2016031411A (en) * 2014-07-28 2016-03-07 株式会社リコー Radio communication system, display device and display method
DE102014013259A1 (en) * 2014-09-08 2016-03-10 Abb Technology Ag Means for management and configuration of field devices of an automation system
JP2017058811A (en) * 2015-09-15 2017-03-23 株式会社リコー Display device, a display system, and program
CN105592287A (en) * 2015-12-29 2016-05-18 太仓美宅姬娱乐传媒有限公司 Intelligent multimedia conferencing control system
CN105898342A (en) * 2015-12-30 2016-08-24 乐视致新电子科技(天津)有限公司 Video multipoint co-screen play method and system
KR20170138132A (en) * 2016-06-07 2017-12-15 삼성전자주식회사 Display apparatus and cotrolling method thereof

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6008809A (en) * 1997-09-22 1999-12-28 International Business Machines Corporation Apparatus and method for viewing multiple windows within a dynamic window
US20010050679A1 (en) * 2000-06-09 2001-12-13 Kazuyuki Shigeta Display control system for displaying image information on multiple areas on a display screen
US6388654B1 (en) * 1997-10-03 2002-05-14 Tegrity, Inc. Method and apparatus for processing, displaying and communicating images
US6473088B1 (en) * 1998-06-16 2002-10-29 Canon Kabushiki Kaisha System for displaying multiple images and display method therefor
US20030017846A1 (en) * 2001-06-12 2003-01-23 Estevez Leonardo W. Wireless display
US20030110244A1 (en) * 2001-12-10 2003-06-12 American Megatrends, Inc. Systems and methods for capturing screen displays from a host computing system for display at a remote terminal
US20030120849A1 (en) * 2001-06-11 2003-06-26 Roslak Thomas K. PDA presentation system
US20030117587A1 (en) * 2001-12-26 2003-06-26 Olson Jorell A. Image-rendering device
US6600500B1 (en) * 1999-05-18 2003-07-29 Nec Corporation Multi-window display system and method for displaying and erasing window
US6977661B1 (en) * 2000-02-25 2005-12-20 Microsoft Corporation System and method for applying color management on captured images
US20060028584A1 (en) * 2001-02-28 2006-02-09 Yamaha Corporation Video mixer apparatus

Family Cites Families (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2603347B2 (en) * 1989-12-19 1997-04-23 キヤノン株式会社 The information processing apparatus and a display apparatus using the same
USRE43462E1 (en) * 1993-04-21 2012-06-12 Kinya (Ken) Washino Video monitoring and conferencing system
JP2863428B2 (en) * 1993-05-18 1999-03-03 富士通株式会社 Interactive graphics system
JPH0779424A (en) * 1993-09-06 1995-03-20 Hitachi Ltd Multi-point video communication equipment
US7185054B1 (en) * 1993-10-01 2007-02-27 Collaboration Properties, Inc. Participant display and selection in video conference calls
GB2319138B (en) * 1993-10-01 1998-06-24 Vicor Inc Teleconferencing system
US5826035A (en) * 1994-06-10 1998-10-20 Hitachi, Ltd. Image display apparatus
US6008803A (en) * 1994-11-29 1999-12-28 Microsoft Corporation System for displaying programming information
US6137485A (en) * 1995-03-20 2000-10-24 Canon Kabushiki Kaisha Image transmission method and apparatus, and image transmission system including the apparatus
US7720672B1 (en) * 1995-12-29 2010-05-18 Wyse Technology Inc. Method and apparatus for display of windowing application programs on a terminal
US8861707B2 (en) * 1996-05-31 2014-10-14 Verint Americas Inc. Method and apparatus for simultaneously monitoring computer user screen and telephone activity from a remote location
US5929850A (en) * 1996-07-01 1999-07-27 Thomson Consumer Electronices, Inc. Interactive television system and method having on-demand web-like navigational capabilities for displaying requested hyperlinked web-like still images associated with television content
US6333750B1 (en) * 1997-03-12 2001-12-25 Cybex Computer Products Corporation Multi-sourced video distribution hub
JPH10304187A (en) * 1997-04-28 1998-11-13 Canon Inc Device and method for print control and storage medium
US6384868B1 (en) * 1997-07-09 2002-05-07 Kabushiki Kaisha Toshiba Multi-screen display apparatus and video switching processing apparatus
JP3753207B2 (en) * 1997-08-11 2006-03-08 富士ゼロックス株式会社 Joint work support system and co-operation support method
US6390371B1 (en) * 1998-02-13 2002-05-21 Micron Technology, Inc. Method and system for displaying information uniformly on tethered and remote input devices
US6522352B1 (en) * 1998-06-22 2003-02-18 Motorola, Inc. Self-contained wireless camera device, wireless camera system and method
US6560637B1 (en) * 1998-12-02 2003-05-06 Polycom, Inc. Web-enabled presentation device and methods of use thereof
JP3617371B2 (en) * 1999-05-07 2005-02-02 セイコーエプソン株式会社 Projector and information storage medium
JP4688996B2 (en) * 2000-01-31 2011-05-25 キヤノン株式会社 Video display apparatus, a control method and a storage medium
US6674799B2 (en) * 2000-02-28 2004-01-06 Lg Electronics Inc. Apparatus for converting screen aspect ratio
US20030121027A1 (en) * 2000-06-23 2003-06-26 Hines Kenneth J. Behavioral abstractions for debugging coordination-centric software designs
US20020097322A1 (en) * 2000-11-29 2002-07-25 Monroe David A. Multiple video display configurations and remote control of multiple video signals transmitted to a monitoring station over a network
JP4672856B2 (en) * 2000-12-01 2011-04-20 キヤノン株式会社 Multi-screen display device and the multi-screen display method
US6431711B1 (en) * 2000-12-06 2002-08-13 International Business Machines Corporation Multiple-surface display projector with interactive input capability
US6943845B2 (en) * 2000-12-15 2005-09-13 Canon Kabushiki Kaisha Apparatus and method for data processing, and storage medium
JP4757389B2 (en) * 2001-01-15 2011-08-24 三菱電機株式会社 Multi-vision for the projector apparatus, and a multi-vision using the same
US20020149617A1 (en) * 2001-03-30 2002-10-17 Becker David F. Remote collaboration technology design and methodology
JP2002358065A (en) * 2001-06-01 2002-12-13 Seiko Epson Corp Display service providing system and video display device
JP4412701B2 (en) * 2003-01-24 2010-02-10 日本電気株式会社 Screen information display method, system and computer program

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6008809A (en) * 1997-09-22 1999-12-28 International Business Machines Corporation Apparatus and method for viewing multiple windows within a dynamic window
US6388654B1 (en) * 1997-10-03 2002-05-14 Tegrity, Inc. Method and apparatus for processing, displaying and communicating images
US6473088B1 (en) * 1998-06-16 2002-10-29 Canon Kabushiki Kaisha System for displaying multiple images and display method therefor
US6600500B1 (en) * 1999-05-18 2003-07-29 Nec Corporation Multi-window display system and method for displaying and erasing window
US6977661B1 (en) * 2000-02-25 2005-12-20 Microsoft Corporation System and method for applying color management on captured images
US20010050679A1 (en) * 2000-06-09 2001-12-13 Kazuyuki Shigeta Display control system for displaying image information on multiple areas on a display screen
US20060028584A1 (en) * 2001-02-28 2006-02-09 Yamaha Corporation Video mixer apparatus
US20030120849A1 (en) * 2001-06-11 2003-06-26 Roslak Thomas K. PDA presentation system
US20030017846A1 (en) * 2001-06-12 2003-01-23 Estevez Leonardo W. Wireless display
US20030110244A1 (en) * 2001-12-10 2003-06-12 American Megatrends, Inc. Systems and methods for capturing screen displays from a host computing system for display at a remote terminal
US20030117587A1 (en) * 2001-12-26 2003-06-26 Olson Jorell A. Image-rendering device

Cited By (92)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100257586A1 (en) * 2001-08-28 2010-10-07 Seiko Epson Corporation Projector projecting password
US8272035B2 (en) 2001-08-28 2012-09-18 Seiko Epson Corporation Projector projecting password
US8806571B2 (en) 2001-08-28 2014-08-12 Seiko Epson Corporation Projector projecting password
US20100100847A1 (en) * 2002-05-27 2010-04-22 Seiko Epson Corporation Image data transmission system, process and program, image data output device and image display device
US8875053B2 (en) 2002-05-27 2014-10-28 Seiko Epson Corporation Secure connection protocol for image projecting unit, process and program
US9305188B2 (en) 2003-03-24 2016-04-05 Seiko Epson Corporation Image-display method, projector, image-display system, projector-control method, image-display program, and projector-control program
US8793771B2 (en) 2003-03-24 2014-07-29 Seiko Epson Corporation Image-display method, projector, image-display system, projector-control method, image-display program, and projector-control program
US8230000B2 (en) 2003-03-24 2012-07-24 Seiko Epson Corporation Image-display method, projector, image-display system, projector-control method, image-display program, and projector-control program
US20090284667A1 (en) * 2003-03-24 2009-11-19 Seiko Epson Corporation Image-display method, projector, image-display system, projector-control method, image-display program, and projector-control program
US8640196B2 (en) 2004-01-21 2014-01-28 Seiko Epson Corporation Network system of projector
US8646036B2 (en) 2004-01-21 2014-02-04 Seiko Epson Corporation Network system of projector
US20050160479A1 (en) * 2004-01-21 2005-07-21 Seiko Epson Corporation Network system of projector
US7865932B2 (en) * 2004-01-21 2011-01-04 Seiko Epson Corporation Network system of projector
US7551175B2 (en) 2004-03-10 2009-06-23 Panasonic Corporation Image transmission system and image transmission method
US20070257927A1 (en) * 2004-03-10 2007-11-08 Yasuaki Sakanishi Image Transmission System and Image Transmission Method
US20080158438A1 (en) * 2004-03-10 2008-07-03 Tsuyoshi Maeda Image Transmission System and Image Transmission Method
US7682028B2 (en) * 2004-03-10 2010-03-23 Panasonic Corporation Image transmission system and image transmission method
US20080022291A1 (en) * 2004-12-01 2008-01-24 Tong Shao Device And Method For Computer Display Synthesis
US20060218583A1 (en) * 2005-03-25 2006-09-28 Alcatel Interactive displaying system
US20070050778A1 (en) * 2005-08-30 2007-03-01 Si-Hyoung Lee User interface method, system, and device in multitasking environment
US9258514B2 (en) * 2005-08-30 2016-02-09 Samsung Electronics Co., Ltd. User interface method, system, and device in multitasking environment
US20070055941A1 (en) * 2005-09-08 2007-03-08 Bhakta Dharmesh N Method and apparatus to selectively display portions of a shared desktop in a collaborative environment
US20070085846A1 (en) * 2005-10-07 2007-04-19 Benq Corporation Projector and method for issuing display authority token to computers from the same
US20070098353A1 (en) * 2005-11-01 2007-05-03 Lite-On It Corp. Dvd recorder with surveillance function
US20070186174A1 (en) * 2006-02-07 2007-08-09 Kazunori Horikiri Electronic conference system, electronic conference assistance method and conference control terminal device
US8296572B2 (en) 2006-04-04 2012-10-23 Seiko Epson Corporation Projector system
US8892898B2 (en) 2006-04-04 2014-11-18 Seiko Epson Corporation Projector system
US8780125B2 (en) * 2006-09-29 2014-07-15 Hewlett-Packard Development Company, L.P. Intelligent display
US20080079740A1 (en) * 2006-09-29 2008-04-03 Bruce Aaron Tankleff Intelligent display
US8584164B2 (en) * 2006-12-19 2013-11-12 At&T Intellectual Property I, Lp System and apparatus for managing media content
US20080148331A1 (en) * 2006-12-19 2008-06-19 At&T Knowledge Ventures, Lp System and apparatus for managing media content
US8479230B2 (en) * 2006-12-19 2013-07-02 At&T Intellectual Property I, Lp System and apparatus for managing media content
US9298412B2 (en) 2007-08-07 2016-03-29 Seiko Epson Corporation Conferencing system, server, image display method, and computer program product
US8984061B2 (en) 2007-08-07 2015-03-17 Seiko Epson Corporation Conferencing system, server, image display method, and computer program product
EP2023630A3 (en) * 2007-08-07 2010-12-01 Seiko Epson Corporation Conferencing system, server, image display method, and computer program product
US8726156B2 (en) * 2007-08-07 2014-05-13 Seiko Epson Corporation Graphical user interface device
US20140218624A1 (en) * 2007-08-07 2014-08-07 Seiko Epson Corporation Graphical user interface device
US20090044116A1 (en) * 2007-08-07 2009-02-12 Seiko Epson Corporation Graphical user interface device
US8065622B2 (en) * 2007-08-31 2011-11-22 Hong Fu Jin Precision Industry (Shenzhen) Co., Ltd. Displaying device with user-defined display regions and method thereof
US20090064016A1 (en) * 2007-08-31 2009-03-05 Hong Fu Jin Precision Industry(Shenzhen) Co., Ltd. Displaying device with user-defined display regions and method thereof
US20160313807A1 (en) * 2008-01-07 2016-10-27 Samsung Electronics Co., Ltd. Method for providing area of image displayed on display apparatus in gui form using electronic apparatus, and electronic apparatus applying the same
US9972279B2 (en) * 2008-01-07 2018-05-15 Samsung Electronics Co., Ltd. Method for providing area of image displayed on display apparatus in GUI form using electronic apparatus, and electronic apparatus applying the same
US20090207321A1 (en) * 2008-02-15 2009-08-20 Seiko Epson Corporation Image transfer device, image display apparatus, and image data transfer method
WO2009114232A2 (en) * 2008-03-14 2009-09-17 Microsoft Corporation Multi-monitor remote desktop environment user interface
US20090235177A1 (en) * 2008-03-14 2009-09-17 Microsoft Corporation Multi-monitor remote desktop environment user interface
WO2009114232A3 (en) * 2008-03-14 2009-11-12 Microsoft Corporation Multi-monitor remote desktop environment user interface
US8875026B2 (en) * 2008-05-01 2014-10-28 International Business Machines Corporation Directed communication in a virtual environment
US20090276707A1 (en) * 2008-05-01 2009-11-05 Hamilton Ii Rick A Directed communication in a virtual environment
US9592451B2 (en) 2008-05-01 2017-03-14 International Business Machines Corporation Directed communication in a virtual environment
US9797827B2 (en) 2008-09-03 2017-10-24 Hitachi High-Technologies Corporation Automatic analyzer
US20110169836A1 (en) * 2008-09-03 2011-07-14 Hitachi High-Technologies Corporation Automatic analyzer
US9164112B2 (en) * 2008-09-03 2015-10-20 Hitachi High-Technologies Corporation Automatic analyzer
US20110209063A1 (en) * 2008-11-17 2011-08-25 Shenzhen Tcl New Technology Ltd. Apparatus and method for portable media player notification
US9311042B2 (en) * 2008-12-19 2016-04-12 Canon Kabushiki Kaisha Display controlling apparatus and image processing apparatus
US20150054715A1 (en) * 2008-12-19 2015-02-26 Canon Kabushiki Kaisha Display controlling apparatus and image processing apparatus
US20100174992A1 (en) * 2009-01-04 2010-07-08 Leon Portman System and method for screen recording
US20100289806A1 (en) * 2009-05-18 2010-11-18 Apple Inc. Memory management based on automatic full-screen detection
US8368707B2 (en) * 2009-05-18 2013-02-05 Apple Inc. Memory management based on automatic full-screen detection
US20100302130A1 (en) * 2009-05-29 2010-12-02 Seiko Epson Corporation Image display system, image display device, and image display method
US8791877B2 (en) * 2009-05-29 2014-07-29 Seiko Epson Corporation Image display system, image display device, and image display method
US9002947B2 (en) * 2010-03-15 2015-04-07 Seiko Epson Corporation Display device, terminal device, display system, display method, and image alteration method
US20110221763A1 (en) * 2010-03-15 2011-09-15 Seiko Epson Corporation Display device, terminal device, display system, display method, and image alteration method
US9170767B2 (en) * 2010-07-29 2015-10-27 Seiko Epson Corporation Information storage medium, terminal device, display system, and image generating method
US20120030594A1 (en) * 2010-07-29 2012-02-02 Seiko Epson Corporation Information storage medium, terminal device, display system, and image generating method
US20120026189A1 (en) * 2010-07-29 2012-02-02 Seiko Epson Corporation Display device, display system and display method
US20120075332A1 (en) * 2010-09-24 2012-03-29 Walton Advanced Engineering Inc. Portable storage device and its operating method
US20120137222A1 (en) * 2010-11-30 2012-05-31 Satoshi Ozaki Program synthesizing device and program synthesizing method
US8201091B1 (en) * 2010-11-30 2012-06-12 Kabushiki Kaisha Toshiba Program synthesizing device and program synthesizing method
US20130145315A1 (en) * 2011-12-05 2013-06-06 Hai-Bo Zhou Electronic device with multi-window displaying function and multi-window displaying method thereof
US20130293667A1 (en) * 2012-05-07 2013-11-07 Cellco Partnership D/B/A Verizon Wireless Method and apparatus for dynamic sharing of desktop content
US9092186B2 (en) * 2012-05-07 2015-07-28 Cellco Partnership Method and apparatus for dynamic sharing of desktop content
US20150109400A1 (en) * 2012-06-05 2015-04-23 Huawei Technologies Co., Ltd. Method, Apparatus and System for Controlling Multipicture Display
US9542020B2 (en) 2012-06-08 2017-01-10 Microsoft Technology Licensing, Llc Remote session control using multi-touch inputs
US20130328779A1 (en) * 2012-06-08 2013-12-12 Microsoft Corporation Remote session control using multi-touch inputs
US8970492B2 (en) * 2012-06-08 2015-03-03 Microsoft Technology Licensing, Llc Remote session control using multi-touch inputs
US20170195378A1 (en) * 2012-09-28 2017-07-06 Intel Corporation Multiple-device screen capture
US9645678B2 (en) * 2012-12-18 2017-05-09 Seiko Epson Corporation Display device, and method of controlling display device
US20140168168A1 (en) * 2012-12-18 2014-06-19 Seiko Epson Corporation Display device, and method of controlling display device
US20140245185A1 (en) * 2013-02-28 2014-08-28 Ricoh Company, Ltd. Electronic Information Collaboration System
US9641570B2 (en) 2013-02-28 2017-05-02 Ricoh Company, Ltd. Electronic information collaboration system
US20140244720A1 (en) * 2013-02-28 2014-08-28 Ricoh Company, Ltd. Electronic Information Collaboration System
US9645781B2 (en) * 2013-08-12 2017-05-09 Seiko Epson Corporation Information processing device, information processing method, and recording medium
US9864564B2 (en) 2013-08-12 2018-01-09 Seiko Epson Corporation Information processing device, information processing method, and recording medium
US20150042561A1 (en) * 2013-08-12 2015-02-12 Seiko Epson Corporation Information processing device, information processing method, and recording medium
US9756398B2 (en) * 2013-10-23 2017-09-05 Lg Electronics Inc. TV and operating method thereof
US20170024031A1 (en) * 2014-04-18 2017-01-26 Seiko Epson Corporation Display system, display device, and display control method
US20150363089A1 (en) * 2014-06-17 2015-12-17 Sony Corporation Information acquiring apparatus and method, and electronic device
US9854083B2 (en) * 2014-09-02 2017-12-26 Ricoh Company, Ltd. Information processing system, information processing apparatus, device control method, and medium
US20160065718A1 (en) * 2014-09-02 2016-03-03 Ricoh Company, Ltd. Information processing system, information processing apparatus, device control method, and medium
US20160072925A1 (en) * 2014-09-10 2016-03-10 Ricoh Company, Ltd. Information processing system, information processing device, and device control method
US9838386B2 (en) * 2014-09-10 2017-12-05 Ricoh Company, Ltd. Information processing system, information processing device, and device control method
US20170300285A1 (en) * 2016-04-13 2017-10-19 Seiko Epson Corporation Display system, display device, and method of controlling display system

Also Published As

Publication number Publication date Type
US20140115528A1 (en) 2014-04-24 application
EP1385336B1 (en) 2007-09-19 grant
US8656302B2 (en) 2014-02-18 grant
DE60316388D1 (en) 2007-10-31 grant
EP1385336A3 (en) 2004-03-24 application
US20100095241A1 (en) 2010-04-15 application
DE60316388T2 (en) 2008-06-12 grant
CN1476242A (en) 2004-02-18 application
EP1385336A2 (en) 2004-01-28 application
CN1268122C (en) 2006-08-02 grant

Similar Documents

Publication Publication Date Title
US6473088B1 (en) System for displaying multiple images and display method therefor
US6266082B1 (en) Communication apparatus image processing apparatus communication method and image processing method
US7397476B2 (en) Projector, projection display system, and corresponding method and recording medium
US6697687B1 (en) Image display apparatus having audio output control means in accordance with image signal type
US20030217186A1 (en) Apparatus for and method of seamless wireless multimedia download path to peer networked appliances
US20040056985A1 (en) Apparatus and method for displaying a television video signal in a mobile terminal
US6864921B2 (en) Display control system for controlling a display screen formed of multiple display units
US7222356B1 (en) Communication apparatus, storage medium, camera and processing method
US6493008B1 (en) Multi-screen display system and method
US7483960B2 (en) System and method for providing a service to a terminal having data format specifications
US20030065806A1 (en) Audio and/or visual system, method and components
US6020863A (en) Multi-media processing system with wireless communication to a remote display and method using same
US20020089518A1 (en) Image processing system, image display method, recording medium and image display apparatus
US20010017630A1 (en) Image display device and method for displaying an image on the basis of a plurality of image signals
US20060079214A1 (en) Method and apparatus for showing wireless mobile device data content on an external viewer
JP2006031359A (en) Screen sharing method and conference support system
US20020122158A1 (en) Projector
JP2004069996A (en) Projector system, and information processor and projector
CN102981793A (en) Screen synchronization method and device
US6538675B2 (en) Display control apparatus and display control system for switching control of two position indication marks
US20090044116A1 (en) Graphical user interface device
US20050117121A1 (en) Multiple image projection system and method for projecting multiple selected images adjacent each other
US20030234749A1 (en) System and method for communicating graphics image data over a communication network for display on a single logical screen
US5867154A (en) Method and apparatus to select a display area within a data processing system
US20090184924A1 (en) Projection Device, Computer Readable Recording Medium Which Records Program, Projection Method and Projection System

Legal Events

Date Code Title Description
AS Assignment

Owner name: SEIKO EPSON CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NAGANO, MIKI;YOSHIKUNI, NORIHIRO;REEL/FRAME:014260/0070;SIGNING DATES FROM 20030919 TO 20030926