WO2010058591A1 - Dispositif d'affichage d'image, dispositif de téléconférence et procédé d'affichage d'image - Google Patents

Dispositif d'affichage d'image, dispositif de téléconférence et procédé d'affichage d'image Download PDF

Info

Publication number
WO2010058591A1
WO2010058591A1 PCT/JP2009/006255 JP2009006255W WO2010058591A1 WO 2010058591 A1 WO2010058591 A1 WO 2010058591A1 JP 2009006255 W JP2009006255 W JP 2009006255W WO 2010058591 A1 WO2010058591 A1 WO 2010058591A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
display
image display
video
unit
Prior art date
Application number
PCT/JP2009/006255
Other languages
English (en)
Japanese (ja)
Inventor
岡田晋
中西淑人
Original Assignee
パナソニック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2008296986A external-priority patent/JP2010124317A/ja
Priority claimed from JP2008316742A external-priority patent/JP2010141662A/ja
Priority claimed from JP2009002878A external-priority patent/JP2010161662A/ja
Application filed by パナソニック株式会社 filed Critical パナソニック株式会社
Priority to US13/129,878 priority Critical patent/US20110222676A1/en
Publication of WO2010058591A1 publication Critical patent/WO2010058591A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/147Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42203Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] sound input device, e.g. microphone
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4314Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for fitting data in a restricted space on the screen, e.g. EPG data in a rectangular grid
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems

Definitions

  • the present invention relates to an image display device, a video conference device, and an image display method.
  • the present invention relates to a video conference device for communicating with a person at a remote place by displaying images taken by a camera by transmitting and receiving each other, an image display device capable of displaying a partner screen configuration, and the like.
  • FIG. 15 is a conceptual diagram showing a situation in which two persons who are geographically distant each perform remote communication using a video conference device.
  • video conference apparatuses 10A and 10B and cameras 11A and 11B are installed at points A and B that are geographically separated from each other.
  • the video conference apparatuses 10A and 10B are connected to each other via the network 13.
  • the video of the point A taken by the camera 11A is transmitted from the video conference apparatus 10A to the video conference apparatus 10B via the network 13 and displayed.
  • the video of the point B taken by the camera 11B is transmitted from the video conference apparatus 10B to the video conference apparatus 10A via the network 13 and displayed.
  • video conferencing equipment When video conferencing equipment is installed at various locations and a video conferencing system consisting of a plurality of video conferencing equipment is used, it is possible to display a video of the partner site on the display screen of each video conferencing equipment and conduct a video conference. It has become.
  • the video conference device at his / her own site can display the image at his / her own site together with the image at the other site. .
  • a video conference is performed using the video conference device at the A site and the video conference device at the B site.
  • an image of the A site (self-portrait) 1502 can be superimposed on a part of the image (partner image) 1501 of the B site displayed on the screen 1500 of the A site, and the user himself / herself can be displayed on the camera. It is possible to grasp whether it is reflected.
  • a video conference is performed using the video conference device at the A base and the video conference device at the B base.
  • the image of the A site (self-portrait) 2012 can be superimposed on a part of the image of the B site (partner image) 2011 displayed on the screen A 2010 of the A site, and the user It is possible to grasp whether it is reflected in the camera.
  • a screen display method for displaying the camera video data of the local site as a small screen on a small screen on a part of the large partner screen is known (for example, , See Patent Document 2). Further, the child screen can be displayed at a position desired by the user by moving a small screen to be displayed by an input device such as a mouse.
  • Some video conference apparatuses are provided with a function for displaying a function setting screen using a GUI, image content acquired from an external device, and the like superimposed on a video transmitted from another video conference apparatus.
  • the GUI screen 21 is displayed on the upper right of the screen of the video conference device 10 ⁇ / b> B so as to be superimposed on the video transmitted from the video conference device 10 ⁇ / b> A.
  • the screen of the video conference apparatus 10 ⁇ / b> B of the two people at the point A, the person 31 on the right side is almost hidden by the GUI screen 21.
  • such information regarding the display state is not sent to the video conference apparatus 10A. For this reason, there is no way for the person at point A to know the portion that is not displayed on the screen of the video conferencing apparatus 10B or the fact, among the images captured by the camera 11A and transmitted to the video conferencing apparatus 10B.
  • Patent Document 1 discloses an imaging range projection display device that projects a projection by irradiating a light beam covering a predetermined area in a frame shape surrounding a predetermined area in order to suggest an imaging range of the imaging device.
  • this shooting range projection display device is applied to the example shown in FIG. 15, the shooting range projection display device irradiates a person at point A with a light beam indicating the area displayed on the screen of the video conference device 10B. And display the projection.
  • the light beam irradiated at this time does not indicate an area indicating the GUI screen 21 in the screen of the video conference apparatus 10B.
  • the image 1502 of the own site can be displayed in a small size on the lower right side of the partner image 1501, and it can be confirmed that the image is reflected on the camera.
  • the B site shown in FIG. 20B a part of the image around the face of the participant 1511 of the conference at the own site is hidden by the sub-screen 1513 displaying the image of the partner site.
  • desired video data may not be displayed on the counterpart screen.
  • the conference participants 1511 and 1512 at the local site appear on the camera at the local site, but the video data of the conference participant 1511 is displayed on the partner screen (A An example that does not appear on the display screen of the base) is shown. This example applies not only to the conference participants themselves, but also to the case where the conference participants want to show to the participants at the other site or the material video data shared with the other site during the conference.
  • the image 2012 of the own site is displayed in a small size in the lower right of the partner image 2011, and it can be confirmed that it is reflected in the camera.
  • the B site shown in FIG. 28B a part of the image around the face of the participant 2021 of the conference at the own site is hidden by the sub-screen 2023 displaying the image of the partner site.
  • a desired image may not be displayed on the opponent screen.
  • the conference participant 2021 is displayed by the sub-screen 2023 displayed superimposed on the display screen.
  • Such an example applies not only to the conference participants themselves, but also to the case where the conference participants want to show to the participants at the other site or the material video shared with the other site during the conference.
  • the present invention has been made in view of the above circumstances, and an object of the present invention is to provide an image display device that allows a user at the local site to recognize the position of the sub display area of the display screen at the remote site. .
  • An object is to provide a display method. It is another object of the present invention to provide an image display device, a video conference device, and an image display method capable of grasping what kind of display is performed on the display screen of the partner site. It is another object of the present invention to provide an image display device, a video conference device, and an image display method capable of preventing the child screen and a predetermined subject from overlapping on the display screen of the partner site.
  • the present invention relates to an image display device capable of configuring a plurality of display areas on a display screen, and another screen configuration for configuring a plurality of display areas on a display screen of another image display device via a communication line Means for receiving information, and means for controlling to present the position of the sub display area of the plurality of display areas in the other image display device based on the received other screen configuration information.
  • An image display device is provided.
  • This configuration makes it possible for the user at the local site to recognize the position of the sub display area on the display screen of the remote site.
  • the present invention is a video conference apparatus used in a video conference system that displays video captured by a camera by transmitting and receiving each other, and acquires a shooting area information about a shooting area of the camera of the video conference apparatus
  • An information acquisition unit that acquires content display area information related to a display area of content displayed on a display screen of another video conference apparatus that receives the video captured by the camera transmitted by the video conference apparatus; Based on the information and the content display area information, an actual display area determining unit that determines an actual display area in which the captured video is displayed on the display screen of the other video conference device, and the actual display area determining unit determines Based on the actual display area, the user of the video conference device assigns the actual display area in the shooting area of the camera.
  • a presentation control unit which controls the presentation device presenting, to provide a video conference device equipped with.
  • the present invention also provides an image display device capable of configuring a plurality of display areas on a display screen, and for configuring a plurality of display areas on a display screen of another image display device via a communication line.
  • a layout receiving unit for receiving screen configuration information; a self-screen configuration information setting unit for setting self-screen configuration information for configuring a display area on the display screen of the image display device; the other screen configuration information and the self-screen
  • a layout determining unit that determines a position of a reproduction area having a plurality of reproduction display areas corresponding to a plurality of display areas on the display screen of the other image display apparatus on the display screen of the image display apparatus based on the configuration information;
  • a display unit for displaying for a display area of the display screen of the other image display device.
  • This configuration makes it possible to grasp what kind of display is being performed on the display screen of the partner site. Specifically, the screen configuration information for configuring the display screen of the partner site is acquired, the screen configuration for configuring the display screen of the local site is set, and the display screen of the partner site is displayed on the display screen of the local site. To reproduce. By confirming this reproduction display, it is possible to grasp the display form of the partner site.
  • the present invention is an image display device capable of configuring a plurality of display areas on a display screen, and is photographed by a photographing unit that photographs a subject at a local site where the image display device is disposed, and the photographing unit.
  • Other screen configuration information for configuring a plurality of display areas on another display screen which is a display screen of another image display device, via a communication line and a subject detection unit that detects the subject included in the received video data
  • the video data is displayed in the main display area on the other display screen based on the layout receiving unit for receiving, the other screen configuration information and the detection position of the subject, the subject included in the video data and the
  • a determination unit that determines whether or not a sub display area that is superimposed on the main display region overlaps, and a shooting state by the shooting unit is controlled based on a determination result by the determination unit.
  • Comprises a photographing state control unit which, a.
  • the user of the local site recognize the position of the sub display area of the display screen of the counterpart site.
  • the area actually displayed in the captured video displayed on the display screen of the video conference device that has received the video captured by the camera is displayed on the video conference device that has transmitted the captured video. It can be presented to the user.
  • the block diagram which shows the structure of the video conference system in the 1st Embodiment of this invention.
  • the figure which shows typically the positional relationship of the image
  • the figure which shows typically the positional relationship of the image
  • FIG. 1 Schematic block diagram of the light irradiation apparatus 150 in the first embodiment of the present invention
  • fluctuation surface in the 1st Embodiment of this invention The figure which shows the time relationship between each video frame at the time of the camera 100 image
  • FIG. 1 Schematic which shows an example of the video conference system in the 1st Embodiment of this invention.
  • Conceptual diagram showing a situation where two geographically distant persons are performing remote communication using videoconferencing devices, respectively.
  • (A)-(c) is a figure which shows an example of the video data displayed on the display screen of the video conference apparatus in the 2nd Embodiment of this invention.
  • the block diagram which shows an example of a structure of the video conference apparatus in the 2nd Embodiment of this invention.
  • (A)-(e) is a schematic diagram which shows the example of a screen structure of the video conference apparatus in the 2nd Embodiment of this invention.
  • the flowchart which shows an example of the operation
  • (A), (b) is a figure which shows the video data displayed on the display screen of the conventional video conference apparatus.
  • the block diagram which shows an example of a structure of the video conference apparatus in the 3rd Embodiment of this invention.
  • the schematic diagram which shows the example of the screen structure of the video conference apparatus in the 3rd Embodiment of this invention.
  • the flowchart which shows an example of main operation
  • FIG. 1 which shows an example of the positional relationship of a subscreen and a face area at the time of displaying the video data image
  • (A) is a figure which shows an example of the image
  • FIG. 1 is a block diagram showing the configuration of the video conference system of the present embodiment.
  • the video conference system shown in FIG. 1 includes a camera 100 that captures a person, a video transmission device 110, a light irradiation device 150, a network 120, a video reception device 130, and a display 140.
  • the video conference system is an example of an image display device.
  • the video transmission device 110 transmits the video captured by the camera 100 to the video reception device 130 via the network 120, and the transmitted video is displayed on the display 140. Is displayed. Therefore, the flow of the video signal is only in one direction from the video transmission device 100 to the video reception device 130.
  • the video conference system according to the present embodiment provides a bidirectional configuration by providing a configuration similar to the video transmission device 110 on the video reception device 130 side and a configuration similar to the video reception device 130 on the video transmission device 110 side. Video signals can be communicated.
  • the video transmission device 110 includes a video acquisition unit 111, a video transmission unit 112, an imaging region acquisition unit 113, a display region reception unit 114, an actual display region determination unit 115, and a light irradiation control unit 116.
  • the video acquisition unit 111 acquires video captured by the camera 100.
  • the video acquisition unit 111 sends a synchronization signal of the camera 100 to the light irradiation control unit 116.
  • the video transmission unit 112 transmits the video acquired by the video acquisition unit 111 to the video reception device 130 via the network 120.
  • the imaging area acquisition unit 113 acquires information related to the imaging area of the camera 100 (hereinafter referred to as “imaging area information”) and sends it to the actual display area determination unit 115.
  • imaging area information information related to the imaging area of the camera 100
  • the horizontal and vertical rotation angles of the camera 100 in the coordinate system with the front direction of the camera 100 being 0 degrees, the angle of view, and the distance from the plane indicating the shooting area of the camera 100 to the camera 100 Is included.
  • the shooting area acquisition unit 113 may acquire the number of pixels in the vertical and horizontal directions of the video shot by the camera 100 and include it in the shooting area information.
  • the shooting area acquisition unit 113 may send shooting area information to the actual display area determination unit 115 when the zoom is completed.
  • the display area receiving unit 114 receives information related to the display area of the content displayed on the display 140 (hereinafter referred to as “content display area information”) transmitted from the video receiving device 130.
  • the content is a function setting screen using a GUI of the video receiving device 130, a video taken by a camera (not shown) provided in the video receiving device 130, a video or an image stored in the video receiving device 130, and the like.
  • the content is a video or an image sent from an external device connected to the video receiving device 130.
  • the content display screen displayed on the display 140 is rectangular.
  • the content display area information is indicated by information on the number of pixels in the vertical and horizontal directions of the video displayed on the display 140 and, for example, the positions of the upper left vertex and the lower right vertex of the content on the two-dimensional coordinates of the video. Location information.
  • the content display area information is an example of other screen configuration information.
  • the actual display area determining unit 115 determines an area where the video is displayed on the display 140 based on the shooting area information acquired by the shooting area acquiring unit 113 and the content display area information received by the display area receiving unit 114. To do.
  • the area displayed on the display 140 is hereinafter referred to as “actual display area”. Details of the method of determining the actual display area by the actual display area determining unit 115 will be described later.
  • the light irradiation control unit 116 controls the light irradiation device 150 based on the information related to the actual display area determined by the actual display area determination unit 115 and the synchronization signal of the camera 100 sent from the video acquisition unit 111. Details of the light irradiation device 150 will be described later.
  • the video reception device 130 includes a video reception unit 131, a video display processing unit 132, a content acquisition unit 136, a content display processing unit 133, a display area acquisition unit 134, a display area transmission unit 135, and an operation reception unit 137.
  • the video receiving unit 131 receives the video transmitted from the video transmission device 110 via the network 120.
  • the video display processing unit 132 performs processing to display the video received by the video receiving unit 131 on the display 140.
  • the video display processing unit 132 sends information regarding the number of pixels in the vertical and horizontal directions of the video displayed on the display 140 to the display area acquisition unit 134.
  • the content acquisition unit 136 acquires content recorded on a recording medium (not shown) included in the video reception device 130 and content sent from an external device (not shown) connected to the video reception device 130.
  • the content acquisition unit 136 may acquire content from a server or the like connected via the network 120.
  • the content display processing unit 133 performs processing so that the content acquired by the content acquisition unit 136 is displayed on the display 140.
  • the content display processing unit 133 displays the position information of the content indicated by the positions of the upper left vertex and the lower right vertex of the rectangular content on the two-dimensional coordinates of the video displayed on the display 140 as a display area acquisition unit. 134.
  • the display area acquisition unit 134 acquires information regarding the number of pixels in the vertical and horizontal directions of the video displayed on the display 140 from the video display processing unit 132.
  • the display area acquisition unit 134 acquires the position information of the content superimposed on the video on the display 140 from the content display processing unit 133.
  • the display area acquisition unit 134 sends these pieces of information as content display area information to the display area transmission unit 135.
  • the display area transmission unit 135 transmits the content display area information sent from the display area acquisition unit 134 to the video transmission device 110 via the network 120.
  • the operation accepting unit 137 accepts an operation for designating the content displayed on the display 140 or the size or position of the content.
  • the operation reception unit 137 instructs the content display processing unit 133 according to the received operation.
  • FIGS. 2 and 3 are diagrams schematically showing the positional relationship between the video 201 photographed by the camera 100 and displayed on the display 140, and the content 202 displayed on the video 201.
  • FIG. 2 Note that the positional relationship between the video 201 and content 202 and the camera 100 shown in FIG. 2 is a relationship when the camera 100 is viewed from above. Further, the positional relationship between the video 201 and the content 202 and the camera 100 shown in FIG. 3 is a relationship when the camera 100 is viewed from the side.
  • the screen area of the content 202 is an example of a sub display area.
  • the positional relationship between the video 201 and the content 202 is as shown in FIGS. 2 and 3 and will be described below.
  • the two-dimensional coordinates of the video 201 are represented by using the upper left vertex of the video 201 as the origin and the number of vertical and horizontal pixels as a unit of each axis.
  • the example shown in FIGS. 2 and 3 shows a case where the horizontal pixel number of the video 201 is X pixels and the vertical pixel number is Y pixels.
  • the content display processing unit 133 sends position information indicating the positions of the upper left vertex (x1, y1) and the lower right vertex (x2, y2) of the content 202 on the two-dimensional coordinates to the display area acquisition unit 134.
  • the video display processing unit 132 sends information (X, Y) regarding the number of pixels in the vertical and horizontal directions of the video 201 to the display area acquisition unit 134.
  • the content display processing unit 133 assigns an identification number to each content.
  • the position information of each content includes information indicating the identification number. For example, when two contents are superimposed on the video, the position information of the first content is expressed as (1, x1, y1) and (1, x2, y2).
  • the position information of the second content is represented as (2, x1 ′, y1 ′) and (2, x2 ′, y2 ′).
  • the content display processing unit 133 gives a display identification number to the content. For example, when there are three displays A, B, and C, the content display processing unit 133 displays the first content described above on the display A and the second content on the display C. In this case, the position information of the first content is expressed as (1, A, x1, y1) and (1, A, x2, y2). The position information of the second content is represented as (2, C, x1 ′, y1 ′) and (2, C, x2 ′, y2 ′).
  • the actual display area determination unit 115 of the video transmission device 110 converts the position information of the content 202 into angle information based on the installation position and the shooting direction of the camera 100.
  • the angle ⁇ X shown in FIG. 2 is an angle obtained by dividing the horizontal field angle of the camera 100 into two equal parts in the shooting direction of the camera 100.
  • ⁇ x1 shown in the figure is an angle in the horizontal direction of the upper left vertex (x1, y1) of the content 202 in the video 201 with respect to the shooting direction of the camera 100.
  • the angle ⁇ Y shown in FIG. 3 is an angle obtained by dividing the vertical angle of view of the camera 100 into two equal parts in the shooting direction of the camera 100. Also, the angle ⁇ y1 shown in the figure is an angle in the vertical direction of the upper left vertex (x1, y1) of the content 202 in the video 201 with respect to the shooting direction of the camera 100.
  • the actual display area determination unit 115 determines the angle ⁇ x1 shown in FIG. 2 and the angle ⁇ x1 shown in FIG. 3 based on the shooting area information obtained from the shooting area acquisition unit 113 and the content display area information received by the display area receiving unit 114.
  • the indicated angle ⁇ y1 is calculated using the following equation. Note that a variable L in the following expression is a distance from the plane indicating the imaging region of the camera 100 to the camera 100.
  • the area of the content 202 in the video 201 photographed by the camera 100 is represented by an angle based on the installation position and the photographing direction of the camera 100.
  • the actual display area determination unit 115 determines an area obtained by removing the area of the content 202 from the video 201 as an actual display area displayed on the display 140.
  • the light irradiation device 150 presents, to the user of the video transmission device 110, the video region captured by the camera 100 and displayed on the display 140 with light.
  • FIG. 4 is a schematic configuration diagram of the light irradiation device 150.
  • the light irradiation device 150 includes a two-dimensional scan mirror 300 that can be displaced in two orthogonal axes, a light source 151 that irradiates light to the mirror unit 309 of the two-dimensional scan mirror, and a condenser lens 155. And have.
  • the light source 151 is an LED that emits coherent light, for example, red light.
  • the light irradiation device 150 irradiates the area displayed on the display 140 with red light.
  • the condenser lens 155 is provided between the light source 151 and the two-dimensional scan mirror 300, and prevents light diffusion.
  • FIG. 5 is a top view (a) and a cross-sectional view along line AA (b) of the two-dimensional scan mirror 300.
  • FIG. FIG. 6 is an ⁇ -axis sectional view (a) and a ⁇ -axis sectional view (b) of the two-dimensional scan mirror 300 shown in FIG.
  • the two-dimensional scan mirror 300 includes a substrate 501, a post portion 401, a ⁇ -axis excitation electrode 403, an ⁇ -axis excitation electrode 405, a fixed frame 301, a ⁇ -axis coupling portion 303, a ⁇ -axis oscillating surface 305, an ⁇ -axis coupling portion 307, and
  • the mirror unit 309 is swingable in two axes ( ⁇ axis and ⁇ axis).
  • the mirror unit 309 is made of metal or silicon that reflects light.
  • a square post portion 401 (503 in FIG. 6) is provided on the substrate 501. Further, a square-shaped fixing frame 301 is supported on the post portion 401.
  • a ⁇ -axis swinging surface 305 is connected to the fixed frame 301 via a ⁇ -axis coupling portion 303. The ⁇ -axis swinging surface 305 can swing with respect to the fixed frame 301 around the ⁇ -axis connecting portion 303.
  • a mirror portion 309 is connected to the ⁇ -axis swinging surface 305 via an ⁇ -axis coupling portion 307. The mirror unit 309 can swing with respect to the ⁇ -axis swinging surface 305 around the ⁇ -axis coupling unit 307. Therefore, the mirror unit 309 can be independently displaced about the ⁇ axis and the ⁇ axis. That is, the mirror unit 309 can be displaced in any two-dimensional direction.
  • the two ⁇ -axis excitation electrodes 405 are arranged on the substrate 501 right below the mirror unit 309 and symmetrically about the ⁇ -axis.
  • two ⁇ -axis excitation electrodes 403 are disposed on the substrate 501 directly below the ⁇ -axis swinging surface 305 so as to be symmetrical about the ⁇ -axis.
  • the two-dimensional scan mirror 300 supplies a control signal sent from the light irradiation control unit 116 of the video transmission device 110 to each excitation electrode, so that the mirror unit 309 is displaced with respect to the ⁇ axis, and the ⁇ axis is targeted.
  • the ⁇ -axis swinging surface 305 is displaced.
  • FIG. 7 is a diagram showing the relationship between the electrostatic forces Fon and Foff and the spring force kZ of the swing surface.
  • the force acting on the rocking surface 601 includes electrostatic forces Fon and Foff applied from excitation electrodes 603A and 603B respectively disposed immediately below both ends of the rocking surface 601, and the rocking surface.
  • the equation of motion for the rocking surface 601 is described below. Where m is the mass of the mirror, b is the damping coefficient, Z (t) is the amount of displacement at time t, g is the gap between the excitation electrodes 603A and 603B and the oscillating surface 601, and k is the spring constant.
  • the electrostatic force Fon applied from the excitation electrode 603A is obtained by the following equation.
  • ⁇ 0 is the dielectric constant
  • w is the mirror width
  • g0 is the initial gap between the oscillating surface 601 and the excitation electrode 603A
  • td is the thickness of the dielectric on the mirror surface
  • ⁇ r is the relative dielectric constant of the dielectric
  • Z is Indicates the amount of displacement.
  • the oscillating surface 601 when a constant voltage is applied to the excitation electrode 603A as a force acting on the oscillation surface 601, a constant electrostatic force Fon is applied to the oscillation surface 601.
  • the oscillating surface 601 is displaced in the direction of the excitation electrode 603A by the electrostatic force Fon, and at the same time, the spring force kZ increases according to the amount of displacement. As shown in FIG. 8, the oscillating surface 601 balances the force at a position where the spring force kZ is equal to the electrostatic force Fon, and stops in that state.
  • the electrostatic force is proportional to the minus square of the gap between the oscillating surface and the excitation electrode, and the spring force is proportional to the displacement. For this reason, when the gap exceeds a predetermined position, the point of balance becomes unstable and a so-called pull-in phenomenon occurs in which the swing surface is drawn into the excitation electrode. For this reason, the displacement amount of the oscillating surface is controlled so that at least the gap between the excitation electrode and the oscillating surface is displaced within about 1/3 of the initial gap.
  • FIG. 9 is a diagram illustrating the relationship between the control voltage and the swing angle of the swing surface.
  • the swing angle of the swing surface is uniquely determined with respect to the voltage of the control signal. That is, in order to displace the swing surface by a predetermined angle, a control signal having a predetermined voltage may be supplied to the excitation electrode.
  • a predetermined control signal is supplied only to the right ⁇ axis excitation electrode 405 shown in FIG. 5, and the right side of the mirror unit 309 is centered around the ⁇ axis. Displace to the side. At this time, the control signal is not supplied to the ⁇ -axis excitation electrode 405 and the ⁇ -axis excitation electrode 403 on the left side, and the same potential as that of the mirror unit 309 is maintained.
  • a predetermined control signal is supplied to the upper ⁇ -axis excitation electrode 403 shown in FIG. It is displaced toward the substrate 501 side in the center. Since the ⁇ -axis oscillating surface 305 is connected by the ⁇ -axis coupling portion 307, when the ⁇ -axis oscillating surface 305 is displaced, the mirror portion 309 is also displaced in the same direction.
  • the light irradiation control unit 116 outputs a control signal of a predetermined voltage supplied to the excitation electrode described above.
  • the light irradiation control unit 116 determines the voltage of the control signal supplied to each excitation electrode based on the information determined by the actual display region determination unit 115. Further, the light irradiation control unit 116 controls the supply of control signals to the excitation electrodes independently for each of the ⁇ axis and the ⁇ axis. Therefore, the mirror unit 309 of the light irradiation device 150 can be inclined at an arbitrary two-dimensional angle with respect to the fixed frame 301. In other words, light in one direction emitted from the light source 151 can be reflected by the mirror unit 309 in any two-dimensional direction.
  • the control of the mirror unit 309 by the control signal is performed with a low-level voltage control signal if the frequency of the control signal supplied to the ⁇ -axis excitation electrode 405 is the same as the natural frequency of the mirror unit 309 in the ⁇ -axis rotation direction.
  • the mirror unit 309 can be excited.
  • the control of the mirror unit 309 is a low-level voltage control signal.
  • the ⁇ -axis oscillating surface 305 can be excited.
  • the timing at which the light source 151 included in the light irradiation device 150 emits light is the time between video frames when the camera 100 captures an image.
  • FIG. 10 is a diagram illustrating a temporal relationship between each video frame when the camera 100 captures an image and a timing at which the light irradiation device 150 irradiates light. As shown in FIG. 10, the light irradiation device 150 irradiates light on the imaging region of the camera 100 during the time between video frames based on the synchronization signal of the camera 100 sent from the video acquisition unit 111. In other words, the light irradiation device 150 does not emit light at the timing when the camera 100 performs photographing.
  • the light of the light irradiation device 150 is not reflected in the video shot by the camera 100 and displayed on the display 140.
  • a natural color image is displayed on the display 140 regardless of the light irradiation by the light irradiation device 150.
  • FIG. 11 is a flowchart showing the operation of the video conference system of the present embodiment.
  • the user of the video reception device 130 instructs display of content (step S101).
  • the display area acquisition unit 134 of the video reception device 130 acquires information regarding the number of pixels in the vertical and horizontal directions of the video displayed on the display 140 and the position information of the content on the two-dimensional coordinates of the video,
  • the data is output to the display area transmission unit 135.
  • the display area transmission unit 135 transmits the acquired information to the video transmission device 110 (step S103).
  • the actual display area determination unit 115 of the video transmission device 110 converts the position information of the content into angle information based on the installation position and the shooting direction of the camera 100 based on the shooting area information and the content display area information. (Step S105). Further, the actual display area determination unit 115 determines an area obtained by removing the content area from the video captured by the camera 100 as an actual display area displayed on the display 140 (step S107). The light irradiation control unit 116 of the video transmission device 110 controls the light irradiation device 150 based on the determined information on the actual display area (step S109).
  • the light irradiation device 150 applies the imaging region of the camera 100 corresponding to the region excluding the region of the content 202 from the image 201 displayed on the display 140. Light is irradiated. Therefore, the user of the video transmission device 110 can intuitively know the region 901 that is actually displayed on the display 140 of the video reception device 130 among the regions captured by the camera 100. In other words, the user can know an area 903 that is an area photographed by the camera 100 but is not actually displayed because of the content 202 superimposed on the display 140.
  • the light irradiation device 150 may include two-color light sources having different wavelengths. In this case, as shown in FIG. 13, the light irradiation device 150 irradiates the area 901 actually displayed on the display 140 with one color light, and the area 903 that is not actually displayed for the content 202. May be irradiated with light of the other color. At this time, since the user who is in the imaging region of the camera 100 can always visually recognize the light, the display status of the display 140 can be recognized more accurately.
  • the light irradiation device 150 may irradiate a predetermined region 905 that surrounds the outside of the region 901 with the same light as the light irradiated to the region 903. There may be some deviation between the light irradiation range of the light irradiation device 150 and the shooting range of the camera 100. However, if the generated shift is in the form shown in FIG. 14, it can be seen that “shift” occurs if the light irradiated to the region 905 is captured by the camera 100. At this time, by adjusting the region 901 and the region 905, the light irradiation range and the photographing range of the camera 100 can be made to exactly match.
  • the light irradiation device 150 may include three or more light sources that emit colors having different wavelengths.
  • the light irradiation device 150 may increase the amount of light when irradiating the outer periphery of the imaging region of the camera 100 and decrease the amount of light when irradiating the center of the imaging region.
  • a user who performs a video conference is often located near the center of the shooting area of the camera 100. Therefore, the conference participant does not feel the light of the light irradiation device 150 dazzlingly.
  • the light irradiation device 150 may increase the light amount when irradiating light to the corner of the room, and decrease the light amount when irradiating light to the center of the room, according to the size of the room. Since the light irradiated to the corner of the room becomes indirect light, the user does not feel the light of the light irradiation device 150 dazzlingly.
  • the video reception device 130 extracts a part of the received video captured by the camera 100 and displays it on the display 140 in an enlarged manner
  • the video reception device 130 uses the same method to give an area not displayed on the display 140 to the user on the video transmission device 110 side. Can be presented.
  • the gist of the present embodiment is that the user of the video conference device reproduces and displays on the display screen of the video conference device at the local site what the screen configuration of the display screen of the video conference device at the other site is. It is to be able to grasp intuitively.
  • a video conference apparatus will be described as an example of an image display apparatus.
  • a plurality of video conference apparatuses are arranged at a plurality of bases, and the plurality of arranged video conference apparatuses constitute a video conference system.
  • Each video conference device communicates audio data and video data via a predetermined communication line.
  • the present invention is not limited to this.
  • FIG. 16 is a diagram showing an example of the screen configuration of the display screen of the video conference apparatus arranged at each site.
  • the example of FIG. 16 shows the screen configuration of each base when communication is performed using the video conference apparatus at two bases, the A base and the B base.
  • FIG. 16A shows a screen configuration of the video conference apparatus arranged at the A base
  • FIG. 16B shows a screen configuration of the video conference apparatus arranged at the B base.
  • a conference participant at the A site and a conference participant at the B site perform a video conference using their respective video conference devices.
  • the present embodiment is not limited to a video conference, and other communication may be performed using video data transmitted / received by a video conference device, such as a remote lecture or telemedicine.
  • the A base and the B base are not necessarily at a long distance.
  • the display screen 1100 of the A site shown in FIG. 16A has a main screen 1105, a sub-screen 1102, and a grandchild screen 1103.
  • the sub-screen 1102 is an example of a sub display area.
  • the main screen 1105 is a first display area in which video data such as the person 1101 at the B base photographed by the camera at the B base is displayed.
  • the sub-screen 1102 is a second display area in which the video data displayed on the main screen 1115 of the display screen 1110 at the B site is superimposed on a predetermined area in the main screen 1105.
  • the grandchild screen 1103 is a third display area in which the video data displayed on the child screen 1113 of the display screen 1110 at the B site is superimposed on a predetermined area in the child screen 1102.
  • the display screen 1110 of the B base shown in FIG. 16B similarly has a main screen 1115, a child screen 1113, and a grandchild screen 1114.
  • the main screen 1115 shown in FIG. 16B is a first display area in which video data such as the persons 1111 and 1112 at the A site photographed by the camera at the A site is displayed.
  • the sub-screen 1113 is a second display area in which the video data displayed on the main screen 1105 of the display screen 1100 at the A site is superimposed on a predetermined area in the main screen 1115.
  • the grandchild screen 1114 is a third display area in which the video data displayed on the child screen 1102 of the display screen 1100 at the A site is superimposed on a predetermined region in the child screen 1113.
  • the video data of the other person is displayed on the main screen, and the display data (hereinafter referred to as the other person screen display data) displayed on the display screen of the other party on the sub-screen. Is also displayed).
  • the partner screen display data includes display data of the partner main screen and display data of the partner child screen. Therefore, in FIGS. 16A and 16B, when the display data of the main screen of the partner site is displayed on the sub-screen of the display screen of the partner site, the sub-screen of the partner site is incidentally shaped like a grandchild screen. It appears.
  • each video data may be displayed on a screen different from the screen described above.
  • the screen configuration of the A site shown in FIG. 16C displays the video data displayed on the main screen 1115 of the display screen 1110 of the B site on the main screen 1105.
  • the sub-screen 1102 displays video data of the person 1101 at the B site and the like taken by the camera at the B site.
  • the child screen 1113 of the display screen 1110 at the B site is superimposed on the main screen 1105 on which the main screen 1115 at the B site is displayed. Display data other than video data may be displayed on each screen.
  • a child screen 1113 is displayed in the upper left area of the display screen 1110, and the person 1111 of the A site displayed on the main screen 1115 of the B site is displayed.
  • Hidden in the child screen 1113 The person participating in the video conference at the A site looks at the child screen 1102 displayed in the display screen 1100 shown in FIG. 16A, and the person 1111 is hidden by the child screen. It is possible to intuitively understand that it is not displayed on the screen. Accordingly, the person 1111 at the A base can move to be displayed on the screen at the B base as needed without being verbally pointed out by the person at the B base. Further, the person at the A site may arrange the subject so as to be always displayed on the partner screen, that is, the display screen 1110 at the B site, while viewing the sub-screen 1102 when displaying the subject to be shown to the other party. It becomes possible.
  • FIG. 17 is a block diagram illustrating an example of a main configuration of each video conference apparatus according to the present embodiment.
  • the video conference apparatus arranged at the point A is assumed to be 1001A
  • the video conference apparatus arranged at the point B is assumed to be 1001B.
  • the video conference apparatuses 1001A and 1001B shown in FIG. 17 display a camera 1200 that captures a subject to be photographed in its own base, a video transmission / reception apparatus 1210 that transmits / receives video data of the camera 1200, and received video data.
  • a display 1230 and an input device 1240 are provided.
  • the video transmission / reception device 1210 transmits the video data of the camera 1200 to the video conference device at the partner site via the network 1220 and receives video data from the video conference device at the partner site via the network 1220.
  • the display 1230 displays video data received by the video transmission / reception device 1210.
  • the input device 1240 is a mouse, a remote controller, or the like that designates the configuration of a display screen displayed on the display 1230 by the video transmission / reception device 1210 in accordance with a user instruction.
  • the video transmission / reception device 1210 will be described in detail.
  • the video conference apparatus 1001A will be described for easy understanding, but the video conference apparatus 1001B is a device that is paired with the video conference apparatus 1001A and has the same components and functions. Also in FIG. 17, detailed components are omitted for the video transmission / reception device 1210 of the video conference device 1001B.
  • the video transmission / reception device 1210 includes a video acquisition unit 1211, a video transmission unit 1212, a video reception unit 1213, a video display unit 1214, an operation unit 1215, a layout transmission unit 1216, a layout reception unit 1217, a layout determination unit 1218, and a partner screen configuration.
  • a display unit 1219 is provided.
  • the video acquisition unit 1211 acquires video captured by the camera 1200 (video including shooting targets such as people 1111 and 1112) as video data.
  • the acquired video data is used, for example, as video data displayed on the sub-screen 1102 in the video conference apparatus 1001A at site A, which is its own site. Further, the acquired video data is used as video displayed on the main screen 1115 and grandchild screen 1114 in the video conference apparatus 1001B at the B site which is the partner site.
  • the video transmission unit 1212 encodes the video data acquired by the video acquisition unit 1211 (hereinafter, the encoded video data is referred to as encoded video data), and converts it into a data format that can be transmitted to the network 1220. Next, the video transmission unit 1212 transmits the encoded video data to the video conference apparatus 1001B via the network 1220. Note that the encoded video data transmitted by the video conference apparatus 1001A is received by the video reception unit 1213 of the video conference apparatus 1001B.
  • the video receiving unit 1213 receives encoded video data including a subject to be photographed such as the person 1101 transmitted from the video conference apparatus 1001B via the network 1220, and converts it into a format that can be displayed on the display 1230.
  • This video data is used, for example, as video data displayed on the main screen 1105 and grandchild screen 1103 in the video conference apparatus 1001A and as video data displayed on the sub-screen 1113 in the video conference apparatus 1001B.
  • the video data received by the video conference apparatus 1001A is encoded video data transmitted by the video transmission unit 1212 of the video conference apparatus 1001B.
  • the video receiving unit 1213 may receive display data other than the encoded video data.
  • the layout receiving unit 1217 receives the partner screen configuration information transmitted from the video conference apparatus 1001B via the network 1220.
  • the partner screen configuration information indicates screen configuration information for configuring each screen such as the main screen 1115 and the sub screen 1113 on the display screen 1110 of the video conference apparatus arranged at the base B which is the partner base. . Details of the partner screen configuration information will be described later.
  • the partner screen configuration information received by the video conference apparatus 1001A is the own screen configuration information transmitted by the layout transmission unit 1216 of the video conference apparatus 1001B.
  • the operation unit 1215 acquires information specifying whether or not the video display unit 1214 displays the child screen 1102 via the input device 1240 in accordance with an instruction from the user of the video conference device 1001A. Alternatively, the operation unit 1215 similarly acquires information that specifies the position of the sub-screen 1102 with respect to the display screen 1100.
  • This designation information is, for example, pattern selection information for selecting one from a plurality of predetermined patterns stored in a storage unit (not shown) of the video conference apparatus 1001A.
  • the coordinate information specifies the upper left area or the lower right area.
  • Each screen such as the main screen and the sub-screen does not necessarily have a rectangular shape, and may have another shape such as a circular shape.
  • FIG. 18 shows an example of pattern selection information for selecting designation information.
  • the pattern selection information displays a child screen 1102 on which the second display data 1402 is displayed superimposed on the display area on the main screen 1105 on which the first display data 1401 is displayed. It is information for. Specific display areas are, for example, the lower right (FIG. 18A), the upper right (FIG. 18B), the upper left (FIG. 18C), the lower left (FIG. 18D), and the like. Also, in the example shown in FIG. 18 (e), the display contents are reversed from those in FIGS. 18 (a) to (d), and the first screen is displayed on the main screen 1105 on which the second display data 1402 is displayed.
  • the sub-screen 1102 on which the display data 1401 is displayed may be set at the lower right. Which display form is used is determined by the operation unit 1215 via the input device 1240.
  • the operation unit 1215 designates on the main screen 1105 or the sub screen 1102 the display screen of the partner screen display data via the input device 1240 in accordance with an instruction from the user of the video conference apparatus 1001A. May be obtained.
  • the operation unit 1215 generates own screen configuration information based on these pieces of information.
  • the self-screen configuration information is screen configuration information for configuring each screen such as the main screen 1105 and the sub-screen 1102 on the display screen 1100 of the video conference apparatus disposed at the A site that is the home site. Details of the self-screen configuration information will be described later.
  • the layout determination unit 1218 reproduces and displays the partner screen display data displayed on the display screen of the local site based on the partner screen configuration information from the layout receiving unit 1217 and the host screen configuration information from the operation unit 1215. Determine the reproduction area for This reproduction area is set in the child screen 1102 or the main screen 1105 in the display screen 1100. For example, when the partner screen display data is displayed on the child screen 1102, a predetermined area in the child screen 1102 becomes a reproduction area, and a grandchild screen 1103 is additionally superimposed on the predetermined area of the child screen 1102. (For example, see FIG. 16A).
  • a predetermined area in the main screen 1105 becomes a reproduction area, and a grandchild screen 1103 is additionally superimposed on the predetermined area of the main screen 1105. (For example, see FIG. 16C).
  • the layout transmission unit 1216 transmits the self-screen configuration information from the operation unit 1215 to the video conference device 1001B via the network 1220.
  • the layout transmission unit 1216 may transmit the self-screen configuration information at the timing when the child screen 1102 is displayed on the display screen, for example, according to an instruction from the input device 1240. Note that the self-screen configuration information transmitted by the video conference apparatus 1001A is received as the partner screen configuration information by the video transmission unit 1212 of the video conference apparatus 1001B.
  • the video display unit 1214 displays data other than the partner screen display data on the main screen 1105 and the like.
  • the video display unit 1214 displays video data including a shooting target such as the person 1101 from the information video receiving unit 1213.
  • the video display unit 1214 relays the video data when the partner screen configuration display unit 1219 displays the video data received by the video reception unit 1213.
  • the partner screen configuration display unit 1219 displays the partner screen display data on the child screen 1102 or the like based on the display position determined by the layout determination unit 1218.
  • the partner screen configuration display unit 1219 includes video data including shooting targets such as the person 1101 from the video display unit 1214 in the child screen 1102 and video data including shooting targets such as the persons 1111 and 1112 from the video acquisition unit 1211. Is displayed.
  • the self-screen configuration information generated by the operation unit 1215 includes first self-screen configuration information for displaying the main screen 1105 and second self-screen configuration information for displaying the sub-screen 1102.
  • Each self-screen configuration information may include display position information on the display screen 1100 of the main screen 1105 of the video conference apparatus 1001A, display position information on the display screen 1100 of the sub-screen 1102, and resolution information on the display screen 1100.
  • the resolution information is information indicating a unit as a pixel, for example.
  • the self-screen configuration information may include information (display order information) indicating the number of the main screen 1105 or the sub-screen 1102 displayed on the display screen 1100 from the front side of the screen.
  • the first self-screen configuration information is (x1, y1, x1 ′, y1 ′, N1)
  • the second self-screen configuration information is (x2, y2, x2 ′, y2).
  • ', N2) is generated.
  • x1 and y1 indicate the upper left coordinates of the rectangle of the main screen 1105.
  • X1 'and y1' indicate the lower right coordinates of the rectangle of the main screen 1105.
  • x2 and y2 indicate the upper left coordinates of the rectangle of the child screen 1102.
  • x2 ′ and y2 ′ indicate the lower right coordinates of the rectangle of the child screen 1102.
  • N1 and N2 are values indicating the display order of the main screen 1105 and the sub-screen 1102, and a larger value is displayed on the front side of the display screen 1100.
  • the resolution information is omitted here, but is unnecessary when the layout determining unit 1218 determines the display position of each screen and the like, and is transmitted from the layout transmitting unit 1216 to the video conference apparatus at the partner site. It is necessary in some cases.
  • the partner screen configuration information received by the layout receiving unit 1217 includes the second partner screen configuration information used by the video conference apparatus 1001B to display the main screen 1115 and the second screen used to display the sub-screen 1113. Includes partner screen configuration information.
  • the partner screen configuration information includes display position information on the display screen 1110 of the main screen 1115 of the video conference apparatus 1001B, display position information on the display screen 1110 of the sub-screen 1113, and resolution information on the display screen 1110. Further, the partner screen configuration information includes information (display order information) indicating what number the main screen 1115 or the sub-screen 1113 is displayed on the display screen 1110 from the front side of the screen.
  • the first partner screen configuration information (x3, y3, x3 ′, y3 ′, N3) and the second partner screen configuration information (x4, y4, x4 ′, y4 ′) , N4)
  • information of (X, Y) is included as resolution information.
  • x3 and y3 indicate the upper left coordinates of the rectangle of the main screen 1115.
  • x3 ′ and y3 ′ indicate the lower right coordinates of the rectangle of the main screen 1115.
  • x4 and y4 indicate the upper left coordinates of the rectangle of the child screen 1113.
  • x4 ′ and y4 ′ indicate the lower right coordinates of the rectangle of the child screen 1113.
  • N3 and N4 are values indicating the display order of the main screen 1115 and the sub-screen 1113, and the larger the value, the closer to the display screen 1110 is displayed.
  • X represents the horizontal resolution in FIG.
  • Y indicates the vertical resolution in FIG.
  • FIG. 19 is a flowchart showing an example of the operation when the layout determining unit 1218 determines the screen configuration of the display screen.
  • the display position of each screen, the position of the reproduction area, the display order, etc. are determined.
  • the display position of each screen is determined from the coordinate information.
  • the video conference device at its own site is 1001A and the video conference device at the other site is 1001B.
  • the partner screen display data is displayed on the child screen 1102.
  • the layout determining unit 1218 acquires the partner screen configuration information from the layout receiving unit 1217 (step S1011).
  • the layout determining unit 1218 acquires the display position information of the main screen 1115, the display position information of the sub-screen 1113, and the resolution information of the display screen 1110 included in the acquired partner screen configuration information (step S1012).
  • the layout determination unit 1218 includes first self-screen configuration information, second self-screen configuration information from the operation unit 1215, first partner screen configuration information, second partner screen configuration information, partner screen from the layout reception unit 1217. Get resolution information for. Subsequently, the layout determination unit 1218 determines the screen configuration on the display screen 1100 of the video conference device 1001A based on the acquired information. At this time, the layout determination unit 1218 also calculates the drawing position of the reproduction area where the partner screen display data is reproduced and displayed (step S1013).
  • the layout determining unit 1218 sets a region corresponding to the main screen 1115 of the video conference apparatus 1001B displayed in the display region of the sub-screen 1102 as a main screen corresponding region 1115A. Furthermore, the layout determination unit 1218 sets a region corresponding to the child screen 1113 of the video conference apparatus 1001B displayed in the display region of the child screen 1102 as a child screen corresponding region 1113A.
  • These corresponding areas 1115A and 1113A are examples of reproduction display areas indicating a plurality of display areas in the reproduction area.
  • the layout determining unit 1218 generates first corresponding screen configuration information for displaying the main screen corresponding area 1115A and second corresponding screen configuration information for displaying the child screen corresponding area 1113A.
  • the first corresponding screen configuration information includes display position information with respect to the display screen 1100 in the main screen corresponding area 1115A and information (display order information) indicating what number is displayed from the front side of the screen in the display screen 1100.
  • the second corresponding screen configuration information includes display position information with respect to the display screen 1100 of the sub-screen corresponding area 1113A and information (display order information) indicating what number is displayed from the front side of the screen in the display screen 1100. .
  • information of (x5, y5, x5 ′, y5 ′, N5) is generated as the first corresponding screen configuration information
  • information (x6, y6, x6 ′, y6 ′, N6) is generated as the second corresponding screen configuration information.
  • x5 and y5 indicate the upper left coordinates of the rectangle of the main screen corresponding area 1115A.
  • x5 'and y5' indicate the lower right coordinates of the rectangle of the main screen corresponding area 1115A.
  • x6 and y6 indicate the upper left coordinates of the rectangle of the small screen corresponding area 1113A.
  • x6 'and y6' indicate the lower right coordinates of the rectangle of the sub-screen corresponding area 1113A.
  • N5 and N6 are values indicating the display order of the sub-screen 1102 and the grandchild screen 1103. Is displayed.
  • the child screen corresponding area 1113A automatically represents the grandchild screen 1103.
  • x5, y5, x5 ', y5', x6, y6, x6 ', y6' can be expressed by the following (Formula 5) based on the own screen configuration information and the partner screen configuration information.
  • the layout determining unit 1218 can determine the position of the reproduction area on the display screen 1100.
  • the result is that the horizontal length of the reproduction region is shorter than the horizontal length of the child screen 1102.
  • monochrome black display may be performed.
  • the layout determining unit 1218 determines the display order for the main screen 1105, the main screen corresponding area 1115A of the reproduction area, and the sub screen corresponding area 1113A (step S1014).
  • the display order is determined by assuming that Nmax is the largest display order information (here, N3 and N4) included in the partner screen configuration information, and N5 and N6 are expressed by the following (formula 6). Can be represented.
  • the layout determining unit 1218 determines the corresponding screen configuration information as (x5, y5, x5 ′, y5 ′, N5) and (x6, y6, x6 ′, y6 ′, N6) determined as described above.
  • the data is sent to the display unit 1219. Further, the layout determining unit 1218 sends the first self-screen configuration information (x1, y1, x1 ′, y1 ′, N1) to the video display unit 1214 (step S1015).
  • the counterpart screen configuration display unit 1219 and the video display unit 1214 compare the values of N1, N5, and N6, which are the order of superimposition, and the larger value is superimposed on the display screen 1100 (front side of the screen). To display. Specifically, display data corresponding to the main screen 1105, a main screen corresponding area 1115A in the sub screen 1102, and a sub screen corresponding area 1113A corresponding to the grandchild screen 1103 is displayed. In other words, the grandchild screen 1103 with the large display order information is displayed in the forefront.
  • the partner screen configuration display unit 1219 displays the child screen 1102
  • the main screen corresponding area 1115A is appropriately reduced to the position indicated by the first corresponding screen configuration information (x5, y5, x5 ′, y5 ′, N5). Or it will enlarge and display.
  • the partner screen configuration display unit 1219 displays the child screen 1102
  • the child screen corresponding region 1113A is located at the position indicated by the second corresponding screen configuration information (x6, y6, x6 ′, y6 ′, N6). Is reduced or enlarged as appropriate.
  • the processing shown in FIG. 19 is performed to display the video data (including the sub screen) displayed on the display screen of the video conference device at the partner site on the display screen of the video conference device at the local site.
  • the user at the local site can intuitively grasp whether or not the video data of the local site displayed on the display screen of the partner site is hidden on the sub-screen.
  • a screen is configured based on partner screen configuration information, which is layout information acquired from the partner site, using video data possessed by the site. Thereby, this embodiment is realizable only by the minimum information amount of other party screen structure information, without transmitting / receiving extra video data.
  • the layout receiving unit 217 receives the partner screen configuration information including the third partner screen configuration information, so that the layout determining unit 1218 determines the display position and display order of the document data and displays the document data in the reproduction area. can do.
  • the layout determination unit 1218 may generate self-screen configuration information including information on the document data, and the layout transmission unit 1216 may transmit the information to the video conference device at the partner site.
  • the area display is similarly performed.
  • information called fourth partner screen configuration information (x8, y8, x8 ′, y8 ′, N8) different from the first partner screen configuration information to the third partner screen configuration information described above is prepared.
  • the layout receiving unit 1217 receives the partner screen configuration information including the third partner screen configuration information, so that the layout determining unit 1218 determines the display position and display order of the GUI and the like, and displays the GUI and the like in the reproduction area. can do.
  • the layout determining unit 1218 may generate self-screen configuration information including information such as GUI and the like, and the layout transmitting unit 1216 may transmit the information to the video conference device at the partner site.
  • an icon or the like indicating that the window or GUI is displayed is displayed on the sub-screen on the display screen without displaying the video data of the local site and the remote site. May be.
  • an item V of information indicating the type of image displayed on the main screen and the sub screen of the video conference device at the partner base is added.
  • information indicating the type of image is assigned as 1 for the self-portrait, 2 for the counterpart image, 3 for the material video data, and the like.
  • the video display unit 1214 and the partner screen configuration display unit 1219 can identify what type of video data is displayed on the child screen 1102 and the grandchild screen 1103.
  • a video conference apparatus will be described as an example of an image display apparatus.
  • a plurality of video conference apparatuses are arranged at a plurality of bases, and the plurality of arranged video conference apparatuses constitute a video conference system.
  • Each video conference system performs communication of audio data and video data via a predetermined communication line.
  • FIG. 21 is a block diagram illustrating an example of a main configuration of each video conference apparatus according to the present embodiment.
  • the video conference apparatus arranged at the point A is 2001A
  • the video conference apparatus arranged at the point B is 2001B.
  • the video conference apparatus 2001B is an example of another image display apparatus.
  • FIG. 21 it is assumed that the conference participant at the A site and the conference participant at the B site conduct a video conference using the respective video conference apparatuses.
  • the configuration of FIG. 21 is not limited to the video conference, and other communication may be performed using video and audio transmitted and received by the video conference device.
  • the A base and the B base are not necessarily at a long distance.
  • a case where two video conference apparatuses are arranged at two locations will be described, but the present invention is not limited to this.
  • the 21 includes a camera 2100, a video transmission / reception device 2110, a display 2140, and an input device 2150.
  • the camera 2100 captures a subject such as a person at the base.
  • the video transmission / reception device 2110 acquires the video data of the camera 2100, transmits the video data to the video conference device at the partner site via the network 2130, and receives the video from the video conference device at the partner site via the network 2130.
  • the display 2140 displays video data received by the video transmission / reception device 2110.
  • the input device 2150 includes a mouse, a remote controller, and the like that perform various operation inputs in accordance with user instructions.
  • the video transmission / reception device 2110 will be described in detail.
  • the video conference apparatus 2001A will be described for easy understanding.
  • the video conference apparatus 2001B is a device paired with the video conference apparatus 2001A, and has the same components and functions. Also in FIG. 21, detailed components are omitted for the video transmission / reception device 2110 of the video conference device 2001B.
  • a screen display example will be described with reference to FIG.
  • the video transmission / reception device 2110 includes a video acquisition unit 2111, a video transmission unit 2112, a video reception unit 2113, a video display unit 2114, an operation unit 2115, a layout transmission unit 2116, a layout reception unit 2117, a self-screen display unit 2118, a determination unit 2119, A camera control unit 2120, a subject detection unit 2121, and a layout determination unit 2122 are provided.
  • the video acquisition unit 2111 acquires video captured by the camera 2100 (video including subjects such as people 2021 and 2022) as video data.
  • the video data is displayed on the sub-screen 2012 as the sub display area in the video conference apparatus 2001A at the A base that is the own base, and the main screen as the main display area in the video conference apparatus 2001B at the B base that is the partner base. 2025 is displayed.
  • the video transmission unit 2112 encodes the video data acquired by the video acquisition unit 2111 and converts it into a data format that can be transmitted to the network (hereinafter, the encoded video data is referred to as encoded video data). Next, the video transmission unit 2112 transmits the encoded video data to the video conference apparatus 2001B via the network 2130. Note that the encoded video data transmitted by the video conference apparatus 2001A is received by the video reception unit 2113 of the video conference apparatus 2001B.
  • the video reception unit 2113 receives encoded video data including a subject such as the person 2011 transmitted from the video conference apparatus 2001B via the network 2130, and converts it into a format that can be displayed on the display 2140.
  • the video data is displayed on the main screen 2015 in the video conference apparatus 2001A, and is displayed on the sub-screen 2023 in the video conference apparatus 2001B.
  • the encoded video data received by the video conference apparatus 2001A is encoded video data transmitted by the video transmission unit 2112 of the video conference apparatus 2001B.
  • the video display unit 2114 displays video data including a subject such as the person 2011 from the video receiving unit 2113, shared data shared with the video conference apparatus 2001B at the partner site, or the like on the main screen 2015 or the sub-screen 2012. Note that the video display unit 2114 displays data other than the video data captured by the camera 2100 at the site A (that is, other than the self-portrait) on the main screen 2015 or the sub-screen 2012. In addition, the video display unit 2114 sends the resolution information of the display screen 2010 when displaying on the main screen 2015 or the sub-screen 2012 to the layout determination unit 2122.
  • the layout receiving unit 2117 receives the partner screen configuration information transmitted from the video conference apparatus 2001B via the network 2130.
  • the partner screen configuration information includes screen configuration information for configuring each screen such as the main screen 2025 and the sub screen 2023 on the display screen 2020 of the video conference apparatus 2001B arranged at the base B which is the partner base. Details of the partner screen configuration information will be described later.
  • the display screen 2020 is an example of another display screen, and the partner screen configuration information is an example of other screen configuration information.
  • the other party screen configuration information received by the video conference apparatus 2001A is the own screen configuration information transmitted by the layout transmission unit 2116 of the video conference apparatus 2001B.
  • the operation unit 2115 receives information specifying the display of the sub-screen 2012 on the self-screen display unit 2118 and information specifying the position of the sub-screen 2012 with respect to the display screen 2010. 2150.
  • This designation information is, for example, pattern selection information for selecting one from a plurality of predetermined patterns stored in a storage unit (not shown) of the video conference apparatus 2001A.
  • the designation information is, for example, coordinate information for designating the upper left area or lower right area when the child screen 2012 is rectangular.
  • Each screen such as the main screen and the sub-screen does not necessarily have a rectangular shape, and may have another shape such as a circular shape.
  • FIG. 22 shows an example of pattern selection information.
  • the pattern selection information displays a child screen 2012 on which the second display data 2202 is displayed superimposed on the display area on the main screen 2015 on which the first display data 2201 is displayed. It is information for.
  • the pattern selection information indicates the arrangement of the sub-screen 2012 displayed on the main screen 2015 in the lower right (FIG. 22A), upper right (FIG. 22B), and upper left (FIG. 22 ( c)), lower left (FIG. 22D), and the like.
  • Which display form is used is determined by the operation unit 2115 via the input device 2150.
  • the contents displayed on the main screen 2015 and the sub-screen 2012 are, for example, the video data of the local site acquired by the video acquisition unit 2111, the video data of the partner site received by the video receiving unit 2113, and shared by both sites. Or any other content data.
  • the operation unit 2115 acquires information for designating whether or not to display the self-portrait in response to a user instruction from the video conference apparatus 2001A. In addition, the operation unit 2115 sends the designation information to the layout determination unit 2122.
  • the layout determination unit 2122 generates self-screen configuration information based on the designation information from the operation unit 2115 and the resolution information of the display screen 2010 from the video display unit 2114.
  • the self-screen configuration information is screen configuration information for configuring each screen such as the main screen 2015 and the sub-screen 2012 on the display screen 2010 of the video conference apparatus arranged at the A site that is the home site. Details of the self-screen configuration information will be described later.
  • the layout transmission unit 2116 transmits the self-screen configuration information from the layout determination unit 2122 to the video conference apparatus 2001B via the network 2130.
  • the layout transmission unit 2116 may transmit the self-screen configuration information at the timing when the child screen 2012 is displayed on the display screen, for example, in accordance with the input of the input device 2150. Note that the self-screen configuration information transmitted by the video conference apparatus 2001A is received as the partner screen configuration information by the layout receiving unit 2117 of the video conference apparatus 2001B.
  • the self screen display unit 2118 displays the video data (that is, the self image) acquired by the video acquisition unit 2111 on the main screen 2015 or the sub screen 2012 based on the self screen configuration information from the layout determination unit 2122. At this time, the self-screen display unit 2118 appropriately performs reduction or enlargement according to the size of the sub-screen 2012 designated by the operation unit 2115. Note that the sub-screen 2012 is not limited to one.
  • the subject detection unit 2121 detects various subjects such as a face, a person, and materials included in the video data from the video acquisition unit 2111.
  • a background image without a subject is acquired in advance, a background difference method for acquiring the difference from the current video to check for the presence of the subject, face portions based on feature extraction of the subject, There are a face detection method and a person detection method for detecting a person.
  • the subject detection unit 2121 may use a moving object detection method that detects the motion of the detection target of the video data as the subject detection method.
  • the subject detection unit 2121 detects in which region of the video data the detected subject is located. This detection position can be expressed, for example, as position coordinates of a face area described in FIG.
  • the determination unit 2119 determines whether or not the child screen and the predetermined subject overlap on the display screen of the partner base. To do. Next, the determination unit 2119 determines a control method (camera rotation direction, zoom magnification, etc.) of the camera 2100 arranged at its own location based on the determination result. That is, the determination unit 2119 gives an instruction to control the shooting state of the camera 2100 based on whether or not the sub-screen and the predetermined subject overlap on the display screen of the partner site.
  • a control method camera rotation direction, zoom magnification, etc.
  • the camera control unit 2120 generates a camera control command for controlling the camera 2100 based on the control method (camera rotation direction, zoom magnification, etc.) of the camera 2100 by the determination unit 2119.
  • the camera control command is a command that can be recognized by the camera 2100 and is, for example, a character string such as an ASCII code. In response to this command, the camera 2100 is controlled. That is, the camera control unit 2120 actually controls the shooting state of the camera 2100.
  • the self-screen configuration information generated by the layout determining unit 2122 includes information on the upper left and lower right coordinates of the rectangle of the child screen 2012, the horizontal resolution and the vertical resolution of the display screen 2010.
  • the resolution information is information indicating a unit as a pixel, for example. Note that the resolution information is not necessary when determining the display position of the sub-screen 2012, and is necessary when transmitting from the layout transmission unit 2116 to the video conference device at the partner site.
  • the self-screen configuration information may include information regarding each data displayed on the child screen 2012 and the main screen 2015.
  • the partner screen configuration information received by the layout receiving unit 2117 includes partner screen configuration information used by the video conference apparatus 2001B to display the sub-screen 2023.
  • the counterpart screen configuration information includes information on the upper left and lower right coordinates of the rectangle, the horizontal resolution and the vertical resolution of the display screen 2020. Further, the other screen configuration information may include information regarding each data displayed on the sub-screen 2023 and the main screen 2015.
  • FIG. 23 is a flowchart illustrating an example of the operation of the determination unit 2119 in the present embodiment.
  • the video conference apparatus placed at the local site is 2001A
  • the video conference apparatus placed at the partner site is 2001B.
  • video data captured by the camera 2100 at the local site is displayed on the main screen 2025 of the video conference apparatus 2001B
  • the sub-screen 2023 is displayed at any position on the main screen 2025.
  • the number of participants in the video conference at their base will be described as two.
  • the subject detection unit 2121 performs face detection as subject detection.
  • the camera at the local site will be described as the camera 2100A.
  • the determination unit 2119 instructs the subject detection unit 2121 to perform a predetermined face detection process on video data as an input image from the video acquisition unit 2111.
  • This face detection process uses a known technique (step S2011).
  • the determination unit 2119 acquires the partner screen configuration information from the layout receiving unit 2117. Then, the determination unit 2119 acquires the display position information of the child screen 2023 and the resolution information of the display screen 2010 of the video conference apparatus 2001B from the partner screen configuration information (step S2012).
  • the determination unit 2119 acquires area information of the face area detected by the subject detection unit 2121. In the present embodiment, it is assumed that two face regions are detected. The determination unit 2119 determines whether or not the child screen 2023 overlaps at least one of the two face areas detected by the subject detection unit 2121 on the display screen 2020 of the video conference apparatus 2001B (step S2013).
  • information (x1, y1, x2, y2) is included as display position information of the child screen 2023 included in the partner screen configuration information.
  • x1 and y1 indicate the upper left coordinates of the rectangle of the child screen 2023 on the display screen 2020.
  • x2 and y2 indicate the lower right coordinates of the rectangle of the child screen 2023 on the display screen 2020.
  • the position of the first face area is represented as (1, xf1, yf1), (1, xf2, yf2), and the position of the second face area is (2, xf1, yf1), (2, xf2, yf2).
  • “1” and “2” are identification information of the face area
  • xf1 and yf1 are the upper left coordinates of the rectangle of each face area on the display screen 10
  • xf2 and yf2 are the right of the rectangle of each face area on the display screen 10. Indicates the bottom coordinate.
  • (1, xf1, yf1) indicates the upper left coordinates of the rectangle of the first face area
  • (1, xf2, yf2) indicates the lower right coordinates of the rectangle of the first face area
  • (2, xf1, yf1) indicates the upper left coordinates of the rectangle of the second face area
  • (2, xf2, yf2) indicates the lower right coordinates of the rectangle of the second face area.
  • coordinates may be represented as (face area identification information, face area coordinate information) for explanation.
  • (2, xf2) represents the rectangular lower right x coordinate xf2 of the second face area.
  • the resolution of the display screen 2020 of the video conference apparatus 2001B is represented as (X, Y).
  • X represents the horizontal resolution of the display screen 2020
  • Y represents the vertical resolution of the display screen 2020.
  • step S2013 specifically, when any of the following expressions (7) to (10) is satisfied, it is determined that the child screen 2023 does not overlap any face area on the display screen 2020.
  • xf1 ⁇ x1 and xf2 ⁇ x1 (7)
  • x2 ⁇ xf1 and x2 ⁇ xf2 (8)
  • yf1 ⁇ y1 and yf2 ⁇ y1 9
  • step S2013 when the child screen 2023 of the display screen 2020 overlaps with the face area, the determination unit 2119 determines whether the child screen 2023 can avoid overlapping with each face area by rotating the camera 2100A left and right (step S2013). Step S2014). At this time, if either of the following formulas (11) and (12) is satisfied, the child screen 2023 is not outside the display screen 2020, and the camera 2100A is moved to the left or right so that the child screen 2023 has any face. Estimate that it does not overlap with the region. (2, xf2)-(1, xf1) ⁇ x1 (11) (2, xf2) ⁇ (1, xf1) ⁇ X ⁇ x2 (12)
  • FIG. 24 is a diagram illustrating an example of the positional relationship between the child screen 2023 and the face area when the captured video data is displayed on the display screen 2020 by the camera 2100A.
  • the camera 2100 In the state shown in FIG. 24 (that is, when (2, xf2)> x1) estimated to be non-overlapping by the equations (11) and (12), the camera 2100 is moved to the right by the angle ⁇ shown in the following equation (13). , The face area does not overlap with the child screen 2023.
  • L shown in FIG. 24 represents the distance from the camera position to the virtual imaging plane.
  • ⁇ x indicates an angle formed by the reference shooting direction perpendicular to the virtual shooting plane from the camera 2100A and the shooting target end in the horizontal direction.
  • represents an angle formed by the reference photographing direction of the camera 2100A and the rotation angle to be actually rotated.
  • ⁇ y represents an angle formed between the reference photographing direction of the camera 2100A and the photographing target end in the vertical direction.
  • FIG. 25 is a diagram showing an example of the positional relationship between the child screen 2023 and the face area when the captured video data is displayed on the display screen 2020 by the camera 2100A.
  • the camera 2100A is moved to the left by the angle ⁇ shown in the following equation (14).
  • no face area overlaps the sub-screen 2023.
  • the determination unit 2119 sends determination information to the camera control unit 2120 (step S2015).
  • the determination information is information for instructing the camera 2100A to pan to the left and right by the angle ⁇ shown in Expression (13) or Expression (14).
  • step S2014 If it is determined in step S2014 that the child screen 2023 overlaps at least one face area even when the camera 2100A is rotated to the left or right, the determination unit 2119 performs the following process.
  • the determination unit 2119 determines whether the child screen 2023 does not overlap with any face area by rotating the camera 2100A up and down (step S2016). At this time, if the determination unit 2119 satisfies the following expression (15) or (16), even if the camera 2100A is moved up and down, the sub screen 2023 does not fall outside the display screen 2020, and the sub screen 2023 It is estimated that it does not overlap with any face area.
  • (2, yf2) ⁇ (1, yf1) ⁇ Y ⁇ y2 (16)
  • FIG. 26 is a diagram showing an example of the positional relationship between the child screen 2023 and the face area when the captured video data is displayed on the display screen 2020 by the camera 2100A.
  • the determination unit 2119 is a case where it is estimated that there is no overlap according to the equations (15) and (16), and the state shown in FIG. 26, that is, the smaller value of (1, yf1) and (2, yf1) is represented as yfmin. To do. Next, when yfmin ⁇ y2, the determination unit 2119 estimates that any face area does not overlap the child screen 2023 when the camera 2100 is rotated upward by an angle ⁇ represented by the following equation (17).
  • FIG. 27 is a diagram showing an example of a positional relationship between the child screen 2023 and the face area when the captured video data is displayed on the display screen 2020 by the camera 2100A.
  • the determination unit 2119 is a case where it is estimated that there is no overlap according to the equations (15) and (16), and the larger value of the states (1, yf1) and (2, yf1) shown in FIG. 27 is set as yfmax. .
  • the determination unit 2119 estimates that any face area does not overlap the child screen 2023 when the camera 2100A is rotated downward by the angle ⁇ shown in the following equation (18).
  • the determination unit 2119 performs the following processing.
  • the determination unit 2119 sends determination information for instructing the camera 2100A to be tilted up and down by the angle ⁇ shown in Expression (17) or Expression (18) to the camera control unit 2120 (Step S2017).
  • step S2016 when the child screen 2023 and at least one face area overlap even when the camera 2100A is rotated up and down, the determination unit 2119 performs the following processing.
  • the determination unit 2119 sends determination information instructing to zoom out by a certain amount (that is, to reduce the shooting magnification) to the camera control unit 2120 (step S2018).
  • the determination unit 2119 returns to step S2011 after executing step S2018. Therefore, the determination unit 2119 performs the above determination while zooming out by a certain amount until there is no overlap between the child screen 2023 and the face area.
  • the determination unit 2119 may end the processing illustrated in FIG. 23 after a predetermined time has elapsed from the start of determination before the child screen 2023 overlaps with the face area.
  • the predetermined subject and the sub screen overlap and the display data hidden on the sub screen cannot be viewed. It can be avoided.
  • the face of the participant of the video conference is the subject, but the participant's person itself may be the subject.
  • material data such as material video shared between the own base and the partner base may be used as a subject.
  • the document data is detected as a subject, and it is determined that the child screen and the document data do not overlap.
  • two or more face areas are detected, and camera control is performed so that the child screen does not overlap with any face area.
  • the present invention is useful as an image display device or the like that allows a user at the local site to recognize the position of the sub display area on the display screen at the other site.
  • the present invention presents the area actually displayed in the captured video displayed on the display screen of the video conference apparatus that has received the video captured by the camera to the user of the video conference apparatus that has transmitted the captured video. It is useful as a video conference device.
  • the present invention is useful for a screen display device, a video conference device, or the like that can grasp what kind of display is being performed on the display screen of the partner site.
  • the present invention is useful for a screen display device, a video conference device, and the like that can prevent the child screen and the predetermined subject from overlapping on the display screen of the partner site.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

L'invention porte sur un dispositif d'affichage d'image qui permet à un utilisateur, à l'emplacement du dispositif, d'être informé de la position d'une région d'affichage secondaire sur un écran d'affichage à un autre emplacement. Le dispositif d'affichage d'image peut configurer de multiples régions d'affichage sur un écran d'affichage. Le dispositif reçoit des informations de région d'affichage de contenu pour configurer de multiples régions d'affichage (201, 202) sur un dispositif d'affichage (140) d'un autre dispositif d'affichage d'image sur un réseau (120), et fournit une commande pour présenter une position pour la région d'affichage (202), à partir des multiples régions d'affichage (201, 202) sur l'autre dispositif d'affichage d'image, sur la base des informations de région d'affichage de contenu reçues.
PCT/JP2009/006255 2008-11-20 2009-11-19 Dispositif d'affichage d'image, dispositif de téléconférence et procédé d'affichage d'image WO2010058591A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/129,878 US20110222676A1 (en) 2008-11-20 2009-11-19 Image display apparatus, teleconferencing device, and image display method

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP2008296986A JP2010124317A (ja) 2008-11-20 2008-11-20 テレビ会議装置及び実際表示領域提示処理方法
JP2008-296986 2008-11-20
JP2008316742A JP2010141662A (ja) 2008-12-12 2008-12-12 画像表示装置、テレビ会議装置、および画像表示方法
JP2008-316742 2008-12-12
JP2009-002878 2009-01-08
JP2009002878A JP2010161662A (ja) 2009-01-08 2009-01-08 画像表示装置、テレビ会議装置、および画像表示方法

Publications (1)

Publication Number Publication Date
WO2010058591A1 true WO2010058591A1 (fr) 2010-05-27

Family

ID=42198034

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/006255 WO2010058591A1 (fr) 2008-11-20 2009-11-19 Dispositif d'affichage d'image, dispositif de téléconférence et procédé d'affichage d'image

Country Status (2)

Country Link
US (1) US20110222676A1 (fr)
WO (1) WO2010058591A1 (fr)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8872878B2 (en) * 2011-07-20 2014-10-28 Cisco Technology, Inc. Adaptation of video for use with different number of cameras and displays at endpoints
JP2013242357A (ja) * 2012-05-18 2013-12-05 Ricoh Co Ltd 情報処理装置、情報処理方法、およびプログラム
KR101959820B1 (ko) * 2012-10-12 2019-03-20 삼성전자주식회사 멀티미디어 통신 시스템에서 구성 정보 송수신 방법 및 장치
JP2014235534A (ja) * 2013-05-31 2014-12-15 株式会社東芝 情報処理装置および表示制御方法
CN104243528B (zh) * 2013-06-20 2018-06-05 腾讯科技(深圳)有限公司 一种在多个终端设备间同步复制内容的方法和系统
JP2015126457A (ja) 2013-12-27 2015-07-06 ブラザー工業株式会社 サーバ装置のプログラム、サーバ装置及び遠隔会議方法
JP2015126456A (ja) * 2013-12-27 2015-07-06 ブラザー工業株式会社 通信端末装置のプログラム、通信端末装置、サーバ装置のプログラム、サーバ装置及び遠隔会議システム
JP2016126432A (ja) * 2014-12-26 2016-07-11 ブラザー工業株式会社 遠隔会議プログラム、端末装置、および遠隔会議方法
US11068129B2 (en) * 2019-08-20 2021-07-20 Lenovo (Singapore) Pte. Ltd. Method and device for augmenting a communal display device
CN112866774A (zh) * 2021-01-07 2021-05-28 福建捷联电子有限公司 一种具有多功能相机的显示器及其控制方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10200873A (ja) * 1997-01-16 1998-07-31 Sharp Corp テレビ電話装置
JP2002374508A (ja) * 2001-06-14 2002-12-26 Canon Inc 通信装置、通信システム、映像通信処理方法、記憶媒体、及びプログラム
JP2004101708A (ja) * 2002-09-06 2004-04-02 Sony Corp 画像表示制御装置および方法、並びにプログラム
JP2005244314A (ja) * 2004-02-24 2005-09-08 Kddi Technology Corp マルチメディア配信サーバ
JP2008258779A (ja) * 2007-04-02 2008-10-23 Sony Corp テレビ会議装置、制御方法、およびプログラム

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5959622A (en) * 1996-05-31 1999-09-28 Intel Corporation Still image capture under computer control in response to user-instructed trigger
US6025871A (en) * 1998-12-31 2000-02-15 Intel Corporation User interface for a video conferencing system
KR100580174B1 (ko) * 2003-08-21 2006-05-16 삼성전자주식회사 회전 가능한 디스플레이 장치 및 화면 조정 방법
JP4363199B2 (ja) * 2004-01-28 2009-11-11 ソニー株式会社 情報信号処理装置および情報信号処理方法
US8081684B2 (en) * 2005-08-19 2011-12-20 Qualcomm Incorporated Picture-in-picture processing for video telephony
EP1763243A3 (fr) * 2005-09-09 2008-03-26 LG Electronics Inc. Methode et système de capture et d'affichage d'image
US8244068B2 (en) * 2007-03-28 2012-08-14 Sony Ericsson Mobile Communications Ab Device and method for adjusting orientation of a data representation displayed on a display

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10200873A (ja) * 1997-01-16 1998-07-31 Sharp Corp テレビ電話装置
JP2002374508A (ja) * 2001-06-14 2002-12-26 Canon Inc 通信装置、通信システム、映像通信処理方法、記憶媒体、及びプログラム
JP2004101708A (ja) * 2002-09-06 2004-04-02 Sony Corp 画像表示制御装置および方法、並びにプログラム
JP2005244314A (ja) * 2004-02-24 2005-09-08 Kddi Technology Corp マルチメディア配信サーバ
JP2008258779A (ja) * 2007-04-02 2008-10-23 Sony Corp テレビ会議装置、制御方法、およびプログラム

Also Published As

Publication number Publication date
US20110222676A1 (en) 2011-09-15

Similar Documents

Publication Publication Date Title
WO2010058591A1 (fr) Dispositif d'affichage d'image, dispositif de téléconférence et procédé d'affichage d'image
US9860486B2 (en) Communication apparatus, communication method, and communication system
US10297005B2 (en) Method for generating panoramic image
US10027871B2 (en) Imaging control system, control apparatus, control method, and storage medium
CN101771840B (zh) 用于控制投影仪设备的控制设备和方法
JP6551155B2 (ja) 通信システム、通信装置、通信方法およびプログラム
US10951859B2 (en) Videoconferencing device and method
JP4760896B2 (ja) カメラ制御装置及びカメラ制御方法
JP7346830B2 (ja) 通信端末、プログラム、表示方法、記録媒体、システム
JP2017083661A (ja) 通信システム、通信装置、通信方法およびプログラム
JP2023164525A (ja) 通信端末、プログラム、表示方法、記録媒体
JP2010011307A (ja) カメラ情報表示装置及びカメラ情報表示方法
CN114631323A (zh) 区划适应性视频生成
JP2010124317A (ja) テレビ会議装置及び実際表示領域提示処理方法
KR20180089639A (ko) 수술 영상촬영 및 처리 시스템
JP2010141662A (ja) 画像表示装置、テレビ会議装置、および画像表示方法
JP2010161662A (ja) 画像表示装置、テレビ会議装置、および画像表示方法
WO2014208169A1 (fr) Dispositif de traitement d'informations, procédé de commande, programme et support d'enregistrement
JP2007214803A (ja) 撮影制御装置及び撮影制御方法
Lazewatsky et al. A panorama interface for telepresence robots
JP2004147307A (ja) 表示装置、端末装置及び双方向対話型システム
JP2004112824A (ja) テレビカメラ通信装置
JP4485933B2 (ja) 遠隔コミュニケーションシステムとこのシステムで使用される方向提示方法及び方向提示プログラム
JP2024065683A (ja) 撮像装置、撮像装置の制御方法、プログラム
JP2023079276A (ja) 個人用表示装置、表示システム、および表示システムの制御方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09827372

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 13129878

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09827372

Country of ref document: EP

Kind code of ref document: A1