JP2012142697A - Video conference system and video conference program - Google Patents

Video conference system and video conference program Download PDF

Info

Publication number
JP2012142697A
JP2012142697A JP2010292524A JP2010292524A JP2012142697A JP 2012142697 A JP2012142697 A JP 2012142697A JP 2010292524 A JP2010292524 A JP 2010292524A JP 2010292524 A JP2010292524 A JP 2010292524A JP 2012142697 A JP2012142697 A JP 2012142697A
Authority
JP
Japan
Prior art keywords
image
display
object image
display object
means
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2010292524A
Other languages
Japanese (ja)
Inventor
Takahiro Shimazu
宝浩 島津
Original Assignee
Brother Ind Ltd
ブラザー工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Brother Ind Ltd, ブラザー工業株式会社 filed Critical Brother Ind Ltd
Priority to JP2010292524A priority Critical patent/JP2012142697A/en
Publication of JP2012142697A publication Critical patent/JP2012142697A/en
Pending legal-status Critical Current

Links

Images

Abstract

PROBLEM TO BE SOLVED: To provide a video conference system and video conference program capable of displaying the content of information that has been displayed before the superimposition of a shield even when obtaining an image where the shield is superimposed on an object with the information displayed therein.SOLUTION: An acquisition image 50 is displayed in an acquisition image display area 281 on a display 28. When a person image 80 is not superimposed on a display object image 61 in the acquisition image 50, the display object image 61 is stored. The stored display object image 61 is a storage image 51. When the person image 80 is superimposed on the display object image 61 in the acquisition image 50, the storage image 51 stored immediately before is displayed in a storage image display area 282 on the display 28. The content of information which cannot be visually recognized by being shielded by the person image 80 in the acquisition image 50 can be visually recognized in the storage image 51.

Description

  The present invention relates to a video conference system and a video conference program for acquiring and displaying an image.

  Conventionally, an apparatus for acquiring an image is known. For example, in a system for recording a meeting described in Patent Document 1, contents written on a whiteboard are recorded using a camera. The recorded whiteboard contents serve as a visual index for efficiently browsing the contents of the meeting. As a result, the user can efficiently browse the contents of the recorded conference.

Japanese Patent No. 4499380

  However, for example, when displaying on the display while acquiring an image, if an object such as a person exists in front of the whiteboard, information written by the user displayed on the whiteboard is blocked by the object. When shielded by an object, the content of information displayed on the whiteboard is difficult to visually recognize.

  An object of the present invention is to provide a video conference system and a television capable of displaying the content of information displayed before the shielding object overlaps even when an image in which the shielding object overlaps an object on which information is displayed is acquired. To provide a conference program.

  The video conference system according to the first aspect of the present invention is a video conference system including a communication device that transmits and receives images to and from other devices via a network. The communication device acquires an image to acquire an image. Means, first display control means for displaying an acquired image, which is an image acquired by the image acquisition means, on a display means for displaying the image, and an image of an information display object for displaying information included in the acquired image A specifying unit that specifies a display object image, and whether or not a shield object image that is an image of an object that shields the information display object overlaps the display object image specified by the specifying unit in the acquired image. An overlap determination unit for determining the display object, an image storage control unit for storing the display object image specified by the specifying unit in a first storage device, and the overlap determination unit. When it is determined that the shielding object image overlaps the display object image, the image storage control unit immediately before the overlap determination unit determines that the shielding object image overlaps the display object image. And a second display control means for causing the display means to display the display object image stored in the first storage device.

  In this case, the acquired image is displayed by the first display control means. When the shielding object image overlaps the display object image included in the acquired image, the shielding object image shields the content of the information displayed on the information display object from being visually recognized. In this case, the second display control means displays the display object image stored in the first storage device immediately before the shield object image overlaps the display object image. For this reason, even when the shielding object image overlaps the display object image, the content of the information displayed on the information display object can be visually recognized.

  In the video conference system, the image acquisition unit includes the image transmission unit of the other apparatus including an image imaging unit that captures an image and an image transmission unit that transmits the image captured by the image capturing unit to a network. The image transmitted by the means may be acquired by receiving via the network. In this case, an image transmitted from another device via the network is acquired, and it is determined whether or not the display object image and the shielding object image overlap each other. In the case of overlapping, the display object image stored in the first storage device immediately before the shield object image overlaps the display object image is displayed. Since it is not necessary for another device to transmit the display object image immediately before the shielding object image overlaps the display object image to the communication device via the network, it is possible to reduce the load on the network bandwidth. Therefore, it is possible to prevent a load on the network band from disturbing the image and slowing down the image display speed.

  On the display means, the video conference system includes the acquired image displayed on the display means by the first display control means and the display object image displayed on the display means by the second display control means. A setting detection means for detecting an instruction to change the display setting, which is a setting relating to the display of the display, and the display setting after the change based on the instruction to change the display setting detected by the setting detection means. Display setting storage control means for storing in a storage device, wherein the first display control means adjusts the acquired image based on the changed display settings stored by the display setting storage control means, and Display on the display means, and the second display control means adjusts the display object image based on the changed display settings stored by the display setting storage control means. It may be displayed on the display means. In this case, the setting regarding the display on the display means of the acquired image and the display object image transmitted from another device can be changed. For this reason, for example, the display size and position of the acquired image and the display object image can be changed, or display / non-display of the display object image by the second display control unit can be set. As described above, the display mode can be changed instead of displaying the image transmitted from another device and acquired by the image acquisition unit or the display object image as it is.

  In the video conference system, when the overlap determining unit determines that the shielding object image does not overlap the display object image, the display object image specified by the specifying unit is displayed by the image storage control unit. Update determining means for determining whether or not the display object image immediately before stored in the first storage device is different, the image storage control means, the display object image specified by the specifying means, The display object image specified by the specifying unit when the update determining unit determines that the display object image is different from the display object image stored in the first storage device by the image storage control unit. May be stored. In this case, when the display object image specified by the specifying unit is different from the display object image stored in the first storage device by the image storage control unit, that is, the content of the information displayed on the information display object is updated. In this case, the image storage control means newly stores the display object image. Since the display object image is stored when the content of the information displayed on the information display object is updated, it is not necessary to store the same display object image many times, and the processing load of the communication device can be reduced. .

  The video conference system includes write determination means for determining whether or not information display has started on the information display object indicated by the display object image specified by the specifying means, and the overlap determination means includes the write When it is determined by the determining means that the display of information is started on the information display object, it may be started to determine whether the shielding object image overlaps the display object image specified by the specifying means. Good. In this case, when the display of information is started on the information display object, the overlap determination unit starts determining whether or not the shielding object image overlaps the display object image. If they do not overlap, the display object image is stored by the image storage control means. Moreover, if it overlaps, the display object image memorize | stored in the 1st memory | storage device will be displayed on a display means by the 2nd display control means. In other words, if writing is not performed, determination by the overlap determination unit, storage by the image storage control unit, display of the display object image by the second display control unit, and the like are not performed. Therefore, the processing load of the communication device can be reduced.

  The video conference system includes time acquisition means for acquiring time information that is time information, and the image storage control means is determined by the overlap determination means that the shielding object image does not overlap the display object image. The display object image specified by the specifying means and the current time information acquired by the time means are stored in the first storage device, and the second display control means If the display object image is determined to overlap the display object image by the means, the display object image stored in the first storage device by the image storage control means and the time based on the time information An image may be displayed on the display means. In this case, the display object image and the image at the time when the display object image was stored are displayed on the display means. For this reason, the user can confirm at which point the display object image displayed by the second display control means is the display object image.

  In the video conference system, an object contour extracting unit that extracts an object contour that is an image contour of an object included in the acquired image acquired by the image acquiring unit, and the object contour extracted by the object contour extracting unit. A shielding object contour identifying unit for identifying a shielding object contour that is a contour of the shielding object image from among the object contours extracted from the object contour extraction unit. The display object image is identified by identifying the display object contour which is the contour of the object image, and the overlap determining unit is identified by the shielding object contour identifying unit to the display object contour identified by the identifying unit. It may be determined whether the shielding object image overlaps the display object image by determining whether the shielding object outline overlaps.

  In this case, first, the contour extracting means extracts the object contour, and then the shielding object contour and the display object contour are identified. By extracting the object outline, it becomes easy to identify the shape of the object. For this reason, it is possible to more accurately identify the shielding object outline and the display object outline. Therefore, the overlap determination unit can more accurately determine whether or not the shielding object image overlaps the display object image.

  In the video conference system, the overlap determination unit has a ratio of a region where the shielding object image overlaps the display object image to a region on the display unit where the display object image is displayed is a predetermined value or more. It may be determined whether or not the shielding object image overlaps the display object image by determining whether or not there is. In this case, the display object image stored in the first storage device when the ratio of the shield object image overlapping the display object image to the area on the display means on which the display object image is displayed is equal to or greater than a predetermined value. Is displayed. For this reason, for example, if the predetermined value is set to a value that does not cause a problem in the visibility of the information displayed on the display object image, the first storage device stores the value when there is no problem in the visibility. It is possible not to display the stored display object image.

  A video conference program according to a second aspect of the present invention is a video conference program executed in a video conference system including a communication device that transmits and receives images to and from other devices via a network. Included in the acquired image is an image acquiring step for acquiring an image, a first display control step for causing the controller of the apparatus to display the acquired image, which is the image acquired in the image acquiring step, on display means for displaying the image. A specifying step of specifying a display object image that is an image of an information display object that displays information; and in the acquired image, an object that shields the information display object from the display object image specified by the specifying step. An overlap determination step for determining whether or not the shielding object image as an image is overlapped, and the identification step specified A storage control step of storing the display object image in the first storage device, and the display object by the overlap determination step when it is determined by the overlap determination step that the shielding object image overlaps the display object image. A second display control step for causing the display means to display the display object image stored in the first storage device by the storage control step immediately before it is determined that the shielding object image overlaps the image. Let In this case, the display object image stored in the first storage device immediately before the shielding object image overlaps the display object image is displayed. For this reason, even when the shielding object image overlaps the display object image, the contents of the information displayed on the information display object can be visually recognized.

1 is a diagram illustrating a configuration of a video conference system 1. FIG. 3 is a block diagram showing a hardware configuration of a communication device 4. FIG. 4 is a diagram illustrating an example of an image displayed on a display 28. FIG. 4 is a diagram illustrating an example of an image displayed on a display 28. FIG. It is a flowchart which shows a transmission side main process. It is a flowchart which shows a receiving side main process. It is a flowchart which shows an image recognition process. 5 is a diagram illustrating an example of an acquired image 50. FIG. It is a figure which shows an example of the acquired image 50 from which the outline 90 was extracted. It is a figure which shows an example of the acquired image 50 when the person image 80 overlaps with the display thing image 61. FIG. It is a figure which shows an example of the acquired image 50 from which the outline 90 was extracted. It is a figure which shows an example of the acquisition image 50 when the writing of a character is added. It is a flowchart which shows a transmission side deformation | transformation process. It is a flowchart which shows a receiving side deformation | transformation process.

  Hereinafter, embodiments according to the present invention will be described with reference to the drawings. These drawings are used to explain the technical features that can be adopted by the present invention, and the configuration of the communication device and the flowcharts of various processes described are not intended to be limited to them. It is merely an illustrative example.

  The configuration of the video conference system 1 including the communication devices 3, 4, and 5 will be described with reference to FIG. 1. The video conference system 1 includes a network 2, a communication device 3, a communication device 4, and a communication device 5. The communication devices 3, 4, 5 are connected to each other via the network 2. In the video conference system 1, a video conference is performed between the communication devices 3, 4, and 5 by transmitting and receiving images and sounds to and from each other via the network 2. In the present embodiment, the whiteboard 6 supported by the support legs 7 is installed at the base where the communication device 3 is installed. Then, an image including the whiteboard 6 is captured by the communication device 3 and transmitted to the communication devices 4 and 5. In FIG. 1, three communication devices 3, 4, and 5 are connected to the network 2, but the number is not limited, and for example, 10 devices may be connected to the network 2.

  The communication devices 3, 4, and 5 have the same hardware configuration, and the same software is set. In the present embodiment, the communication device 3 is a communication device that transmits image data to the communication devices 4 and 5. The communication devices 4 and 5 are communication devices that receive the image data transmitted from the communication device 3 and display the image data on the display 28 (see FIG. 2). Any one of all the communication devices 3, 4, 5 connected to the network 2 can function as a communication device that transmits image data. Also, other communication devices can function as communication devices that receive image data.

  The hardware configuration of the communication devices 3, 4, and 5 will be described with reference to FIG. Since the hardware configurations of the communication devices 3, 4, and 5 are the same, the hardware configuration of the communication device 4 will be described here.

  The communication device 4 is provided with a CPU 20 as a controller that controls the communication device 4. Connected to the CPU 20 are a ROM 21 that stores BIOS, a RAM 22 that temporarily stores various data, and an I / O interface 30 that mediates data transfer. The RAM 22 includes at least a display object image storage area 221 and a display setting storage area 222. A display object image 61 described later is stored in the display object image storage area 221. The display setting storage area 222 stores display settings to be described later.

  The I / O interface 30 is connected to a hard disk drive 31 (hereinafter referred to as HDD 31) having various storage areas. The HDD 31 stores a program for causing the CPU 20 to execute various processes. The I / O interface 30 includes a communication interface 25 for communicating with the network 2, a mouse 27, a video controller 23, a key controller 24, a camera 34 for capturing an image, and user's voice. A microphone 35 for capturing and a CD-ROM drive 26 are connected to each other. The I / O interface 30 includes an encoder 36 for compressing and encoding an image, a decoder 37 for expanding and decoding the encoded image, and measuring time and measuring elapsed time. A timer 38 is connected to the camera. A display 28 is connected to the video controller 23. A keyboard 29 is connected to the key controller 24.

  The CD-ROM 114 inserted into the CD-ROM drive 26 stores, for example, various programs in the present embodiment of the communication devices 3, 4, and 5. When the CD-ROM 114 is introduced, various programs and the like are set up from the CD-ROM 114 to the HDD 31.

  In the present embodiment, an example of an image displayed on the display 28 will be described. In the following description, the upper side, lower side, right side, and left side of FIG. 3 are defined as the upper side, lower side, right side, and left side of the display 28, respectively. FIG. 3 is an example of a screen imaged by the communication device 3, transmitted from the communication device 3 to the communication devices 4 and 5, and displayed on the display 28 by the communication devices 4 and 5. As shown in FIG. 3, the display 28 is provided with an acquired image display area 281 that is an area in which images transmitted from the communication device 3 and received by the communication devices 4 and 5 are displayed. Hereinafter, an image received by the communication devices 4 and 5 is referred to as an acquired image 50.

  The acquired image 50 includes a display object image 61. The display object image 61 is an image of an information display object (in this embodiment, the whiteboard 6) that displays information. As shown in FIG. 3, the character “ABC” is written by the user in the upper left portion of the display object image 61, and the character “DEF” is written by the user in the lower left portion. In the case of this embodiment, these written characters are information displayed on the information display object. That is, in this embodiment, when it is described that writing is performed on the whiteboard 6, this means that information is displayed on the whiteboard 6.

  A support leg image 71 that is an image of the support leg 7 (see FIG. 1) is displayed below the display object image 61. In the upper right part of the acquired image display area 281, a display that is a button on the screen for setting the size of the acquired image display area 281 and display / non-display by the user operating the mouse 27 and the keyboard 29. A setting button 2811 is displayed.

  When the person stands in front of the whiteboard 6 and the display object image 61 and the person image 80 as the person image overlap in the acquired image display area 281, the image shown in FIG. 4 is displayed on the display 28. Is done. As shown in FIG. 4, an acquired image display area 281 is displayed on the left side of the display 28. In the acquired image display area 281, the display object image 61 is shielded by the person image 80, and the lower left character “DEF” cannot be visually recognized.

  A stored image display area 282 is provided on the right side of the acquired image display area 281. A stored image 51 is displayed in the stored image display area 282. The stored image 51 is a display object image 61 stored in the display object image storage area 221 in the past. By displaying the stored image 51, the character “DEF” that is no longer visible in the acquired image display area 281 is displayed in a visible manner. The time “12:10” is displayed in the lower right portion of the stored image display area 282. This time is the time when the stored image 51 displayed in the stored image display area 282 is stored. In the upper right part of the stored image display area 282, the display setting which is a button on the screen for setting the size of the stored image display area 282 and display / non-display by the user operating the mouse 27 and the keyboard 29. Button 2821 is displayed.

  With reference to FIG. 5, the transmission side main process executed by the CPU 20 of the communication apparatus 3 will be described. The transmission-side main process is started when the user inputs an instruction to start a video conference. The audio data for the video conference is also transmitted / received between the communication devices 3, 4 and 5, but in the following description, the audio data transmission / reception process is omitted for easy understanding. Explains.

  As shown in FIG. 5, in the transmission side main process, first, the camera 34 is used to capture an image (S11). As a result, an image is acquired. Next, the image acquired in S11 is encoded and image data is created (S12). Next, the image data encoded in S12 is transmitted to the communication devices 4 and 5 (S13). The transmitted image data is received in S21 (see FIG. 6) of the communication devices 4 and 5. Next, it is determined whether or not an instruction to end the video conference is input by the user via the mouse 27 or the keyboard 29 (S14). If no termination instruction has been input (S14: NO), the process returns to S11. When the end instruction is input (S14: YES), the transmission side main process is ended.

  With reference to FIG. 6, the reception side main process executed by the CPU 20 of the communication devices 4 and 5 will be described. The reception-side main process is started when a user inputs an instruction to start a video conference.

  As shown in FIG. 6, in the receiving side main process, first, it is determined whether or not the image data transmitted in S13 (see FIG. 5) of the communication device 3 has been received (S21). An image based on the image data received in S <b> 21 is the acquired image 50. That is, in S21, the acquired image 50 is acquired. If image data has not been received (S21: NO), it is determined whether an instruction to end the video conference has been input via the mouse 27 or the display 28 (S41). If the instruction to end the video conference has not been input (S41: NO), the process returns to S21. When image data is received (S21: YES), the image data received in S21 is decoded (S22). Next, an image recognition process is executed (S23).

  The image recognition process will be described with reference to FIG. The image recognition process is a process of extracting a feature portion included in the acquired image 50 and specifying the display object image 61. As shown in FIG. 7, in the image recognition process, first, an image is corrected. In S51, for example, noise is removed using a digital filter.

  Next, the characteristic part of the acquired image 50 is extracted (S52). In the present embodiment, as an example, it is assumed that the characteristic portion is an outline of an object. Processing for extracting the contour of an object in the acquired image 50 by edge extraction will be described. For the extraction of the edge, for example, a well-known method such as secondary differentiation, Hough transform, or the like can be used.

  For example, it is assumed that the acquired image 50 including the display object image 61 illustrated in FIG. 8 is received. In this case, the contour 90 is extracted by the processing of S52 as shown in FIG. The contour 90 includes a contour 611 of the display object image 61 and a contour 711 of the support leg image 71. In the case of FIG. 10 in which the person image 80 is displayed, the contour 90 includes a contour 611, a contour 711, and a contour 801 of the person image 80 as illustrated in FIG. 11.

  Next, the outline 611 of the display object image 61 is identified from the outline 90 extracted in S52 (S53). In S53, for example, the outline 611 of the display object image 61 is identified by a pattern matching method which is a well-known method.

  Next, the outline 801 of the person image 80 is identified from the outline 90 extracted in S52 as shown in FIG. 11 (S54). In S54, for example, the contour 801 of the person image 80 is identified by the pattern matching method.

  Next, the outline 611 of the display object image 61 identified in S53 is used (see FIGS. 9 and 11), and the display object image 61 is specified (S55). For example, in the case of FIG. 9, it is specified that the area inside the contour 611 is the display object image 61. Next, the image recognition process is terminated, and the process returns to the main process (see FIG. 6).

  Next, the acquired image 50 acquired in S21 is displayed in the acquired image display area 281 of the display 28 (S24). In S <b> 24, for example, the size and position of the acquired image 50 are adjusted and displayed based on display settings that are settings related to the display stored in the display setting storage area 222. The display setting is changed by S39 and S40 described later. In addition, when S40 is not performed, the acquired image 50 is displayed based on a preset display setting. By the process of S24, for example, as shown in FIG. 3, an image including the display object image 61 is displayed. Next, it is determined whether or not the display object image 61 is specified in S55 (see FIG. 7) (S25). Accordingly, it is determined whether or not the display object image 61 is included in the acquired image 50.

  If the display object image 61 is not specified (S25: NO), the process proceeds to S41. When the display object image 61 is specified (S25: YES), it is determined whether or not writing of characters or the like has been started on the whiteboard 6 indicated by the display object image 61 specified in S55 (see FIG. 7). (S26). In S26, for example, it is determined whether or not writing is started by determining whether or not there is a portion having a predetermined luminance in the area in the display object image 61 (S26). For example, since the display object image 61 is an image of the whiteboard 6, the area where writing is performed is a color close to white. When writing is performed in a color other than white, the luminance is lower than that of white. By setting the luminance of a color different from white as the predetermined luminance, it is possible to determine whether or not there is a portion having the predetermined luminance in S26. That is, it can be determined whether writing of characters or the like has started.

  If writing has not started (S26: NO), the process proceeds to S41. That is, S27 to S40 are not executed until writing is performed. For example, in FIG. 8, since the characters “ABC” and “DEF” have been written, it is determined that the writing has started. When writing is started (S26: YES), it is determined whether or not the person image 80 overlaps the display object image 61 specified in S55 (S27). In S27, it is determined whether or not the outline 801 of the person image 80 overlaps the outline 611 of the display object image 61 identified in S53, whereby the person image 80 is added to the display object image 61 specified in S55. It is determined whether or not they overlap. For example, in FIG. 9, since the person image 80 does not exist, it is determined that the person image 80 does not overlap the display object image 61.

  When the person image 80 does not overlap the display object image 61 (S27: NO), it is determined whether or not the display object image 61 is stored in the display object image storage area 221 (S28). The display object image 61 is stored in the display object image storage area 221 by the process of S29 or S35 described later. The display object image 61 stored in the process of S29 or S35 is the stored image 51. In other words, in S28, it is determined whether or not there is a stored image 51.

  When the display object image 61 is not stored in the display object image storage area 221 (S28: NO), the display object image 61 specified in S55 (see FIG. 7) is stored in the display object image storage area 221. (S29). Thereby, the stored image 51 is generated. For example, the display object image 61 in which characters “ABC” and “DEF” shown in FIG. 8 are written is stored in the display object image storage area 221. Next, the current time is stored in the display object image storage area 221 (S30). For example, when the current time is 12:10, the time “12:10” is stored. Next, the process proceeds to S41.

  Hereinafter, a case where a person stands in front of the whiteboard 6 will be described. In this case, as shown in FIG. 10, the display object image 61 and the person image 80 overlap in the acquired image 50. In this case, as shown in FIG. 11, a contour 90 which is a characteristic portion is extracted (S52, see FIG. 7). As described above, the contour 90 illustrated in FIG. 11 includes the contour 611 of the display object image 61, the contour 711 of the support leg image 71, and the contour 801 of the person image 80. Then, the outline 611 of the display object image 61 is identified (S53), and the outline 801 of the person image 80 is identified (S54).

  Then, it is determined that the outline 611 of the display object image 61 and the outline 801 of the person image 80 overlap (S27: YES), the display object image storage area 221 is referred to, and whether or not the stored image 51 is stored. Determination is made (S37). In other words, it is determined whether or not the display object image 61 is stored in S29 or S35 (described later) (S37).

  If the stored image 51 is not stored (S37: NO), the process proceeds to S41. When the stored image 51 is stored (S37: YES), the stored image 51 is displayed (S38). That is, the display object image 61 immediately before the person image 80 overlaps the display object image 61 is displayed. When the stored image 51 is displayed in S38, the time stored in the display object image storage area 221 is also displayed. As a result, the image displayed as shown in FIG. 3 is updated as shown in FIG. In FIG. 4, the acquired image 50 is displayed in the acquired image display area 281, and the stored image 51 is displayed in the stored image display area 282. In the lower right part of the stored image 51, the time “12:10” stored in the display object image storage area 221 is displayed. As shown in FIG. 4, in the acquired image 50, the person image 80 overlaps the display object image 61 so that the display object image 61 is shielded. For this reason, in the acquired image 50, the character “DEF” cannot be visually recognized by the person image 80. However, since the stored display object image 61 (that is, the stored image 51) is displayed, the character “DEF” is visible.

  Next, it is determined whether or not an instruction to change display settings is input via the mouse 27 or the keyboard 29 (S39). The user inputs an instruction to change display settings by pressing the display setting buttons 2811, 2821 and the like. Thereby, for example, the size of the acquired image display area 281 or the stored image display area 282 and the setting of display / non-display are set. When an instruction to change the display setting is input (S39: YES), the display setting after the change is stored in the display setting storage area 222 (S40). Thus, the images displayed in S24 and S37 are displayed based on the changed display settings. If there is no instruction to change the display setting (S39: NO), the process proceeds to S41.

  In the following, a case in which a state where a person stands in front of the whiteboard 6 changes to a state where no person stands in front of the whiteboard 6 will be described as an example. That is, the case where the acquired image 50 changes from the state shown in FIG. 10 to the state shown in FIG.

  In this case, it is determined that the person image 80 does not overlap the display object image 61 (S27: YES). If it is determined that the display object image 61 is stored in the display object image storage area 221 (S28: YES), it is determined whether or not the storage image 51 is being displayed (S31). When the stored image 51 is not being displayed (S31: NO), S33 to be described later is executed. When the stored image 51 is being displayed (S31: YES), the stored image display area 282 displaying the stored image 51 is deleted from the display 28 (S32). As a result, the image shown in FIG. 4 is updated to the image shown in FIG. That is, an image in which only the acquired image 50 is displayed is displayed.

  Next, whether or not the display object image 61 (that is, the stored image 51) stored in the display object image storage area 221 in S29 or S35 is different from the display object image 61 newly specified in S55 (see FIG. 7). Is determined (S33). When the display object image 61 stored in the display object image storage area 221 is different from the display object image 61 newly specified in S55, the contents such as characters written on the whiteboard 6 are updated. Is the case.

  In S33, for example, the total luminance value of all pixels of the display object image 61 stored in the display object image storage area 221 and the total luminance value of all images of the display object image 61 newly specified in S55. And are calculated. Then, the total luminance values are compared, and if the total luminance values are the same, the display object image 61 stored in the display object image storage area 221 and the display object image 61 newly specified in S55. Are not different (S34: NO), the process proceeds to S41.

  For example, when the acquired image 50 shown in FIG. 8 is changed to the acquired image 50 shown in FIG. 12 by writing the character “GHI” on the whiteboard 6, the luminance of the portion where the character “GHI” is written is changed. The value becomes lower. For this reason, the total value of the luminance values is lowered. Therefore, it is determined that the display object image 61 stored in the display object image storage area 221 is different from the display object image 61 newly specified in S55 (S34: YES). Then, the display object image 61 newly specified in S55 is updated and stored in the display object image storage area 221 (S35). Next, the current time is updated and stored in the display object image storage area 221 (S36). Next, the process proceeds to S41. If it is determined in S41 that an instruction to end the conference has been input (S41: YES), the receiving main process is ended.

  As described above, the processing in this embodiment is executed. When the person image 80 overlaps the display object image 61 included in the acquired image 50, the person image 80 is shielded, and the content written on the whiteboard 6 becomes difficult to visually recognize. In the present embodiment, when the person image 80 overlaps the display object image 61 included in the acquired image 50 (S27: YES), the image is stored in the display object image storage area 221 immediately before the person image 80 overlaps the display object image 61. The displayed display object image 61 (that is, the stored image 51) is displayed (S38). Therefore, even when the person image 80 overlaps the display object image 61, the user can visually recognize the contents written on the whiteboard 6 (that is, the contents of information displayed on the whiteboard 6).

  Further, an image transmitted from the communication device 3 via the network is acquired (S21: YES), and it is determined whether or not the display object image 61 and the person image 80 are overlapped (S27). If they overlap (S27: YES), the stored image 51 stored in the display object image storage area 221 immediately before the person image 80 overlaps the display object image 61 is displayed (S38). In this case, it is not necessary for the communication device 3 to transmit the display object image 61 immediately before the person image 80 overlaps the display object image 61 to the communication devices 4 and 5 via the network 2. For this reason, the load on the bandwidth of the network 2 can be reduced. Therefore, it is possible to prevent a load on the bandwidth of the network 2 from disturbing the display of the image or slowing down the display speed of the image.

  In addition, display settings on the display 28 for the acquired image 50 and the display object image 61 can be changed (S39: YES, S40). For this reason, instead of displaying the acquired image 50 or the stored image 51 received from the communication device 3 via the network as they are, the display mode can be changed on the received communication device 4 or 5 side.

  Further, when the display object image 61 stored in the display object image storage area 221 is different from the display object image 61 newly specified in S55 (S34: YES), that is, the characters described on the whiteboard 6 or the like. Is updated, the display object image 61 is newly stored (S35). That is, the stored image 51 is updated. Since the display object image 61 is stored when the contents written in the whiteboard 6 are updated, it is not necessary to store the same display object image 61 many times, and the processing load of the communication devices 4 and 5 is reduced. Can be reduced.

  Further, when writing to the whiteboard 6 is started (S26: YES), it is determined whether or not the person image 80 overlaps the display object image 61 (S27). If they overlap (S27: YES), the stored image 51 is displayed (S38). In other words, when the writing to the whiteboard 6 has not started (S26: NO), the processing of S27 to S40 is not executed. Therefore, the processing load of the communication devices 4 and 5 can be reduced. In this case, since the display object image 61 and the person image 80 do not overlap, the user can visually recognize the writing on the whiteboard 6 by the acquired image 50.

  As shown in FIG. 4, an image “12:10” at the time when the stored image 51 is stored is displayed together with the stored image 51. For this reason, the user can confirm the display object image 61 at which time the stored image 51 is displayed.

  Further, after the contour 90 of the object is extracted in S52 of FIG. 7, the contour 611 of the display object image 61 and the contour 801 of the person image 80 are identified in S53 and S54. By extracting the contour 90 of the object in S52, information related to the display other than the contour 90 is reduced, so that the shape of the object can be easily identified. For this reason, the outline 611 and the outline 801 can be identified more accurately. Therefore, it is possible to more accurately determine whether or not the person image 80 overlaps the display object image 61 in S27 of FIG.

  In the above embodiment, the CPU 20 of the communication devices 4 and 5 performing the processing of S21 in FIG. 6 corresponds to the “image acquisition unit” of the present invention, and the display 28 of the communication devices 4 and 5 serves as the “display unit” of the present invention. Equivalent to. The CPU 20 of the communication devices 4 and 5 performing the processing of S24 in FIG. 6 corresponds to the “first display control means” of the present invention, and the whiteboard 6 corresponds to the “information display object” of the present invention. The display object image 61 and the stored image 51 correspond to the “display object image” of the present invention, and the CPU 20 of the communication devices 4 and 5 performing the processes of S53 and S55 in FIG. 7 corresponds to the “specifying unit” of the present invention. . The person image 80 corresponds to the “shielding object image” of the present invention, and the CPU 20 of the communication devices 4 and 5 that performs the processing of S27 in FIG. 6 corresponds to the “overlap determination unit” of the present invention. The RAM 22 of the communication devices 4 and 5 corresponds to the “first storage device” and the “second storage device” of the present invention, and the CPU 20 of the communication devices 4 and 5 that performs the processes of S29, S35, S30, and S36 in FIG. This corresponds to “image storage control means” of the present invention.

  The CPU 20 of the communication devices 4 and 5 performing the process of S38 in FIG. 6 corresponds to the “second display control means” of the present invention, and the CPU 20 of the communication devices 4 and 5 performing the process of S39 of FIG. This corresponds to “setting detection means”. The CPU 20 of the communication devices 4 and 5 performing the process of S40 in FIG. 6 corresponds to the “display setting storage control means” of the present invention, and the CPU 20 of the communication devices 4 and 5 performing the process of S34 of FIG. It corresponds to “update determination means”. The CPU 20 of the communication devices 4 and 5 performing the processing of S26 in FIG. 6 corresponds to the “write determination unit” of the present invention, and the timer 38 corresponds to the “time counting unit” of the present invention. The contour 90 extracted in the process of S52 of FIG. 7 corresponds to the “object contour” of the present invention, and the CPU 20 of the communication devices 4 and 5 performing the process of S52 of FIG. 7 serves as the “object contour extracting means” of the present invention. Equivalent to. The contour 611 corresponds to the “display object contour” of the present invention, and the contour 801 corresponds to the “shielding object contour” of the present invention. The CPU 20 of the communication devices 4 and 5 that performs the process of S54 in FIG.

  The process of S21 of FIG. 6 corresponds to the “image acquisition step” of the present invention, and the process of S24 of FIG. 6 corresponds to the “first display control step” of the present invention. The process of S55 of FIG. 7 corresponds to the “specific step” of the present invention, and the process of S27 of FIG. 6 corresponds to the “overlap determination step” of the present invention. The processing of S29 and S35 corresponds to the “image storage control step” of the present invention, and the processing of S38 of FIG. 6 corresponds to the “second display control step” of the present invention.

  In addition, this invention is not limited to said embodiment, A various change is possible. For example, although the information display object is the whiteboard 6, it is not limited to this. For example, a display object that displays information such as a blackboard, paper, a display screen by a projector, and a liquid crystal display may be used.

  In S27, when it is determined that the person image 80 overlaps the display object image 61, for example, the area of the person image 80 that overlaps the display object image 61 with respect to the area on the display 28 on which the display object image 61 is displayed. When the ratio is equal to or greater than a predetermined value, it may be determined that the person image 80 overlaps the display object image 61. As a specific example, first, the number of pixels in the area of the display object image 61 (see FIG. 9) specified in S55 is calculated. Then, the number of pixels of the portion where the person image 80 identified in S54 overlaps the display object image 61 is calculated. The ratio of the portion where the display object image 61 overlaps the person image 80 to the number of pixels of the display object image 61 is 5%. The predetermined value is assumed to be 10%. It is assumed that 10% is a value that causes no problem in the visibility of the information displayed on the display object image 61. In this case, since it does not exceed the predetermined value, it is determined that the person image 80 does not overlap the display object image 61 (S27: NO). That is, when there is no problem in the visibility of characters or the like written in the display object image 61, the stored image 51 can be prevented from being displayed.

  Moreover, although the case where the display object image 61 and the person image 80 overlap was demonstrated to the example, it is not limited to this. For example, instead of the person image 80, a shielding object image that is an image of various objects that shield the information display object may be used.

  The communication devices 3, 4 and 5 all have the same hardware configuration and are provided with the same software. However, the present invention is not limited to this. For example, the hardware configuration and software of the communication device 3 that is an image transmission device and the communication devices 4 and 5 that are image reception devices may be different.

  Further, when it is determined that the person image 80 overlaps the display object image 61 (S27: YES), the stored image 51 is always displayed. However, the present invention is not limited to this. For example, even when it is determined that the person image 80 is overlapped with the display object image 61, the stored image 51 may not be displayed when characters or the like are not displayed at the overlapped portion.

  When it is determined that the person image 80 overlaps the display object image 61 (S27: YES), the display setting can be changed in S39 and S40, but the present invention is not limited to this. For example, the received display object image 61 and the stored stored image 51 may be displayed as they are without changing the display setting.

  Moreover, although the image imaged by S11 (refer FIG. 5) is not displayed on the display 28 of the communication apparatus 3, it is not limited to this. For example, the image captured in S11 may be displayed on the display 28 of the communication device 3.

  In the present embodiment, the display object image 61 is stored in the communication apparatuses 4 and 5 that receive the image transmitted from the communication apparatus 3 (S29 and S35), and the person image 80 overlaps the display object image 61. However, the present invention is not limited to this. For example, in the communication device 3 capturing an image, the display object image 61 is stored, and when the person image 80 overlaps the display object image 61, the stored display object image 61 is transmitted to the communication devices 4 and 5. The display object image 61 may be displayed on the communication devices 4 and 5. Hereinafter, this modification will be described in detail with reference to FIGS. 13 and 14.

  FIG. 13 is a transmission side modification process that is a modification of the transmission side main process (see FIG. 5), and FIG. 14 is a reception side modification process that is a modification of the reception side main process (see FIG. 6). In the following description, the same processing is denoted by the same reference numerals for the transmission side main processing and the reception side main processing, and detailed description thereof is omitted.

  First, with reference to FIG. 13, the transmission side deformation process by the CPU 20 of the communication device 3 will be described. As shown in FIG. 13, first, an image is acquired by capturing an image with a camera included in the communication device 3 (S11). In the case of this embodiment, the image acquired in S <b> 11 is the acquired image 50. Then, the acquired image 50 is encoded and the image data is transmitted to the communication devices 4 and 5 (S12 and S13). The image data of the acquired image 50 transmitted in S13 is received in S21 (see FIG. 14) of the communication devices 4 and 5, and the acquired image 50 is displayed on the display 28 of the communication devices 4 and 5 (S24). That is, the process of S13 is a process of displaying the acquired image 50 on the display 28 of the communication devices 4 and 5.

  Next, the acquired image 50 is used to perform image recognition processing (see FIG. 7) (S23). After S23 is executed, the processes of S25 to S27 are executed. When the person image 80 does not overlap the display object image 61 (S27: NO), S28 is executed. When the display object image 61 is not stored (S28: NO), S29 and S30 are executed, and the process proceeds to S41. When the display object image 61 is memorize | stored (S28: YES), the process of S33-S36 is performed and a process progresses to S41.

  When the person image 80 overlaps the display object image 61 (S27: YES), it is determined whether or not the stored image 51 is stored (S37). If the stored image 51 is not stored (S37: NO), the process proceeds to S41. When the stored image 51 is stored (S37: YES), the stored image data that is the data of the stored image 51 is transmitted to the communication devices 4 and 5 (S61). The transmitted stored image data is received in S71 of the communication devices 4 and 5, and the stored image 51 is displayed on the display 28 (S73). That is, S61 is a process of displaying the stored image 51 stored immediately before the overlapping on the display 28 of the communication devices 4 and 5 when the person image 80 overlaps the display object image 61. After S61 is executed, the process of S41 is executed. When the instruction to end the video conference is not input (S41: NO), the process returns to S11. When an instruction to end the video conference is input (S41: YES), the transmission side transformation process is ended.

  With reference to FIG. 14, the receiving side transformation process by the CPU 20 of the communication apparatuses 4 and 5 will be described. As shown in FIG. 14, it is first determined whether or not the image data of the acquired image 50 transmitted in S13 (see FIG. 13) of the communication device 3 has been received (S21). If the image data of the acquired image 50 has not been received (S21: NO), it is determined whether or not the stored image data transmitted in S61 (see FIG. 13) of the communication device 3 has been received (S71). If the stored image data has not been received (S71: NO), the process proceeds to S14. When the instruction to end the video conference is not input (S14: NO), the process returns to S21.

  When the image data of the acquired image 50 is received (S21: YES), the received image data is decoded (S22), and the acquired image 50 is displayed (S24). By the process of S24, the acquired image 50 is displayed in the acquired image display area 281 as shown in FIG. 3 or FIG. Next, it is determined whether or not the stored image 51 is being displayed on the display 28 (S74). If the stored image 51 is not displayed as shown in FIG. 3 (S74: NO), the process proceeds to S14. As shown in FIG. 4, when the stored image 51 is being displayed (S74: YES), it is determined whether or not the stored image data is received in S71 during a predetermined time (for example, 5 seconds). (S75). The case where the stored image data has not been received for a predetermined time means that S61 (see FIG. 13) is not executed, that is, the person image 80 does not overlap the display object image 61 in S27 (see FIG. 13). This is the case.

  If the stored image has been received during the predetermined time (S75: YES), the process returns to S14. That is, the state where the stored image 51 is displayed as shown in FIG. 4 is continued. When the stored image has not been received within the predetermined time (S75: NO), the stored image display area 282 is deleted (S76). Accordingly, the state where the stored image 51 shown in FIG. 4 is displayed is changed to the state where the stored image 51 shown in FIG. 3 is not displayed. Next, the process proceeds to S14.

  When the stored image data is received (S71: YES), the received stored image data is decoded (S72). Then, the stored image 51 is displayed in the stored image display area 282 (S73). In S <b> 72, for example, the size and position of the acquired image 50 are adjusted and displayed based on the display settings that are settings related to the display stored in the display setting storage area 222.

  Next, it is determined whether or not there is an instruction to change the display setting (S39). If there is no instruction to change the display setting (S39: NO), the process proceeds to S14. When there is an instruction to change the display setting (S39: YES), the display setting after the change is stored in the display setting storage area 222 (S40). Thus, the images displayed in S24 and S73 are displayed based on the changed display settings. Next, the process proceeds to S14. When an instruction to end the video conference is input (S14: YES), the receiving side transformation process is ended.

  As described above, the process in the modified example is performed. In the present modification, the communication device 3 that captures an image stores the display object image 61 (S29 and S35, see FIG. 13), and the person image 80 overlaps the display object image 61 (S27: YES). ), The stored display object image 61 (that is, the stored image 51) is transmitted to the communication devices 4 and 5 (S61), and the display device image 61 is displayed on the communication devices 4 and 5 (see S73, FIG. 14). For this reason, even when the person image 80 overlaps the display object image 61, the users of the communication devices 4 and 5 can visually recognize the contents written on the whiteboard 6.

  Further, since it is not necessary to perform the processing of S23 to S30 and S33 to S37 in FIG. 13 in the communication devices 4 and 5, the processing load of the communication devices 4 and 5 can be reduced.

  In the above modification, the CPU 20 of the communication device 3 that performs the process of S11 in FIG. 13 corresponds to the “image acquisition unit” of the present invention, and the CPU 20 of the communication device 3 that performs the process of S13 of FIG. It corresponds to “one display control means”. The CPU 20 of the communication device 3 that performs the processes of S53 and S55 in FIG. 7 corresponds to the “specifying means” of the present invention, and the CPU 20 of the communication device 3 that performs the process of S27 of FIG. Equivalent to. The CPU 20 of the communication apparatus 3 that performs the processes of S29, S35, S30, and S36 in FIG. 13 corresponds to the “image storage control unit” of the present invention, and the CPU 20 of the communication apparatus 3 that performs the process of S61 of FIG. It corresponds to “second display control means”. The CPU 20 of the communication devices 4 and 5 performing the process of S39 in FIG. 14 corresponds to the “setting detection means” of the present invention, and the CPU 20 of the communication devices 4 and 5 performing the process of S40 of FIG. It corresponds to “memory control means”. The CPU 20 of the communication device 3 that performs the process of S34 in FIG. 13 corresponds to the “update determination unit” of the present invention, and the CPU 20 of the communication device 3 that performs the process of S26 of FIG. 13 corresponds to the “write determination unit” of the present invention. To do. The RAM 22 of the communication device 3 corresponds to the “first storage device” of the present invention, and the RAM 22 of the communication devices 4 and 5 corresponds to the “second storage device” of the present invention. The CPU 20 of the communication device 3 that performs the process of S52 in FIG. 7 corresponds to the “object contour extracting unit” of the present invention, and the CPU 20 of the communication device 3 that performs the process of S54 of FIG. Is equivalent to.

  The process of S11 of FIG. 13 corresponds to the “image acquisition step” of the present invention, and the process of S13 of FIG. 13 corresponds to the “first display control step” of the present invention. 7 corresponds to the “specific step” of the present invention, and the process of S27 of FIG. 13 corresponds to the “overlap determination step” of the present invention. The processes of S29, S35, S30, and S36 in FIG. 13 correspond to the “image storage control step” of the present invention, and the process of S61 in FIG. 13 corresponds to the “second display control step” of the present invention.

  Although the acquired image 50 and the stored image 51 are not displayed on the display 28 of the communication device 3, the present invention is not limited to this. For example, the acquired image 50 and the stored image 51 may be displayed on the display 28 of the communication device 3. In this case, a process of displaying an image may be performed after S13 and S61 in FIG.

DESCRIPTION OF SYMBOLS 1 Video conference system 2 Network 3, 4, 5 Communication apparatus 6 White board 28 Display 34 Camera 38 Timer 50 Acquisition image 51 Storage image 61 Display object image 80 Person image 221 Display object image storage area 222 Display setting storage area 281 Acquisition image display Area 282 Memory image display area 2811, 2821 Display setting button

Claims (9)

  1. In a video conference system including a communication device that transmits and receives images to and from other devices via a network,
    The communication device
    Image acquisition means for acquiring images;
    First display control means for displaying an acquired image, which is an image acquired by the image acquisition means, on a display means for displaying an image;
    A specifying means for specifying a display object image that is an image of an information display object for displaying information included in the acquired image;
    In the acquired image, an overlap determining unit that determines whether or not a shielding object image that is an image of an object that shields the information display object overlaps the display object image specified by the specifying unit;
    Image storage control means for storing the display object image specified by the specifying means in a first storage device;
    When it is determined by the overlap determining means that the shielding object image is overlapped with the display object image, the overlap determining means immediately before determining that the shielding object image is overlapped with the display object image. A video conference system comprising: second display control means for causing the display means to display the display object image stored in the first storage device by image storage control means.
  2.   The image acquisition unit is transmitted by the image transmission unit of the other apparatus including an image imaging unit that captures an image and an image transmission unit that transmits an image captured by the image imaging unit to the network. The video conference system according to claim 1, wherein an image is acquired by receiving the image via the network.
  3. This is a setting relating to the display on the display means of the acquired image displayed on the display means by the first display control means and the display object image displayed on the display means by the second display control means. Setting detection means for detecting an instruction to change the display setting;
    A display setting storage control unit for storing the display setting after the change in the second storage device based on an instruction to change the display setting detected by the setting detection unit;
    The first display control means adjusts the acquired image based on the changed display settings stored by the display setting storage control means and causes the display means to display the adjusted display settings.
    The second display control means adjusts the display object image based on the changed display setting stored by the display setting storage control means and causes the display means to display the adjusted display object image. The video conference system described in 1.
  4. When it is determined by the overlap determining means that the shielding object image does not overlap the display object image, the display object image specified by the specifying means is stored in the first storage device by the image storage control means. Update determination means for determining whether or not the display object image is different from the immediately preceding stored image,
    The image storage control means is determined by the update determination means when the display object image specified by the specifying means is different from the display object image stored in the first storage device by the image storage control means. 4. The video conference system according to claim 1, wherein the display object image specified by the specifying unit is stored in the first storage device.
  5. The information display object indicated by the display object image specified by the specifying means comprises information display determination means for determining whether or not display of information is started,
    The overlap determining means determines whether the shielding object image overlaps the display object image specified by the specifying means when the information display determining means determines that the display of information is started on the information display object. 5. The video conference system according to claim 1, wherein a determination as to whether or not is started.
  6. The image storage control means, when it is determined by the overlap determination means that the shielding object image does not overlap the display object image, the display object image specified by the specifying means, and a time for measuring time Storing the current time measured by the time measuring means in the first storage device;
    The second display control unit is configured to store the display stored in the first storage device by the image storage control unit when the overlap determination unit determines that the shielding object image overlaps the display object image. The video conference system according to claim 1, wherein an object image and an image indicating the time are displayed on the display unit.
  7. An object contour extracting means for extracting an object contour which is a contour of an image of an object included in the acquired image acquired by the image acquiring means;
    A shielding object contour identifying means for identifying a shielding object contour that is a contour of the shielding object image from the object contour extracted by the object contour extraction means;
    The specifying unit identifies the display object image by identifying a display object contour that is an outline of the display object image from the object contours extracted by the object contour extraction unit;
    The overlap determining unit determines whether or not the shielding object contour identified by the shielding object contour identifying unit overlaps the display object contour identified by the specifying unit. The video conference system according to claim 1, wherein it is determined whether or not the shielding object images overlap.
  8.   The overlap determining unit determines whether a ratio of an area where the shielding object image overlaps the display object image to an area on the display unit on which the display object image is displayed is equal to or greater than a predetermined value. The video conference system according to claim 1, wherein it is determined whether or not the shielding object image overlaps the display object image.
  9. A video conference program executed in a video conference system including a communication device that transmits and receives images to and from other devices via a network,
    In the controller of the communication device,
    An image acquisition step of acquiring an image;
    A first display control step of displaying an acquired image, which is an image acquired by the image acquiring step, on display means for displaying an image;
    A specifying step of specifying a display object image that is an image of an information display object that displays information included in the acquired image;
    In the acquired image, an overlap determination step of determining whether or not a shielding object image that is an image of an object that shields the information display object overlaps the display object image specified by the specifying step;
    A storage control step of storing the display object image specified in the specifying step in a first storage device;
    If it is determined by the overlap determining step that the shielding object image overlaps the display object image, immediately before the overlap determining step determines that the shielding object image overlaps the display object image, A video conference program for executing a second display control step of causing the display means to display the display object image stored in the first storage device by a storage control step.
JP2010292524A 2010-12-28 2010-12-28 Video conference system and video conference program Pending JP2012142697A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2010292524A JP2012142697A (en) 2010-12-28 2010-12-28 Video conference system and video conference program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2010292524A JP2012142697A (en) 2010-12-28 2010-12-28 Video conference system and video conference program

Publications (1)

Publication Number Publication Date
JP2012142697A true JP2012142697A (en) 2012-07-26

Family

ID=46678562

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2010292524A Pending JP2012142697A (en) 2010-12-28 2010-12-28 Video conference system and video conference program

Country Status (1)

Country Link
JP (1) JP2012142697A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103888710A (en) * 2012-12-21 2014-06-25 深圳市捷视飞通科技有限公司 Video conferencing system and method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002314990A (en) * 2001-04-12 2002-10-25 Auto Network Gijutsu Kenkyusho:Kk System for visually confirming periphery of vehicle
JP2005260636A (en) * 2004-03-12 2005-09-22 Fuji Photo Film Co Ltd Photographing system
JP2007074326A (en) * 2005-09-07 2007-03-22 Hitachi Ltd Drive support device
JP2007166245A (en) * 2005-12-14 2007-06-28 Tokyo Institute Of Technology Image processor, image processing program, and image processing method
JP2010092224A (en) * 2008-10-07 2010-04-22 Canon Inc Image processing apparatus and image processing method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002314990A (en) * 2001-04-12 2002-10-25 Auto Network Gijutsu Kenkyusho:Kk System for visually confirming periphery of vehicle
JP2005260636A (en) * 2004-03-12 2005-09-22 Fuji Photo Film Co Ltd Photographing system
JP2007074326A (en) * 2005-09-07 2007-03-22 Hitachi Ltd Drive support device
JP2007166245A (en) * 2005-12-14 2007-06-28 Tokyo Institute Of Technology Image processor, image processing program, and image processing method
JP2010092224A (en) * 2008-10-07 2010-04-22 Canon Inc Image processing apparatus and image processing method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103888710A (en) * 2012-12-21 2014-06-25 深圳市捷视飞通科技有限公司 Video conferencing system and method

Similar Documents

Publication Publication Date Title
US10334162B2 (en) Video processing apparatus for generating panoramic video and method thereof
CN104378688B (en) mode switching method and device
US9330292B2 (en) Two-dimensional code scanning method and device
JP5451888B2 (en) Camera-based scanning
EP2728853A2 (en) Method and device for controlling a camera
JP5664442B2 (en) Video conference apparatus, display control method, and display control program
US20130321703A1 (en) Video output device and video output method
EP2453384B1 (en) Method and apparatus for performing gesture recognition using object in multimedia device
CN101510269B (en) Method and device for acquiring two-dimensional code in video
US20160063954A1 (en) Method for removing image sticking in display device
EP2270772B1 (en) Display control apparatus and display control method
WO2012111272A1 (en) Display control device and display control method
CN105549797B (en) Display device and projection display device
JP6030945B2 (en) Viewer video display control device, viewer video display control method, and viewer video display control program
US20130174037A1 (en) Method and device for adding video information, and method and device for displaying video information
JP2013257686A (en) Projection type image display apparatus, image projecting method, and computer program
JP2009075685A (en) Image processor, image processing method, and program
US20070086669A1 (en) Regions of interest in video frames
JP2012523630A (en) Improved handheld screen detection pointer
CN102722590B (en) Terminal and image acquisition method
US8300084B2 (en) Method for real-time prompting stitching degree of panoramic image
US8730158B2 (en) Information processing device, information processing method, information processing program, and information processing system
US9742995B2 (en) Receiver-controlled panoramic view video share
EP2453388A1 (en) Method for user gesture recognition in multimedia device and multimedia device thereof
JP4902795B2 (en) Display device, television receiver, display device control method, program, and recording medium

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20120914

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20130821

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20130917

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20140212