WO2018008077A1 - Image projection device - Google Patents

Image projection device Download PDF

Info

Publication number
WO2018008077A1
WO2018008077A1 PCT/JP2016/069834 JP2016069834W WO2018008077A1 WO 2018008077 A1 WO2018008077 A1 WO 2018008077A1 JP 2016069834 W JP2016069834 W JP 2016069834W WO 2018008077 A1 WO2018008077 A1 WO 2018008077A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
unit
mask pattern
mask
processing unit
Prior art date
Application number
PCT/JP2016/069834
Other languages
French (fr)
Japanese (ja)
Inventor
貴司 山内
康祐 中島
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2017520559A priority Critical patent/JP6249136B1/en
Priority to PCT/JP2016/069834 priority patent/WO2018008077A1/en
Publication of WO2018008077A1 publication Critical patent/WO2018008077A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering

Definitions

  • the present invention relates to an image projection apparatus used for communication between remote locations.
  • a conference system that conducts a conference between remote locations through communication such as the Internet includes an imaging device and a display device.
  • the conference system captures images of conference participants captured by the imaging device and is captured at the communication destination. Received images of the conference participants are displayed on the display device.
  • the viewpoint position and the shooting position are made to follow the position of the camera in the other space according to the displacement of the viewer's viewpoint position.
  • the viewpoint position and the shooting position are made to follow the position of the camera in the other space according to the displacement of the viewer's viewpoint position.
  • there are many viewpoint positions there are many viewpoint positions, and it has been difficult to enhance the sense of reality by processing the video uniquely or determining the shooting location.
  • Patent Document 1 As a means for solving this problem, there is a technology that uses the cameras for the number of participants in the conference to match the line of sight between the speaker and the listener (Patent Document 1).
  • the present invention has been made to solve such a problem, and in a place where an unspecified number of people communicate dynamically, it feels as if the rooms are connected to each other as if they were connected to each other.
  • the purpose is to promote smooth communication.
  • An image projection apparatus is an image projection apparatus used in an information communication system capable of communicating with each other by an image, and the image projection apparatus A reception image processing unit that performs mask processing on a part thereof and a projection unit that projects an image masked by the reception image processing unit are provided.
  • the image projection apparatus performs mask processing on a part of an image input from the outside, and projects the mask-processed image. Therefore, the perspective of the viewpoint based on the perspective of the image and the viewer's eyes is used. The shift is alleviated by the mask applied to the image, and it becomes possible to feel a higher sense of realism and room feeling.
  • FIG. 10 is a flowchart showing a flow of mask processing of the image projection apparatus according to the second embodiment.
  • FIG. 6 is a functional configuration diagram of an image projection apparatus according to Embodiment 3.
  • FIG. 10 is a flowchart showing a flow of mask processing of the image projection apparatus according to the third embodiment.
  • FIG. 10 is a functional configuration diagram of an image projection apparatus according to a fourth embodiment. 10 is a flowchart showing a flow of mask processing of the image projection apparatus according to the fourth embodiment.
  • FIG. 10 is a functional configuration diagram of an image projection apparatus according to a fifth embodiment. 10 is a flowchart showing a flow of mask processing of the image projection apparatus according to the fifth embodiment.
  • FIG. 1 is a diagram showing a configuration of a communication system using the image projection apparatus according to the first embodiment.
  • FIG. 2 is a hardware configuration diagram of the image projection apparatus according to the present invention.
  • FIG. 3 is a functional configuration diagram of the image projection apparatus according to the first embodiment.
  • FIG. 4 is a diagram illustrating a pattern example of a mask pattern stored in the pattern storage unit according to the first embodiment.
  • FIG. 5 is a flowchart showing a flow of mask processing of the image projection apparatus according to the first embodiment.
  • a remote communication system 1 includes a plurality of image projection devices 2a to 2b connected to a network 100 such as the Internet, and a server device 3.
  • image projector 2 is used to indicate an arbitrary video conference device among the plurality of image projectors 2a to 2b.
  • the image projection device 2 is a terminal device corresponding to the remote communication system 1 and is an example of an information processing device.
  • the image projection device 2 may be a general-purpose information processing device such as a personal computer (hereinafter referred to as “PC”), a tablet terminal, or a smartphone, or a dedicated terminal for the remote communication system 1. May be.
  • PC personal computer
  • a tablet terminal or a smartphone
  • a dedicated terminal for the remote communication system 1. May be.
  • each function of the image projection device 2 is realized by, for example, an application program installed in the information processing device.
  • the server device 3 is, for example, a monitor of connection status as to whether or not it is connected to the image projection devices 2a to 2b, connection control at the start / end of remote communication, data (video), audio, etc. during remote communication Control transmission and reception.
  • FIG. 2 is a hardware configuration diagram of the image projection apparatus according to the present invention.
  • the image projection apparatus 1 according to the present invention has a general computer configuration, for example, a CPU (Central Processing Unit) 4, a memory 5, a recording device 6, a remote controller 7, a camera 8, a microphone 9, and a speaker 10. They are connected by a bus or the like.
  • the CPU 4 is an arithmetic device that implements each function of the image projection device 2 by reading a program and data from the recording device 6 and executing the processing, for example.
  • the CPU 4 executes a program for the image projecting device 2 to perform functions such as overall control of the image projecting device 2, control of remote communication, image processing of images to be transmitted and received, and sound processing of sound to be transmitted and received. Realize.
  • the memory 5 includes storage devices such as a RAM (Random Access Memory) and a ROM (Read Only Memory).
  • the RAM is a volatile memory used as a work area for the CPU 4.
  • the ROM is, for example, a non-volatile memory that stores a startup program for the image projection apparatus 2 and setting values.
  • the recording device 6 is a storage device that records, for example, device control executed by the CPU 4, control of television communication, data, and the like.
  • a storage device that records, for example, device control executed by the CPU 4, control of television communication, data, and the like.
  • device control executed by the CPU 4 control of television communication, data, and the like.
  • HDD Hard Disk Device
  • SSD Solid State Drive
  • flash Consists of ROM and the like.
  • the remote controller 7 is a means for the image projection device 2 to accept a user operation, such as an operation button or a keyboard.
  • the camera 8 is, for example, an imaging unit that captures images of telecommunications participants and the like.
  • the camera 8 converts a captured image such as a meeting into predetermined image data, and transmits the image data to the CPU 4 or the like via a bus or the like.
  • the camera 8 can change, for example, the brightness of the captured image based on the set camera parameter or the like.
  • the camera 8 has default camera parameters, for example, and the values of the camera parameters can be dynamically changed by a program or the like operating on the CPU 4.
  • the microphone 9 acquires, for example, voices of telecommunications participants, etc., converts the voices into predetermined voice data, and transmits the voice data to the CPU 4 or the like via the bus.
  • the microphone 9 includes, for example, a plurality of directional microphones.
  • the speaker 10 converts predetermined sound data received from the CPU 4 or the like into sound and outputs the sound, for example.
  • the remote controller 7, the camera 8, the microphone 9, the speaker 10, and the like may be external devices partially connected to the outside of the main body of the image projector 2.
  • the display device (display unit) connected to the image projection device 2 may incorporate the remote control 7, the camera 8, the microphone 9, the speaker 10, and the like.
  • FIG. 3 is a functional configuration diagram of the image projection apparatus according to the first embodiment.
  • the image projection apparatus 2 includes an imaging unit 11, a transmission image processing unit 12, a communication unit 15 including a transmission unit 13 and a reception unit 14, an operation unit 16, a reception image processing unit 17, a mask pattern. It comprises a storage unit 18 and a projection unit 19.
  • the imaging unit 11 is a unit that captures an image of a situation of a room on the communication partner side such as a telecommunications participant or furniture, and is realized by, for example, the camera 8 in FIG.
  • the transmission image processing unit 12 performs image processing such as brightness adjustment on the image captured by the imaging unit 11.
  • the transmission image processing unit 12 is realized by, for example, a program that operates on the CPU 4 in FIG.
  • the transmission image processing unit 12 may be an image processing unit such as a camera DSP (Digital Signal Processor) included in the camera 8 of FIG.
  • the image processing performed by the transmission image processing unit 12 includes processing for substantially changing the brightness of the image, such as brightness adjustment, contrast adjustment, and gamma adjustment.
  • the transmission unit 13 transmits the image (image data) subjected to image processing by the transmission image processing unit 12 to the image projection apparatus 2 that is a communication destination via the server apparatus 3.
  • the transmission unit 13 can also transmit information other than image data, for example, communication control information, usage scene information, and the like.
  • the transmission unit 13 is realized by, for example, a control program.
  • the receiving unit 14 is a unit that receives an image transmitted from the communication destination image projection apparatus 2 via the network 100.
  • the receiving unit 14 transmits the received image to the received image processing unit 17.
  • the receiving unit 14 is realized by, for example, a control program.
  • the operation unit 16 is means for operating when the user wants to operate the image projection apparatus 2.
  • the user can select whether to perform mask processing on the received image by operating the operation unit 16.
  • the user can select a desired mask pattern from a plurality of mask patterns stored in a mask pattern storage unit 18 described later.
  • an image to be transmitted by the imaging unit 11 can be captured.
  • the operation unit 16 is realized by, for example, the remote controller 7 of FIG. 2 or its control program.
  • the received image processing unit 17 is a means for performing image processing on the image received by the receiving unit 14, and is realized by, for example, a program executed by the CPU 4 in FIG.
  • the received image processing unit 17 performs mask processing by adapting the mask pattern selected by the user via the operation unit 16 to the received image. Then, the received image processing unit 17 transmits the image subjected to the mask process to the projection unit 19.
  • the mask process is a process for displaying or hiding only a specific part of an image.
  • the projection unit 19 is a means for projecting the image processed by the received image processing unit 17 onto the display unit 20, and is realized by a program executed by the CPU 4 in FIG.
  • the projection unit 19 forms an image of light emitted from a subject on an imaging element (not shown) such as a CCD (Charge Coupled Device) sensor or a CMOS (Complementary Metal Oxide Semiconductor) sensor.
  • an imaging element such as a CCD (Charge Coupled Device) sensor or a CMOS (Complementary Metal Oxide Semiconductor) sensor.
  • an imaging lens constituting the projection unit 19 is disposed so as to face the display unit 20.
  • the display unit 20 may be a method such as a liquid crystal display or a projector.
  • FIG. 4 is a diagram illustrating a pattern example of the mask pattern stored in the pattern storage unit.
  • FIG. 4A shows an example in which a strip-shaped mask in the vertical direction, that is, a striped mask is applied to the received image.
  • a striped mask is formed on the image, the lateral viewpoint shift is alleviated. Therefore, when viewing a person who is at the opposite end from the position of the end of the display unit 20, or while looking in front of the display unit 20 The effect can be expected in some cases.
  • the width of the mask is desirably a width that does not completely hide a person, for example, 300 mm or less.
  • FIG. 4B applies a strip-shaped mask in the vertical direction, that is, a stripe-shaped mask, as in FIG. 4A, but further, the mask portion is moved toward the left and right ends of the display area. This is an example of widening or narrowing the interval between mask portions.
  • FIG. 4C shows an example in which a strip-shaped mask in the horizontal direction, that is, a border-shaped mask is applied to the received image.
  • a border-like mask is formed on the image, the vertical viewpoint shift is alleviated, so that it is possible to alleviate the difference in appearance caused by the height difference of the users.
  • FIG. 4D applies a strip-shaped mask in the horizontal direction, that is, a border-like mask, as in FIG. 4C, but further, the mask portion becomes closer to the upper and lower ends of the display area end. This is an example in which the distance between the mask portions is narrowed.
  • FIG. 4E shows an example in which a lattice mask is applied to the received image. This is an example in which FIGS. 4A and 4C are combined, and the effects of both FIGS. 4A and 4C described above can be obtained.
  • FIG. 4 (f) is an example in which a grid-like mask is applied to the received image, as in FIG. 4 (e). However, the mask portion becomes wider toward the left and right and upper and lower ends of the display area. This is an example in which the interval between mask portions is narrowed. This is an example in which FIGS. 4B and 4D are combined, and the effects of both FIGS. 4B and 4D described above can be obtained.
  • FIG. 4G shows a case where a strip-shaped mask in the vertical direction, that is, a striped mask is applied as in FIG. 4A, and further, the image is displayed gradually toward the center of the received image.
  • a mask so that the region becomes narrow. This uses the property that the smaller the width, the deeper it looks, and the wider the width, the closer it is to the front.
  • FIG. 4 (h) applies a strip-shaped mask in the vertical direction, that is, a striped mask, as in FIG. 4 (a), and further displays the image gently toward the center of the received image. It is a proposal to apply a mask so that the area becomes wide. By applying the mask in this way, there is an effect that the video display area at the screen edge where the deviation of the viewpoint is large can be reduced, and the uncomfortable feeling can be eliminated.
  • FIG. 5 is a flowchart showing the flow of mask processing of the image projection apparatus 2 according to the first embodiment. Here, the flow of mask processing when the user selects the mask pattern of FIG. 4A will be described.
  • the receiving unit 14 of the image projecting apparatus 2 receives an image transmitted from the image projecting apparatus of the communication destination via the network 100 (S11). Then, the receiving unit 14 transmits the received image to the received image processing unit 17.
  • the received image processing unit 17 uses a predetermined mask from mask patterns (FIGS. 4A to 4H) stored in the mask pattern storage unit 18 by the user via the operation unit 16. It is determined whether a pattern is selected (ST21). Here, since the mask pattern of FIG. 4A, that is, the stripe-shaped mask pattern is selected, the process proceeds to ST31 where mask processing is performed (YES in ST21). If the mask pattern is not selected by the user, the received image processing unit 17 transmits an image not subjected to mask processing to the projection unit 19 (NO in ST21).
  • the received image processing unit 17 combines the received image with the striped mask information selected by the user (ST31), and transmits only the image of the unmasked portion to the projection unit 19.
  • Projection unit 19 projects the image with the striped mask transmitted from received image processing unit 17 onto display unit 20 (ST41-1). If the mask pattern has not been selected by the user, the projection unit 19 projects an image that has not been subjected to mask processing onto the display unit 20 (ST41-2).
  • the mask pattern selected by the user by applying the mask pattern selected by the user to the received image, the deviation of the viewpoint based on the perspective in the image and the viewer's eyes is alleviated by the mask applied to the image. It is possible to feel a high sense of realism and room feeling.
  • FIG. 6 is a functional configuration diagram of the image projection apparatus according to the second embodiment.
  • FIG. 7 is a flowchart showing a flow of mask processing of the image projection apparatus according to the second embodiment.
  • differences from the first embodiment will be mainly described, and the same configurations and effects as those of the first embodiment will be omitted. Further, the same reference numerals are used for the same configurations as those of the first embodiment.
  • a configuration necessary for receiving an image from the counterpart image projecting apparatus of the image projecting apparatus 21 according to the second embodiment will be described with reference to FIG. Note that the configuration of the image projecting device 21 necessary for transmitting an image to the counterpart image projecting device is the same as that of the first embodiment, and thus the description thereof is omitted.
  • the receiving unit 14 is a means for receiving an image transmitted via the network 100 from the image projection apparatus that is the communication destination. Further, the receiving unit 14 transmits the received image to an image recognition unit 23 and a received image processing unit 24 described later.
  • the receiving unit 14 is realized by, for example, a control program.
  • the operation unit 22 is means for operating when the user wants to operate the image projection device 21. The user can select whether to perform mask processing on the received image by operating the operation unit 22. In addition, by operating the operation unit 22, an image to be transmitted by the imaging unit 11 can be captured.
  • the operation unit 22 is realized by, for example, the remote controller 7 in FIG. 2 or its control program.
  • the image recognition unit 23 is a means for receiving the image transmitted from the reception unit 14 and recognizing the image, and is realized by, for example, a program executed by the CPU 4 in FIG.
  • the image recognition unit 23 recognizes people, furniture arrangements, walls, floors, ceilings, and the like that appear in the image, and sends the recognition results to the received image processing unit 24.
  • the recognition method may be anything such as a camera, a sound wave sensor, an infrared sensor, or the like that can recognize the contents shown in the image.
  • the received image processing unit 24 is a means for performing image processing on the image received by the receiving unit 14, and is realized by, for example, a program executed by the CPU 4 in FIG. Specifically, the received image processing unit 24 extracts an optimal mask pattern from the mask patterns stored in the mask pattern storage unit 18 based on the recognition result from the image recognition unit 23, and performs mask processing on the received image. I do. Then, the received image processing unit 24 transmits the image subjected to the mask process to the projection unit 19.
  • the projection unit 19 is means for projecting the image masked by the received image processing unit 24 onto the display unit 20, and is realized by a program executed by the CPU 201 in FIG.
  • FIG. 7 is a flowchart showing a mask processing flow of the image projection apparatus 21 according to the second embodiment.
  • the flow of mask processing when the received image processing unit 24 selects the mask pattern of FIG. 4A based on the recognition result of the image recognition unit will be described.
  • the reception unit 14 of the image projection apparatus 2 receives an image transmitted from the communication destination image projection apparatus via the network 100 (ST12). Then, the reception unit 14 transmits the received image to the image recognition unit 23 and the reception image processing unit 24, respectively.
  • the image recognition unit 23 recognizes the content of the image transmitted from the reception unit 14 (ST22). Here, when the recognized image is displayed on the display unit 20 as it is, information such as a large deviation of the viewpoint in the horizontal direction is recognized. Then, the image recognition unit 23 transmits these pieces of information to the received image processing unit 24.
  • the received image processing unit 24 acquires information about the image from the receiving unit 14 and information about the contents of the image recognized by the image recognition unit 23. Then, based on the information about the contents of the image, an optimal mask pattern to be applied to the received image is extracted from the mask pattern storage unit 18 (ST32).
  • the received image processing unit 24 wants to alleviate the deviation of the viewpoint in the horizontal direction, the striped mask pattern shown in FIG. 4A is extracted.
  • the received image processing unit 24 performs mask processing by combining the received image with the striped mask information extracted based on the recognition result of the image recognition unit (ST42), and only the image of the portion where no mask is applied. Is transmitted to the projection unit 19. Then, the projecting unit 19 projects the image with the striped mask transmitted from the received image processing unit 24 onto the display unit 20 (ST52).
  • the second embodiment it is possible to extract an optimal mask pattern based on information about the contents of the image recognized by the image recognition unit and display an image masked using the extracted mask pattern on the display unit It becomes. For this reason, the perspective shift in the image and the viewpoint of the viewer's eyes is alleviated by the mask applied to the video, and a higher sense of realism and room feeling can be felt.
  • Embodiment 3 The configuration of the image projection apparatus according to the third embodiment of the present invention will be described with reference to FIGS.
  • FIG. 8 is a functional configuration diagram of the image projection apparatus according to the third embodiment.
  • FIG. 9 is a flowchart showing a flow of mask processing of the image projection apparatus according to the third embodiment.
  • differences from the first and second embodiments will be mainly described, and the same configurations and effects as those in the first and second embodiments will be omitted.
  • the same reference numerals are used for the same configurations as in the first and second embodiments.
  • a configuration necessary for receiving an image from the counterpart image projecting apparatus of the image projecting apparatus 25 according to the third embodiment will be described with reference to FIG. Note that the configuration of the image projecting device 25 necessary for transmitting an image to the counterpart image projecting device is the same as that of the first embodiment, and thus the description thereof is omitted.
  • the receiving unit 14 is a means for receiving an image transmitted via the network 100 from the image projection apparatus that is the communication destination. Further, the reception unit 14 transmits the received image to an image recognition unit 23 and a reception image processing unit 26 described later.
  • the receiving unit 14 is realized by, for example, a control program.
  • the image recognition unit 23 is a means for receiving the image transmitted from the reception unit 14 and recognizing the image, and is realized by, for example, a program executed by the CPU 4 in FIG.
  • the image recognition unit 23 recognizes people, furniture arrangements, walls, floors, ceilings, and the like reflected in the image, and sends the recognition results to the received image processing unit 26.
  • the recognition method may be anything such as a camera, a sound wave sensor, an infrared sensor, or the like that can recognize the contents shown in the image.
  • the received image processing unit 26 is a means for performing image processing on the image received by the receiving unit 14, and is realized by, for example, a program executed by the CPU 4 in FIG. Specifically, the received image processing unit 26 extracts an optimum mask pattern from the mask patterns stored in the mask pattern storage unit 18 based on the recognition result from the image recognition unit 23. If the received image processing unit 26 determines that sufficient presence or room feeling cannot be obtained when the extracted mask pattern is applied to the received image, the mask pattern correcting unit 27 described later relates to the extracted mask pattern. Send information and correction information necessary to obtain sufficient presence and room feeling. The correction information is information necessary for changing the mask interval, mask width, mask shape, etc. of the extracted mask pattern, for example.
  • the received image processing unit 26 When the received image processing unit 26 receives information on the mask pattern corrected by the mask pattern correcting unit 27, the received image processing unit 26 performs mask processing on the received image using the corrected mask pattern.
  • the received image processing unit 26 transmits the image subjected to the mask process to the projection unit 19.
  • the mask pattern correction unit 27 receives information on the extracted mask pattern and correction information from the received image processing unit 26. Then, the mask pattern correction unit 27 corrects the extracted mask pattern based on the correction information. The mask pattern correction unit 27 transmits the corrected mask pattern information to the received image processing unit 26.
  • the projection unit 19 is a unit for projecting the image processed by the received image processing unit 26 onto the display unit 20, and is realized by a program executed by the CPU 201 in FIG.
  • FIG. 9 is a flowchart showing a flow of mask processing of the image projection apparatus according to the third embodiment.
  • the received image processing unit extracts the mask pattern of FIG. 4A, corrects the extracted mask pattern of FIG. 4A, and the corrected mask pattern. The flow of the mask processing for applying to the received image will be described.
  • the receiving unit 14 of the image projection apparatus 25 receives an image transmitted from the communication destination image projection apparatus via the network 100 (ST13). Then, the reception unit 14 transmits the received image to the image recognition unit 23 and the reception image processing unit 26, respectively.
  • the image recognition unit 23 recognizes the content of the image transmitted from the reception unit 14 (ST23). Here, when the recognized image is displayed on the display unit 20 as it is, information such as a large deviation of the viewpoint in the horizontal direction is recognized. The image recognition unit 23 transmits these pieces of information to the received image processing unit 26.
  • the received image processing unit 26 acquires information about the image from the receiving unit 14 and information about the contents of the image recognized by the image recognition unit 23. Then, an optimum mask pattern to be applied to the received image is extracted from the mask pattern storage unit 18 based on the information regarding the contents of the image (ST33).
  • the received image processing unit 24 wants to alleviate the deviation of the viewpoint in the horizontal direction, the striped mask pattern shown in FIG. 4A is extracted. If the reception processing unit 26 determines that the extracted striped mask pattern in FIG. 4A does not provide sufficient presence or room feeling (Yes in ST43), the mask pattern information in FIG. 4A.
  • the correction information is transmitted to the mask pattern correction unit 27.
  • the mask pattern correcting unit 27 corrects the mask pattern shown in FIG. 4A based on the correction information (ST53). Then, the mask pattern correcting unit 27 transmits the corrected mask pattern in FIG. 4A to the received image processing unit 26.
  • the received image processing unit 26 combines the received image with the mask pattern information of FIG. 4A corrected by the mask pattern correcting unit 27 (S63), and only the image of the unmasked portion is projected to the projection unit 19. Send. Then, the projecting unit 19 projects the image with the corrected stripe mask transmitted from the received image processing unit 26 onto the display unit 20 (S73).
  • the received image and the mask of FIG. 4A are combined (S63), and only the image of the part not covered with the mask is transmitted to the projection unit 19. Then, the projection unit 19 projects the image with the striped mask transmitted from the reception image processing unit 26 onto the display unit 20 (ST73).
  • the mask pattern extracted based on the information related to the content of the image recognized by the image recognition unit is further modified so as to obtain a more realistic feeling and the same room feeling.
  • An image can be displayed on the display unit. For this reason, the perspective shift in the image and the viewpoint of the viewer's eyes is alleviated by the mask applied to the video, and a higher sense of realism and room feeling can be felt.
  • FIG. 10 is a functional configuration diagram of the image projection apparatus according to the fourth embodiment.
  • FIG. 11 is a flowchart showing a flow of mask processing of the image projection apparatus according to the fourth embodiment.
  • differences from the first to third embodiments will be mainly described, and the same configurations and effects as those of the first to third embodiments will be omitted.
  • the same reference numerals are used for the same configurations as in the first to third embodiments.
  • a configuration necessary for receiving an image from the counterpart image projecting apparatus of the image projecting apparatus 28 according to the fourth embodiment will be described with reference to FIG. Note that the configuration of the image projecting device 28 necessary for transmitting an image to the counterpart image projecting device is the same as that of the first embodiment, and thus the description thereof is omitted.
  • the operation unit 29 is means for operating when the user wants to operate the image projection device 28.
  • the user can select whether to perform mask processing on the received image by operating the operation unit 29.
  • the user can select a desired mask pattern from a plurality of mask patterns stored in the mask pattern storage unit 18 by operating the operation unit 29.
  • an image to be transmitted by the imaging unit 11 can be captured.
  • the operation unit 29 is realized by, for example, the remote controller 7 in FIG. 2 or its control program.
  • the receiving unit 14 is a means for receiving an image transmitted via the network 100 from the image projection apparatus that is the communication destination. Further, the reception unit 14 transmits the received image to the image recognition unit 23 and a received image processing unit 30 described later.
  • the receiving unit 14 is realized by, for example, a control program.
  • the received image processing unit 30 is a means for performing image processing on the image received by the receiving unit 14, and is realized, for example, by a program executed by the CPU 4 in FIG.
  • the received image processing unit 30 selects a mask pattern selected by the user via the operation unit 29 from the mask pattern storage unit 18. Furthermore, when the received image processing unit 30 determines that sufficient presence or room feeling cannot be obtained when the mask pattern selected by the user is applied to the received image, the mask pattern correction unit 27 Information related to the selected mask pattern and correction information necessary to obtain a sufficient sense of presence and room feeling are transmitted.
  • the received image processing unit 30 receives information on the mask pattern corrected by the mask pattern correcting unit 27, the received image processing unit 30 performs mask processing on the received image using the corrected mask pattern.
  • the received image processing unit 30 transmits the image subjected to the mask process to the projection unit 19.
  • the projection unit 19 is means for projecting the image processed by the received image processing unit 30 onto the display unit 20, and is realized by a program executed by the CPU 201 in FIG.
  • FIG. 11 is a flowchart showing a mask processing flow of the image projection apparatus according to the fourth embodiment.
  • the flow of mask processing in which the user selects the mask pattern of FIG. 4A, corrects the selected mask pattern of FIG. 4A, and applies the corrected mask pattern to the received image will be described. To do.
  • the receiving unit 14 of the image projection device 28 receives an image transmitted from the image projection device of the communication destination via the network 100 (ST14). Then, the receiving unit 14 transmits the received image to the image recognition unit 23 and the received image processing unit 30, respectively.
  • the received image processing unit 30 uses a predetermined mask from mask patterns (FIGS. 4A to 4H) stored in the mask pattern storage unit 18 by the user via the operation unit 29. It is determined whether a pattern is selected (ST24). Here, since the mask pattern of FIG. 4A, that is, the stripe-shaped mask pattern is selected, the process proceeds to ST34 (YES in ST24). If the mask pattern is not selected by the user, the received image processing unit 30 transmits an image not subjected to mask processing to the projection unit 19 (NO in ST24). If NO in ST24, that is, if the mask pattern is not selected by the user, the process proceeds to ST74-2, and the projection unit 19 projects an image without masking on the display unit 20 (ST74-2).
  • the image recognition unit 23 recognizes the content of the image transmitted from the reception unit 14 (ST34). Here, when the recognized image is displayed on the display unit 20 as it is, information such as a large deviation of the viewpoint in the horizontal direction is recognized. Then, the image recognition unit 23 transmits these pieces of information to the received image processing unit 30.
  • FIG. 4A When the reception processing unit 30 determines that the presence of the striped mask pattern of FIG. 4A selected by the user does not provide sufficient presence or room feeling (Yes in ST44), FIG. 4A.
  • the mask pattern information and the correction information are transmitted to the mask pattern correction unit 27.
  • the mask pattern correcting unit 27 corrects the mask pattern shown in FIG. 4A based on the correction information (ST54). Then, the mask pattern correcting unit 27 transmits the corrected mask pattern in FIG. 4A to the received image processing unit 30.
  • the received image processing unit 30 combines the received image with the mask pattern information of FIG. 4A corrected by the mask pattern correcting unit 27 (S64), and only the image of the unmasked portion is sent to the projection unit 19. Send. Then, projecting unit 19 projects the image with the striped mask transmitted from received image processing unit 30 onto display unit 20 (ST74-1).
  • the mask pattern selected by the user is further modified so as to obtain a more realistic feeling and room feeling, so that an image with a more optimal mask pattern can be displayed on the display unit. Become. For this reason, the perspective shift in the image and the viewpoint of the viewer's eyes is alleviated by the mask applied to the video, and a higher sense of realism and room feeling can be felt.
  • Embodiment 5 The configuration of the image projection apparatus according to the fifth embodiment of the present invention will be described with reference to FIGS.
  • FIG. 12 is a functional configuration diagram of the image projection apparatus according to the fifth embodiment.
  • FIG. 13 is a flowchart showing a flow of mask processing of the image projection apparatus according to the fifth embodiment.
  • differences from the first to fourth embodiments will be mainly described, and the same configurations and effects as those of the first to fourth embodiments will be omitted.
  • the same reference numerals are used for the same configurations as in the first to fourth embodiments.
  • a configuration necessary for receiving an image from the counterpart image projecting apparatus of the image projecting apparatus 31 according to the fifth embodiment will be described with reference to FIG. Note that the configuration of the image projecting device 31 that is necessary for transmitting an image to the counterpart image projecting device is the same as that of the first embodiment, and a description thereof will be omitted.
  • the receiving unit 14 is a means for receiving an image transmitted via the network 100 from the image projection apparatus that is the communication destination. Further, the receiving unit 14 transmits the received image to the image recognition unit 23 and a received image processing unit 32 described later.
  • the receiving unit 14 is realized by, for example, a control program.
  • the received image processing unit 32 is a means for performing image processing on the image received by the receiving unit 14, and is realized by, for example, a program executed by the CPU 4 in FIG. Specifically, when the reception image processing unit 32 receives the recognition result transmitted from the image recognition unit 23, the reception image processing unit 32 transmits the recognition result to a mask pattern generation unit 33 described later. When the received image processing unit 32 receives information on the mask generated by the mask pattern generating unit 33, the received image processing unit 32 performs mask processing on the received image using the generated mask pattern. The received image processing unit 32 transmits the image subjected to the mask process to the projection unit 19.
  • the mask pattern generation unit 33 receives information related to the recognition result from the received image processing unit 32. Then, the mask pattern generation unit 33 generates a mask pattern based on the information related to the recognition result. For example, if the mask pattern generation unit 33 determines that the sense of presence and room feeling can be obtained when the lateral viewpoint shift is reduced, a striped mask is generated. In addition, when it is determined that the presence of a sense of realism and the same room can be obtained when the vertical viewpoint deviation is reduced, a border-like mask is generated. The mask pattern generation unit 33 transmits information regarding the generated mask pattern to the received image processing unit 32.
  • the projection unit 19 is means for projecting the image processed by the received image processing unit 32 onto the display unit 20, and is realized by a program executed by the CPU 201 in FIG.
  • FIG. 13 is a flowchart showing a flow of mask processing of the image projection apparatus 31 according to the fifth embodiment.
  • the reception unit 14 of the image projection apparatus 31 receives an image transmitted from the image projection apparatus as a communication destination via the network 100 (ST15). Then, the receiving unit 14 transmits the received image to the image recognition unit 23 and the received image processing unit 32, respectively.
  • the image recognition unit 23 recognizes the content of the image transmitted from the reception unit 14 (ST25).
  • the image recognition unit 23 recognizes information such as a person, furniture arrangement, walls, floors, ceilings, and the like shown in the image. Then, the image recognition unit 23 transmits information regarding these recognition results to the reception image processing unit 32. Then, when the reception image processing unit 32 receives the recognition result transmitted from the image recognition unit 23, the reception image processing unit 32 transmits the recognition result to the mask pattern generation unit 33.
  • the mask pattern generation unit 33 When the mask pattern generation unit 33 receives information on the recognition result, the mask pattern generation unit 33 generates a mask pattern based on the recognition result (ST35). For example, if the mask pattern generation unit 33 determines that the sense of presence and room feeling can be obtained when the lateral viewpoint shift is reduced, a striped mask is generated. In addition, when it is determined that the presence of a sense of realism and the same room can be obtained when the vertical viewpoint deviation is reduced, a border-like mask is generated. Then, the mask pattern generation unit 33 transmits information regarding the generated mask to the reception image processing unit 32.
  • the received image processing unit 32 combines the received image with the mask pattern generated by the mask image generating unit (ST45), and transmits only the image of the unmasked portion to the projection unit 19.
  • the projecting unit 19 projects the masked image transmitted from the received image processing unit 32 on the display unit 20 (ST55).
  • the mask pattern is stored in the storage unit from the beginning, and the optimal mask pattern is selected compared to the case of selecting from among them. It is possible to display an image subjected to the process on the display unit. For this reason, the perspective shift in the image and the viewpoint of the viewer's eyes is alleviated by the mask applied to the video, and a higher sense of realism and room feeling can be felt.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The purpose of the present invention is to promote, in a case where an unspecified large number of people are participating in dynamic communication, smooth communications therebetween by providing a highly lively feeling and a sense of unity of being in the same room as if the rooms were connected to each other. This image projection device 2 is used in an information communication system that is capable of performing communication with a remote location using images. The image projection device 2 comprises: a received image processing unit 17 that performs mask processing on a portion of an image that has been input; and a projection unit 19 that projects the image that was mask-processed by the received image processing unit 17.

Description

画像投影装置Image projection device
 この発明は、遠隔地間のコミュニケーションに用いる画像投影装置に関する。 The present invention relates to an image projection apparatus used for communication between remote locations.
 インターネット等の通信によって遠隔地間で会議を行う会議システムは、撮像装置と表示装置とを含んでおり、撮像装置で撮像した会議参加者の画像を通信先に送信するとともに、通信先で撮像された会議参加者の画像を受信して表示装置に表示させる。このような会議システムにおいて、1対1のコミュ二ケーションの場合には、視聴者の視点位置の変位に応じて、もう片方の空間にあるカメラの位置を追従させることで、視点位置と撮影位置を一致させることができた。しかし、多数対多数の場合には視点位置が多数になり、一意に映像を加工したり、撮影場所を決定したりして、臨場感を高めることは難しかった。 A conference system that conducts a conference between remote locations through communication such as the Internet includes an imaging device and a display device. The conference system captures images of conference participants captured by the imaging device and is captured at the communication destination. Received images of the conference participants are displayed on the display device. In such a conference system, in the case of one-to-one communication, the viewpoint position and the shooting position are made to follow the position of the camera in the other space according to the displacement of the viewer's viewpoint position. Could be matched. However, in the case of many-to-many, there are many viewpoint positions, and it has been difficult to enhance the sense of reality by processing the video uniquely or determining the shooting location.
 それを解決する手段として、会議参加者人数分のカメラを用いて、話し手と聞き手の視線が合うようにする技術が存在する(特許文献1)。 As a means for solving this problem, there is a technology that uses the cameras for the number of participants in the conference to match the line of sight between the speaker and the listener (Patent Document 1).
特開平5-22722号公報JP-A-5-22722
 しかし、この手段は会議参加者がカメラの方を向いて座ってテーブルを囲む静的な会議の場を対象としたものであり、参加者が移動する状況も考慮されていない。そのため、複数の視点から見た奥行感のずれによる違和感が、臨場感を生むうえでの大きな障害となっていた。 However, this means is intended for a static meeting place where a conference participant sits facing the camera and surrounds the table, and the situation in which the participant moves is not taken into consideration. For this reason, the sense of incongruity due to the shift in depth as viewed from multiple viewpoints has been a major obstacle to creating a sense of reality.
 そこで、本発明はこのような問題点を解決するためになされたもので、不特定多数が動的にコミュニケーションを行う場において、部屋同士があたかも繋がっているような高い臨場感、同室感を感じ、円滑なコミュニケーションを促すことを目的とする。 Therefore, the present invention has been made to solve such a problem, and in a place where an unspecified number of people communicate dynamically, it feels as if the rooms are connected to each other as if they were connected to each other. The purpose is to promote smooth communication.
 この発明に係る画像投影装置は、遠隔地との間で互いに画像によるコミュニケ―ションを行うことが可能な情報通信システムに用いられる画像投影装置であって、画像投影装置は、入力された画像の一部をマスク処理する受信画像処理部と、受信画像処理部によりマスク処理された画像を投影する投影部と、を備えたことを特徴とする。 An image projection apparatus according to the present invention is an image projection apparatus used in an information communication system capable of communicating with each other by an image, and the image projection apparatus A reception image processing unit that performs mask processing on a part thereof and a projection unit that projects an image masked by the reception image processing unit are provided.
 この発明に係る画像投影装置は、外部から入力された画像の一部にマスク処理を行い、そのマスク処理された画像を投影するため、画像内のパースと視聴者の目を基準とした視点のずれが、映像にかけられたマスクにより緩和され、より高い臨場感、同室感を感じることが可能となる。 The image projection apparatus according to the present invention performs mask processing on a part of an image input from the outside, and projects the mask-processed image. Therefore, the perspective of the viewpoint based on the perspective of the image and the viewer's eyes is used. The shift is alleviated by the mask applied to the image, and it becomes possible to feel a higher sense of realism and room feeling.
実施の形態1に係る画像投影装置を用いた遠隔コミュニケーションシステムの構成を示す図である。It is a figure which shows the structure of the remote communication system using the image projection apparatus which concerns on Embodiment 1. FIG. 本発明に係る画像投影装置のハードウェア構成図である。It is a hardware block diagram of the image projector which concerns on this invention. 実施の形態1に係る画像投影装置の機能構成図である。1 is a functional configuration diagram of an image projection apparatus according to Embodiment 1. FIG. 実施の形態1に係るパターン記憶部に記憶されたマスクパターンのパターン例を示す図である。It is a figure which shows the example of a pattern of the mask pattern memorize | stored in the pattern memory | storage part which concerns on Embodiment 1. FIG. 実施の形態1に係る画像投影装置のマスク処理の流れを示すフローチャートである。6 is a flowchart showing a flow of mask processing of the image projection apparatus according to the first embodiment. 実施の形態2に係る画像投影装置の機能構成図である。5 is a functional configuration diagram of an image projection apparatus according to Embodiment 2. FIG. 実施の形態2に係る画像投影装置のマスク処理の流れを示すフローチャートである。10 is a flowchart showing a flow of mask processing of the image projection apparatus according to the second embodiment. 実施の形態3に係る画像投影装置の機能構成図である。6 is a functional configuration diagram of an image projection apparatus according to Embodiment 3. FIG. 実施の形態3に係る画像投影装置のマスク処理の流れを示すフローチャートである。10 is a flowchart showing a flow of mask processing of the image projection apparatus according to the third embodiment. 実施の形態4に係る画像投影装置の機能構成図である。FIG. 10 is a functional configuration diagram of an image projection apparatus according to a fourth embodiment. 実施の形態4に係る画像投影装置のマスク処理の流れを示すフローチャートである。10 is a flowchart showing a flow of mask processing of the image projection apparatus according to the fourth embodiment. 実施の形態5に係る画像投影装置の機能構成図である。FIG. 10 is a functional configuration diagram of an image projection apparatus according to a fifth embodiment. 実施の形態5に係る画像投影装置のマスク処理の流れを示すフローチャートである。10 is a flowchart showing a flow of mask processing of the image projection apparatus according to the fifth embodiment.
 実施の形態1
 本発明の実施の形態1に係る画像投影装置の構成について図1から5を用いて説明する。図1は実施の形態1に係る画像投影装置を用いたコミュニケーションシステムの構成を示す図である。図2は本発明に係る画像投影装置のハードウェア構成図である。図3は実施の形態1に係る画像投影装置の機能構成図である。図4は実施の形態1に係るパターン記憶部に記憶されたマスクパターンのパターン例を示す図である。図5は実施の形態1に係る画像投影装置のマスク処理の流れを示すフローチャートである。
Embodiment 1
The configuration of the image projection apparatus according to the first embodiment of the present invention will be described with reference to FIGS. FIG. 1 is a diagram showing a configuration of a communication system using the image projection apparatus according to the first embodiment. FIG. 2 is a hardware configuration diagram of the image projection apparatus according to the present invention. FIG. 3 is a functional configuration diagram of the image projection apparatus according to the first embodiment. FIG. 4 is a diagram illustrating a pattern example of a mask pattern stored in the pattern storage unit according to the first embodiment. FIG. 5 is a flowchart showing a flow of mask processing of the image projection apparatus according to the first embodiment.
 図1に示すように、本発明に係る遠隔コミュニケーションシステム1は、インターネット等のネットワーク100に接続された複数の画像投影装置2a~2bと、サーバ装置3とを備えている。なお、以下の説明の中で、複数の画像投影装置2a~2bのうちの任意のテレビ会議装置を示す場合、「画像投影装置2」を用いる。 As shown in FIG. 1, a remote communication system 1 according to the present invention includes a plurality of image projection devices 2a to 2b connected to a network 100 such as the Internet, and a server device 3. In the following description, “image projector 2” is used to indicate an arbitrary video conference device among the plurality of image projectors 2a to 2b.
 画像投影装置2は、遠隔コミュ二ケーションシステム1に対応した端末装置であり、情報処理装置の一例である。画像投影装置2は、例えば、パーソナルコンピュータ(以下、「PC」と記載する)、タブレット端末、スマートフォン等の汎用の情報処理装置であってもよいし、遠隔コミュニケーションシステム1用の専用端末等であってもよい。なお、画像投影装置2が汎用の情報処理装置である場合、画像投影装置2の各機能は、例えば、情報処理装置にインストールされたアプリケーションプログラム等によって実現される。 The image projection device 2 is a terminal device corresponding to the remote communication system 1 and is an example of an information processing device. The image projection device 2 may be a general-purpose information processing device such as a personal computer (hereinafter referred to as “PC”), a tablet terminal, or a smartphone, or a dedicated terminal for the remote communication system 1. May be. When the image projection device 2 is a general-purpose information processing device, each function of the image projection device 2 is realized by, for example, an application program installed in the information processing device.
 サーバ装置3は、例えば、画像投影装置2a~2bと接続しているか否かの接続状態のモニタ、遠隔コミュニケーションの開始/終了時の接続制御、遠隔コミュニケーション中の画像(映像)、音声等のデータ送受信等の制御を行う。 The server device 3 is, for example, a monitor of connection status as to whether or not it is connected to the image projection devices 2a to 2b, connection control at the start / end of remote communication, data (video), audio, etc. during remote communication Control transmission and reception.
 図2は本発明に係る画像投影装置のハードウェア構成図である。本発明に係る画像投影装置1は、一般的なコンピュータの構成を有しており、例えば、CPU(Central Processing Unit)4、メモリ5、記録装置6、リモコン7、カメラ8、マイク9、スピーカ10を有し、それらはバス等で接続されている。 FIG. 2 is a hardware configuration diagram of the image projection apparatus according to the present invention. The image projection apparatus 1 according to the present invention has a general computer configuration, for example, a CPU (Central Processing Unit) 4, a memory 5, a recording device 6, a remote controller 7, a camera 8, a microphone 9, and a speaker 10. They are connected by a bus or the like.
 CPU4は、例えば、記録装置6等からプログラムやデータを読み出し、処理を実行することで、画像投影装置2が備える各機能を実現する演算装置である。CPU4は、例えば、画像投影装置2用のプログラムを実行することにより、画像投影装置2の全体制御、遠隔コミュニケーションの制御、送受信する画像の画像処理、及び送受信する音声の音声処理等の各機能を実現する。 The CPU 4 is an arithmetic device that implements each function of the image projection device 2 by reading a program and data from the recording device 6 and executing the processing, for example. For example, the CPU 4 executes a program for the image projecting device 2 to perform functions such as overall control of the image projecting device 2, control of remote communication, image processing of images to be transmitted and received, and sound processing of sound to be transmitted and received. Realize.
 メモリ5は、例えばRAM(Random Access Memory)、ROM(Read Only Memory)等の記憶装置を含む。RAMは、CPU4のワークエリア等として利用される揮発性のメモリである。ROMは、例えば、画像投影装置2の起動プログラムや、設定値等を記憶する不揮発性のメモリである。 The memory 5 includes storage devices such as a RAM (Random Access Memory) and a ROM (Read Only Memory). The RAM is a volatile memory used as a work area for the CPU 4. The ROM is, for example, a non-volatile memory that stores a startup program for the image projection apparatus 2 and setting values.
 記録装置6は、例えば、CPU4が実行する機器制御、テレビコミュニケーションの制御等のプログラムや、データ等を記録したストレージ装置であり、例えば、HDD(Hard Disk Device)、SSD(Solid State Drive)、フラッシュROM等で構成される。 The recording device 6 is a storage device that records, for example, device control executed by the CPU 4, control of television communication, data, and the like. For example, an HDD (Hard Disk Device), SSD (Solid State Drive), flash Consists of ROM and the like.
 リモコン7は、操作ボタンや、キーボード等、画像投影装置2がユーザの操作を受付ける手段である。 The remote controller 7 is a means for the image projection device 2 to accept a user operation, such as an operation button or a keyboard.
 カメラ8は、例えば、テレコミュニケーションの参加者等の画像を撮像する撮像手段である。カメラ8は、撮像した会議等の画像を所定の画像データに変換し、バス等を介してCPU4等に送信する。この時、カメラ8は、例えば、設定されたカメラパラメータ等に基づいて撮像した画像の明るさ等を変更することができる。カメラ8は、例えば、デフォルトのカメラパラメータを有しており、そのカメラパラメータの値は、CPU4で動作するプログラム等により動的に変更可能である。 The camera 8 is, for example, an imaging unit that captures images of telecommunications participants and the like. The camera 8 converts a captured image such as a meeting into predetermined image data, and transmits the image data to the CPU 4 or the like via a bus or the like. At this time, the camera 8 can change, for example, the brightness of the captured image based on the set camera parameter or the like. The camera 8 has default camera parameters, for example, and the values of the camera parameters can be dynamically changed by a program or the like operating on the CPU 4.
 マイク9は、例えば、テレコミュニケーションの参加者等の音声を取得し、所定の音声データに変換して、バスを介して、CPU4等に送信する。マイク9は、例えば、複数の指向性マイク等を含む。 The microphone 9 acquires, for example, voices of telecommunications participants, etc., converts the voices into predetermined voice data, and transmits the voice data to the CPU 4 or the like via the bus. The microphone 9 includes, for example, a plurality of directional microphones.
 スピーカ10は、例えば、CPU4等から受信した所定の音声データを音声に変換して出力する。 The speaker 10 converts predetermined sound data received from the CPU 4 or the like into sound and outputs the sound, for example.
 なお、図2において、リモコン7、カメラ8、マイク9、スピーカ10等は、その一部が画像投影装置2の本体の外部に接続された外付けの機器であっても良い。あるいは、画像投影装置2に接続されたディスプレイ装置(表示部)が、リモコン7、カメラ8、マイク9、スピーカ10等を内蔵していても良い。 In FIG. 2, the remote controller 7, the camera 8, the microphone 9, the speaker 10, and the like may be external devices partially connected to the outside of the main body of the image projector 2. Alternatively, the display device (display unit) connected to the image projection device 2 may incorporate the remote control 7, the camera 8, the microphone 9, the speaker 10, and the like.
 図3は実施の形態1に係る画像投影装置の機能構成図である。図3に示すように、画像投影装置2は、撮像部11、送信画像処理部12、送信部13と受信部14で構成された通信部15、操作部16、受信画像処理部17、マスクパターン記憶部18及び投影部19で構成される。 FIG. 3 is a functional configuration diagram of the image projection apparatus according to the first embodiment. As shown in FIG. 3, the image projection apparatus 2 includes an imaging unit 11, a transmission image processing unit 12, a communication unit 15 including a transmission unit 13 and a reception unit 14, an operation unit 16, a reception image processing unit 17, a mask pattern. It comprises a storage unit 18 and a projection unit 19.
 まず、画像投影装置2の、相手側画像投影装置へ画像を送信するために必要な構成について説明する。
 撮像部11は、テレコミュニケーションの参加者、家具等の通信相手側の室内の状況等の画像を撮像する手段であり、例えば、図2のカメラ8や、その制御プログラム等により実現される。
First, a configuration necessary for transmitting an image to the counterpart image projecting apparatus of the image projecting apparatus 2 will be described.
The imaging unit 11 is a unit that captures an image of a situation of a room on the communication partner side such as a telecommunications participant or furniture, and is realized by, for example, the camera 8 in FIG.
 送信画像処理部12は、撮像部11が撮像した画像に、例えば、明るさ調整等の画像処理を行う。送信画像処理部12は、例えば、図2のCPU4で動作するプログラム等によって実現される。あるいは、送信画像処理手部12は、図2のカメラ8等に含まれる、例えば、カメラDSP(Digital Signal Processor)等の画像処理部であっても良い。なお、送信画像処理手部12が行う画像処理は、例えば、輝度調整、コントラスト調整、ガンマ調整等、画像の明るさが実質的に変更される処理を含む。 The transmission image processing unit 12 performs image processing such as brightness adjustment on the image captured by the imaging unit 11. The transmission image processing unit 12 is realized by, for example, a program that operates on the CPU 4 in FIG. Alternatively, the transmission image processing unit 12 may be an image processing unit such as a camera DSP (Digital Signal Processor) included in the camera 8 of FIG. Note that the image processing performed by the transmission image processing unit 12 includes processing for substantially changing the brightness of the image, such as brightness adjustment, contrast adjustment, and gamma adjustment.
 送信部13は、送信画像処理手部12によって画像処理が行われた画像(画像データ)を、サーバ装置3を介して、通信先の画像投影装置2に送信する。また、送信部13は、画像データ以外の情報、例えば、通信制御情報や、利用シーン情報等の送信も行うことができる。送信部13は、例えば、制御プログラム等により実現される。 The transmission unit 13 transmits the image (image data) subjected to image processing by the transmission image processing unit 12 to the image projection apparatus 2 that is a communication destination via the server apparatus 3. The transmission unit 13 can also transmit information other than image data, for example, communication control information, usage scene information, and the like. The transmission unit 13 is realized by, for example, a control program.
 つづいて、画像投影装置2の、相手側画像投影装置2bから画像を受信するために必要な構成について説明する。
 受信部14は、通信先の画像投影装置2からネットワーク100を介して送信された画像を受信する手段である。受信部14は、受信画像処理部17に受信した画像を送信する。受信部14は、例えば、制御プログラム等により実現される。
Next, the configuration necessary for receiving an image from the counterpart image projecting device 2b of the image projecting device 2 will be described.
The receiving unit 14 is a unit that receives an image transmitted from the communication destination image projection apparatus 2 via the network 100. The receiving unit 14 transmits the received image to the received image processing unit 17. The receiving unit 14 is realized by, for example, a control program.
 操作部16は、利用者が画像投影装置2を操作したい時に操作するための手段である。利用者は操作部16を操作することで、受信した画像にマスク処理を行うか否かを選択することができる。そして、利用者はマスク処理を行う場合に、後述するマスクパターン記憶部18に記憶された複数のマスクパターンから所望のマスクパターンを選択することができる。その他にも操作部16を操作することで、撮像部11により送信する画像を撮像することができる。操作部16は、例えば、図2のリモコン7や、その制御プログラム等により実現される。 The operation unit 16 is means for operating when the user wants to operate the image projection apparatus 2. The user can select whether to perform mask processing on the received image by operating the operation unit 16. When performing the mask process, the user can select a desired mask pattern from a plurality of mask patterns stored in a mask pattern storage unit 18 described later. In addition, by operating the operation unit 16, an image to be transmitted by the imaging unit 11 can be captured. The operation unit 16 is realized by, for example, the remote controller 7 of FIG. 2 or its control program.
 受信画像処理部17は、受信部14が受信した画像に画像処理を行う手段であり、例えば、図2のCPU4によって実行されるプログラム等によって実現される。受信画像処理部17は、利用者が操作部16を介して選択したマスクパターンを受信画像に適応することで、マスク処理を行う。そして、受信画像処理部17はマスク処理を行った画像を投影部19に送信する。なお、マスク処理とは、画像の特定の部分のみ表示、または非表示にする処理のことである。 The received image processing unit 17 is a means for performing image processing on the image received by the receiving unit 14, and is realized by, for example, a program executed by the CPU 4 in FIG. The received image processing unit 17 performs mask processing by adapting the mask pattern selected by the user via the operation unit 16 to the received image. Then, the received image processing unit 17 transmits the image subjected to the mask process to the projection unit 19. The mask process is a process for displaying or hiding only a specific part of an image.
 投影部19は、受信画像処理部17によって画像処理された画像を表示部20に投影するため手段であり、図2のCPU4によって実行されるプログラム等により実現される。投影部19は、例えば、CCD(Charge Coupled Device)センサー、或いはCMOS(Complementary Metal Oxide Semiconductor)センサー等からなる撮像素子(図示せず)と、被写体から発せられた光を撮像素子上に結像させるための撮像レンズとを備えている。投影部19を構成する撮像レンズは表示部20と対向するように配置される The projection unit 19 is a means for projecting the image processed by the received image processing unit 17 onto the display unit 20, and is realized by a program executed by the CPU 4 in FIG. For example, the projection unit 19 forms an image of light emitted from a subject on an imaging element (not shown) such as a CCD (Charge Coupled Device) sensor or a CMOS (Complementary Metal Oxide Semiconductor) sensor. And an imaging lens. The imaging lens constituting the projection unit 19 is disposed so as to face the display unit 20.
 表示部20は、例えば、液晶ディスプレイ、プロジェクタ等、方法を問わない。 The display unit 20 may be a method such as a liquid crystal display or a projector.
 図4は、パターン記憶部に記憶されたマスクパターンのパターン例を示す図である。
 図4(a)は、受信した画像に縦方向の短冊状のマスク、すなわちストライプ状のマスクを適用した例である。画像にストライプ状のマスクがなされると横方向の視点のずれが緩和されるため、表示部20の端の位置から反対の端にいる人を見る場合や、表示部20の前を横切りながら見る場合に効果が期待できる。マスクの幅は人が完全に隠れてしまわない程度の幅、例えば300mm以下が望ましい。
FIG. 4 is a diagram illustrating a pattern example of the mask pattern stored in the pattern storage unit.
FIG. 4A shows an example in which a strip-shaped mask in the vertical direction, that is, a striped mask is applied to the received image. When a striped mask is formed on the image, the lateral viewpoint shift is alleviated. Therefore, when viewing a person who is at the opposite end from the position of the end of the display unit 20, or while looking in front of the display unit 20 The effect can be expected in some cases. The width of the mask is desirably a width that does not completely hide a person, for example, 300 mm or less.
 図4(b)は、図4(a)と同様に縦方向の短冊状のマスク、すなわちストライプ状のマスクを適用したものであるが、さらに、表示領域の左右端に行くほど、マスク部分を広く若しくはマスク部分の間隔を狭くしている例である。横方向でも特に視点のずれが顕著に表れる画面左右端部分の映像表示面積を狭くすることで、より違和感を解消することが可能となる。 FIG. 4B applies a strip-shaped mask in the vertical direction, that is, a stripe-shaped mask, as in FIG. 4A, but further, the mask portion is moved toward the left and right ends of the display area. This is an example of widening or narrowing the interval between mask portions. By reducing the video display area at the left and right end portions of the screen where the viewpoint shift is particularly noticeable even in the horizontal direction, it is possible to eliminate the sense of incongruity.
 図4(c)は、受信した画像に横方向の短冊状のマスク、すなわちボーダー状のマスクを適用した例である。画像にボーダー状のマスクがなされると縦方向の視点のずれが緩和されるため、利用者の身長差によって生じる見え方の相違を緩和することが可能となる。 FIG. 4C shows an example in which a strip-shaped mask in the horizontal direction, that is, a border-shaped mask is applied to the received image. When a border-like mask is formed on the image, the vertical viewpoint shift is alleviated, so that it is possible to alleviate the difference in appearance caused by the height difference of the users.
 図4(d)は、図4(c)と同様に横方向の短冊状のマスク、すなわちボーダー状のマスクを適用したものであるが、さらに、表示領域端の上下端に行くほど、マスク部分を広く若しくはマスク部分の間隔を狭くしている例である。縦方向でも視点のずれが顕著に表れる画面上下端部分の映像表示面積を狭くすることで、より違和感が解消される。 FIG. 4D applies a strip-shaped mask in the horizontal direction, that is, a border-like mask, as in FIG. 4C, but further, the mask portion becomes closer to the upper and lower ends of the display area end. This is an example in which the distance between the mask portions is narrowed. By reducing the image display area at the upper and lower ends of the screen where the viewpoint shift is noticeable even in the vertical direction, the sense of incongruity is further eliminated.
 図4(e)は、受信した画像に格子状のマスクを適用した例である。これは、図4(a)、図4(c)を組み合わせた例で、上述した図4(a)、図4(c)の両方の効果を得ることができる。 FIG. 4E shows an example in which a lattice mask is applied to the received image. This is an example in which FIGS. 4A and 4C are combined, and the effects of both FIGS. 4A and 4C described above can be obtained.
 図4(f)は、図4(e)と同様に、受信した画像に格子状のマスクを適用した例であるが、さらに、表示領域の左右及び上下端に行くほど、マスク部分を広く若しくはマスク部分の間隔を狭くしている例である。これは、図4(b)、図4(d)を組み合わせた例で、上述した図4(b)、図4(d)の両方の効果を得ることができる。 FIG. 4 (f) is an example in which a grid-like mask is applied to the received image, as in FIG. 4 (e). However, the mask portion becomes wider toward the left and right and upper and lower ends of the display area. This is an example in which the interval between mask portions is narrowed. This is an example in which FIGS. 4B and 4D are combined, and the effects of both FIGS. 4B and 4D described above can be obtained.
 図4(g)は、図4(a)と同様に縦方向の短冊状のマスク、すなわちストライプ状のマスクを適用したものであるが、さらに、受信した画像の中心に向かって緩やかに画像表示領域が狭くなるようにマスクを適用する案である。これは、幅が狭いほど奥に見え、幅が広いと手前に見えるという特性を利用したものであり、このようにマスクを適用することで、画像の上下端は手前に、画面中心は奥側に見えるようになり、画像内の位置関係をより明快にする効果がある。 FIG. 4G shows a case where a strip-shaped mask in the vertical direction, that is, a striped mask is applied as in FIG. 4A, and further, the image is displayed gradually toward the center of the received image. It is a proposal to apply a mask so that the region becomes narrow. This uses the property that the smaller the width, the deeper it looks, and the wider the width, the closer it is to the front. By applying a mask in this way, the top and bottom edges of the image are on the front and the center of the screen is on the back And the positional relationship in the image becomes clearer.
 図4(h)は、図4(a)と同様に縦方向の短冊状のマスク、すなわちストライプ状のマスクを適用したものであるが、さらに、受信した画像の中心に向かって緩やかに画像表示領域が広くなるようにマスクを適用する案である。このようにマスクを適用することで、視点のずれが大きい画面端の映像表示面積を少さくでき、違和感が解消できるという効果がある。 FIG. 4 (h) applies a strip-shaped mask in the vertical direction, that is, a striped mask, as in FIG. 4 (a), and further displays the image gently toward the center of the received image. It is a proposal to apply a mask so that the area becomes wide. By applying the mask in this way, there is an effect that the video display area at the screen edge where the deviation of the viewpoint is large can be reduced, and the uncomfortable feeling can be eliminated.
 図4(a)~図4(h)のように表示される画像を部分的に隠すことで、違和感の緩和や、奥行きの強調を行い、利用者により高い臨場感や同室感を与えることが可能である。 By partially hiding the displayed images as shown in FIGS. 4 (a) to 4 (h), it is possible to alleviate the sense of incongruity and enhance the depth, giving the user a higher sense of realism and room feeling. Is possible.
 図5は、実施の形態1に係る画像投影装置2のマスク処理の流れを示すフローチャートである。ここでは、利用者が図4(a)のマスクパターンを選択した場合のマスク処理の流れを説明する。 FIG. 5 is a flowchart showing the flow of mask processing of the image projection apparatus 2 according to the first embodiment. Here, the flow of mask processing when the user selects the mask pattern of FIG. 4A will be described.
 複数の画像投影装置2間でテレコミュニケーションが開始されると、画像投影装置2の受信部14は、通信先の画像投影装置からネットワーク100を介して送信された画像を受信する(S11)。そして、受信部14は、受信した画像を受信画像処理部17に送信する。 When telecommunications are started between the plurality of image projecting apparatuses 2, the receiving unit 14 of the image projecting apparatus 2 receives an image transmitted from the image projecting apparatus of the communication destination via the network 100 (S11). Then, the receiving unit 14 transmits the received image to the received image processing unit 17.
 続いて、受信画像処理部17は、操作部16を介して利用者がマスクパターン記憶部18に記憶されたマスクパターン(図4の(a)~図4(h))の中から所定のマスクパターンを選択されているかを判定する(ST21)。なお、ここでは、図4(a)のマスクパターンすなわちストライプ状のマスクパターンが選択されているため、マスク処理を行うST31に移行する(ST21のYES)。なお、もし利用者によってマスクパターンが選択されていない場合、受信画像処理部17は、マスク処理を施さない画像を投影部19に送信する(ST21のNO)。 Subsequently, the received image processing unit 17 uses a predetermined mask from mask patterns (FIGS. 4A to 4H) stored in the mask pattern storage unit 18 by the user via the operation unit 16. It is determined whether a pattern is selected (ST21). Here, since the mask pattern of FIG. 4A, that is, the stripe-shaped mask pattern is selected, the process proceeds to ST31 where mask processing is performed (YES in ST21). If the mask pattern is not selected by the user, the received image processing unit 17 transmits an image not subjected to mask processing to the projection unit 19 (NO in ST21).
 受信画像処理部17は受信した画像に利用者によって選択されたストライプ状のマスク情報を組み合わせ(ST31)、マスクのかかっていない部分の画像のみを投影部19に送信する。 The received image processing unit 17 combines the received image with the striped mask information selected by the user (ST31), and transmits only the image of the unmasked portion to the projection unit 19.
 投影部19は、受信画像処理部17から送信されたストライプ状のマスクが施された画像を表示部20に投影する(ST41-1)。もし利用者によってマスクパターンが選択されていない場合、投影部19は、マスク処理が施されていない画像を表示部20に投影する(ST41-2)。 Projection unit 19 projects the image with the striped mask transmitted from received image processing unit 17 onto display unit 20 (ST41-1). If the mask pattern has not been selected by the user, the projection unit 19 projects an image that has not been subjected to mask processing onto the display unit 20 (ST41-2).
 実施の形態1では、受信画像に利用者が選択したマスクパターンを適用することで、画像内のパースと視聴者の目を基準とした視点のずれが、画像にかけられたマスクにより緩和され、より高い臨場感、同室感を感じることが可能となる。 In the first embodiment, by applying the mask pattern selected by the user to the received image, the deviation of the viewpoint based on the perspective in the image and the viewer's eyes is alleviated by the mask applied to the image. It is possible to feel a high sense of realism and room feeling.
 実施の形態2
 本発明の実施の形態2に係る画像投影装置の構成について図6、図7を用いて説明する。図6は、実施の形態2に係る画像投影装置の機能構成図である。図7は実施の形態2に係る画像投影装置のマスク処理の流れを示すフローチャートである。なお、本実施の形態2では実施の形態1との相違点を中心に説明し、実施の形態1と同様の構成及び効果については省略する。また、実施の形態1と同一の構成については同一の符号を用いる。
Embodiment 2
The configuration of the image projection apparatus according to Embodiment 2 of the present invention will be described with reference to FIGS. FIG. 6 is a functional configuration diagram of the image projection apparatus according to the second embodiment. FIG. 7 is a flowchart showing a flow of mask processing of the image projection apparatus according to the second embodiment. In the second embodiment, differences from the first embodiment will be mainly described, and the same configurations and effects as those of the first embodiment will be omitted. Further, the same reference numerals are used for the same configurations as those of the first embodiment.
 図6を用いて、実施の形態2に係る画像投影装置21の、相手側画像投影装置から画像を受信するために必要な構成について説明する。なお、画像投影装置21の、相手側画像投影装置へ画像を送信するために必要な構成については実施の形態1と同様であるため説明を省略する。 A configuration necessary for receiving an image from the counterpart image projecting apparatus of the image projecting apparatus 21 according to the second embodiment will be described with reference to FIG. Note that the configuration of the image projecting device 21 necessary for transmitting an image to the counterpart image projecting device is the same as that of the first embodiment, and thus the description thereof is omitted.
 受信部14は、通信先の画像投影装置からネットワーク100を介して送信された画像を受信する手段である。さらに、受信部14は、後述する画像認識部23及び受信画像処理部24に受信した画像を送信する。受信部14は、例えば、制御プログラム等により実現される。 The receiving unit 14 is a means for receiving an image transmitted via the network 100 from the image projection apparatus that is the communication destination. Further, the receiving unit 14 transmits the received image to an image recognition unit 23 and a received image processing unit 24 described later. The receiving unit 14 is realized by, for example, a control program.
 操作部22は、利用者が画像投影装置21を操作したい時に操作するための手段である。利用者は操作部22を操作することで、受信した画像にマスク処理を行うか否かを選択することができる。その他にも操作部22を操作することで、撮像部11により送信する画像を撮像することができる。操作部22は、例えば、図2のリモコン7や、その制御プログラム等により実現される。 The operation unit 22 is means for operating when the user wants to operate the image projection device 21. The user can select whether to perform mask processing on the received image by operating the operation unit 22. In addition, by operating the operation unit 22, an image to be transmitted by the imaging unit 11 can be captured. The operation unit 22 is realized by, for example, the remote controller 7 in FIG. 2 or its control program.
 画像認識部23は、受信部14から送信された画像を受信し、その画像を認識する手段であり、例えば、図2のCPU4によって実行されるプログラム等によって実現される。画像認識部23は、画像内に映る人、家具の配置、壁、床、天井等を認識し、その認識結果を受信画像処理部24に送る。なお、認識するための方法は、カメラ、音波センサー、赤外線センサー等、画像に映る内容を認識できるものであれば何でもよい。 The image recognition unit 23 is a means for receiving the image transmitted from the reception unit 14 and recognizing the image, and is realized by, for example, a program executed by the CPU 4 in FIG. The image recognition unit 23 recognizes people, furniture arrangements, walls, floors, ceilings, and the like that appear in the image, and sends the recognition results to the received image processing unit 24. Note that the recognition method may be anything such as a camera, a sound wave sensor, an infrared sensor, or the like that can recognize the contents shown in the image.
 受信画像処理部24は、受信部14が受信した画像に画像処理を行う手段であり、例えば、図2のCPU4によって実行されるプログラム等によって実現される。具体的には、受信画像処理部24は、画像認識部23からの認識結果に基づき、マスクパターン記憶部18に記憶されたマスクパターンの中から最適なマスクパターンを抽出し、受信画像にマスク処理を行う。そして、受信画像処理部24はマスク処理を行った画像を投影部19に送信する。 The received image processing unit 24 is a means for performing image processing on the image received by the receiving unit 14, and is realized by, for example, a program executed by the CPU 4 in FIG. Specifically, the received image processing unit 24 extracts an optimal mask pattern from the mask patterns stored in the mask pattern storage unit 18 based on the recognition result from the image recognition unit 23, and performs mask processing on the received image. I do. Then, the received image processing unit 24 transmits the image subjected to the mask process to the projection unit 19.
 投影部19は、受信画像処理部24によってマスク処理された画像を表示部20に投影するため手段であり、図2のCPU201によって実行されるプログラム等により実現される。 The projection unit 19 is means for projecting the image masked by the received image processing unit 24 onto the display unit 20, and is realized by a program executed by the CPU 201 in FIG.
 図7は、実施の形態2に係る画像投影装置21のマスク処理の流れを示すフローチャートである。ここでは、画像認識部の認識結果に基づいて、受信画像処理部24が図4(a)のマスクパターンを選択する場合のマスク処理の流れを説明する。 FIG. 7 is a flowchart showing a mask processing flow of the image projection apparatus 21 according to the second embodiment. Here, the flow of mask processing when the received image processing unit 24 selects the mask pattern of FIG. 4A based on the recognition result of the image recognition unit will be described.
 複数の画像投影装置21間でテレコミュニケーションが開始されると、画像投影装置2の受信部14は、通信先の画像投影装置からネットワーク100を介して送信された画像を受信する(ST12)。そして、受信部14は、受信した画像を画像認識部23、受信画像処理部24にそれぞれ送信する。 When telecommunications are started between the plurality of image projection apparatuses 21, the reception unit 14 of the image projection apparatus 2 receives an image transmitted from the communication destination image projection apparatus via the network 100 (ST12). Then, the reception unit 14 transmits the received image to the image recognition unit 23 and the reception image processing unit 24, respectively.
 画像認識部23は受信部14から送信された画像の内容を認識する(ST22)。ここでは、認識した画像をそのまま表示部20に表示すると横方向の視点のずれが大きくなる等の情報を認識する。そして画像認識部23はこれらの情報を受信画像処理部24に送信する。 The image recognition unit 23 recognizes the content of the image transmitted from the reception unit 14 (ST22). Here, when the recognized image is displayed on the display unit 20 as it is, information such as a large deviation of the viewpoint in the horizontal direction is recognized. Then, the image recognition unit 23 transmits these pieces of information to the received image processing unit 24.
 続いて、受信画像処理部24は、受信部14からの画像に関する情報と、画像認識部23が認識した画像の内容に関する情報を取得する。そして、画像の内容に関する情報に基づいて、受信した画像に適用する最適なマスクパターンをマスクパターン記憶部18から抽出する(ST32)。ここでは、受信画像処理部24は横方向の視点のずれを緩和させたいので、図4(a)のストライプ状のマスクパターンを抽出する。 Subsequently, the received image processing unit 24 acquires information about the image from the receiving unit 14 and information about the contents of the image recognized by the image recognition unit 23. Then, based on the information about the contents of the image, an optimal mask pattern to be applied to the received image is extracted from the mask pattern storage unit 18 (ST32). Here, since the received image processing unit 24 wants to alleviate the deviation of the viewpoint in the horizontal direction, the striped mask pattern shown in FIG. 4A is extracted.
 受信画像処理部24は、受信した画像に、画像認識部の認識結果に基づいて抽出されたストライプ状のマスク情報を組み合わせることでマスク処理を行い(ST42)、マスクのかかっていない部分の画像のみを投影部19に送信する。そして、投影部19は、受信画像処理部24から送信されたストライプ状のマスクが施された画像を表示部20に投影する(ST52)。 The received image processing unit 24 performs mask processing by combining the received image with the striped mask information extracted based on the recognition result of the image recognition unit (ST42), and only the image of the portion where no mask is applied. Is transmitted to the projection unit 19. Then, the projecting unit 19 projects the image with the striped mask transmitted from the received image processing unit 24 onto the display unit 20 (ST52).
 実施の形態2では、画像認識部が認識した画像の内容に関する情報に基づいて、最適なマスクパターンを抽出し、その抽出したマスクパターンを用いてマスク処理した画像を表示部に表示することが可能となる。そのため、画像内のパースと視聴者の目を基準とした視点のずれが、映像にかけられたマスクにより緩和され、より高い臨場感、同室感を感じることが可能となる。 In the second embodiment, it is possible to extract an optimal mask pattern based on information about the contents of the image recognized by the image recognition unit and display an image masked using the extracted mask pattern on the display unit It becomes. For this reason, the perspective shift in the image and the viewpoint of the viewer's eyes is alleviated by the mask applied to the video, and a higher sense of realism and room feeling can be felt.
 実施の形態3
 本発明の実施の形態3に係る画像投影装置の構成について図8、図9を用いて説明する。図8は、実施の形態3に係る画像投影装置の機能構成図である。図9は、実施の形態3に係る画像投影装置のマスク処理の流れを示すフローチャートである。なお、本実施の形態3では実施の形態1及び2との相違点を中心に説明し、実施の形態1及び2と同様の構成及び効果については省略する。また、実施の形態1及び2と同一の構成については同一の符号を用いる。
Embodiment 3
The configuration of the image projection apparatus according to the third embodiment of the present invention will be described with reference to FIGS. FIG. 8 is a functional configuration diagram of the image projection apparatus according to the third embodiment. FIG. 9 is a flowchart showing a flow of mask processing of the image projection apparatus according to the third embodiment. In the third embodiment, differences from the first and second embodiments will be mainly described, and the same configurations and effects as those in the first and second embodiments will be omitted. The same reference numerals are used for the same configurations as in the first and second embodiments.
 図8を用いて、実施の形態3に係る画像投影装置25の、相手側画像投影装置から画像を受信するために必要な構成について説明する。なお、画像投影装置25の、相手側画像投影装置へ画像を送信するために必要な構成については実施の形態1と同様であるため説明を省略する。 A configuration necessary for receiving an image from the counterpart image projecting apparatus of the image projecting apparatus 25 according to the third embodiment will be described with reference to FIG. Note that the configuration of the image projecting device 25 necessary for transmitting an image to the counterpart image projecting device is the same as that of the first embodiment, and thus the description thereof is omitted.
 受信部14は、通信先の画像投影装置からネットワーク100を介して送信された画像を受信する手段である。さらに、受信部14は、後述する画像認識部23及び受信画像処理部26に受信した画像を送信する。受信部14は、例えば、制御プログラム等により実現される。 The receiving unit 14 is a means for receiving an image transmitted via the network 100 from the image projection apparatus that is the communication destination. Further, the reception unit 14 transmits the received image to an image recognition unit 23 and a reception image processing unit 26 described later. The receiving unit 14 is realized by, for example, a control program.
 画像認識部23は、受信部14から送信された画像を受信し、その画像を認識する手段であり、例えば、図2のCPU4によって実行されるプログラム等によって実現される。画像認識部23は、画像内に映る人、家具の配置、壁、床、天井等を認識し、その認識結果を受信画像処理部26に送る。なお、認識するための方法は、カメラ、音波センサー、赤外線センサー等、画像に映る内容を認識できるものであれば何でもよい。 The image recognition unit 23 is a means for receiving the image transmitted from the reception unit 14 and recognizing the image, and is realized by, for example, a program executed by the CPU 4 in FIG. The image recognition unit 23 recognizes people, furniture arrangements, walls, floors, ceilings, and the like reflected in the image, and sends the recognition results to the received image processing unit 26. Note that the recognition method may be anything such as a camera, a sound wave sensor, an infrared sensor, or the like that can recognize the contents shown in the image.
 受信画像処理部26は、受信部14が受信した画像に画像処理を行う手段であり、例えば、図2のCPU4によって実行されるプログラム等によって実現される。具体的には、受信画像処理部26は、画像認識部23からの認識結果に基づき、マスクパターン記憶部18に記憶されたマスクパターンの中から最適なマスクパターンを抽出する。そして、受信画像処理部26は、抽出したマスクパターンを受信した画像に適用すると十分な臨場感や同室感が得られないと判断した場合、後述するマスクパターン修正部27に、抽出したマスクパターンに関する情報及び十分な臨場感や同室感を得るために必要な修正情報を送信する。修正情報とは、例えば、抽出されたマスクパターンのマスク間隔、マスク幅、マスクの形状等を変更するために必要な情報である。そして、受信画像処理部26は、マスクパターン修正部27によって修正されたマスクパターンの情報を受信すると、その修正されたマスクパターンを用いて受信画像にマスク処理行う。受信画像処理部26は、マスク処理を行った画像を投影部19に送信する。 The received image processing unit 26 is a means for performing image processing on the image received by the receiving unit 14, and is realized by, for example, a program executed by the CPU 4 in FIG. Specifically, the received image processing unit 26 extracts an optimum mask pattern from the mask patterns stored in the mask pattern storage unit 18 based on the recognition result from the image recognition unit 23. If the received image processing unit 26 determines that sufficient presence or room feeling cannot be obtained when the extracted mask pattern is applied to the received image, the mask pattern correcting unit 27 described later relates to the extracted mask pattern. Send information and correction information necessary to obtain sufficient presence and room feeling. The correction information is information necessary for changing the mask interval, mask width, mask shape, etc. of the extracted mask pattern, for example. When the received image processing unit 26 receives information on the mask pattern corrected by the mask pattern correcting unit 27, the received image processing unit 26 performs mask processing on the received image using the corrected mask pattern. The received image processing unit 26 transmits the image subjected to the mask process to the projection unit 19.
 マスクパターン修正部27は、受信画像処理部26から、抽出されたマスクパターンに関する情報及び修正情報を受信する。そして、マスクパターン修正部27は、修正情報に基づいて、抽出されたマスクパターンを修正する。マスクパターン修正部27は修正したマスクパターン情報を受信画像処理部26に送信する。 The mask pattern correction unit 27 receives information on the extracted mask pattern and correction information from the received image processing unit 26. Then, the mask pattern correction unit 27 corrects the extracted mask pattern based on the correction information. The mask pattern correction unit 27 transmits the corrected mask pattern information to the received image processing unit 26.
 投影部19は、受信画像処理部26によって画像処理された画像を表示部20に投影するため手段であり、図2のCPU201によって実行されるプログラム等により実現される。 The projection unit 19 is a unit for projecting the image processed by the received image processing unit 26 onto the display unit 20, and is realized by a program executed by the CPU 201 in FIG.
 図9は、実施の形態3に係る画像投影装置のマスク処理の流れを示すフローチャートである。ここでは、画像認識部の認識結果に基づいて、受信画像処理部が図4(a)のマスクパターンを抽出し、その抽出した図4(a)のマスクパターンを修正し、その修正したマスクパターンを受信画像に適用するマスク処理の流れを説明する。 FIG. 9 is a flowchart showing a flow of mask processing of the image projection apparatus according to the third embodiment. Here, based on the recognition result of the image recognition unit, the received image processing unit extracts the mask pattern of FIG. 4A, corrects the extracted mask pattern of FIG. 4A, and the corrected mask pattern. The flow of the mask processing for applying to the received image will be described.
 複数の画像投影装置25間でテレコミュニケーションが開始されると、画像投影装置25の受信部14は、通信先の画像投影装置からネットワーク100を介して送信された画像を受信する(ST13)。そして、受信部14は、受信した画像を画像認識部23、受信画像処理部26にそれぞれ送信する。 When telecommunications is started between the plurality of image projection apparatuses 25, the receiving unit 14 of the image projection apparatus 25 receives an image transmitted from the communication destination image projection apparatus via the network 100 (ST13). Then, the reception unit 14 transmits the received image to the image recognition unit 23 and the reception image processing unit 26, respectively.
 画像認識部23は受信部14から送信された画像の内容を認識する(ST23)。ここでは、認識した画像をそのまま表示部20に表示すると横方向の視点のずれが大きくなる等の情報を認識する。そして画像認識部23はこれらの情報を受信画像処理部26に送信する。 The image recognition unit 23 recognizes the content of the image transmitted from the reception unit 14 (ST23). Here, when the recognized image is displayed on the display unit 20 as it is, information such as a large deviation of the viewpoint in the horizontal direction is recognized. The image recognition unit 23 transmits these pieces of information to the received image processing unit 26.
 続いて、受信画像処理部26は、受信部14からの画像に関する情報と、画像認識部23が認識した画像の内容に関する情報を取得する。そして、画像の内容に関する情報より、受信した画像に適用する最適なマスクパターンをマスクパターン記憶部18から抽出する(ST33)。ここでは、受信画像処理部24は横方向の視点のずれを緩和させたいので、図4(a)のストライプ状のマスクパターンを抽出する。受信処理部26は抽出した図4(a)のストライプ状のマスクパターンでは、十分な臨場感や同室感が得られないと判断した場合(ST43のYes)、図4(a)のマスクパターン情報及び修正情報をマスクパターン修正部27へ送信する。マスクパターン修正部27は、修正情報に基づいて、図4(a)のマスクパターンを修正する(ST53)。そして、マスクパターン修正部27は、修正した図4(a)のマスクパターンを受信画像処理部26に送信する。 Subsequently, the received image processing unit 26 acquires information about the image from the receiving unit 14 and information about the contents of the image recognized by the image recognition unit 23. Then, an optimum mask pattern to be applied to the received image is extracted from the mask pattern storage unit 18 based on the information regarding the contents of the image (ST33). Here, since the received image processing unit 24 wants to alleviate the deviation of the viewpoint in the horizontal direction, the striped mask pattern shown in FIG. 4A is extracted. If the reception processing unit 26 determines that the extracted striped mask pattern in FIG. 4A does not provide sufficient presence or room feeling (Yes in ST43), the mask pattern information in FIG. 4A. The correction information is transmitted to the mask pattern correction unit 27. The mask pattern correcting unit 27 corrects the mask pattern shown in FIG. 4A based on the correction information (ST53). Then, the mask pattern correcting unit 27 transmits the corrected mask pattern in FIG. 4A to the received image processing unit 26.
 受信画像処理部26は、受信した画像に、マスクパターン修正部27で修正された図4(a)のマスクパターン情報を組み合わせ(S63)、マスクのかかっていない部分の画像のみを投影部19に送信する。そして、投影部19は、受信画像処理部26から送信された修正されたストライプ状のマスクが施された画像を表示部20に投影する(S73)。 The received image processing unit 26 combines the received image with the mask pattern information of FIG. 4A corrected by the mask pattern correcting unit 27 (S63), and only the image of the unmasked portion is projected to the projection unit 19. Send. Then, the projecting unit 19 projects the image with the corrected stripe mask transmitted from the received image processing unit 26 onto the display unit 20 (S73).
 一方、ST43でNoと判断した場合、すなわち、抽出した図4(a)のマスクパターンで十分な臨場感や同室感が得られと判断した場合は、受信した画像と図4(a)のマスクパターンを組み合わせ(S63)、マスクのかかっていない部分の画像のみを投影部19に送信する。そして、投影部19は、受信画像処理部26から送信されたストライプ状のマスクが施された画像を表示部20に投影する(ST73)。 On the other hand, when it is determined No in ST43, that is, when it is determined that sufficient presence and room feeling can be obtained with the extracted mask pattern of FIG. 4A, the received image and the mask of FIG. The patterns are combined (S63), and only the image of the part not covered with the mask is transmitted to the projection unit 19. Then, the projection unit 19 projects the image with the striped mask transmitted from the reception image processing unit 26 onto the display unit 20 (ST73).
 実施の形態3では、画像認識部が認識した画像の内容に関する情報に基づいて抽出したマスクパターンを、より臨場感や同室感が得られるようにさらに修正を行うため、より最適なマスクパターン施した画像を表示部に表示することが可能となる。そのため、画像内のパースと視聴者の目を基準とした視点のずれが、映像にかけられたマスクにより緩和され、より高い臨場感、同室感を感じることが可能となる。 In the third embodiment, the mask pattern extracted based on the information related to the content of the image recognized by the image recognition unit is further modified so as to obtain a more realistic feeling and the same room feeling. An image can be displayed on the display unit. For this reason, the perspective shift in the image and the viewpoint of the viewer's eyes is alleviated by the mask applied to the video, and a higher sense of realism and room feeling can be felt.
 実施の形態4
 本発明の実施の形態4に係る画像投影装置の構成について図10、図11を用いて説明する。図10は、実施の形態4に係る画像投影装置の機能構成図である。図11は、実施の形態4に係る画像投影装置のマスク処理の流れを示すフローチャートである。なお、本実施の形態4では実施の形態1から3との相違点を中心に説明し、実施の形態1から3と同様の構成及び効果については省略する。また、実施の形態1から3と同一の構成については同一の符号を用いる。
Embodiment 4
The configuration of the image projection apparatus according to Embodiment 4 of the present invention will be described with reference to FIGS. FIG. 10 is a functional configuration diagram of the image projection apparatus according to the fourth embodiment. FIG. 11 is a flowchart showing a flow of mask processing of the image projection apparatus according to the fourth embodiment. In the fourth embodiment, differences from the first to third embodiments will be mainly described, and the same configurations and effects as those of the first to third embodiments will be omitted. The same reference numerals are used for the same configurations as in the first to third embodiments.
 図10を用いて、実施の形態4に係る画像投影装置28の、相手側画像投影装置から画像を受信するために必要な構成について説明する。なお、画像投影装置28の、相手側画像投影装置へ画像を送信するために必要な構成については実施の形態1と同様であるため説明を省略する。 A configuration necessary for receiving an image from the counterpart image projecting apparatus of the image projecting apparatus 28 according to the fourth embodiment will be described with reference to FIG. Note that the configuration of the image projecting device 28 necessary for transmitting an image to the counterpart image projecting device is the same as that of the first embodiment, and thus the description thereof is omitted.
 操作部29は、利用者が画像投影装置28を操作したい時に操作するための手段である。利用者は操作部29を操作することで、受信した画像にマスク処理を行うか否かを選択することができる。そして、利用者はマスク処理を行う場合に、操作部29を操作することでマスクパターン記憶部18に記憶された複数のマスクパターンから所望のマスクパターンを選択することができる。その他にも操作部29を操作することで、撮像部11により送信する画像を撮像することができる。操作部29は、例えば、図2のリモコン7や、その制御プログラム等により実現される。 The operation unit 29 is means for operating when the user wants to operate the image projection device 28. The user can select whether to perform mask processing on the received image by operating the operation unit 29. When performing mask processing, the user can select a desired mask pattern from a plurality of mask patterns stored in the mask pattern storage unit 18 by operating the operation unit 29. In addition, by operating the operation unit 29, an image to be transmitted by the imaging unit 11 can be captured. The operation unit 29 is realized by, for example, the remote controller 7 in FIG. 2 or its control program.
 受信部14は、通信先の画像投影装置からネットワーク100を介して送信された画像を受信する手段である。さらに、受信部14は、画像認識部23及び後述する受信画像処理部30に受信した画像を送信する。受信部14は、例えば、制御プログラム等により実現される。 The receiving unit 14 is a means for receiving an image transmitted via the network 100 from the image projection apparatus that is the communication destination. Further, the reception unit 14 transmits the received image to the image recognition unit 23 and a received image processing unit 30 described later. The receiving unit 14 is realized by, for example, a control program.
 受信画像処理部30は、受信部14が受信した画像に画像処理を行う手段であり、例えば、図2のCPU4によって実行されるプログラム等によって実現される。受信画像処理部30は、利用者が操作部29を介して選択したマスクパターンを、マスクパターン記憶部18の中から選択する。さらに、受信画像処理部30は、利用者によって選択されたマスクパターンを受信した画像に適用した場合に、十分な臨場感や同室感が得られないと判断した場合、マスクパターン修正部27に、選択されたマスクパターンに関する情報及び十分な臨場感や同室感を得るために必要な修正情報を送信する。そして、受信画像処理部30は、マスクパターン修正部27によって修正されたマスクパターンの情報を受信すると、その修正されたマスクパターンを用いて受信画像にマスク処理行う。受信画像処理部30は、マスク処理を行った画像を投影部19に送信する。 The received image processing unit 30 is a means for performing image processing on the image received by the receiving unit 14, and is realized, for example, by a program executed by the CPU 4 in FIG. The received image processing unit 30 selects a mask pattern selected by the user via the operation unit 29 from the mask pattern storage unit 18. Furthermore, when the received image processing unit 30 determines that sufficient presence or room feeling cannot be obtained when the mask pattern selected by the user is applied to the received image, the mask pattern correction unit 27 Information related to the selected mask pattern and correction information necessary to obtain a sufficient sense of presence and room feeling are transmitted. When the received image processing unit 30 receives information on the mask pattern corrected by the mask pattern correcting unit 27, the received image processing unit 30 performs mask processing on the received image using the corrected mask pattern. The received image processing unit 30 transmits the image subjected to the mask process to the projection unit 19.
 投影部19は、受信画像処理部30によって画像処理された画像を表示部20に投影するため手段であり、図2のCPU201によって実行されるプログラム等により実現される。 The projection unit 19 is means for projecting the image processed by the received image processing unit 30 onto the display unit 20, and is realized by a program executed by the CPU 201 in FIG.
 図11は、実施の形態4に係る画像投影装置のマスク処理の流れを示すフローチャートである。ここでは、利用者が図4(a)のマスクパターンを選択し、その選択した図4(a)のマスクパターンを修正し、その修正したマスクパターンを受信画像に適用するマスク処理の流れを説明する。 FIG. 11 is a flowchart showing a mask processing flow of the image projection apparatus according to the fourth embodiment. Here, the flow of mask processing in which the user selects the mask pattern of FIG. 4A, corrects the selected mask pattern of FIG. 4A, and applies the corrected mask pattern to the received image will be described. To do.
 複数の画像投影装置28間でテレコミュニケーションが開始されると、画像投影装置28の受信部14は、通信先の画像投影装置からネットワーク100を介して送信された画像を受信する(ST14)。そして、受信部14は、受信した画像を画像認識部23、受信画像処理部30にそれぞれ送信する。 When telecommunications are started between the plurality of image projection devices 28, the receiving unit 14 of the image projection device 28 receives an image transmitted from the image projection device of the communication destination via the network 100 (ST14). Then, the receiving unit 14 transmits the received image to the image recognition unit 23 and the received image processing unit 30, respectively.
 続いて、受信画像処理部30は、操作部29を介して利用者がマスクパターン記憶部18に記憶されたマスクパターン(図4の(a)~図4(h))の中から所定のマスクパターンを選択されているかを判定する(ST24)。ここでは、図4(a)のマスクパターンすなわちストライプ状のマスクパターンが選択されているため、ST34に移行する(ST24のYES)。なお、もし利用者によってマスクパターンが選択されていない場合、受信画像処理部30は、マスク処理を行わない画像を投影部19に送信する(ST24のNO)。ST24でNOの場合、すなわち利用者によってマスクパターンが選択されない場合はST74-2に移行し、投影部19は、マスクを施さない画像を表示部20に投影する(ST74-2)。 Subsequently, the received image processing unit 30 uses a predetermined mask from mask patterns (FIGS. 4A to 4H) stored in the mask pattern storage unit 18 by the user via the operation unit 29. It is determined whether a pattern is selected (ST24). Here, since the mask pattern of FIG. 4A, that is, the stripe-shaped mask pattern is selected, the process proceeds to ST34 (YES in ST24). If the mask pattern is not selected by the user, the received image processing unit 30 transmits an image not subjected to mask processing to the projection unit 19 (NO in ST24). If NO in ST24, that is, if the mask pattern is not selected by the user, the process proceeds to ST74-2, and the projection unit 19 projects an image without masking on the display unit 20 (ST74-2).
 画像認識部23は受信部14から送信された画像の内容を認識する(ST34)。ここでは、認識した画像をそのまま表示部20に表示すると横方向の視点のずれが大きくなる等の情報を認識する。そして画像認識部23はこれらの情報を受信画像処理部30に送信する。 The image recognition unit 23 recognizes the content of the image transmitted from the reception unit 14 (ST34). Here, when the recognized image is displayed on the display unit 20 as it is, information such as a large deviation of the viewpoint in the horizontal direction is recognized. Then, the image recognition unit 23 transmits these pieces of information to the received image processing unit 30.
 受信処理部30は利用者によって選択された図4(a)のストライプ状のマスクパターンでは、十分な臨場感や同室感が得られないと判断した場合(ST44のYes)、図4(a)のマスクパターン情報及び修正情報をマスクパターン修正部27へ送信する。マスクパターン修正部27は、修正情報に基づいて、図4(a)のマスクパターンを修正する(ST54)。そして、マスクパターン修正部27は、修正した図4(a)のマスクパターンを受信画像処理部30に送信する。 When the reception processing unit 30 determines that the presence of the striped mask pattern of FIG. 4A selected by the user does not provide sufficient presence or room feeling (Yes in ST44), FIG. 4A. The mask pattern information and the correction information are transmitted to the mask pattern correction unit 27. The mask pattern correcting unit 27 corrects the mask pattern shown in FIG. 4A based on the correction information (ST54). Then, the mask pattern correcting unit 27 transmits the corrected mask pattern in FIG. 4A to the received image processing unit 30.
 受信画像処理部30は、受信した画像に、マスクパターン修正部27で修正された図4(a)のマスクパターン情報を組み合わせ(S64)、マスクのかかっていない部分の画像のみを投影部19に送信する。そして、投影部19は、受信画像処理部30から送信されたストライプ状のマスクが施された画像を表示部20に投影する(ST74-1)。 The received image processing unit 30 combines the received image with the mask pattern information of FIG. 4A corrected by the mask pattern correcting unit 27 (S64), and only the image of the unmasked portion is sent to the projection unit 19. Send. Then, projecting unit 19 projects the image with the striped mask transmitted from received image processing unit 30 onto display unit 20 (ST74-1).
 一方、ST44でNoと判断した場合、すなわち、利用者によって選択された図4(a)のマスクパターンで十分な臨場感や同室感が得られと判断した場合は、受信した画像と図4(a)のマスクパターンを組み合わせ(S64)、マスクのかかっていない部分の画像のみを投影部19に送信する。そして、投影部19は、受信画像処理部30から送信されたストライプ状のマスクが施された画像を表示部20に投影する(ST74-1)。 On the other hand, when it is determined No in ST44, that is, when it is determined that sufficient presence or room feeling can be obtained with the mask pattern of FIG. 4A selected by the user, the received image and FIG. The mask patterns of a) are combined (S64), and only the image of the portion where no mask is applied is transmitted to the projection unit 19. Then, projecting unit 19 projects the image with the striped mask transmitted from received image processing unit 30 onto display unit 20 (ST74-1).
 実施の形態4では、利用者が選択したマスクパターンを、より臨場感や同室感が得られるようにさらに修正を行うため、より最適なマスクパターン施した画像を表示部に表示することが可能となる。そのため、画像内のパースと視聴者の目を基準とした視点のずれが、映像にかけられたマスクにより緩和され、より高い臨場感、同室感を感じることが可能となる。 In the fourth embodiment, the mask pattern selected by the user is further modified so as to obtain a more realistic feeling and room feeling, so that an image with a more optimal mask pattern can be displayed on the display unit. Become. For this reason, the perspective shift in the image and the viewpoint of the viewer's eyes is alleviated by the mask applied to the video, and a higher sense of realism and room feeling can be felt.
 実施の形態5
 本発明の実施の形態5に係る画像投影装置の構成について図12、図13を用いて説明する。図12は、実施の形態5に係る画像投影装置の機能構成図である。図13は、実施の形態5に係る画像投影装置のマスク処理の流れを示すフローチャートである。なお、本実施の形態5では実施の形態1から4との相違点を中心に説明し、実施の形態1から4と同様の構成及び効果については省略する。また、実施の形態1から4と同一の構成については同一の符号を用いる。
Embodiment 5
The configuration of the image projection apparatus according to the fifth embodiment of the present invention will be described with reference to FIGS. FIG. 12 is a functional configuration diagram of the image projection apparatus according to the fifth embodiment. FIG. 13 is a flowchart showing a flow of mask processing of the image projection apparatus according to the fifth embodiment. In the fifth embodiment, differences from the first to fourth embodiments will be mainly described, and the same configurations and effects as those of the first to fourth embodiments will be omitted. The same reference numerals are used for the same configurations as in the first to fourth embodiments.
 図12を用いて、実施の形態5に係る画像投影装置31の、相手側画像投影装置から画像を受信するために必要な構成について説明する。なお、画像投影装置31の、相手側画像投影装置へ画像を送信するために必要な構成については実施の形態1と同様であるため説明を省略する。 A configuration necessary for receiving an image from the counterpart image projecting apparatus of the image projecting apparatus 31 according to the fifth embodiment will be described with reference to FIG. Note that the configuration of the image projecting device 31 that is necessary for transmitting an image to the counterpart image projecting device is the same as that of the first embodiment, and a description thereof will be omitted.
 受信部14は、通信先の画像投影装置からネットワーク100を介して送信された画像を受信する手段である。さらに、受信部14は、画像認識部23及び後述する受信画像処理部32に受信した画像を送信する。受信部14は、例えば、制御プログラム等により実現される。 The receiving unit 14 is a means for receiving an image transmitted via the network 100 from the image projection apparatus that is the communication destination. Further, the receiving unit 14 transmits the received image to the image recognition unit 23 and a received image processing unit 32 described later. The receiving unit 14 is realized by, for example, a control program.
 受信画像処理部32は、受信部14が受信した画像に画像処理を行う手段であり、例えば、図2のCPU4によって実行されるプログラム等によって実現される。具体的には、受信画像処理部32は、画像認識部23から送信された認識結果を受信すると、後述するマスクパターン生成部33にこの認識結果を送信する。そして、受信画像処理部32は、マスクパターン生成部33によって生成されたマスクに関する情報を受信すると、その生成されたマスクパターンを用いて受信画像にマスク処理行う。受信画像処理部32は、マスク処理を行った画像を投影部19に送信する。 The received image processing unit 32 is a means for performing image processing on the image received by the receiving unit 14, and is realized by, for example, a program executed by the CPU 4 in FIG. Specifically, when the reception image processing unit 32 receives the recognition result transmitted from the image recognition unit 23, the reception image processing unit 32 transmits the recognition result to a mask pattern generation unit 33 described later. When the received image processing unit 32 receives information on the mask generated by the mask pattern generating unit 33, the received image processing unit 32 performs mask processing on the received image using the generated mask pattern. The received image processing unit 32 transmits the image subjected to the mask process to the projection unit 19.
 マスクパターン生成部33は、受信画像処理部32から、認識結果に関する情報を受信する。そして、マスクパターン生成部33は、認識結果に関する情報に基づいてマスクパターンを生成する。例えば、マスクパターン生成部33が、横方向の視点のずれが緩和されたほうが臨場感、同室感を得られると判断した場合は、ストライプ状のマスクを生成する。他には、縦方向の視点のずれが緩和されたほうが臨場感、同室感を得られると判断した場合は、ボーダー状のマスクを生成する。マスクパターン生成部33は、生成したマスクパターンに関する情報を受信画像処理部32に送信する。 The mask pattern generation unit 33 receives information related to the recognition result from the received image processing unit 32. Then, the mask pattern generation unit 33 generates a mask pattern based on the information related to the recognition result. For example, if the mask pattern generation unit 33 determines that the sense of presence and room feeling can be obtained when the lateral viewpoint shift is reduced, a striped mask is generated. In addition, when it is determined that the presence of a sense of realism and the same room can be obtained when the vertical viewpoint deviation is reduced, a border-like mask is generated. The mask pattern generation unit 33 transmits information regarding the generated mask pattern to the received image processing unit 32.
 投影部19は、受信画像処理部32によって画像処理された画像を表示部20に投影するため手段であり、図2のCPU201によって実行されるプログラム等により実現される。 The projection unit 19 is means for projecting the image processed by the received image processing unit 32 onto the display unit 20, and is realized by a program executed by the CPU 201 in FIG.
 図13は、実施の形態5に係る画像投影装置31のマスク処理の流れを示すフローチャートである。 FIG. 13 is a flowchart showing a flow of mask processing of the image projection apparatus 31 according to the fifth embodiment.
 複数の画像投影装置31間でテレコミュニケーションが開始されると、画像投影装置31の受信部14は、通信先の画像投影装置からネットワーク100を介して送信された画像を受信する(ST15)。そして、受信部14は、受信した画像を画像認識部23、受信画像処理部32にそれぞれ送信する。 When telecommunications are started between the plurality of image projection apparatuses 31, the reception unit 14 of the image projection apparatus 31 receives an image transmitted from the image projection apparatus as a communication destination via the network 100 (ST15). Then, the receiving unit 14 transmits the received image to the image recognition unit 23 and the received image processing unit 32, respectively.
 画像認識部23は受信部14から送信された画像の内容を認識する(ST25)。ここでは、画像認識部23は、例えば、画像内に映る人、家具の配置、壁、床、天井等の情報を認識する。そして画像認識部23はこれらの認識結果に関する情報を受信画像処理部32に送信する。そして、受信画像処理部32は、画像認識部23から送信された認識結果を受信すると、マスクパターン生成部33にこの認識結果を送信する。 The image recognition unit 23 recognizes the content of the image transmitted from the reception unit 14 (ST25). Here, for example, the image recognition unit 23 recognizes information such as a person, furniture arrangement, walls, floors, ceilings, and the like shown in the image. Then, the image recognition unit 23 transmits information regarding these recognition results to the reception image processing unit 32. Then, when the reception image processing unit 32 receives the recognition result transmitted from the image recognition unit 23, the reception image processing unit 32 transmits the recognition result to the mask pattern generation unit 33.
 マスクパターン生成部33は、認識結果に関する情報を受信すると、認識結果に基づいてマスクパターンを生成する(ST35)。例えば、マスクパターン生成部33が、横方向の視点のずれが緩和されたほうが臨場感、同室感を得られると判断した場合は、ストライプ状のマスクを生成する。他には、縦方向の視点のずれが緩和されたほうが臨場感、同室感を得られると判断した場合は、ボーダー状のマスクを生成する。そして、マスクパターン生成部33は、生成したマスクに関する情報を受信画像処理部32に送信する。 When the mask pattern generation unit 33 receives information on the recognition result, the mask pattern generation unit 33 generates a mask pattern based on the recognition result (ST35). For example, if the mask pattern generation unit 33 determines that the sense of presence and room feeling can be obtained when the lateral viewpoint shift is reduced, a striped mask is generated. In addition, when it is determined that the presence of a sense of realism and the same room can be obtained when the vertical viewpoint deviation is reduced, a border-like mask is generated. Then, the mask pattern generation unit 33 transmits information regarding the generated mask to the reception image processing unit 32.
 受信画像処理部32は、受信した画像に、マスク画像生成部によって生成されたマスクパターンを組み合わせ(ST45)、マスクのかかっていない部分の画像のみを投影部19に送信する。 The received image processing unit 32 combines the received image with the mask pattern generated by the mask image generating unit (ST45), and transmits only the image of the unmasked portion to the projection unit 19.
 そして、投影部19は、受信画像処理部32から送信されたマスクが施された画像を表示部20に投影する(ST55)。 Then, the projecting unit 19 projects the masked image transmitted from the received image processing unit 32 on the display unit 20 (ST55).
 実施の形態5では、受信した画像の内容に応じて最適なマスクパターンを新たに生成するため、はじめからマスクパターンを記憶部に記憶し、その中から選択する場合に比べ、より最適なマスクパターンを施した画像を表示部に表示することが可能となる。そのため、画像内のパースと視聴者の目を基準とした視点のずれが、映像にかけられたマスクにより緩和され、より高い臨場感、同室感を感じることが可能となる。 In the fifth embodiment, since an optimal mask pattern is newly generated according to the content of the received image, the mask pattern is stored in the storage unit from the beginning, and the optimal mask pattern is selected compared to the case of selecting from among them. It is possible to display an image subjected to the process on the display unit. For this reason, the perspective shift in the image and the viewpoint of the viewer's eyes is alleviated by the mask applied to the video, and a higher sense of realism and room feeling can be felt.
 1 遠隔コミュニケ―ションシステム、2 画像投影装置、3 サーバ装置、4 CPU、5 メモリ、6 記録装置、7 リモコン、8 カメラ、9 マイク、10 スピーカ、11 撮像部、12 送信画像処理部、13 送信部、14 受信部 、15 通信部、16 操作部、17 受信画像処理部、18 マスクパターン記憶部、19 投影部、20 表示部、21 画像投影装置、22 操作部、23 画像認識部、24 受信画像処理部、25 画像投影装置、26 受信画像処理部、27 マスクパターン修正部
28 画像投影装置、29 操作部、30 受信画像処理部、31 画像処理装置、32 受信画像処理部、33 マスクパターン生成部
DESCRIPTION OF SYMBOLS 1 Remote communication system, 2 Image projection apparatus, 3 Server apparatus, 4 CPU, 5 Memory, 6 Recording apparatus, 7 Remote control, 8 Camera, 9 Microphone, 10 Speaker, 11 Imaging part, 12 Transmission image processing part, 13 Transmission Unit, 14 receiver unit, 15 communication unit, 16 operation unit, 17 received image processing unit, 18 mask pattern storage unit, 19 projection unit, 20 display unit, 21 image projection device, 22 operation unit, 23 image recognition unit, 24 reception Image processing unit, 25 image projection device, 26 received image processing unit, 27 mask pattern correction unit 28 image projection device, 29 operation unit, 30 received image processing unit, 31 image processing device, 32 received image processing unit, 33 mask pattern generation Part

Claims (9)

  1.  遠隔地との間で互いに画像によるコミュニケ―ションを行うことが可能な情報通信システムに用いられる画像投影装置であって、
     前記画像投影装置は、
     入力された画像の一部をマスク処理する受信画像処理部と、
     前記受信画像処理部によりマスク処理された画像を投影する投影部と、
    を備えたことを特徴とする画像投影装置。
    An image projection apparatus used in an information communication system capable of communicating with each other by an image,
    The image projector is
    A received image processing unit for masking a part of the input image;
    A projection unit that projects the image masked by the received image processing unit;
    An image projection apparatus comprising:
  2. 前記画像投影装置は、
    前記マスク処理のマスクパターンを記憶するマスクパターン記憶部を備え、
    前記受信画像処理部は、前記マスクパターン記憶部に記憶された前記マスクパターンから選択し、そのパターンに対応したマスク処理を行うことを特徴とする請求項1に記載の画像投影装置。
    The image projector is
    A mask pattern storage unit for storing a mask pattern of the mask processing;
    The image projection apparatus according to claim 1, wherein the received image processing unit selects from the mask patterns stored in the mask pattern storage unit and performs mask processing corresponding to the patterns.
  3. 前記画像投影装置は、
    前記マスクパターン記憶部に記憶された前記マスクパターンを選択可能な操作部を備え、
    前記受信画像処理部は、前記操作部により選択された前記マスクパターンに対応したマスク処理を行うことを特徴とする請求項2に記載の画像投影装置。
    The image projector is
    An operation unit capable of selecting the mask pattern stored in the mask pattern storage unit;
    The image projection apparatus according to claim 2, wherein the received image processing unit performs a mask process corresponding to the mask pattern selected by the operation unit.
  4. 前記画像投影装置は、
    入力された画像の状態を認識する画像認識手段を備え、
    前記受信画像処理部は、前記画像認識手段の認識結果に基づいて前記マスクパターンを決定することを特徴とする請求項2に記載の画像投影装置。
    The image projector is
    Image recognition means for recognizing the state of the input image,
    The image projection apparatus according to claim 2, wherein the received image processing unit determines the mask pattern based on a recognition result of the image recognition unit.
  5. 前記画像投影装置は、
    前記画像認識手段の認識結果に基づいて、前記マスクパターン記憶部に記憶された前記マスクパターンを修正するマスクパターン修正部を備え、
    前記受信画像処理部は、前記マスクパターン修正部により修正されたマスクパターンに対応したマスク処理を行うことを特徴とする請求項4に記載の画像投影装置。
    The image projector is
    A mask pattern correction unit for correcting the mask pattern stored in the mask pattern storage unit based on the recognition result of the image recognition unit;
    The image projection apparatus according to claim 4, wherein the received image processing unit performs mask processing corresponding to the mask pattern corrected by the mask pattern correcting unit.
  6. 前記画像投影装置は、
    前記マスクパターン記憶部に記憶された前記マスクパターンを選択可能な操作部と、
    入力された画像の状態を認識する画像認識手段と、
    前記操作部により選択されたマスクパターンと前記画像認識手段の認識結果に基づいて、前記マスクパターン記憶部に記憶された前記マスクパターンを修正するマスクパターン修正部を備え、
    前記受信画像処理部は、前記マスクパターン修正部により修正されたマスクパターンに対応したマスク処理を行うことを特徴とする請求項2に記載の画像投影装置。
    The image projector is
    An operation unit capable of selecting the mask pattern stored in the mask pattern storage unit;
    Image recognition means for recognizing the state of the input image;
    A mask pattern correction unit for correcting the mask pattern stored in the mask pattern storage unit based on the mask pattern selected by the operation unit and the recognition result of the image recognition unit;
    The image projection apparatus according to claim 2, wherein the received image processing unit performs mask processing corresponding to the mask pattern corrected by the mask pattern correction unit.
  7.  前記マスクパターンは、入力された前記画像に対して縦方向の短冊状であることを特徴とする請求項1から6のいずれか1項に記載の画像投影装置。 The image projection apparatus according to any one of claims 1 to 6, wherein the mask pattern has a strip shape in a vertical direction with respect to the input image.
  8.  前記マスクパターンは、入力された前記画像に対して横方向の短冊状であることを特徴とする請求項1から6のいずれか1項に記載の画像投影装置。 The image projection apparatus according to any one of claims 1 to 6, wherein the mask pattern has a strip shape in a horizontal direction with respect to the input image.
  9. 前記画像投影装置は、
    入力された画像の状態を認識する画像認識手段と、
    前記画像認識手段の認識結果に基づいて、前記マスクパターンを生成するマスクパターン生成部を備え、
    前記受信画像処理部は、前記マスクパターン修正部により修正されたマスクパターンに対応したマスク処理を行うことを特徴とする請求項1に記載の画像投影装置。
    The image projector is
    Image recognition means for recognizing the state of the input image;
    A mask pattern generation unit that generates the mask pattern based on the recognition result of the image recognition means,
    The image projection apparatus according to claim 1, wherein the received image processing unit performs mask processing corresponding to the mask pattern corrected by the mask pattern correction unit.
PCT/JP2016/069834 2016-07-05 2016-07-05 Image projection device WO2018008077A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2017520559A JP6249136B1 (en) 2016-07-05 2016-07-05 Image projection device
PCT/JP2016/069834 WO2018008077A1 (en) 2016-07-05 2016-07-05 Image projection device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/069834 WO2018008077A1 (en) 2016-07-05 2016-07-05 Image projection device

Publications (1)

Publication Number Publication Date
WO2018008077A1 true WO2018008077A1 (en) 2018-01-11

Family

ID=60685592

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/069834 WO2018008077A1 (en) 2016-07-05 2016-07-05 Image projection device

Country Status (2)

Country Link
JP (1) JP6249136B1 (en)
WO (1) WO2018008077A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006148425A (en) * 2004-11-18 2006-06-08 Keio Gijuku Method and apparatus for image processing, and content generation system
WO2016072118A1 (en) * 2014-11-07 2016-05-12 ソニー株式会社 Information processing system, storage medium, and control method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012120105A (en) * 2010-12-03 2012-06-21 Oki Electric Ind Co Ltd Video communication system, video communication apparatus, and information disclosure degree adjusting apparatus

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006148425A (en) * 2004-11-18 2006-06-08 Keio Gijuku Method and apparatus for image processing, and content generation system
WO2016072118A1 (en) * 2014-11-07 2016-05-12 ソニー株式会社 Information processing system, storage medium, and control method

Also Published As

Publication number Publication date
JP6249136B1 (en) 2017-12-20
JPWO2018008077A1 (en) 2018-07-12

Similar Documents

Publication Publication Date Title
US10148909B2 (en) Immersive telepresence anywhere
RU2665872C2 (en) Stereo image viewing
WO2017113734A1 (en) Video multipoint same-screen play method and system
US20220264068A1 (en) Telepresence system and method
US20100103244A1 (en) device for and method of processing image data representative of an object
EP2352290B1 (en) Method and apparatus for matching audio and video signals during a videoconference
JP2010206307A (en) Information processor, information processing method, information processing program, and network conference system
WO2003081921A1 (en) 3-dimensional image processing method and device
JP2017085372A (en) Communication system, communication device, communication method and program
JP7074056B2 (en) Image processing equipment, image processing systems, and image processing methods, and programs
JP2024506390A (en) Video conference device, video conference method, and computer program using spatial virtual reality environment
WO2017141584A1 (en) Information processing apparatus, information processing system, information processing method, and program
WO2021095573A1 (en) Information processing system, information processing method, and program
Cooperstock Multimodal telepresence systems
JP6157077B2 (en) Display device with camera
TW202244899A (en) Customized audio mixing for users in virtual conference calls
JP6149433B2 (en) Video conference device, video conference device control method, and program
CN112740150A (en) Apparatus and method for processing audiovisual data
JP6249136B1 (en) Image projection device
JP2022054192A (en) Remote conference system, server, photography device, audio output method, and program
WO2019146425A1 (en) Image processing device, image processing method, program, and projection system
WO2018198790A1 (en) Communication device, communication method, program, and telepresence system
KR102404130B1 (en) Device for transmitting tele-presence image, device for receiving tele-presence image and system for providing tele-presence image
JP2020530218A (en) How to project immersive audiovisual content
KR20150113795A (en) Apparatus and Method for Controlling Eye-contact Function

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2017520559

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16908121

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16908121

Country of ref document: EP

Kind code of ref document: A1