WO2021193171A1 - Image projection device and image correction method - Google Patents

Image projection device and image correction method Download PDF

Info

Publication number
WO2021193171A1
WO2021193171A1 PCT/JP2021/010239 JP2021010239W WO2021193171A1 WO 2021193171 A1 WO2021193171 A1 WO 2021193171A1 JP 2021010239 W JP2021010239 W JP 2021010239W WO 2021193171 A1 WO2021193171 A1 WO 2021193171A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
projection
unit
projected
obstacle
Prior art date
Application number
PCT/JP2021/010239
Other languages
French (fr)
Japanese (ja)
Inventor
浩 竹下
真哉 三原
礼子 近藤
友樹 杉山
直史 古川
Original Assignee
株式会社Jvcケンウッド
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2020054877A external-priority patent/JP2021158445A/en
Priority claimed from JP2020054879A external-priority patent/JP2021158447A/en
Priority claimed from JP2020054878A external-priority patent/JP2021158446A/en
Application filed by 株式会社Jvcケンウッド filed Critical 株式会社Jvcケンウッド
Publication of WO2021193171A1 publication Critical patent/WO2021193171A1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/10Intensity circuits
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/38Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor

Definitions

  • the present invention relates to an image projection device and an image correction method.
  • Patent Document 1 A so-called projection mapping technique is known in which an image is projected from a plurality of image projection devices onto one object to be projected to form an image (for example, Patent Document 1).
  • Patent Document 1 when a region where the light for projecting an image is blocked occurs, the image projection device is controlled so as to prohibit the projection of the light onto the region.
  • Patent Document 1 It may be difficult to form an image intended for output as an object to be projected by simply prohibiting the projection of light as in Patent Document 1. That is, image quality deterioration such as a part of the image becoming dark or chipped may occur.
  • An object of the present invention is to provide an image projection device and an image correction method that can prevent changes in the brightness of a part of a projected image and deterioration of image quality such as chipping.
  • the image projection device of the present invention is paired with a plurality of projection units that project from different image projection angles and each of the plurality of projection units so that a plurality of projected images form overlapping overlapping images in the same region.
  • a plurality of imaging units that capture the same region from an angle corresponding to the image projection angle of each projection unit to generate an captured image, and the same region as the projection unit based on the captured image.
  • the image correction method of the present invention is provided so as to be paired with each of a plurality of projection devices projecting from different image projection angles so that a plurality of projected images form overlapping overlapping images in the same region.
  • the image projection device of the present invention generates an captured image by imaging a plurality of projection units projecting from different image projection angles and the same region so that a plurality of projected images form overlapping overlapping images in the same region.
  • An obstacle detection unit that detects an obstacle between the projection unit and the same region based on the captured image, and a control unit that controls the projection image to be projected onto the projection unit.
  • the control unit is based on an captured image obtained by switching and projecting the plurality of projection units for each frame and capturing the projected image switched for each frame by the imaging unit, and a projected image. Therefore, it is characterized in that it is controlled so as to specify the position of the shadow of the projected image.
  • the image projection device of the present invention is paired with a plurality of projection units that project from different image projection angles and each of the plurality of projection units so that a plurality of projected images form overlapping overlapping images in the same region.
  • a plurality of imaging units that capture the same region from an angle corresponding to the image projection angle of each projection unit to generate an image, and the same region as the projection unit based on the captured image.
  • An extraction unit that extracts an image of the obstacle, an image correction unit that generates a corrected image obtained by correcting the extracted image of the obstacle to a position and size corresponding to the shadow region, and an image of the obstacle. It is characterized in that it includes a projection unit that is paired with an image pickup unit that has imaged the image, and a control unit that controls a projection unit different from the image pickup unit to project the corrected image.
  • the image projection device of the present invention has a transparent screen capable of projecting an image from both sides, and a first surface side projection unit and a second surface that project a projected image from the first surface side onto the same area of the screen.
  • a second surface side projection unit that projects a projected image from the side, and each of the first surface side projection unit and the second surface side projection unit are provided so as to form a pair, and the image projection angle of each projection unit.
  • the present invention it is possible to prevent changes in the brightness of a part of the projected image and deterioration of image quality such as chipping.
  • FIG. 1 is a functional block diagram showing a configuration of an image projection system according to a first embodiment of the present invention.
  • FIG. 2 is a block diagram showing an example of a configuration for realizing the functional block shown in FIG.
  • FIG. 3 is a diagram showing an example of the positional relationship between the projected object, the projection device, and the image pickup device for projecting an image by the plurality of projection devices and imaging by the plurality of image pickup devices.
  • FIG. 4 is a diagram showing an example of the relationship between the first projected image, the second projected image, the overlapping image, the first captured image, and the second captured image.
  • FIG. 5 is a diagram showing an example of a case where there is an obstacle within the angle of view of projection by the projection device and within the angle of view of imaging by the imaging device.
  • FIG. 6 is a diagram showing an example of the relationship between the first projected image, the second projected image, the overlapping image, the first captured image, and the second captured image.
  • FIG. 7 is a diagram showing an example of the relationship between the first projected image, the second projected image, the duplicated image, the first captured image, and the second captured image when the corrected image is projected.
  • FIG. 8 is a flowchart showing the flow of processing related to the detection of obstacles and shadow areas.
  • FIG. 9 is a schematic diagram showing an example of the relationship between the optical axes of the projection device and the image pickup device, which are treated as one set.
  • FIG. 10 is a schematic view showing a mechanism of a configuration in which an optical axis on which a projection device projects an image via a half mirror and an optical axis on which an imaging device images an overlapping region are provided so as to overlap each other.
  • FIG. 11 is a functional block diagram showing a configuration of an image projection system according to a second embodiment of the present invention.
  • FIG. 12 is a block diagram showing an example of a configuration for realizing the functional block shown in FIG.
  • FIG. 13 is a diagram showing an example of projecting an image on an overlapping region and imaging the overlapping region when there is no obstacle.
  • FIG. 14 is a diagram showing an example of projecting an image on an overlapping region and imaging the overlapping region when there is an obstacle.
  • FIG. 15 is a diagram showing an example of projection of an image on an overlapping region and imaging of the overlapping region after control corresponding to the detection of an obstacle is performed.
  • FIG. 16 is a flowchart showing the flow of processing related to the detection of obstacles and shadow areas.
  • FIG. 17 is a diagram showing a configuration example when there is only one imaging unit.
  • FIG. 18 is a functional block diagram showing a configuration of an image projection system according to a third embodiment of the present invention.
  • FIG. 19 is a block diagram showing an example of a configuration for realizing the functional block shown in FIG.
  • FIG. 20 is a diagram showing an example of the relationship between the first projected image, the second projected image, the projected image, the first captured image, and the second captured image.
  • FIG. 21 is a diagram showing an example of the relationship between the first projected image, the second projected image, the projected image, the first captured image, and the second captured image when the corrected image is projected.
  • FIG. 22 is a flowchart showing the flow of processing related to the detection of obstacles and shadow areas.
  • FIG. 23 is a diagram showing the positional relationship between the projected object, the projection device, and the image pickup device according to the fourth embodiment of the present invention.
  • FIG. 24 is a diagram showing an example of the relationship between the first projected image, the second projected image, the projected image, the back surface image, the first captured image, and the second captured image.
  • FIG. 25 is a diagram showing an example of a case where the obstacle OB2 is within the angle of view of the projection by the projection device and the angle of view of the image captured by the image pickup device in the fourth embodiment.
  • FIG. 26 is a diagram showing an example of the relationship between the first projected image, the second projected image, the projected image, the back surface image, the first captured image, and the second captured image.
  • FIG. 27 is a diagram showing an example of the relationship between the first projected image, the second projected image, the projected image, the back surface image, the first captured image, and the second captured image when the corrected image is projected. Is.
  • FIG. 1 is a functional block diagram showing a configuration of an image projection system 1 according to a first embodiment of the present invention.
  • the image projection system 1 includes a plurality of projection units, a plurality of imaging units, a storage unit 11, an image acquisition unit 12, a shadow area identification unit 13, an image correction unit 14, and a control unit 15.
  • the projection unit 21 and the projection unit 22 are illustrated as a plurality of projection units.
  • an imaging unit 31 and an imaging unit 32 are illustrated as a plurality of imaging units.
  • FIG. 2 is a block diagram showing an example of a configuration for realizing the functional block shown in FIG.
  • the image projection system 1 includes, for example, a plurality of projection devices, a plurality of image pickup devices, and an information processing device 50.
  • the flow of the data flow in the information processing apparatus 50 is indicated by a solid arrow, and the optical projection and imaging are indicated by a broken arrow.
  • the information processing device 50 is a so-called computer.
  • the connection form between the information processing device 50 and the plurality of projection devices 61 and 62 and the connection form between the information processing device 50 and the plurality of imaging devices 71 and 72 may be wired or wireless. It may be a mixture of wired and wireless.
  • the specific connection form may be a bus interface such as USB (Universal Serial Bus), a network communication line, or a dedicated connection form. ..
  • Each projection device shown in FIG. 2 includes one projection unit.
  • a projection device 61 and a projection device 62 are illustrated as a plurality of projection devices.
  • the projection devices 61 and 62 are so-called projectors, and project an image onto the projected object 80.
  • the projected object 80 is a so-called screen.
  • the projected object 80 is, for example, a diffusion type, a reflective type, or a regression type screen, and is provided on the assumption that the projected surface can be visually recognized.
  • the projected object 80 may be provided on the assumption that the back surface of the projected surface can be visually recognized, such as a transmissive screen.
  • the projection device 61 includes a projection unit 21.
  • the projection device 62 includes a projection unit 22.
  • the projection units 21 and 22, respectively, are a display element, a light source that irradiates the display element with light, an optical member such as a lens that causes the light reflected or transmitted by the display element to converge as an image on the projectile 80, and an image input from the outside. It is equipped with a control circuit or the like that operates the display element according to the data.
  • Examples of the display element include, but are not limited to, an Elcos (LCOS: Liquid Crystal On Silicon) device, a digital mirror device (DMD: Digital Mirror Device), or a liquid crystal device, and can be changed as appropriate. ..
  • Each imaging device shown in FIG. 2 includes one imaging unit.
  • an image pickup device 71 and an image pickup device 72 are illustrated as a plurality of image pickup devices.
  • the imaging device 71 includes an imaging unit 31.
  • the imaging device 72 includes an imaging unit 32.
  • the image pickup units 31 and 32 each include an image pickup element that functions as a so-called digital camera, a circuit that generates an image based on the output of the image pickup element, and outputs the image as an image pickup image.
  • Examples of the image pickup device include a CMOS (Complementary Metal Oxide Semiconductor) image sensor or a CCD (Charge Coupled Device) image sensor, but the present invention is not limited to these, and can be changed as appropriate.
  • CMOS Complementary Metal Oxide Semiconductor
  • CCD Charge Coupled Device
  • FIG. 3 is a diagram showing an example of the positional relationship between the projected object 80, the projection devices 61, 62, and the image pickup devices 71, 72 for projecting an image by the plurality of projection devices and imaging by the plurality of imaging devices. ..
  • the projection device 61 and the image pickup device 71 are treated as one set P1. That is, the projection unit 21 and the imaging unit 31 are provided so as to form a set P1. Further, the projection device 62 and the image pickup device 72 are treated as another set P2. That is, the projection unit 22 and the imaging unit 32 are provided so as to form a set P2.
  • the angle of view of the projection by the projection device 61 is shown within the acute angle formed by the two solid lines L1. Further, the angle of view of the projection by the projection device 62 is shown within the acute angle formed by the two solid lines L2.
  • the projection device 61 and the projection device 62 project images from different image projection angles so that the projected images overlap each other in the overlapping region 81 of the projected object 80 to form an overlapping image.
  • the plurality of projection units 21 form a duplicate image in which the projection image projected by the projection unit 21 shown in FIGS. 1 and 2 and the plurality of projection images produced by the projection image projected by the projection unit 22 overlap in the same region. , 22 project from different image projection angles.
  • the overlapping area 81 functions as the same area.
  • the angle of view of the image captured by the imaging device 71 is shown within the acute angle formed by the two broken lines L3. Further, the angle of view of the image captured by the imaging device 72 is shown within the acute angle formed by the two broken lines L4.
  • the imaging device 71 shown in FIG. 3 is provided so as to be paired with the projection device 61, and images the overlapping region 81 from an angle corresponding to the image projection angle of the projection device 61 to be paired with the projection device 61 to generate an captured image. Further, the imaging device 72 images the overlapping region 81 from an angle corresponding to the image projection angle of the projection device 62 which is provided to be paired with the projection unit 22 and generates an captured image. That is, the imaging unit 31 shown in FIGS.
  • the imaging unit 32 is provided so as to be paired with the projection unit 22, and images the overlapping region 81 from an angle corresponding to the image projection angle of the projection unit 22 to be paired with the projection unit 22 to generate an captured image.
  • the image projected by the projection unit 21 is referred to as the first projection image.
  • the image projected by the projection unit 22 is referred to as a second projection image.
  • the image formed in the overlapping region 81 of the projected object 80 by the images projected by the projection unit 21 and the projection unit 22 is regarded as the overlapping image.
  • the image captured by the imaging unit 31 is set as the first captured image.
  • the image captured by the imaging unit 32 is used as the second captured image.
  • FIG. 4 is a diagram showing an example of the relationship between the first projected image, the second projected image, the overlapping image, the first captured image, and the second captured image.
  • the relationship shown in FIG. 4 corresponds to the arrangement shown in FIG.
  • the trapezoidal distortion of the image generated by the projection devices 61 and 62 projecting the image obliquely to the projected body 80 and the imaging devices 71 and 72 imaging the image from an oblique angle is appropriately corrected in advance. It shall be.
  • the projection unit 21 of the projection device 61 of FIG. 2 may be configured to perform a predetermined trapezoidal distortion correction process according to the installation position on the first projected image.
  • the projection device 62 of FIG. 2 may be configured in the same manner as the projection device 61, and the image pickup device 72 may be configured in the same manner as the image pickup device 71.
  • the predetermined trapezoidal distortion correction process may be any generally used trapezoidal distortion correction process.
  • FIG. 4 and the like exemplify a case where an image corresponding to the projected image data 11b (see FIG. 2) stored in the storage unit 11 is projected.
  • the case where the image corresponding to the projected image data 11b stored in the storage unit 11 is projected is taken as an example, but the projected image is not limited to this. Images corresponding to the projected image data read from the external storage device connected to the image projection system 1, the projected image data input from the external information processing device connected to the image projection system 1, and the like are projected. May be good.
  • the projection unit 21 is as shown in the overlapping image V1 shown in FIG.
  • An image similar to the image projected by the projection unit 22 is formed in the overlapping region 81.
  • the captured images by the imaging unit 31 and the imaging unit 32 are the same images as the overlapping image V1. This means that there are no obstacles in either the angle of view of the projection by the projection device 61 or the angle of view of the projection by the projection device 62, and the image captured by the image pickup device 71 is captured by the image pickup device 72. This is because there are no obstacles in any of the angles of view of the image.
  • FIG. 5 is a diagram showing an example of a case where there is an obstacle OB within the angle of view of the projection by the projection device 61 and within the angle of view of the image captured by the image pickup device 71.
  • FIG. 6 is a diagram showing an example of the relationship between the first projected image, the second projected image, the overlapping image, the first captured image, and the second captured image. The relationship shown in FIG. 6 corresponds to the arrangement shown in FIG.
  • the obstacle OB when there is an obstacle OB within the angle of view of the projection by the projection device 61 and the angle of view of the image captured by the image pickup device 71, the obstacle OB is the projected light for projecting the image by the projection device 61. Block a part of. Therefore, as in the overlapping image V2 shown in FIG. 6, an image including a shadow region DA corresponding to the light blocked by the obstacle OB is formed in the overlapping region 81.
  • the shadow region DA is included as in the overlapping image V2.
  • the image is formed in the overlapping region 81. As described above, when there is an obstacle OB between one of the plurality of projection portions and the overlapping region 81, the obstacle OB causes a shadow region DA in the overlapping image V2.
  • the image capturing unit 31 displays an image in which the obstacle OB is located inside the shadow region DA as in the first captured image 31b shown in FIG. It is imaged.
  • the image capturing unit 31 displays an image in which the obstacle OB is located inside the shadow region DA as in the first captured image 31b shown in FIG. It is imaged.
  • the contents of the first captured image 31b and the contents of the second captured image 32b Is reversed.
  • the information processing device 50 performs processing according to the specific and specific results.
  • the information processing device 50 includes a storage unit 11 and a calculation unit 51.
  • the storage unit 11 includes a software program (hereinafter, simply referred to as a program) and a storage device capable of storing data. Examples of the storage device included in the storage unit 11 include, but are not limited to, a hard disk drive, a solid state drive, a flash memory, and the like, and can be changed as appropriate.
  • the storage unit 11 may be a combination of a reading device for a recording medium such as an optical disc and a recording medium set in the reading device.
  • the storage unit 11 stores the image projection program 11a and the projected image data 11b.
  • the image projection program 11a is a program read and executed by the calculation unit 51.
  • the arithmetic unit 51 includes an arithmetic circuit that realizes various functions by reading a program and executing it, like a CPU (Central Processing Unit).
  • the calculation unit 51 of the first embodiment functions as an image acquisition unit 12, a shadow area identification unit 13, an image correction unit 14, and a control unit 15 shown in FIGS. 1 and 2 by reading and executing the image projection program 11a. do.
  • the image acquisition unit 12 acquires the image projected by the projection units 21 and 22. Specifically, the image acquisition unit 12 reads and acquires the projected image data 11b from, for example, the storage unit 11.
  • the projected image data 11b acquired by the image acquisition unit 12 can be referred to by the shadow area identification unit 13 and the image correction unit 14. Further, unless the corrected image is generated by the image correction unit 14 described later, the projection units 21 and 22 project an image corresponding to the content of the projected image data 11b acquired by the image acquisition unit 12 (see FIG. 4).
  • the shadow area specifying unit 13 identifies a shadow area generated in the overlapping area 81 due to an obstacle between one projection unit and the overlapping area 81 based on the captured image. Specifically, the shadow region specifying unit 13 performs image analysis for comparing the content of the image captured by the imaging units 31 and 32 with the content of the image acquired by the image acquisition unit 12. More specifically, the shadow region specifying unit 13 has a decrease in brightness in the tendency of the brightness distribution of the contents of the captured images by the imaging units 31 and 32, based on the tendency of the brightness distribution corresponding to the contents of the projected image data 11b. If a subregion is generated, the subregion is extracted as a shadow region.
  • the image acquisition unit 12 acquires the first projected image, and the first captured image and the first projection.
  • the difference from the image is performed and a difference signal is generated.
  • this difference signal it may be configured to specify a partial region in which the brightness has decreased by a certain amount.
  • the partial region is extracted as an image of an obstacle.
  • a partial area of the difference signal that has a difference amount of the difference signal and is different from the feature of the part where the brightness is reduced by a certain amount is specified, and the area of the obstacle. It may be configured to be specified as.
  • the shadow region specifying unit 13 specifies the shadow region
  • the image correction unit 14 generates a corrected image in which a region corresponding to a shadow region in an image projected by one or more projection units other than one projection unit having an obstacle within the angle of view of the projection is corrected. Specifically, the image correction unit 14 determines the brightness of a partial region of the projected image data 11b at a position and size corresponding to the shadow region so that the overlapping image in the overlapping region 81 is drawn with the original brightness. Generates a corrected image that has been corrected to raise. The degree of correction corresponds to the degree of decrease in brightness caused by the shadow region extracted by the shadow region specifying unit 13 in the contents of the projected image data 11b. For example, when the second captured image has a shadow region, it is preferable to perform correction for increasing the brightness based on the difference signal obtained by the image analysis in the shadow region specifying unit 13. When the first captured image has a shadow region, the same correction may be performed.
  • the control unit 15 comprehensively controls each configuration and each functional configuration of the image projection system 1. Specifically, the control unit 15 of the first embodiment projects the corrected image on the projection unit.
  • the control unit 15 projects the corrected image by a projection unit different from the projection unit that is paired with the imaging unit that captures the captured image including an obstacle within the angle of view of the projection that causes the specified shadow region. It is good to let it.
  • FIG. 7 is a diagram showing an example of the relationship between the first projected image, the second projected image, the duplicated image, the first captured image, and the second captured image when the corrected image is projected.
  • the relationship shown in FIG. 7 corresponds to the arrangement shown in FIG. Further, FIG. 7 assumes a case where the corrected image 14a is generated in response to the result of extracting the images of the shadow region DA and the obstacle OB shown in FIG.
  • a partial region having a position and a size corresponding to a shadow region and having an increased brightness in the corrected image 14a is shown as a correction region BA.
  • the detection that "there is an obstacle OB within the angle of view projected by the projection unit 21" is detected in the storage unit 11 in the "second captured image” which is the image captured by the imaging unit 22. Based on the fact that a shadow that should not be present in the projected image is projected, it is configured so that it is determined that "the obstacle OB is within the angle of view of the projection by the projection unit 21".
  • the image of the projection by the projection unit 21 It may be configured so that it is determined that there is an obstacle OB in the corner.
  • the image correction unit 14 generates a corrected image 14a in which correction for increasing the brightness of a partial region having a position and a size corresponding to the shadow region DA in the second captured image is performed on the content of the projected image data 11b.
  • the control unit 15 projects the corrected image 14a on the projection unit 22 as a second projection image.
  • the control unit 15 causes the projection unit 21 to project the contents of the projected image data 11b.
  • the decrease in the brightness of the shadow region caused by the obstacle OB in the projection corresponding to the projected image data 11b by the projection unit 21 is offset by the increase in the brightness of the BA due to the projection corresponding to the corrected image 14a by the projection unit 22. Therefore, like the overlapping image V3 shown in FIG. 7, an image similar to the content of the projected image data 11b is formed in the overlapping region 81.
  • the control unit 15 can confirm that the correction of the shadow region is normally performed based on the first captured image 31c and the second captured image 32c.
  • the data flow of the output image is shown so as to reach the projection units 21 and 22 from the image acquisition unit 12 via the image correction unit 14, but the data flow is limited to this. Not. There may be a data flow from the image acquisition unit 12 to the direct projection units 21 and 22. Regardless of which data flow is adopted, when the control unit 15 generates a corrected image such as the corrected image 14a, the corrected image is projected onto a projection unit where the obstacle is not within the angle of view, and the obstacle is generated. The image acquired by the image acquisition unit 12 is projected onto the projection unit within the angle of view. In the configuration illustrated in FIG. 1 and FIG. 11 described later, the image correction unit 14 has acquired the image correction unit 12 with respect to the projection unit that does not output the corrected image and projects the content corresponding to the image acquisition unit 12. Output corresponding to the content of the image is performed as it is.
  • FIG. 8 is a flowchart showing the flow of processing related to the detection of obstacles and shadow areas. As a premise that such processing is performed, an image is projected by the projection units 21 and 22 corresponding to the image acquired by the image acquisition unit 12.
  • the imaging units 31 and 32 acquire a plurality of captured images by imaging the overlapping region 81 (step S1).
  • the shadow area identification unit 13 detects images of the shadow area and obstacles based on the captured image acquired in the process of step S1 (step S2). When the image of the shadow area and the obstacle is not detected in the process of step S2 (step S2; No), the process ends.
  • step S2 When an image of a shadow region and an obstacle is detected in the process of step S2 (step S2; Yes), the image correction unit 14 compensates for the decrease in the brightness of the shadow region, that is, the portion corresponding to the shadow region. A corrected image in which the brightness of the region is increased is generated (step S3). Then, the control unit 15 projects the corrected image onto the projection unit having no obstacle within the angle of view of the image projection specified based on the process of step S2 (step S4). In the case of the example shown in FIG. 7, it is the projection unit 22 that is treated as a projection unit having no obstacle within the angle of view of the image projection in the process of step S4.
  • step S1 a plurality of captured images are acquired.
  • the control unit 15 determines whether or not the decrease in brightness of the shadow region has been eliminated based on the plurality of captured images (step S5). When it is determined in the process of step S5 that the decrease in brightness of the shadow region has been eliminated (step S5; Yes), the process ends. On the other hand, if it is determined in the process of step S5 that the decrease in brightness of the shadow region has not been resolved (step S5; No), the process returns to the process of step S3.
  • the projection unit and the imaging unit which are treated as one set, have the optical axis of the image projected by the projection unit and the optical axis of the image captured by the imaging unit coincide with each other as much as possible.
  • FIG. 9 is a schematic diagram showing an example of the relationship between the optical axes of the projection device 61 and the image pickup device 71, which are treated as one set P1.
  • the projection device 61 and the image pickup device 71 are arranged side by side so that the optical axis of the image projected onto the overlap area 81 by the projection device 61 and the optical axis of the image pickup of the overlap area 81 by the image pickup device 71 correspond to each other. ing.
  • the angle of view of the projection device 61 and the angle of view of the image pickup device 71 can be matched.
  • FIG. 10 is a schematic diagram showing a mechanism of a configuration in which an optical axis on which a projection device 61 projects an image via a half mirror 90 and an optical axis on which an imaging device 71 images an overlapping region 81 are provided so as to overlap each other.
  • the half mirror 90 on the optical axis on which the projection device 61 projects an image
  • the light emitted by the projection device 61 for projecting the image is transmitted, and the reflected light from the overlapping region 81 is transmitted. It is possible to reflect a part of the light and point it at an angle different from the optical axis.
  • FIG. 10 is a schematic diagram showing a mechanism of a configuration in which an optical axis on which a projection device 61 projects an image via a half mirror 90 and an optical axis on which an imaging device 71 images an overlapping region 81 are provided so as to overlap each other.
  • the image pickup device 71 is arranged so that the half mirror 90 corresponds to a reflection angle that reflects a part of the reflected light from the overlapping region 81.
  • the optical axis on which the projection device 61 projects an image between the overlapping region 81 and the half mirror 90 can be aligned with the optical axis on which the imaging device 71 images the overlapping region 81.
  • the angle of view of the projection device 61 and the angle of view of the image pickup device 71 can be matched with higher accuracy.
  • the set P1 is taken as an example, but the same configuration can be obtained when there is a set P2 or a third or more set (not shown).
  • the decrease in brightness caused by an obstacle can be compensated by projecting a corrected image. Therefore, it is possible to prevent a change in the brightness of a part of the projected image and deterioration of the image quality such as chipping, and it is possible to more reliably maintain the image quality of the image projected on the overlapping region 81.
  • the angle of view of the projection device 61 and the angle of view of the image pickup device 71 can be matched with higher accuracy.
  • FIG. 11 is a functional block diagram showing the configuration of the image projection system 100 according to the second embodiment of the present invention.
  • FIG. 12 is a block diagram showing an example of a configuration for realizing the functional block shown in FIG.
  • the image projection system 100 includes an obstacle detection unit 16 and a projection control unit 17 instead of the shadow region identification unit 13 and the image correction unit 14 included in the image projection system 1.
  • the calculation unit 51 reads out and executes the image projection program 11c stored in the storage unit 11 included in the information processing device 50 of the image projection system 100.
  • the obstacle detection unit 16 and the projection control unit 17 are realized as functions that replace the shadow area identification unit 13 and the image correction unit 14.
  • the calculation unit 51 of the second embodiment functions as an image acquisition unit 12, a control unit 15, an obstacle detection unit 16 and a projection control unit 17 shown in FIGS. 11 and 12 by reading and executing the image projection program 11c. do.
  • the obstacle detection unit 16 plays at least a function related to extracting an image of an obstacle among the functions provided by the shadow area identification unit 13. That is, the obstacle detection unit 16 detects an obstacle between one projection unit and the overlapping region based on the images captured by the imaging units 31 and 32.
  • the obstacle detection unit 16 may be configured to further specify a projection unit that is paired with an imaging unit that captures an captured image including an obstacle that causes a shadow region. Since the projection units associated with the imaging unit that captures the captured image are configured as one set, if the imaging unit can be specified, the corresponding projection unit can be easily specified. Such identification may be performed by the obstacle detection unit 16 or the projection control unit 17. Further, the information regarding the specified projection unit may be stored in the storage unit 11.
  • the function related to the extraction of the shadow area included in the shadow area identification unit 13 is not essential, but it may be possessed. In the description of the second embodiment, it is assumed that the function provided by the obstacle detection unit 16 and the function provided by the shadow area identification unit 13 are the same.
  • the projection control unit 17 projects an image on a projection unit different from the one projection unit. Further, when an obstacle is not detected, the projection control unit 17 operates a plurality of projection units so as to switch which of the plurality of projection units projects an image on the overlapping region 81 at a predetermined cycle. In this way, the projection control unit 17 switches and projects the projected image projected by the plurality of projection units 21 and 22 for each frame.
  • the projection control unit 17 uses a projection unit different from the projection unit, which is paired with the imaging unit that captures the captured image including an obstacle within the angle of view of the projection that causes the specified shadow region, to produce the projected image. It is good to project it.
  • FIG. 13 is a diagram showing an example of projection of an image on the overlapping region 81 and imaging of the overlapping region 81 when there is no obstacle.
  • the image formed in the overlapping region 81 is described as a projected image.
  • the projection unit 21 projects an image corresponding to the projected image data 11b as the first projected image.
  • the second projected image is not projected.
  • the fact that the second projected image is not projected is illustrated by a black filled rectangle NL.
  • the configuration for projecting an image is switched at a predetermined cycle. Therefore, there is a period during which the second projected image is projected and the first projected image is not projected.
  • the configuration for projecting an image according to the update cycle of the frame image is alternately switched between the projection unit 21 and the projection unit 22. It may be switched every frame, or it may be switched every predetermined number of frames.
  • the captured images by the imaging unit 31 and the imaging unit 32 are the same images as the projected image V4.
  • FIG. 14 is a diagram showing an example of projecting an image on the overlapping region 81 and imaging the overlapping region 81 when there is an obstacle.
  • FIG. 14 shows a state in which the operation control of the plurality of projection units by the projection control unit 17 in response to the detection of an obstacle has not yet been performed.
  • the obstacle OB is the projected light for projecting the image by the projection device 61.
  • Block a part of. Therefore, during the period in which the projection unit 21 is projecting the image corresponding to the projected image data 11b as the first projected image, it corresponds to the light blocked by the obstacle OB as in the projected image V5 shown in FIG.
  • An image including the shadow region DA is formed in the overlapping region 81.
  • the image capturing unit 31 displays an image in which the obstacle OB is located inside the shadow region DA as in the first captured image 31e shown in FIG. It is imaged.
  • the image capturing unit 31 displays an image in which the obstacle OB is located inside the shadow region DA as in the first captured image 31e shown in FIG. It is imaged.
  • FIG. 15 is a diagram showing an example of projection of an image on the overlapping region 81 and imaging of the overlapping region 81 after the control corresponding to the detection of the obstacle is performed.
  • the obstacle detection unit 16 first captures that the obstacle OB is within the angle of view of the projection by the projection device 61 and the angle of view of the image captured by the image pickup device 71. Detect based on the image 31e. After such detection, the projection control unit 17 controls the operation so that the projection device 62 projects the image on the overlapping region 81 and the projection device 61 does not project the image.
  • the first projected image is not projected, and the image corresponding to the projected image data 11b is projected as the second projected image. Since there are no obstacles within the angle of view of the projection by the projection device 62, an image similar to the image projected by the projection device 62 is formed in the overlapping region 81 as in the projection image V6 shown in FIG.
  • the image captured by the imaging unit 31 includes an image of the obstacle OB as in the first captured image 31f shown in FIG.
  • the image captured by the imaging unit 32 becomes the same image as the projected image V6 as in the second captured image 32f shown in FIG. ..
  • FIG. 16 is a flowchart showing the flow of processing related to the detection of obstacles and shadow areas.
  • the projection of the image by the projection unit 21 or the projection unit 22 corresponding to the image acquired by the image acquisition unit 12 is performed while switching at a predetermined cycle.
  • the imaging units 31 and 32 acquire a plurality of captured images by imaging the overlapping region 81 (step S11).
  • the obstacle detection unit 16 detects an image in the shadow region based on the captured image acquired in the process of step S11 (step S12). If the image in the shadow region is not detected in the process of step S12 (step S12; No), the process ends.
  • step S12 When the shadow region is detected in the process of step S12 (step S12; Yes), the projection control unit 17 identifies the projection unit that projected the image on the overlapping region 81 during the period when the shadow region was detected (step S12; Yes). Step S13). Then, the projection control unit 17 projects the image on a projection unit different from the projection unit specified based on the process of step S13 (step S14).
  • step S11 a plurality of captured images are acquired.
  • the obstacle detection unit 16 determines whether or not the shadow region has been eliminated based on the plurality of captured images (step S15). When it is determined in the process of step S15 that the decrease in brightness of the shadow region has been eliminated (step S15; Yes), the process ends. On the other hand, if it is determined in the process of step S15 that the decrease in brightness of the shadow region has not been resolved (step S15; No), the process returns to the process of step S13.
  • the obstacle detection unit 16 determines in which angle of view the projection units 21 and 22 the obstacle OB is located based on the first captured image and the second captured image. It is being detected. By performing such control, there is a remarkable effect that the obstacle OB that caused the shadow region DA can be more reliably identified within which angle of view of the projection devices 61 and 62.
  • the control for eliminating the occurrence of the shadow region in the image projected on the overlapping region 81 is not limited to this.
  • the obstacle detection unit 16 identifies whether the projection unit 21 or the projection unit 22 that was projecting the image at the timing when the shadow region DA is generated in the overlapping region 81.
  • the projection control unit 17 causes the image to be projected by a projection unit different from the specified projection unit. This also eliminates the occurrence of a shadow region in the image projected on the overlapping region 81.
  • the number of imaging units may be one. Therefore, in the second embodiment, it is not essential that the projection unit and the imaging unit are a set. According to this, the configuration can be further simplified.
  • the obstacle detection unit 16 has an obstacle within the angle of view of the projection by the projection device 61 and within the angle of view of the image captured by the image pickup device 71.
  • the presence of OB is detected based on the first captured image 31e.
  • the projection control unit 17 controls the operation so that the projection device 62 projects the image on the overlapping region 81 and the projection device 61 does not project the image, but the operation is not limited to this.
  • the projection control unit 17 controls the projection of a plurality of projection devices to perform projection while switching the projection device for each frame, and within the projection angle of the projection device 61 and within the image capture angle of the image pickup device 71.
  • the presence of the obstacle OB is detected based on the first captured image 31e, and the shadow region is specified based on the second captured image 32e and the first projected image 11b.
  • the area corresponding to the shadow area of the second projection image projected in the frame of is set high to compensate for the decrease in the brightness of the image area corresponding to the shadow area by the correction of the brightness, and the corrected second projection.
  • the image may be switched frame by frame and projected.
  • projection When switching and projecting for each frame, projection may be performed while switching, including projection by a projection device such as a projection device 61 including an obstacle OB within the angle of view of the projection. Further, even if the projection control unit 17 controls the projection of each projection device to be switched and projected at high speed so that the projection is performed at a very high frame rate when the projection is switched for each frame. good.
  • a very high frame rate is preferably a frame rate of 30 Hz or 60 Hz or higher, which is common in video display.
  • the projection control unit 17 prevents the projection device having an obstacle OB within the projection angle of view from projecting, and projects the image onto the overlapping region 81. May be controlled so that it is performed on a projection device having no obstacle OB within the projection angle of view.
  • the projection device having no obstacle OB within the projection angle of view controlled by the projection control unit 17 may be a predetermined projection device, or the obstacle OB may be within a plurality of projection angles of view. It is also possible to control the projection device without the above to project while switching for each frame.
  • the frame rate is different from the frame rate when projecting using all the projection devices. It may be controlled so as to project. For example, if it is composed of three projection devices, one of which contains an obstacle OB within the projection angle of view and projects while switching the projection device at a frame rate of 30 Hz, the obstacle is within the projection angle of view. When projecting by the remaining two projection devices that do not include the object OB, it may be controlled to project at 20 Hz, which is 2/3 of 30 Hz, for example.
  • FIG. 17 is a diagram showing a configuration example when there is one imaging unit.
  • the image pickup device 71 and the image pickup device 72 are omitted, and the image pickup device 75 is provided in place of them.
  • the image pickup apparatus 75 may be provided at a position independent of the projection apparatus 61 and the projection apparatus 62.
  • the image pickup apparatus 75 is provided at a position facing the overlapping region 81.
  • the imaging unit included in the imaging device 75 is the same as the imaging unit 31 included in the imaging device 71 and the imaging unit 32 included in the imaging device 72, and is set so as to include the overlapping region 81 in the imaging range.
  • the angle of view of the image captured by the imaging device 75 is shown within the acute angle formed by the two broken lines L5.
  • the second embodiment various processes and controls related to the projection of the corrected image described in the first embodiment are omitted.
  • the second embodiment is the same as the first embodiment except for the matters noted.
  • the image quality of the projected image is more easily maintained by projecting the image by a projection unit different from the projection unit in which the obstacle is detected within the angle of view, and one of the projected images. It is possible to prevent changes in the brightness of the part and deterioration of image quality such as chipping.
  • the projection unit and the imaging unit are paired, it is more certain that the obstacle OB that caused the shadow region DA is within the angle of view of the projection devices 61 and 62. Can be identified.
  • the process of partially prohibiting the light projected from the image projection device as in Patent Document 1 is complicated, and a method capable of maintaining the image quality of the projected image more easily has been required.
  • the brightness of the projected image is reduced because the projection corresponding to the blocked area is prohibited.
  • FIG. 18 is a functional block diagram showing the configuration of the image projection system 200 according to the third embodiment of the present invention.
  • FIG. 19 is a block diagram showing an example of a configuration for realizing the functional block shown in FIG.
  • the image projection system 200 includes a plurality of projection units, a plurality of imaging units, a storage unit 11, an image acquisition unit 12, a specific unit 113, an extraction unit 114, an image correction unit 115, and a control unit 116. Be prepared.
  • the projection unit 21 and the projection unit 22 are illustrated as a plurality of projection units.
  • an imaging unit 31 and an imaging unit 32 are illustrated as a plurality of imaging units.
  • the image formed by the overlapping region 81 of the projected object 80 by the image projected by the image projection system 200 is referred to as a projected image.
  • one of the projection unit 21 and the projection unit 22 projects an image unless the correction image is generated by the image correction unit 115 described later.
  • the case where the image corresponding to the projected image data 11b stored in the storage unit 11 is projected is taken as an example, but the projected image is not limited to this. Images corresponding to the projected image data read from the external storage device connected to the image projection system 200, the projected image data input from the external information processing device connected to the image projection system 200, and the like are projected. May be good.
  • FIG. 3 when there is no obstacle within the angle of view of the projection, an image similar to the image projected by the projection unit 21 is formed in the overlapping region 81 as in the projection image V4 shown in FIG. .. Further, just as there is no obstacle in the angle of view of the projection, when there is no obstacle in the angle of view of the captured image, the image is captured as in the first captured image 31d and the second captured image 32d shown in FIG. The image captured by the unit 31 and the imaging unit 32 is the same as the projected image V4.
  • FIG. 13 and the like exemplify a case where the projection unit 21 projects an image corresponding to the projected image data 11b (see FIG. 19) stored in the storage unit 11.
  • the projection unit 22 does not project an image unless the image correction unit 115, which will be described later, generates a corrected image (see FIG. 13). In FIG. 13 and the like, it is shown that the projection unit 22 does not project the image by making the second projected image a black filled rectangle NL.
  • FIG. 20 is a diagram showing an example of the relationship between the first projected image, the second projected image, the projected image, the first captured image, and the second captured image.
  • the relationship shown in FIG. 20 corresponds to the arrangement shown in FIG.
  • the obstacle OB2 when the obstacle OB2 is within the angle of view of the projection by the projection device 61 and the angle of view of the image captured by the image pickup device 71, the obstacle OB2 is the projected light for projecting the image by the projection device 61. Block a part of. Therefore, as in the projected image V7 shown in FIG. 20, an image including the shadow region DA2 corresponding to the light blocked by the obstacle OB2 is formed in the overlapping region 81. In this way, when there is an obstacle OB2 between the projection unit 21 and the overlapping region 81, the obstacle OB2 creates a shadow region DA2 in the projected image V7.
  • the obstacle OB2 is within the angle of view of the image captured by the imaging device 71, an image in which the obstacle OB2 is located inside the shadow region DA2 is captured by the imaging unit 31 as in the first captured image 31g shown in FIG. It is imaged.
  • the image pickup device 72 there is no obstacle OB2 within the angle of view of the image captured by the image pickup device 72. Therefore, as in the second captured image 32g shown in FIG. 20, the captured image by the imaging unit 32 is the same as the projected image V7.
  • the light that projects an image from the projection unit 21 onto the overlapping region 81 is projected so as to spread toward the overlapping region 81 side via the optical element of the projection unit 21. Therefore, as shown by the first captured image 31g, the shadow region DA2 is larger than the image of the obstacle OB2.
  • the information processing device 500 performs processing related to the projection of the image by the projection units 21 and 22 based on the images captured by the image pickup units 31 and 32.
  • the information processing device 500 is the same as the information processing device 50, except for the matters to be noted below.
  • the calculation unit 51 of the third embodiment reads and executes the image acquisition unit 12, the identification unit 113, the extraction unit 114, the image correction unit 115, and the control unit 116 shown in FIGS. 18 and 19. Functions as.
  • the image projection program 11a and the image projection program 11d are the same except for the difference in the realized functions.
  • the image acquisition unit 12 acquires the image projected by the projection unit 21. Specifically, the image acquisition unit 12 reads and acquires the projected image data 11b from, for example, the storage unit 11.
  • the projected image data 11b acquired by the image acquisition unit 12 can be referred to by the identification unit 113 and the image correction unit 115.
  • the identification unit 113 identifies an image of an obstacle between the projection unit 21 and the overlapping region 81 and a shadow region generated in the overlapping image due to the obstacle based on the captured image. Specifically, the specific unit 113 performs image analysis for comparing the content of the image captured by the imaging units 31 and 32 with the content of the image acquired by the image acquisition unit 12. More specifically, the specific unit 113 has a decrease in brightness in the tendency of the brightness distribution of the contents of the captured images by the imaging units 31 and 32, based on the tendency of the brightness distribution corresponding to the contents of the projected image data 11b. If a partial area is generated, the partial area is specified as a shadow area.
  • the image acquisition unit 12 acquires the first projected image, and the first captured image and the first projection.
  • the difference from the image is performed and a difference signal is generated.
  • this difference signal it may be configured to specify a partial region in which the brightness has decreased by a certain amount.
  • the partial region is specified as an image of an obstacle.
  • a partial region of the difference signal that has a difference amount of the difference signal and is different from the feature of the portion where the brightness is reduced by a certain amount is specified, and the image of the obstacle. It may be configured to be specified as an area.
  • the projection unit that is paired with the image pickup unit that captures the captured image including the obstacle that causes the shadow region is specified by the specific unit 113. Since the projection units associated with the imaging unit that captures the captured image are configured as one set, if the imaging unit can be specified, the corresponding projection unit can be easily specified. Such identification may be performed by the specific unit 113 or may be performed by the control unit 116. Further, the information regarding the specified projection unit may be stored in the storage unit 11.
  • the extraction unit 114 extracts an image of an obstacle included in the captured image. Specifically, the extraction unit 114 extracts a region specified as an image of an obstacle by the identification unit 113. The image of the obstacle extracted by the extraction unit 114 is used by the image correction unit 115.
  • the image correction unit 115 generates a corrected image in which the extracted obstacle image is corrected to a position and size corresponding to the shadow area. Specifically, the image correction unit 115 enlarges the size of the image of the obstacle OB2 extracted by the extraction unit 114 so as to correspond to the size of the shadow region DA2.
  • the size of the enlarged obstacle OB2 may be larger than or equal to the shadow area DA2, and it is not essential that the size of the enlarged obstacle OB2 exactly matches the size of the shadow area DA2.
  • the projected position of the enlarged image of the obstacle OB2 is the position of the shadow region DA2, and another visible image is not projected around the enlarged image of the obstacle OB2.
  • the processed image is generated as a corrected image and output. More specifically, the image correction unit 115 uses an image in which transmission processing is performed around the enlarged image of the obstacle OB2 and having the same number of vertical and horizontal pixels as the projected image data 11b as a correction image. Generate.
  • the position and size of the shadow region DA2 in the processing performed by the image correction unit 115 may be determined by the second captured image 32 g or the first captured image 31 g, and there are slight differences between them. In some cases, processing such as averaging may be performed.
  • the control unit 116 comprehensively controls each configuration and each functional configuration of the image projection system 200. Specifically, the control unit 116 of the third embodiment projects the corrected image on the projection unit. Here, the control unit 116 projects the corrected image by a projection unit different from the projection unit that is paired with the imaging unit that captures the captured image including an obstacle within the angle of view of the projection that causes the specified shadow region. It is good to let it.
  • FIG. 21 is a diagram showing an example of the relationship between the first projected image, the second projected image, the projected image, the first captured image, and the second captured image when the corrected image is projected.
  • the relationship shown in FIG. 21 corresponds to the arrangement shown in FIG. Further, FIG. 21 assumes a case where the corrected image 15a is generated in response to the result of extracting the images of the shadow region DA2 and the obstacle OB2 shown in FIG. 20. In FIG. 21 and the like, the image of the obstacle OB2 included in the corrected image 15a and corrected to the position and size corresponding to the shadow region is shown as an enlarged image BOB2.
  • the control unit 116 projects the corrected image 15a onto the projection unit 22 as a second projection image.
  • the control unit 116 causes the projection unit 21 to project the contents of the projected image data 11b.
  • the enlarged image BOB2 included in the corrected image 15a is projected so as to overlap the shadow region generated by the obstacle OB2 in the projection corresponding to the projected image data 11b by the projection unit 21. Therefore, like the projected image V8 shown in FIG. 21, an image in which the shadow region DA2 (see FIG. 20) in the projected image V7 is replaced by the enlarged image BOB2 is drawn in the overlapping region 81.
  • the shadow region DA2 captured by the first captured image 31 g and the second captured image 32 g shown in FIG. 20 is the enlarged image BOB2.
  • a replaced image is obtained.
  • the control unit 116 can confirm that the correction of the shadow region is normally performed based on the first captured image 31h and the second captured image 32h.
  • the data flow of the output image is shown so as to reach the projection unit 21 from the image acquisition unit 12 via the image correction unit 115, but the present invention is not limited to this. There may be a data flow from the image acquisition unit 12 to the direct projection unit 21.
  • the image correction unit 115 outputs to the projection unit 21 as it is corresponding to the content of the image acquired by the image acquisition unit 12.
  • FIG. 22 is a flowchart showing the flow of processing related to the detection of obstacles and shadow areas. As a premise that such processing is performed, an image is projected by the projection unit 21 corresponding to the image acquired by the image acquisition unit 12.
  • the imaging units 31 and 32 acquire a plurality of captured images by imaging the overlapping region 81 (step S21).
  • the identification unit 113 detects an image of a shadow region and an obstacle based on the captured image acquired in the process of step S21 (step S22). When the image of the shadow area and the obstacle is not detected in the process of step S22 (step S22; No), the process ends.
  • step S22 When an image of a shadow area and an obstacle is detected in the process of step S22 (step S22; Yes), the extraction unit 114 extracts the detected image of the obstacle (step S23). Further, the image correction unit 115 corrects the image of the obstacle extracted in the process of step S23 to the position and size of the shadow region detected in the process of step S22 to generate a corrected image (step S24). Then, the control unit 116 projects the corrected image generated in the process of step S24 onto the projection unit 22 (step S25).
  • step S21 a plurality of captured images are acquired.
  • the control unit 116 determines whether or not the image of the obstacle by the corrected image is applied to the shadow region based on the plurality of captured images (step S26).
  • step S26 determines whether or not the image of the obstacle by the corrected image is applied to the shadow area based on the plurality of captured images.
  • an image corresponding to an obstacle that has created a shadow region can be projected. That is, it is possible to project an image whose contents are changed from the contents of the image which is supposed to be projected in advance according to the obstacle between the projection unit 21 and the projected object 80. Therefore, new value can be added to the projected image.
  • FIG. 23 is a diagram showing the positional relationship between the projected object 85, the projection devices 61, 62, and the image pickup devices 71, 72 according to the fourth embodiment of the present invention.
  • the projected body 85 is adopted.
  • the projected object 85 is provided on the assumption that an image projected from one surface side is visually recognized from the back surface side, such as a transmission screen. Due to the positional relationship between the projection device 61 and the projected body 85 shown in FIG. 23, the image projected from the projection unit 21 included in the projection device 61 of the set P1 is surfaced on the opposite side of the projection unit 21 with the projected body 85 in between. It is visually recognized as a back surface image on the visible surface 86. Further, the set P2 is provided on the opposite side of the set P1 with the projected body 85 in between. That is, the set P1 and the set P2 are arranged so as to face each other with the projected body 85 in between.
  • the angle of view of the imaging devices 71 and 72 are omitted in FIGS. 23 and 25, the angle of view of the imaging device 71 is the angle of view of the projection device 61 in the fourth embodiment as in the third embodiment.
  • the angle of view of the imaging device 72 is provided so as to correspond to the angle of view of the projection device 62.
  • FIG. 24 is a diagram showing an example of the relationship between the first projected image, the second projected image, the projected image, the back surface image, the first captured image, and the second captured image.
  • the relationship shown in FIG. 24 corresponds to the arrangement shown in FIG. 23.
  • the content of the back surface image visually recognized by the visible surface 86 facing the opposite side of the projection unit 21 with the projected body 85 sandwiched is the projected image data projected by the projection unit 21.
  • the contents of the projected image V9 corresponding to the contents of 11b are inverted left and right.
  • the content of the second captured image corresponds to the content of the back surface image.
  • the relationship between the first projected image, the second projected image, the projected image, the first captured image, and the second captured image in the fourth embodiment has been described with reference to FIG. This is similar to these relationships in the third embodiment.
  • FIG. 25 is a diagram showing an example of a case where the obstacle OB2 is within the angle of view of the projection by the projection device 61 and the angle of view of the image captured by the image pickup device 71 in the fourth embodiment.
  • FIG. 26 is a diagram showing an example of the relationship between the first projected image, the second projected image, the projected image, the back surface image, the first captured image, and the second captured image. The relationship shown in FIG. 26 corresponds to the arrangement shown in FIG. 25.
  • the obstacle OB2 when there is an obstacle OB2 within the angle of view of the projection by the projection device 61 and within the angle of view of the image captured by the image pickup device 71, the obstacle OB2 is the projected light for projecting the image by the projection device 61. Block a part of. Therefore, as in the projected image V10 shown in FIG. 25, an image including the shadow region DA2 corresponding to the light blocked by the obstacle OB2 is formed in the overlapping region 81. Further, as in the back surface image V10R shown in FIG. 25, the content of the back surface image is a left-right inverted content of the projected image V10. Therefore, the position and shape of the shadow region DA2 in the back image V10R is the left-right reversal of the position and shape of the shadow region DA2 in the projected image V10.
  • the image in which the obstacle OB2 is located inside the shadow region DA2 is captured by the image pickup unit 31 as in the first image captured image 31j shown in FIG. It is imaged.
  • the captured image by the imaging unit 32 is the same as the back surface image V10R.
  • FIG. 27 is a diagram showing an example of the relationship between the first projected image, the second projected image, the projected image, the back surface image, the first captured image, and the second captured image when the corrected image is projected. Is.
  • the relationship shown in FIG. 27 corresponds to the arrangement shown in FIG. Further, FIG. 27 assumes a case where the corrected image 15a is generated in response to the result of extracting the images of the shadow region DA2 and the obstacle OB2 shown in FIG. 26.
  • the image correction unit 115 of the fourth embodiment When generating the corrected image, the image correction unit 115 of the fourth embodiment further flips the enlarged image of the obstacle left and right to obtain a reversed image. Further, the reference of the position and size of the shadow region DA2 in the processing performed by the image correction unit 115 of the fourth embodiment is based on the image pickup content of the image pickup unit 32 as in the second image capture image 32j shown in FIG. It becomes.
  • the processing of the image correction unit 115 of the fourth embodiment may be performed by the control unit 116 of the fourth embodiment.
  • FIG. 27 illustrates an inverted image MOB2 generated based on the obstacle OB2.
  • the direction of one diagonal line attached to the obstacle OB2 and the direction of one diagonal line attached to the inverted image MOB2 are symmetrical, so that the inverted image MOB2 makes the obstacle OB2. It shows that it is enlarged left and right.
  • the set P1 and the set P2 are arranged so as to face each other with the projected body 85 in between. Therefore, the projection unit 22 projects the corrected image from the opposite side of the projection unit 21 with the projected body 85 in between, that is, the side facing the visible surface 86.
  • the enlarged image of the obstacle is regarded as an inverted image
  • the reference of the position and size of the shadow region DA2 is based on the image captured content of the imaging unit 32. Therefore, the position and shape of the inverted image M included in the corrected image M correspond to the left and right inverted positions and shapes of the shadow region DA2 in the projected image V10 shown in FIG. 26 and the projected image V11 shown in FIG. 27.
  • the set P1 and the set P2 are arranged so as to face each other with the projected object 85 in between, the set P1 and the set P2 are enlarged so as to correspond to the shadow region. It is possible to project an image to which an image of an obstacle has been applied.
  • the same effect as that of the third embodiment can be obtained by arranging the projection unit and the imaging unit different from those of the third embodiment as in the fourth embodiment.
  • the number of projection units may be 3 or more. That is, in the configuration shown in FIG. 2 and the like, the number of projection devices may be three or more.
  • the brightness of the projected image by one or more projection units having no obstacle on the optical axis may be corrected.
  • the degree of correction may be such that the projection images of the plurality of projection units having no obstacles on the optical axis are overlapped so that the decrease in brightness of the shadow region due to the projection units of the obstacles on the optical axis is offset.
  • the image may be projected by one or more projection units having no obstacle on the optical axis. That is, it is sufficient that the image is not projected by the projection unit having an obstacle on the optical axis.
  • the number of imaging units may be 3 or more. In the first embodiment, the number of imaging units corresponds to the number of projection units. In the second embodiment, as described with reference to FIGS. 9, 10 and 11, an imaging unit may be provided so as to be paired with the projection unit, or a plurality of imaging units may be provided at positions independent of the projection unit. A unit may be provided.
  • the number of projection units in the third embodiment may be 3 or more. That is, in the configuration shown in FIG. 19 and the like, the number of projection devices may be three or more. Further, although two imaging units 31 and 32 are illustrated in FIG. 18, the number of imaging units may be 3 or more. The number and arrangement of imaging units in the third embodiment correspond to the number of projection units.
  • the functional configuration shown in the image projection system 1 shown in FIG. 1 and the image projection system 100 shown in FIG. 11 and the functional configuration shown in the image projection system 200 shown in FIG. 18 may be a functional configuration included in one device. .. That is, the present invention is not limited to the system configuration by combining a plurality of devices as shown in FIGS. 2 and 12 and 19, but is realized by one device having the functions shown in FIGS. 1 and 11 and 18. May be good. That is, the configuration with reference numeral 1 and the configuration with reference numeral 100 and the configuration with reference numeral 200 may be an image projection device. To give a specific example, the configuration with reference numeral 1 and the configuration with reference numeral 100 may be an integrated device provided like a so-called projection television. However, in the third embodiment, the projected object is provided at a position separated from the device. That is, the device is configured so that an obstacle can intervene between the device and the projected object.
  • the positions and shapes of the obstacle OB2 and the shadow region DA2, and the contents of the corrected image 15a illustrated in FIG. 21 are merely examples and are not limited thereto.
  • the present invention can project an image having arbitrary contents. Further, the present invention can generate a corrected image according to a shadow region generated corresponding to the shape of an obstacle within the angle of view of projection.
  • the projection control unit 17 of the second embodiment is based on the captured image obtained by capturing the projected image projected by switching each frame by the plurality of imaging units 21 and 22, and the contents of the projected image data 11b which is the projected image.
  • the position of the shadow of the projected image may be specified. That is, the projection control unit 17 may have a function related to extraction of the shadow area DA included in the shadow area identification unit in the first embodiment and the obstacle detection unit 16 in the second embodiment.
  • the present invention is not limited by the contents of these embodiments.
  • the components described above include those that can be easily assumed by those skilled in the art, those that are substantially the same, that is, those in a so-called equal range. Furthermore, the components described above can be combined as appropriate. Further, various omissions, replacements or changes of the components can be made without departing from the gist of the above-described embodiment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Projection Apparatus (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

This image projection device comprises: a plurality of projection units for projecting a plurality of projection images from different image projection angles such that the projection images form overlapping images that overlap in the same region; a plurality of imaging units provided so as to be paired with the plurality of projection units, for imaging said same region from angles corresponding to image projection angles of the respective projection units to generate a taken image; a shaded region specifying unit for specifying, on the basis of the taken image, a shaded region that occurs in the overlapping images due to an obstacle between the projection unit and said same region; an image correction unit for generating a corrected image in which a region that is in the projection image projected by the projection unit and corresponds to the shaded region is corrected; and a control unit for controlling the projection unit to project the corrected image.

Description

画像投影装置及び画像補正方法Image projection device and image correction method
 本発明は、画像投影装置及び画像補正方法に関する。 The present invention relates to an image projection device and an image correction method.
 一つの被投影対象に対して複数の画像投影装置から画像を投影して画像を形成する所謂プロジェクションマッピング技術が知られている(例えば、特許文献1)。特許文献1では、画像を投影する光が遮られる領域が生じる場合、係る領域に対する光の投影を禁止するよう画像投影装置を制御している。 A so-called projection mapping technique is known in which an image is projected from a plurality of image projection devices onto one object to be projected to form an image (for example, Patent Document 1). In Patent Document 1, when a region where the light for projecting an image is blocked occurs, the image projection device is controlled so as to prohibit the projection of the light onto the region.
特表2007-520945号公報Special Table 2007-520945
 特許文献1のように単に光の投影を禁止するだけでは、出力を意図した画像を被投影対象に形成することが困難になることがある。すなわち、画像の一部が暗くなる、欠ける等の画質低下が生じることがある。 It may be difficult to form an image intended for output as an object to be projected by simply prohibiting the projection of light as in Patent Document 1. That is, image quality deterioration such as a part of the image becoming dark or chipped may occur.
 本発明は、投影画像の一部の明るさの変化や、欠ける等の画質低下の発生を防ぐことができる画像投影装置及び画像補正方法を提供することを目的とする。 An object of the present invention is to provide an image projection device and an image correction method that can prevent changes in the brightness of a part of a projected image and deterioration of image quality such as chipping.
 本発明の画像投影装置は、複数の投影画像が同一領域で重複した重複画像を形成するように、異なる画像投影角度から投影する複数の投影部と、前記複数の投影部の各々と組になるよう設けられて、それぞれの投影部の画像投影角度に対応する角度から前記同一領域を撮像して撮像画像を生成する複数の撮像部と、前記撮像画像に基づいて、前記投影部と前記同一領域との間の障害物により前記重複画像に生じた影領域を特定する影領域特定部と、前記投影部が投影する投影画像における前記影領域に対応する領域を補正した補正画像を生成する画像補正部と、前記投影部に前記補正画像を投影させる制御部とを備えることを特徴とする。 The image projection device of the present invention is paired with a plurality of projection units that project from different image projection angles and each of the plurality of projection units so that a plurality of projected images form overlapping overlapping images in the same region. A plurality of imaging units that capture the same region from an angle corresponding to the image projection angle of each projection unit to generate an captured image, and the same region as the projection unit based on the captured image. Image correction to generate a corrected image in which a shadow area specifying portion that identifies a shadow region generated in the overlapping image due to an obstacle between the two and a correction image that corrects a region corresponding to the shadow region in the projected image projected by the projection unit. It is characterized by including a unit and a control unit that projects the corrected image onto the projection unit.
 本発明の画像補正方法は、複数の投影画像が同一領域で重複した重複画像を形成するように、異なる画像投影角度から投影する複数の投影装置の各々と組になるよう設けられて、それぞれの投影装置の画像投影角度に対応する角度から前記同一領域を撮像する複数の撮像装置が生成した撮像画像に基づいて、投影装置と前記同一領域との間の障害物により前記重複画像に生じた影領域を特定するステップと、前記投影装置が投影する投影画像における前記影領域に対応する領域を補正した補正画像を生成するステップと、前記投影装置に前記補正画像を投影させるステップとを含む。 The image correction method of the present invention is provided so as to be paired with each of a plurality of projection devices projecting from different image projection angles so that a plurality of projected images form overlapping overlapping images in the same region. Based on the captured images generated by a plurality of imaging devices that image the same region from an angle corresponding to the image projection angle of the projection device, shadows generated in the overlapping image due to an obstacle between the projection device and the same region. It includes a step of specifying a region, a step of generating a corrected image in which a region corresponding to the shadow region in the projected image projected by the projection device is corrected, and a step of projecting the corrected image on the projection device.
 本発明の画像投影装置は、複数の投影画像が同一領域で重複した重複画像を形成するように、異なる画像投影角度から投影する複数の投影部と、前記同一領域を撮像して撮像画像を生成する撮像部と、前記撮像画像に基づいて、前記投影部と前記同一領域との間の障害物を検知する障害物検知部と、前記投影部に前記投影画像を投影させる制御を行う制御部とを備え、前記制御部は、投影画像をフレームごとに前記複数の投影部を切り替えて投影し、前記撮像部によりフレームごとに切り替えて投影された投影画像を撮像した撮像画像、および投影画像に基づいて、投影画像の影の位置を特定するように制御することを特徴とする。 The image projection device of the present invention generates an captured image by imaging a plurality of projection units projecting from different image projection angles and the same region so that a plurality of projected images form overlapping overlapping images in the same region. An obstacle detection unit that detects an obstacle between the projection unit and the same region based on the captured image, and a control unit that controls the projection image to be projected onto the projection unit. The control unit is based on an captured image obtained by switching and projecting the plurality of projection units for each frame and capturing the projected image switched for each frame by the imaging unit, and a projected image. Therefore, it is characterized in that it is controlled so as to specify the position of the shadow of the projected image.
 本発明の画像投影装置は、複数の投影画像が同一領域で重複した重複画像を形成するように、異なる画像投影角度から投影する複数の投影部と、前記複数の投影部の各々と組になるよう設けられて、それぞれの投影部の画像投影角度に対応する角度から前記同一領域を撮像して撮像画像を生成する複数の撮像部と、前記撮像画像に基づいて、前記投影部と前記同一領域との間の障害物の画像及び障害物により前記重複画像に生じた影領域、並びに前記障害物の画像を撮像した撮像部と組になる投影部を特定する特定部と、前記撮像画像に含まれる前記障害物の画像を抽出する抽出部と、抽出された前記障害物の画像を前記影領域に対応する位置及び大きさに補正した補正画像を生成する画像補正部と、前記障害物の画像を撮像した撮像部と組になる投影部と異なる投影部を制御して前記補正画像を投影させる制御部とを備えることを特徴とする。 The image projection device of the present invention is paired with a plurality of projection units that project from different image projection angles and each of the plurality of projection units so that a plurality of projected images form overlapping overlapping images in the same region. A plurality of imaging units that capture the same region from an angle corresponding to the image projection angle of each projection unit to generate an image, and the same region as the projection unit based on the captured image. The image of the obstacle between the two, the shadow area generated in the overlapping image due to the obstacle, the specific part that specifies the projection part to be paired with the image pickup part that captured the image of the obstacle, and the image capture image. An extraction unit that extracts an image of the obstacle, an image correction unit that generates a corrected image obtained by correcting the extracted image of the obstacle to a position and size corresponding to the shadow region, and an image of the obstacle. It is characterized in that it includes a projection unit that is paired with an image pickup unit that has imaged the image, and a control unit that controls a projection unit different from the image pickup unit to project the corrected image.
 本発明の画像投影装置は、両面から映像を投影可能な透過性を有するスクリーンと、前記スクリーンの同一領域に対して第1面側から投影画像を投影する第1面側投影部及び第2面側から投影画像を投影する第2面側投影部と、前記第1面側投影部及び前記第2面側投影部の各々と組になるように設けられて、それぞれの投影部の画像投影角度に対応する角度から前記同一領域を撮像して撮像画像を生成する撮像部と、前記撮像画像に基づいて、前記第1面側投影部または前記第2面側投影部と前記同一領域との間の障害物の画像及び障害物により重複画像に生じた影領域、並びに前記障害物の画像を撮像した撮像部と組になる投影部を特定する特定部と、前記撮像画像に含まれる前記障害物の画像を抽出する抽出部と、前記障害物の画像を撮像した撮像部と組になる投影部と異なる投影部を制御して補正画像を投影させる制御部とを備えることを特徴とする。 The image projection device of the present invention has a transparent screen capable of projecting an image from both sides, and a first surface side projection unit and a second surface that project a projected image from the first surface side onto the same area of the screen. A second surface side projection unit that projects a projected image from the side, and each of the first surface side projection unit and the second surface side projection unit are provided so as to form a pair, and the image projection angle of each projection unit. Between the imaging unit that images the same region from an angle corresponding to the above and generates an image, and the first surface side projection unit or the second surface side projection unit and the same region based on the captured image. An image of an obstacle, a shadow region generated in an overlapping image due to the obstacle, a specific part for specifying a projection part to be paired with an image pickup part that captured the image of the obstacle, and the obstacle included in the captured image. It is characterized by including an extraction unit for extracting an image of the above object, and a control unit for controlling a projection unit different from the projection unit formed with the image pickup unit that has captured the image of the obstacle to project a corrected image.
 本発明によれば、投影画像の一部の明るさの変化や、欠ける等の画質低下の発生を防ぐことができる。 According to the present invention, it is possible to prevent changes in the brightness of a part of the projected image and deterioration of image quality such as chipping.
図1は、本発明の第1実施形態に係る画像投影システムの構成を示す機能ブロック図である。FIG. 1 is a functional block diagram showing a configuration of an image projection system according to a first embodiment of the present invention. 図2は、図1に示す機能ブロックを実現する構成の一例を示すブロック図である。FIG. 2 is a block diagram showing an example of a configuration for realizing the functional block shown in FIG. 図3は、複数の投影装置による画像の投影及び複数の撮像装置による撮像を行うための被投射体と投影装置と撮像装置との位置関係の一例を示す図である。FIG. 3 is a diagram showing an example of the positional relationship between the projected object, the projection device, and the image pickup device for projecting an image by the plurality of projection devices and imaging by the plurality of image pickup devices. 図4は、第1投影画像と、第2投影画像と、重複画像と、第1撮像画像と、第2撮像画像との関係の一例を示す図である。FIG. 4 is a diagram showing an example of the relationship between the first projected image, the second projected image, the overlapping image, the first captured image, and the second captured image. 図5は、投影装置による投影の画角内及び撮像装置による撮像の画角内に障害物がある場合の一例を示す図である。FIG. 5 is a diagram showing an example of a case where there is an obstacle within the angle of view of projection by the projection device and within the angle of view of imaging by the imaging device. 図6は、第1投影画像と、第2投影画像と、重複画像と、第1撮像画像と、第2撮像画像との関係の一例を示す図である。FIG. 6 is a diagram showing an example of the relationship between the first projected image, the second projected image, the overlapping image, the first captured image, and the second captured image. 図7は、補正画像が投影される場合の第1投影画像と、第2投影画像と、重複画像と、第1撮像画像と、第2撮像画像との関係の一例を示す図である。FIG. 7 is a diagram showing an example of the relationship between the first projected image, the second projected image, the duplicated image, the first captured image, and the second captured image when the corrected image is projected. 図8は、障害物及び影領域の検知に係る処理の流れを示すフローチャートである。FIG. 8 is a flowchart showing the flow of processing related to the detection of obstacles and shadow areas. 図9は、1つの組として扱われる投影装置と撮像装置の各々の光軸の関係の一例を示す模式図である。FIG. 9 is a schematic diagram showing an example of the relationship between the optical axes of the projection device and the image pickup device, which are treated as one set. 図10は、ハーフミラーを介して投影装置が画像を投射する光軸と撮像装置が重複領域を撮像する光軸とが重なるよう設けられる構成の仕組みを示す模式図である。FIG. 10 is a schematic view showing a mechanism of a configuration in which an optical axis on which a projection device projects an image via a half mirror and an optical axis on which an imaging device images an overlapping region are provided so as to overlap each other. 図11は、本発明の第2実施形態に係る画像投影システムの構成を示す機能ブロック図である。FIG. 11 is a functional block diagram showing a configuration of an image projection system according to a second embodiment of the present invention. 図12は、図11に示す機能ブロックを実現する構成の一例を示すブロック図である。FIG. 12 is a block diagram showing an example of a configuration for realizing the functional block shown in FIG. 図13は、障害物がない場合の重複領域に対する画像の投影及び重複領域の撮像の例を示す図である。FIG. 13 is a diagram showing an example of projecting an image on an overlapping region and imaging the overlapping region when there is no obstacle. 図14は、障害物がある場合の重複領域に対する画像の投影及び重複領域の撮像の例を示す図である。FIG. 14 is a diagram showing an example of projecting an image on an overlapping region and imaging the overlapping region when there is an obstacle. 図15は、障害物が検知されたことに対応した制御が行われた後の重複領域に対する画像の投影及び重複領域の撮像の例を示す図である。FIG. 15 is a diagram showing an example of projection of an image on an overlapping region and imaging of the overlapping region after control corresponding to the detection of an obstacle is performed. 図16は、障害物及び影領域の検知に係る処理の流れを示すフローチャートである。FIG. 16 is a flowchart showing the flow of processing related to the detection of obstacles and shadow areas. 図17は、撮像部が1つである場合の構成例を示す図である。FIG. 17 is a diagram showing a configuration example when there is only one imaging unit. 図18は、本発明の第3実施形態に係る画像投影システムの構成を示す機能ブロック図である。FIG. 18 is a functional block diagram showing a configuration of an image projection system according to a third embodiment of the present invention. 図19は、図18に示す機能ブロックを実現する構成の一例を示すブロック図である。FIG. 19 is a block diagram showing an example of a configuration for realizing the functional block shown in FIG. 図20は、第1投影画像と、第2投影画像と、投影画像と、第1撮像画像と、第2撮像画像との関係の一例を示す図である。FIG. 20 is a diagram showing an example of the relationship between the first projected image, the second projected image, the projected image, the first captured image, and the second captured image. 図21は、補正画像が投影される場合の第1投影画像と、第2投影画像と、投影画像と、第1撮像画像と、第2撮像画像との関係の一例を示す図である。FIG. 21 is a diagram showing an example of the relationship between the first projected image, the second projected image, the projected image, the first captured image, and the second captured image when the corrected image is projected. 図22は、障害物及び影領域の検知に係る処理の流れを示すフローチャートである。FIG. 22 is a flowchart showing the flow of processing related to the detection of obstacles and shadow areas. 図23は、本発明の第4実施形態における被投射体と投影装置と撮像装置との位置関係を示す図である。FIG. 23 is a diagram showing the positional relationship between the projected object, the projection device, and the image pickup device according to the fourth embodiment of the present invention. 図24は、第1投影画像と、第2投影画像と、投影画像と、裏面画像と、第1撮像画像と、第2撮像画像との関係の一例を示す図である。FIG. 24 is a diagram showing an example of the relationship between the first projected image, the second projected image, the projected image, the back surface image, the first captured image, and the second captured image. 図25は、第4実施形態において投影装置による投影の画角内及び撮像装置による撮像の画角内に障害物OB2がある場合の一例を示す図である。FIG. 25 is a diagram showing an example of a case where the obstacle OB2 is within the angle of view of the projection by the projection device and the angle of view of the image captured by the image pickup device in the fourth embodiment. 図26は、第1投影画像と、第2投影画像と、投影画像と、裏面画像と、第1撮像画像と、第2撮像画像との関係の一例を示す図である。FIG. 26 is a diagram showing an example of the relationship between the first projected image, the second projected image, the projected image, the back surface image, the first captured image, and the second captured image. 図27は、補正画像が投影される場合の第1投影画像と、第2投影画像と、投影画像と、裏面画像と、第1撮像画像と、第2撮像画像との関係の一例を示す図である。FIG. 27 is a diagram showing an example of the relationship between the first projected image, the second projected image, the projected image, the back surface image, the first captured image, and the second captured image when the corrected image is projected. Is.
 以下、添付図面を参照して、本発明に係る実施形態を詳細に説明する。なお、この実施形態により本発明が限定されるものではなく、また、実施形態が複数ある場合には、各実施形態を組み合わせて構成するものも含む。また、以下の実施形態において、同一の部位には同一の符号を付することにより重複する説明を省略する。 Hereinafter, embodiments according to the present invention will be described in detail with reference to the accompanying drawings. The present invention is not limited to this embodiment, and when there are a plurality of embodiments, the present invention also includes a combination of the respective embodiments. Further, in the following embodiments, the same parts are designated by the same reference numerals, so that duplicate description will be omitted.
[第1実施形態]
 図1は、本発明の第1実施形態に係る画像投影システム1の構成を示す機能ブロック図である。画像投影システム1は、複数の投影部と、複数の撮像部と、記憶部11と、画像取得部12と、影領域特定部13と、画像補正部14と、制御部15とを備える。図1及び後述する図11では、複数の投影部として、投影部21と投影部22とを例示している。また、図1及び後述する図11では、複数の撮像部として、撮像部31と撮像部32とを例示している。
[First Embodiment]
FIG. 1 is a functional block diagram showing a configuration of an image projection system 1 according to a first embodiment of the present invention. The image projection system 1 includes a plurality of projection units, a plurality of imaging units, a storage unit 11, an image acquisition unit 12, a shadow area identification unit 13, an image correction unit 14, and a control unit 15. In FIG. 1 and FIG. 11 described later, the projection unit 21 and the projection unit 22 are illustrated as a plurality of projection units. Further, in FIG. 1 and FIG. 11 described later, an imaging unit 31 and an imaging unit 32 are illustrated as a plurality of imaging units.
 図2は、図1に示す機能ブロックを実現する構成の一例を示すブロック図である。図2に示すように、画像投影システム1は、例えば複数の投影装置と、複数の撮像装置と、情報処理装置50とを備える。なお、図1及び図2並びに後述する図11及び図12では、情報処理装置50内のデータフローの流れを実線の矢印で示し、光学的な投影及び撮像を破線の矢印で示している。 FIG. 2 is a block diagram showing an example of a configuration for realizing the functional block shown in FIG. As shown in FIG. 2, the image projection system 1 includes, for example, a plurality of projection devices, a plurality of image pickup devices, and an information processing device 50. In FIGS. 1 and 2 and FIGS. 11 and 12 described later, the flow of the data flow in the information processing apparatus 50 is indicated by a solid arrow, and the optical projection and imaging are indicated by a broken arrow.
 情報処理装置50は、所謂コンピュータである。情報処理装置50と複数の投影装置61,62との接続及び情報処理装置50と複数の撮像装置71,72との接続形態は、有線によるものであってもよいし、無線によるものであってもよいし、有線と無線とが混在したものによってもよい。また、具体的な接続形態は、例えばUSB(Universal Serial Bus)のようなバスインタフェースであってもよいし、ネットワーク通信回線であってもよいし、専用に設けられた接続形態であってもよい。 The information processing device 50 is a so-called computer. The connection form between the information processing device 50 and the plurality of projection devices 61 and 62 and the connection form between the information processing device 50 and the plurality of imaging devices 71 and 72 may be wired or wireless. It may be a mixture of wired and wireless. Further, the specific connection form may be a bus interface such as USB (Universal Serial Bus), a network communication line, or a dedicated connection form. ..
 図2に示す各投影装置は、1つの投影部を備える。図2及び後述する図12では、複数の投影装置として、投影装置61と投影装置62とを例示している。投影装置61,62は所謂プロジェクタであり、画像を被投射体80に投影する。被投射体80は、所謂スクリーンである。被投射体80は、例えば、拡散型、反射型又は回帰型のスクリーンであり、被投射面が視認されることを想定して設けられている。被投射体80は、透過型のスクリーンのように、被投射面の裏面が視認されることを想定して設けられていてもよい。 Each projection device shown in FIG. 2 includes one projection unit. In FIG. 2 and FIG. 12 described later, a projection device 61 and a projection device 62 are illustrated as a plurality of projection devices. The projection devices 61 and 62 are so-called projectors, and project an image onto the projected object 80. The projected object 80 is a so-called screen. The projected object 80 is, for example, a diffusion type, a reflective type, or a regression type screen, and is provided on the assumption that the projected surface can be visually recognized. The projected object 80 may be provided on the assumption that the back surface of the projected surface can be visually recognized, such as a transmissive screen.
 投影装置61は、投影部21を備える。投影装置62は、投影部22を備える。投影部21,22はそれぞれ、表示素子、表示素子に光を照射する光源、表示素子が反射又は透過した光を被投射体80に画像として収束させるレンズ等の光学部材、外部から入力される画像データに応じて表示素子を動作させる制御回路等を備える。表示素子の例として、エルコス(LCOS:Liquid Crystal On Silicon)デバイスやデジタル・ミラー・デバイス(DMD:Digital Mirror Device)又は液晶デバイスが挙げられるが、これらに限られるものでなく、適宜変更可能である。 The projection device 61 includes a projection unit 21. The projection device 62 includes a projection unit 22. The projection units 21 and 22, respectively, are a display element, a light source that irradiates the display element with light, an optical member such as a lens that causes the light reflected or transmitted by the display element to converge as an image on the projectile 80, and an image input from the outside. It is equipped with a control circuit or the like that operates the display element according to the data. Examples of the display element include, but are not limited to, an Elcos (LCOS: Liquid Crystal On Silicon) device, a digital mirror device (DMD: Digital Mirror Device), or a liquid crystal device, and can be changed as appropriate. ..
 図2に示す各撮像装置は、1つの撮像部を備える。図2及び後述する図12では、複数の撮像装置として、撮像装置71と撮像装置72とを例示している。撮像装置71は、撮像部31を備える。撮像装置72は、撮像部32を備える。撮像部31,32はそれぞれ、所謂デジタルカメラとして機能する撮像素子、撮像素子の出力に基づいて画像を生成して撮像画像として出力する回路等を備える。撮像素子の例として、CMOS(Complementary Metal Oxide Semiconductor)イメージセンサ又はCCD(Charge Coupled Device)イメージセンサが挙げられるが、これらに限られるものでなく、適宜変更可能である。 Each imaging device shown in FIG. 2 includes one imaging unit. In FIG. 2 and FIG. 12 described later, an image pickup device 71 and an image pickup device 72 are illustrated as a plurality of image pickup devices. The imaging device 71 includes an imaging unit 31. The imaging device 72 includes an imaging unit 32. The image pickup units 31 and 32 each include an image pickup element that functions as a so-called digital camera, a circuit that generates an image based on the output of the image pickup element, and outputs the image as an image pickup image. Examples of the image pickup device include a CMOS (Complementary Metal Oxide Semiconductor) image sensor or a CCD (Charge Coupled Device) image sensor, but the present invention is not limited to these, and can be changed as appropriate.
 図3は、複数の投影装置による画像の投影及び複数の撮像装置による撮像を行うための被投射体80と投影装置61,62と撮像装置71,72との位置関係の一例を示す図である。図2及び図3に示すように、投影装置61と撮像装置71とは、1つの組P1として扱われる。すなわち、投影部21と撮像部31とは組P1になるよう設けられる。また、投影装置62と撮像装置72とは、他の1つの組P2として扱われる。すなわち、投影部22と撮像部32とは組P2になるよう設けられる。 FIG. 3 is a diagram showing an example of the positional relationship between the projected object 80, the projection devices 61, 62, and the image pickup devices 71, 72 for projecting an image by the plurality of projection devices and imaging by the plurality of imaging devices. .. As shown in FIGS. 2 and 3, the projection device 61 and the image pickup device 71 are treated as one set P1. That is, the projection unit 21 and the imaging unit 31 are provided so as to form a set P1. Further, the projection device 62 and the image pickup device 72 are treated as another set P2. That is, the projection unit 22 and the imaging unit 32 are provided so as to form a set P2.
 図3では、投影装置61による投影の画角を、2つの実線L1が形成する鋭角内の範囲で示している。また、投影装置62による投影の画角を、2つの実線L2が形成する鋭角内の範囲で示している。図3に示すように、投影装置61と投影装置62は、被投射体80の重複領域81で各々の投影画像が重複して重複画像を形成するようにそれぞれ異なる画像投影角度から画像を投影するよう設置される。すなわち、図1及び図2に示す投影部21が投影する投影画像と投影部22が投影する投影画像による複数の投影画像が同一領域で重複した重複画像を形成するように、複数の投影部21,22は、それぞれ異なる画像投影角度から投影する。ここで、重複領域81は、当該同一領域として機能する。 In FIG. 3, the angle of view of the projection by the projection device 61 is shown within the acute angle formed by the two solid lines L1. Further, the angle of view of the projection by the projection device 62 is shown within the acute angle formed by the two solid lines L2. As shown in FIG. 3, the projection device 61 and the projection device 62 project images from different image projection angles so that the projected images overlap each other in the overlapping region 81 of the projected object 80 to form an overlapping image. Will be installed. That is, the plurality of projection units 21 form a duplicate image in which the projection image projected by the projection unit 21 shown in FIGS. 1 and 2 and the plurality of projection images produced by the projection image projected by the projection unit 22 overlap in the same region. , 22 project from different image projection angles. Here, the overlapping area 81 functions as the same area.
 図3では、撮像装置71による撮像の画角を、2つの破線L3が形成する鋭角内の範囲で示している。また、撮像装置72による撮像の画角を、2つの破線L4が形成する鋭角内の範囲で示している。図3に示す撮像装置71は、投影装置61と組になるよう設けられて組になる投影装置61の画像投影角度に対応する角度から重複領域81を撮像して撮像画像を生成する。また、撮像装置72は、投影部22と組になるよう設けられて組になる投影装置62の画像投影角度に対応する角度から重複領域81を撮像して撮像画像を生成する。すなわち、図1及び図2に示す撮像部31は、投影部21と組になるよう設けられて組になる投影部21の画像投影角度に対応する角度から重複領域81を撮像して撮像画像を生成する。また、撮像部32は、投影部22と組になるよう設けられて組になる投影部22の画像投影角度に対応する角度から重複領域81を撮像して撮像画像を生成する。 In FIG. 3, the angle of view of the image captured by the imaging device 71 is shown within the acute angle formed by the two broken lines L3. Further, the angle of view of the image captured by the imaging device 72 is shown within the acute angle formed by the two broken lines L4. The imaging device 71 shown in FIG. 3 is provided so as to be paired with the projection device 61, and images the overlapping region 81 from an angle corresponding to the image projection angle of the projection device 61 to be paired with the projection device 61 to generate an captured image. Further, the imaging device 72 images the overlapping region 81 from an angle corresponding to the image projection angle of the projection device 62 which is provided to be paired with the projection unit 22 and generates an captured image. That is, the imaging unit 31 shown in FIGS. 1 and 2 images the overlapping region 81 from an angle corresponding to the image projection angle of the projection unit 21 which is provided to be paired with the projection unit 21 and captures the captured image. Generate. Further, the imaging unit 32 is provided so as to be paired with the projection unit 22, and images the overlapping region 81 from an angle corresponding to the image projection angle of the projection unit 22 to be paired with the projection unit 22 to generate an captured image.
 以下の説明では、特筆しない限り、投影部21によって投影される画像を第1投影画像とする。また、投影部22によって投影される画像を第2投影画像とする。また、投影部21と投影部22によって投影された画像によって被投射体80の重複領域81で形成される画像を重複画像とする。また、撮像部31による撮像画像を第1撮像画像とする。また、撮像部32による撮像画像を第2撮像画像とする。 In the following description, unless otherwise specified, the image projected by the projection unit 21 is referred to as the first projection image. Further, the image projected by the projection unit 22 is referred to as a second projection image. Further, the image formed in the overlapping region 81 of the projected object 80 by the images projected by the projection unit 21 and the projection unit 22 is regarded as the overlapping image. Further, the image captured by the imaging unit 31 is set as the first captured image. Further, the image captured by the imaging unit 32 is used as the second captured image.
 図4は、第1投影画像と、第2投影画像と、重複画像と、第1撮像画像と、第2撮像画像との関係の一例を示す図である。図4に示す関係は、図3に示す配置に対応する関係である。なお、被投射体80に対して投影装置61,62が斜めから画像を投影し、撮像装置71,72が斜めから撮像することにより発生する画像の台形歪については、予め適切に補正されているものとする。例えば、図2の投影装置61の投影部21は、第1投影画像に対して設置位置に応じた所定の台形歪補正処理を行うように構成してもよい。また、図2の撮像装置71の撮像部31は、第1撮像画像に対して設置位置に応じた所定の台形歪補正処理を行うように構成してもよい。また、図2の投影装置62は投影装置61、撮像装置72は撮像装置71と同様に構成してもよい。ここで、所定の台形歪補正処理は、一般的に用いられる台形歪補正処理であればよい。 FIG. 4 is a diagram showing an example of the relationship between the first projected image, the second projected image, the overlapping image, the first captured image, and the second captured image. The relationship shown in FIG. 4 corresponds to the arrangement shown in FIG. The trapezoidal distortion of the image generated by the projection devices 61 and 62 projecting the image obliquely to the projected body 80 and the imaging devices 71 and 72 imaging the image from an oblique angle is appropriately corrected in advance. It shall be. For example, the projection unit 21 of the projection device 61 of FIG. 2 may be configured to perform a predetermined trapezoidal distortion correction process according to the installation position on the first projected image. Further, the imaging unit 31 of the imaging device 71 of FIG. 2 may be configured to perform a predetermined trapezoidal distortion correction process according to the installation position on the first captured image. Further, the projection device 62 of FIG. 2 may be configured in the same manner as the projection device 61, and the image pickup device 72 may be configured in the same manner as the image pickup device 71. Here, the predetermined trapezoidal distortion correction process may be any generally used trapezoidal distortion correction process.
 投影部21と投影部22は、後の記載で説明する画像補正部14による補正画像の生成が行われない限り、同一の画像データに基づいた画像の投影を行う。図4等では、記憶部11に記憶されている投影画像データ11b(図2参照)に対応する画像が投影されている場合を例示している。実施形態の説明では、記憶部11が記憶している投影画像データ11bに対応する画像が投影される場合を例としているが、投影される画像はこれに限られるものでない。画像投影システム1に接続された外部の記憶装置から読み出された投影画像データ、画像投影システム1に接続された外部の情報処理装置から入力された投影画像データ等に対応した画像が投影されてもよい。 The projection unit 21 and the projection unit 22 project an image based on the same image data unless the image correction unit 14 described later generates a corrected image. FIG. 4 and the like exemplify a case where an image corresponding to the projected image data 11b (see FIG. 2) stored in the storage unit 11 is projected. In the description of the embodiment, the case where the image corresponding to the projected image data 11b stored in the storage unit 11 is projected is taken as an example, but the projected image is not limited to this. Images corresponding to the projected image data read from the external storage device connected to the image projection system 1, the projected image data input from the external information processing device connected to the image projection system 1, and the like are projected. May be good.
 図3に示すように、投影装置61による投影の画角内及び投影装置62による投影の画角内のいずれにも障害物がない場合、図4に示す重複画像V1のように、投影部21と投影部22によって投影された画像と同様の画像が重複領域81に形成される。また、図4に示す第1撮像画像31a及び第2撮像画像32aのように、撮像部31と撮像部32による撮像画像は、重複画像V1と同様の画像になる。これは、投影装置61による投影の画角内及び投影装置62による投影の画角内のいずれにも障害物がないのと同様、撮像装置71による撮像画像の画角内及び撮像装置72による撮像画像の画角内のいずれにも障害物がないからである。 As shown in FIG. 3, when there is no obstacle in either the angle of view of the projection by the projection device 61 or the angle of view of the projection by the projection device 62, the projection unit 21 is as shown in the overlapping image V1 shown in FIG. An image similar to the image projected by the projection unit 22 is formed in the overlapping region 81. Further, as in the first captured image 31a and the second captured image 32a shown in FIG. 4, the captured images by the imaging unit 31 and the imaging unit 32 are the same images as the overlapping image V1. This means that there are no obstacles in either the angle of view of the projection by the projection device 61 or the angle of view of the projection by the projection device 62, and the image captured by the image pickup device 71 is captured by the image pickup device 72. This is because there are no obstacles in any of the angles of view of the image.
 図5は、投影装置61による投影の画角内及び撮像装置71による撮像の画角内に障害物OBがある場合の一例を示す図である。図6は、第1投影画像と、第2投影画像と、重複画像と、第1撮像画像と、第2撮像画像との関係の一例を示す図である。図6に示す関係は、図5に示す配置に対応する関係である。 FIG. 5 is a diagram showing an example of a case where there is an obstacle OB within the angle of view of the projection by the projection device 61 and within the angle of view of the image captured by the image pickup device 71. FIG. 6 is a diagram showing an example of the relationship between the first projected image, the second projected image, the overlapping image, the first captured image, and the second captured image. The relationship shown in FIG. 6 corresponds to the arrangement shown in FIG.
 図5に示すように、投影装置61による投影の画角内及び撮像装置71による撮像の画角内に障害物OBがある場合、障害物OBが投影装置61による画像の投影のための投射光の一部を遮る。このため、図6に示す重複画像V2のように、障害物OBによって遮られた光に対応した影領域DAを含む画像が重複領域81に形成される。図示しないが、投影装置61による投影の画角内に障害物OBがなく、投影装置62による投影の画角内に障害物OBがある場合も、重複画像V2のように、影領域DAを含む画像が重複領域81に形成される。このように、複数の投影部のうち1つと重複領域81との間に障害物OBがある場合、障害物OBによって重複画像V2に影領域DAが生じる。 As shown in FIG. 5, when there is an obstacle OB within the angle of view of the projection by the projection device 61 and the angle of view of the image captured by the image pickup device 71, the obstacle OB is the projected light for projecting the image by the projection device 61. Block a part of. Therefore, as in the overlapping image V2 shown in FIG. 6, an image including a shadow region DA corresponding to the light blocked by the obstacle OB is formed in the overlapping region 81. Although not shown, even when there is no obstacle OB within the angle of view of the projection by the projection device 61 and there is an obstacle OB within the angle of view of the projection by the projection device 62, the shadow region DA is included as in the overlapping image V2. The image is formed in the overlapping region 81. As described above, when there is an obstacle OB between one of the plurality of projection portions and the overlapping region 81, the obstacle OB causes a shadow region DA in the overlapping image V2.
 また、撮像装置71による撮像の画角内に障害物OBがあるため、図6に示す第1撮像画像31bのように、影領域DAの内側に障害物OBが位置する画像が撮像部31によって撮像される。一方、撮像装置72による撮像の画角内には障害物OBはない。従って、図6に示す第2撮像画像32bのように、撮像部32による撮像画像は、重複画像V2と同様の画像になる。なお、撮像装置71による撮像の画角内に障害物OBがなく、撮像装置72による撮像の画角内に障害物OBがある場合、第1撮像画像31bの内容と第2撮像画像32bの内容とは逆転する。 Further, since the obstacle OB is within the angle of view of the image captured by the imaging device 71, the image capturing unit 31 displays an image in which the obstacle OB is located inside the shadow region DA as in the first captured image 31b shown in FIG. It is imaged. On the other hand, there is no obstacle OB within the angle of view of the image captured by the imaging device 72. Therefore, as in the second captured image 32b shown in FIG. 6, the captured image by the imaging unit 32 is the same as the duplicate image V2. When there is no obstacle OB within the angle of view of the image captured by the imaging device 71 and there is an obstacle OB within the angle of view of the image captured by the imaging device 72, the contents of the first captured image 31b and the contents of the second captured image 32b. Is reversed.
 このように、撮像装置71又は撮像装置72の一方による撮像の画角内に障害物OBがあり、他方による撮像の画角内に障害物OBがない場合、第1撮像画像31bの内容と第2撮像画像32bの内容とに相違が生じる。係る相違に基づいて、投影装置61又は投影装置62のどちらの画角内に障害物OBがあるかを特定できる。情報処理装置50は、係る特定及び特定の結果に応じた処理を行う。 As described above, when there is an obstacle OB within the angle of view of the image captured by one of the image pickup device 71 or the image pickup device 72 and there is no obstacle OB within the angle of view of the image imaged by the other, the contents and the first image of the first image captured image 31b. 2 There is a difference from the content of the captured image 32b. Based on the difference, it is possible to identify whether the obstacle OB is in the angle of view of the projection device 61 or the projection device 62. The information processing device 50 performs processing according to the specific and specific results.
 図2に示すように、情報処理装置50は、記憶部11と、演算部51とを備える。記憶部11は、ソフトウェア・プログラム(以下、単にプログラムと記載)及びデータを記憶可能な記憶装置を備える。記憶部11が備える記憶装置の例として、ハードディスクドライブ、ソリッドステートドライブ、フラッシュメモリ等が挙げられるが、これらに限られるものでなく、適宜変更可能である。記憶部11は、光ディスク等の記録媒体の読出装置と当該読出装置にセットされた記録媒体との組み合わせによってもよい。図2に示すように、記憶部11は、画像投影プログラム11aと、投影画像データ11bとを記憶する。画像投影プログラム11aは、演算部51が読み出して実行するプログラムである。 As shown in FIG. 2, the information processing device 50 includes a storage unit 11 and a calculation unit 51. The storage unit 11 includes a software program (hereinafter, simply referred to as a program) and a storage device capable of storing data. Examples of the storage device included in the storage unit 11 include, but are not limited to, a hard disk drive, a solid state drive, a flash memory, and the like, and can be changed as appropriate. The storage unit 11 may be a combination of a reading device for a recording medium such as an optical disc and a recording medium set in the reading device. As shown in FIG. 2, the storage unit 11 stores the image projection program 11a and the projected image data 11b. The image projection program 11a is a program read and executed by the calculation unit 51.
 演算部51は、CPU(Central Processing Unit)のように、プログラムを読み出して実行処理することで各種の機能を実現する演算回路を備える。第1実施形態の演算部51は、画像投影プログラム11aを読み出して実行することで、図1及び図2に示す画像取得部12、影領域特定部13、画像補正部14及び制御部15として機能する。 The arithmetic unit 51 includes an arithmetic circuit that realizes various functions by reading a program and executing it, like a CPU (Central Processing Unit). The calculation unit 51 of the first embodiment functions as an image acquisition unit 12, a shadow area identification unit 13, an image correction unit 14, and a control unit 15 shown in FIGS. 1 and 2 by reading and executing the image projection program 11a. do.
 画像取得部12は、投影部21,22によって投影される画像を取得する。具体的には、画像取得部12は、例えば記憶部11から投影画像データ11bを読み出して取得する。画像取得部12によって取得された投影画像データ11bは、影領域特定部13、画像補正部14による参照が可能になる。また、後述する画像補正部14による補正画像の生成がない限り、投影部21,22は、画像取得部12が取得した投影画像データ11bの内容に対応した画像を投影する(図4参照)。 The image acquisition unit 12 acquires the image projected by the projection units 21 and 22. Specifically, the image acquisition unit 12 reads and acquires the projected image data 11b from, for example, the storage unit 11. The projected image data 11b acquired by the image acquisition unit 12 can be referred to by the shadow area identification unit 13 and the image correction unit 14. Further, unless the corrected image is generated by the image correction unit 14 described later, the projection units 21 and 22 project an image corresponding to the content of the projected image data 11b acquired by the image acquisition unit 12 (see FIG. 4).
 影領域特定部13は、撮像画像に基づいて、一つの投影部と重複領域81との間の障害物により重複領域81に生じた影領域を特定する。具体的には、影領域特定部13は、撮像部31,32による撮像画像の内容と画像取得部12が取得した画像の内容とを比較する画像解析を行う。より具体的には、影領域特定部13は、投影画像データ11bの内容に対応した輝度分布の傾向を基準として、撮像部31,32による撮像画像の内容の輝度分布の傾向において輝度が低下している部分領域が生じていた場合、当該部分領域を影領域として抽出する。一例として、影領域を特定する処理である画像解析は、第1撮像画像の影領域を特定する場合には、画像取得部12により第1投影画像を取得し、第1撮像画像と第1投影画像との差分を行い、差分信号を生成する。この差分信号に基づいて、輝度が一定量低下した部分領域を特定するように構成するとよい。同様に、第2撮像画像の影領域を特定する場合は、第2投影画像との差分から、輝度が一定量低下した部分領域を特定するように構成するとよい。また、当該影領域の内側にさらに輝度低下とは異なる輝度の変化や色の変化等の特徴的変化がある部分領域がある場合、当該部分領域を障害物の画像として抽出する。例えば、第1撮像画像と第1投影画像との差分信号において、差分信号の差分量があり、輝度が一定量低下した部分の特徴とは異なる差分信号の部分領域を特定し、障害物の領域として特定するように構成してもよい。係る影領域及び障害物の画像の抽出によって、撮像部31による撮像画像と32による撮像画像のいずれに障害物が含まれているか判定可能になる。係る判定は、影領域特定部13が行ってもよいし、制御部15が行ってもよい。また、影領域特定部13において影領域を特定する際に、更に、影領域を生じさせる障害物を含む撮像画像を撮像する撮像部と組になる投影部を特定するように構成するとよい。撮像画像を撮像する撮像部に対応づけられている投影部は1つの組として構成されるため、撮像部が特定できれば、対応する投影部を容易に特定することができる。係る特定は、影領域特定部13が行ってもよいし、制御部15が行ってもよい。更に、特定された投影部に関する情報を、記憶部11に記憶するようにしてもよい。 The shadow area specifying unit 13 identifies a shadow area generated in the overlapping area 81 due to an obstacle between one projection unit and the overlapping area 81 based on the captured image. Specifically, the shadow region specifying unit 13 performs image analysis for comparing the content of the image captured by the imaging units 31 and 32 with the content of the image acquired by the image acquisition unit 12. More specifically, the shadow region specifying unit 13 has a decrease in brightness in the tendency of the brightness distribution of the contents of the captured images by the imaging units 31 and 32, based on the tendency of the brightness distribution corresponding to the contents of the projected image data 11b. If a subregion is generated, the subregion is extracted as a shadow region. As an example, in the image analysis, which is a process of specifying a shadow region, when the shadow region of the first captured image is specified, the image acquisition unit 12 acquires the first projected image, and the first captured image and the first projection. The difference from the image is performed and a difference signal is generated. Based on this difference signal, it may be configured to specify a partial region in which the brightness has decreased by a certain amount. Similarly, when specifying the shadow region of the second captured image, it is preferable to specify a partial region in which the brightness is reduced by a certain amount from the difference from the second projected image. Further, when there is a partial region inside the shadow region where there is a characteristic change such as a change in brightness or a change in color different from the decrease in brightness, the partial region is extracted as an image of an obstacle. For example, in the difference signal between the first captured image and the first projected image, a partial area of the difference signal that has a difference amount of the difference signal and is different from the feature of the part where the brightness is reduced by a certain amount is specified, and the area of the obstacle. It may be configured to be specified as. By extracting the image of the shadow region and the obstacle, it becomes possible to determine which of the image captured by the imaging unit 31 and the image captured by 32 contains the obstacle. Such a determination may be made by the shadow area specifying unit 13 or by the control unit 15. Further, when the shadow region specifying unit 13 specifies the shadow region, it is preferable to further specify the projection unit that is paired with the imaging unit that captures the captured image including the obstacle that causes the shadow region. Since the projection units associated with the imaging unit that captures the captured image are configured as one set, if the imaging unit can be specified, the corresponding projection unit can be easily specified. Such identification may be performed by the shadow area specifying unit 13 or may be performed by the control unit 15. Further, the information regarding the specified projection unit may be stored in the storage unit 11.
 画像補正部14は、投影の画角内に障害物がある一つの投影部以外の一つ以上の投影部が投影する画像における影領域に対応する領域を補正した補正画像を生成する。具体的には、画像補正部14は、重複領域81における重複画像として元の輝度による描画が行われるように、投影画像データ11bのうち影領域に対応する位置及び大きさの部分領域の輝度を上げる補正を行った補正画像を生成する。補正の度合いは、影領域特定部13によって抽出された影領域が投影画像データ11bの内容に生じさせた輝度低下の度合いに対応する。例えば、第2撮像画像に影領域がある場合には、影領域特定部13での画像解析にて得られる差分信号に基づいて、輝度を上げる補正を行うようにするとよい。第1撮像画像に影領域がある場合も同様の補正を行うようにするとよい。 The image correction unit 14 generates a corrected image in which a region corresponding to a shadow region in an image projected by one or more projection units other than one projection unit having an obstacle within the angle of view of the projection is corrected. Specifically, the image correction unit 14 determines the brightness of a partial region of the projected image data 11b at a position and size corresponding to the shadow region so that the overlapping image in the overlapping region 81 is drawn with the original brightness. Generates a corrected image that has been corrected to raise. The degree of correction corresponds to the degree of decrease in brightness caused by the shadow region extracted by the shadow region specifying unit 13 in the contents of the projected image data 11b. For example, when the second captured image has a shadow region, it is preferable to perform correction for increasing the brightness based on the difference signal obtained by the image analysis in the shadow region specifying unit 13. When the first captured image has a shadow region, the same correction may be performed.
 制御部15は、画像投影システム1の各構成及び各機能構成を統括制御する。具体的には、第1実施形態の制御部15は、投影部に補正画像を投影させる。ここで、制御部15は、特定された影領域を生じさせる投影の画角内に障害物を含む撮像画像を撮像する撮像部と組になる投影部とは異なる投影部により、補正画像を投影させるようにするとよい。 The control unit 15 comprehensively controls each configuration and each functional configuration of the image projection system 1. Specifically, the control unit 15 of the first embodiment projects the corrected image on the projection unit. Here, the control unit 15 projects the corrected image by a projection unit different from the projection unit that is paired with the imaging unit that captures the captured image including an obstacle within the angle of view of the projection that causes the specified shadow region. It is good to let it.
 図7は、補正画像が投影される場合の第1投影画像と、第2投影画像と、重複画像と、第1撮像画像と、第2撮像画像との関係の一例を示す図である。図7に示す関係は、図5に示す配置に対応する関係である。また、図7は、図6に示す影領域DA及び障害物OBの画像が抽出された結果を受けて補正画像14aが生成された場合を想定している。図7では、影領域に対応する位置及び大きさの部分領域であって補正画像14aにおいて輝度が上げられた領域を補正領域BAとして図示している。 FIG. 7 is a diagram showing an example of the relationship between the first projected image, the second projected image, the duplicated image, the first captured image, and the second captured image when the corrected image is projected. The relationship shown in FIG. 7 corresponds to the arrangement shown in FIG. Further, FIG. 7 assumes a case where the corrected image 14a is generated in response to the result of extracting the images of the shadow region DA and the obstacle OB shown in FIG. In FIG. 7, a partial region having a position and a size corresponding to a shadow region and having an increased brightness in the corrected image 14a is shown as a correction region BA.
 図6に示す例では、「投影部21による投影の画角内に障害物OBがある」ことの検出を、撮像部22の撮像画像である「第2撮像画像」に、記憶部11にある投影画像には無いはずの影が映っていることに基づいて、「投影部21による投影の画角内に障害物OBがある」と判定が行われるように構成する。ここで、「投影部21による投影の画角内に障害物OBがある」ことの検出において、第1撮像画像に障害物OBの画像が含まれていることから、投影部21による投影の画角内に障害物OBがあるという判定が行われるように構成してもよい。画像補正部14は、第2撮像画像における影領域DAに対応した位置及び大きさの部分領域の輝度を上げる補正を投影画像データ11bの内容に対して行った補正画像14aを生成する。図7に示すように、制御部15は、補正画像14aを第2投影画像として投影部22に投影させる。なお、制御部15は、投影部21には投影画像データ11bの内容を投影させる。これによって、投影部21による投影画像データ11bに対応した投影に障害物OBが生じさせる影領域の輝度低下を投影部22による補正画像14aに対応した投影によるBAの輝度上昇が相殺する。従って、図7に示す重複画像V3のように、投影画像データ11bの内容と同様の画像が重複領域81に形成される。 In the example shown in FIG. 6, the detection that "there is an obstacle OB within the angle of view projected by the projection unit 21" is detected in the storage unit 11 in the "second captured image" which is the image captured by the imaging unit 22. Based on the fact that a shadow that should not be present in the projected image is projected, it is configured so that it is determined that "the obstacle OB is within the angle of view of the projection by the projection unit 21". Here, in the detection that "there is an obstacle OB within the angle of view of the projection by the projection unit 21", since the image of the obstacle OB is included in the first captured image, the image of the projection by the projection unit 21 It may be configured so that it is determined that there is an obstacle OB in the corner. The image correction unit 14 generates a corrected image 14a in which correction for increasing the brightness of a partial region having a position and a size corresponding to the shadow region DA in the second captured image is performed on the content of the projected image data 11b. As shown in FIG. 7, the control unit 15 projects the corrected image 14a on the projection unit 22 as a second projection image. The control unit 15 causes the projection unit 21 to project the contents of the projected image data 11b. As a result, the decrease in the brightness of the shadow region caused by the obstacle OB in the projection corresponding to the projected image data 11b by the projection unit 21 is offset by the increase in the brightness of the BA due to the projection corresponding to the corrected image 14a by the projection unit 22. Therefore, like the overlapping image V3 shown in FIG. 7, an image similar to the content of the projected image data 11b is formed in the overlapping region 81.
 また、図7に示す第1撮像画像31c及び第2撮像画像32cが示すように、図6に示す第1撮像画像31b及び第2撮像画像32bで撮像されていた影領域DAは撮像されなくなる。制御部15は、係る第1撮像画像31c及び第2撮像画像32cに基づいて、影領域の補正が正常に行われたことを確認できる。 Further, as shown by the first captured image 31c and the second captured image 32c shown in FIG. 7, the shadow region DA captured by the first captured image 31b and the second captured image 32b shown in FIG. 6 is no longer captured. The control unit 15 can confirm that the correction of the shadow region is normally performed based on the first captured image 31c and the second captured image 32c.
 なお、図1及び後述する図11では、出力される画像のデータフローが画像取得部12から画像補正部14を介して投影部21,22に至るよう図示されているが、これに限られるものでない。画像取得部12から直接投影部21,22に至るデータフローがあってもよい。いずれのデータフローが採用されたとしても、制御部15が補正画像14aのような補正画像が生成された場合、障害物が画角内にない投影部に当該補正画像を投影させ、障害物が画角内にある投影部に画像取得部12が取得した画像を投影させる。図1及び後述する図11で例示する構成では、画像補正部14は、補正画像を出力せず画像取得部12に対応した内容を投影する投影部に対しては、画像取得部12が取得した画像の内容に対応した出力をそのまま行う。 Note that, in FIG. 1 and FIG. 11 described later, the data flow of the output image is shown so as to reach the projection units 21 and 22 from the image acquisition unit 12 via the image correction unit 14, but the data flow is limited to this. Not. There may be a data flow from the image acquisition unit 12 to the direct projection units 21 and 22. Regardless of which data flow is adopted, when the control unit 15 generates a corrected image such as the corrected image 14a, the corrected image is projected onto a projection unit where the obstacle is not within the angle of view, and the obstacle is generated. The image acquired by the image acquisition unit 12 is projected onto the projection unit within the angle of view. In the configuration illustrated in FIG. 1 and FIG. 11 described later, the image correction unit 14 has acquired the image correction unit 12 with respect to the projection unit that does not output the corrected image and projects the content corresponding to the image acquisition unit 12. Output corresponding to the content of the image is performed as it is.
 図8は、障害物及び影領域の検知に係る処理の流れを示すフローチャートである。係る処理が行われる前提として、画像取得部12が取得した画像に対応した投影部21,22による画像の投影が行われている。 FIG. 8 is a flowchart showing the flow of processing related to the detection of obstacles and shadow areas. As a premise that such processing is performed, an image is projected by the projection units 21 and 22 corresponding to the image acquired by the image acquisition unit 12.
 まず、撮像部31,32が、重複領域81を撮像することで複数の撮像画像を取得する(ステップS1)。影領域特定部13は、ステップS1の処理で取得された撮像画像に基づいて、影領域及び障害物の画像を検知する(ステップS2)。ステップS2の処理で、影領域及び障害物の画像が検知されなかった場合(ステップS2;No)、処理は終了する。 First, the imaging units 31 and 32 acquire a plurality of captured images by imaging the overlapping region 81 (step S1). The shadow area identification unit 13 detects images of the shadow area and obstacles based on the captured image acquired in the process of step S1 (step S2). When the image of the shadow area and the obstacle is not detected in the process of step S2 (step S2; No), the process ends.
 ステップS2の処理で、影領域及び障害物の画像が検知された場合(ステップS2;Yes)、画像補正部14が、影領域の輝度低下を補償する輝度補正、すなわち、影領域に対応する部分領域の輝度の上昇が施された補正画像を生成する(ステップS3)。そして、制御部15は、ステップS2の処理に基づいて特定された、画像投影の画角内に障害物がない投影部に補正画像を投影させる(ステップS4)。図7に示す例の場合、ステップS4の処理で画像投影の画角内に障害物がない投影部として扱われるのは、投影部22である。 When an image of a shadow region and an obstacle is detected in the process of step S2 (step S2; Yes), the image correction unit 14 compensates for the decrease in the brightness of the shadow region, that is, the portion corresponding to the shadow region. A corrected image in which the brightness of the region is increased is generated (step S3). Then, the control unit 15 projects the corrected image onto the projection unit having no obstacle within the angle of view of the image projection specified based on the process of step S2 (step S4). In the case of the example shown in FIG. 7, it is the projection unit 22 that is treated as a projection unit having no obstacle within the angle of view of the image projection in the process of step S4.
 その後も、ステップS1と同様、複数の撮像画像の取得が行われる。制御部15は、係る複数の撮像画像に基づいて、影領域の輝度低下が解消されたか判定する(ステップS5)。ステップS5の処理で、影領域の輝度低下が解消されたと判定された場合(ステップS5;Yes)、処理は終了する。一方、ステップS5の処理で、影領域の輝度低下が解消されていないと判定された場合(ステップS5;No)、ステップS3の処理に戻る。 After that, as in step S1, a plurality of captured images are acquired. The control unit 15 determines whether or not the decrease in brightness of the shadow region has been eliminated based on the plurality of captured images (step S5). When it is determined in the process of step S5 that the decrease in brightness of the shadow region has been eliminated (step S5; Yes), the process ends. On the other hand, if it is determined in the process of step S5 that the decrease in brightness of the shadow region has not been resolved (step S5; No), the process returns to the process of step S3.
 なお、1つの組として扱われる投影部と撮像部とは、投影部による画像の投影の光軸と撮像部による画像の撮像の光軸とができるだけ一致していることが望ましい。 It is desirable that the projection unit and the imaging unit, which are treated as one set, have the optical axis of the image projected by the projection unit and the optical axis of the image captured by the imaging unit coincide with each other as much as possible.
 図9は、1つの組P1として扱われる投影装置61と撮像装置71の各々の光軸の関係の一例を示す模式図である。図9では、投影装置61による重複領域81への画像の投影の光軸と撮像装置71による重複領域81の撮像の光軸とが対応するように投影装置61と撮像装置71とを並べて配置している。これによって、投影装置61の画角と撮像装置71の画角とを整合させることができる。 FIG. 9 is a schematic diagram showing an example of the relationship between the optical axes of the projection device 61 and the image pickup device 71, which are treated as one set P1. In FIG. 9, the projection device 61 and the image pickup device 71 are arranged side by side so that the optical axis of the image projected onto the overlap area 81 by the projection device 61 and the optical axis of the image pickup of the overlap area 81 by the image pickup device 71 correspond to each other. ing. As a result, the angle of view of the projection device 61 and the angle of view of the image pickup device 71 can be matched.
 図10は、ハーフミラー90を介して投影装置61が画像を投射する光軸と撮像装置71が重複領域81を撮像する光軸とが重なるよう設けられる構成の仕組みを示す模式図である。図10に示すように、ハーフミラー90を投影装置61が画像を投射する光軸に設けることで、投影装置61が画像の投射のために発する光を透過させると共に、重複領域81からの反射光の一部を反射して当該光軸とは異なる角度に向けることができる。具体的には、図10に示すように、ハーフミラー90が重複領域81からの反射光の一部を反射する反射角に対応するよう撮像装置71を配置する。これによって、重複領域81とハーフミラー90との間における投影装置61が画像を投射する光軸と撮像装置71が重複領域81を撮像する光軸とを一致させることができる。これによって、投影装置61の画角と撮像装置71の画角とをより高精度に整合させることができる。 FIG. 10 is a schematic diagram showing a mechanism of a configuration in which an optical axis on which a projection device 61 projects an image via a half mirror 90 and an optical axis on which an imaging device 71 images an overlapping region 81 are provided so as to overlap each other. As shown in FIG. 10, by providing the half mirror 90 on the optical axis on which the projection device 61 projects an image, the light emitted by the projection device 61 for projecting the image is transmitted, and the reflected light from the overlapping region 81 is transmitted. It is possible to reflect a part of the light and point it at an angle different from the optical axis. Specifically, as shown in FIG. 10, the image pickup device 71 is arranged so that the half mirror 90 corresponds to a reflection angle that reflects a part of the reflected light from the overlapping region 81. As a result, the optical axis on which the projection device 61 projects an image between the overlapping region 81 and the half mirror 90 can be aligned with the optical axis on which the imaging device 71 images the overlapping region 81. As a result, the angle of view of the projection device 61 and the angle of view of the image pickup device 71 can be matched with higher accuracy.
 なお、図9及び図10を参照した説明では、組P1を例としているが、組P2や図示しない3つ目以上の組がある場合についても同様の構成を取れる。 In the description with reference to FIGS. 9 and 10, the set P1 is taken as an example, but the same configuration can be obtained when there is a set P2 or a third or more set (not shown).
 上述のとおり、第1実施形態では、障害物により生じた輝度低下を補正画像の投射によって補償できる。従って、投影画像の一部の明るさの変化や、欠ける等の画質低下の発生を防ぐことができ、重複領域81に投影される画像の画質をより確実に維持することができる。 As described above, in the first embodiment, the decrease in brightness caused by an obstacle can be compensated by projecting a corrected image. Therefore, it is possible to prevent a change in the brightness of a part of the projected image and deterioration of the image quality such as chipping, and it is possible to more reliably maintain the image quality of the image projected on the overlapping region 81.
 また、第1実施形態では、ハーフミラー90を採用することで、投影装置61の画角と撮像装置71の画角とをより高精度に整合させることができる。 Further, in the first embodiment, by adopting the half mirror 90, the angle of view of the projection device 61 and the angle of view of the image pickup device 71 can be matched with higher accuracy.
[第2実施形態]
 次に、第1実施形態とは異なる第2実施形態について説明する。第2実施形態の説明に係り、第1実施形態と同様の構成については同じ符号を付して説明を省略する。
[Second Embodiment]
Next, a second embodiment different from the first embodiment will be described. Regarding the description of the second embodiment, the same reference numerals are given to the same configurations as those of the first embodiment, and the description thereof will be omitted.
 図11は、本発明の第2実施形態に係る画像投影システム100の構成を示す機能ブロック図である。図12は、図11に示す機能ブロックを実現する構成の一例を示すブロック図である。図11に示すように、画像投影システム100は、画像投影システム1が備える影領域特定部13、画像補正部14の構成に代えて、障害物検知部16、投影制御部17を備える。具体的には、図12に示すように、画像投影システム100の情報処理装置50が備える記憶部11が記憶する画像投影プログラム11cを演算部51が読み出して実行処理する。これによって、影領域特定部13、画像補正部14に代えた機能として障害物検知部16、投影制御部17が実現される。言い換えれば、画像投影プログラム11aと画像投影プログラム11cとは、実現される機能の相違を除いて同様である。第2実施形態の演算部51は、画像投影プログラム11cを読み出して実行することで、図11及び図12に示す画像取得部12、制御部15、障害物検知部16及び投影制御部17として機能する。 FIG. 11 is a functional block diagram showing the configuration of the image projection system 100 according to the second embodiment of the present invention. FIG. 12 is a block diagram showing an example of a configuration for realizing the functional block shown in FIG. As shown in FIG. 11, the image projection system 100 includes an obstacle detection unit 16 and a projection control unit 17 instead of the shadow region identification unit 13 and the image correction unit 14 included in the image projection system 1. Specifically, as shown in FIG. 12, the calculation unit 51 reads out and executes the image projection program 11c stored in the storage unit 11 included in the information processing device 50 of the image projection system 100. As a result, the obstacle detection unit 16 and the projection control unit 17 are realized as functions that replace the shadow area identification unit 13 and the image correction unit 14. In other words, the image projection program 11a and the image projection program 11c are the same except for the difference in the realized functions. The calculation unit 51 of the second embodiment functions as an image acquisition unit 12, a control unit 15, an obstacle detection unit 16 and a projection control unit 17 shown in FIGS. 11 and 12 by reading and executing the image projection program 11c. do.
 障害物検知部16は、影領域特定部13が備える機能のうち、少なくとも障害物の画像の抽出に係る機能を奏する。すなわち、障害物検知部16は、撮像部31,32による撮像画像に基づいて、一つの投影部と重複領域との間の障害物を検知する。ここで、障害物検知部16において障害物を検知する際に、更に、影領域を生じさせる障害物を含む撮像画像を撮像する撮像部と組になる投影部を特定するように構成するとよい。撮像画像を撮像する撮像部に対応づけられている投影部は1つの組として構成されるため、撮像部が特定できれば、対応する投影部を容易に特定することができる。係る特定は、障害物検知部16が行ってもよいし、投影制御部17が行ってもよい。更に、特定された投影部に関する情報を、記憶部11に記憶するようにしてもよい。 The obstacle detection unit 16 plays at least a function related to extracting an image of an obstacle among the functions provided by the shadow area identification unit 13. That is, the obstacle detection unit 16 detects an obstacle between one projection unit and the overlapping region based on the images captured by the imaging units 31 and 32. Here, when the obstacle detection unit 16 detects an obstacle, it may be configured to further specify a projection unit that is paired with an imaging unit that captures an captured image including an obstacle that causes a shadow region. Since the projection units associated with the imaging unit that captures the captured image are configured as one set, if the imaging unit can be specified, the corresponding projection unit can be easily specified. Such identification may be performed by the obstacle detection unit 16 or the projection control unit 17. Further, the information regarding the specified projection unit may be stored in the storage unit 11.
 障害物検知部16の機能として、影領域特定部13が備える影領域の抽出に係る機能は必須でないが、有していてもよい。第2実施形態の説明では、障害物検知部16が備える機能と、影領域特定部13が備える機能とは同一であるものとする。 As the function of the obstacle detection unit 16, the function related to the extraction of the shadow area included in the shadow area identification unit 13 is not essential, but it may be possessed. In the description of the second embodiment, it is assumed that the function provided by the obstacle detection unit 16 and the function provided by the shadow area identification unit 13 are the same.
 投影制御部17は、一つの投影部と重複領域との間の障害物が検知された場合、当該一つの投影部とは異なる投影部に画像を投影させる。また、投影制御部17は、障害物が検知されていない場合、複数の投影部のいずれによって重複領域81に画像が投影されるかを所定周期で切り替えるよう、複数の投影部を動作させる。このように、投影制御部17は、複数の投影部21,22によって投影する投影画像をフレームごとに切り替えて投影する。ここで、投影制御部17は、特定された影領域を生じさせる投影の画角内に障害物を含む撮像画像を撮像する撮像部と組になる投影部とは異なる投影部により、投影画像を投影させるようにするとよい。 When an obstacle between one projection unit and the overlapping region is detected, the projection control unit 17 projects an image on a projection unit different from the one projection unit. Further, when an obstacle is not detected, the projection control unit 17 operates a plurality of projection units so as to switch which of the plurality of projection units projects an image on the overlapping region 81 at a predetermined cycle. In this way, the projection control unit 17 switches and projects the projected image projected by the plurality of projection units 21 and 22 for each frame. Here, the projection control unit 17 uses a projection unit different from the projection unit, which is paired with the imaging unit that captures the captured image including an obstacle within the angle of view of the projection that causes the specified shadow region, to produce the projected image. It is good to project it.
 図13は、障害物がない場合の重複領域81に対する画像の投影及び重複領域81の撮像の例を示す図である。なお、図13から図15を参照した説明では、重複領域81に形成される画像を投影画像と記載している。 FIG. 13 is a diagram showing an example of projection of an image on the overlapping region 81 and imaging of the overlapping region 81 when there is no obstacle. In the description with reference to FIGS. 13 to 15, the image formed in the overlapping region 81 is described as a projected image.
 図13に示す例では、投影部21が、第1投影画像として投影画像データ11bに対応した画像を投影している。一方、第1投影画像が投影されている期間中、第2投影画像は投影されない。図13では、第2投影画像が投影されないことを、黒の塗りつぶし矩形NLで図示している。図示しないが、画像を投影する構成は所定周期で切り替わる。従って、第2投影画像が投影され、第1投影画像が投影されない期間もある。具体的には、例えばフレーム画像の更新周期に応じて画像を投影する構成が投影部21と投影部22とで交互に切り替わる。毎フレーム切り替わってもよいし、所定数のフレーム毎に切り替わってもよい。 In the example shown in FIG. 13, the projection unit 21 projects an image corresponding to the projected image data 11b as the first projected image. On the other hand, during the period when the first projected image is projected, the second projected image is not projected. In FIG. 13, the fact that the second projected image is not projected is illustrated by a black filled rectangle NL. Although not shown, the configuration for projecting an image is switched at a predetermined cycle. Therefore, there is a period during which the second projected image is projected and the first projected image is not projected. Specifically, for example, the configuration for projecting an image according to the update cycle of the frame image is alternately switched between the projection unit 21 and the projection unit 22. It may be switched every frame, or it may be switched every predetermined number of frames.
 図3に示すように、投影装置61,62による投影の画角内に障害物がない場合、図13に示す投影画像V4のように、投影装置61又は投影装置62によって投影された画像と同様の画像が重複領域81に形成される。また、図13に示す第1撮像画像31d及び第2撮像画像32dのように、撮像部31と撮像部32による撮像画像は、投影画像V4と同様の画像になる。 As shown in FIG. 3, when there is no obstacle within the angle of view of the projection by the projection devices 61 and 62, the same as the image projected by the projection device 61 or the projection device 62 as in the projection image V4 shown in FIG. The image of is formed in the overlapping region 81. Further, as in the first captured image 31d and the second captured image 32d shown in FIG. 13, the captured images by the imaging unit 31 and the imaging unit 32 are the same images as the projected image V4.
 図14は、障害物がある場合の重複領域81に対する画像の投影及び重複領域81の撮像の例を示す図である。図14は、障害物の検知に応じた投影制御部17による複数の投影部の動作制御はまだ行われていない状態である。 FIG. 14 is a diagram showing an example of projecting an image on the overlapping region 81 and imaging the overlapping region 81 when there is an obstacle. FIG. 14 shows a state in which the operation control of the plurality of projection units by the projection control unit 17 in response to the detection of an obstacle has not yet been performed.
 図5に示すように、投影装置61による投影の画角内及び撮像装置71による撮像の画角内に障害物OBがある場合、障害物OBが投影装置61による画像の投影のための投射光の一部を遮る。このため、投影部21が、第1投影画像として投影画像データ11bに対応した画像を投影している期間中、図14に示す投影画像V5のように、障害物OBによって遮られた光に対応した影領域DAを含む画像が重複領域81に形成される。また、撮像装置71による撮像の画角内に障害物OBがあるため、図14に示す第1撮像画像31eのように、影領域DAの内側に障害物OBが位置する画像が撮像部31によって撮像される。一方、撮像装置72による撮像の画角内には障害物OBはない。従って、図14に示す第2撮像画像32eのように、撮像部32による撮像画像は、投影画像V4と同様の画像になる。 As shown in FIG. 5, when there is an obstacle OB within the angle of view of the projection by the projection device 61 and the angle of view of the image captured by the image pickup device 71, the obstacle OB is the projected light for projecting the image by the projection device 61. Block a part of. Therefore, during the period in which the projection unit 21 is projecting the image corresponding to the projected image data 11b as the first projected image, it corresponds to the light blocked by the obstacle OB as in the projected image V5 shown in FIG. An image including the shadow region DA is formed in the overlapping region 81. Further, since the obstacle OB is within the angle of view of the image captured by the imaging device 71, the image capturing unit 31 displays an image in which the obstacle OB is located inside the shadow region DA as in the first captured image 31e shown in FIG. It is imaged. On the other hand, there is no obstacle OB within the angle of view of the image captured by the imaging device 72. Therefore, as in the second captured image 32e shown in FIG. 14, the captured image by the imaging unit 32 is the same as the projected image V4.
 図15は、障害物が検知されたことに対応した制御が行われた後の重複領域81に対する画像の投影及び重複領域81の撮像の例を示す図である。図14に示すような状態が生じた場合、障害物検知部16は、投影装置61による投影の画角内及び撮像装置71による撮像の画角内に障害物OBがあることを、第1撮像画像31eに基づいて検知する。係る検知後、投影制御部17は、重複領域81に対する画像の投影を投影装置62に行わせ、投影装置61に行わせないよう動作を制御する。これによって、図15に示すように、第1投影画像が投影されず、第2投影画像として投影画像データ11bに対応した画像が投影される。投影装置62による投影の画角内に障害物がないので、図13に示す投影画像V6のように、投影装置62によって投影された画像と同様の画像が重複領域81に形成される。 FIG. 15 is a diagram showing an example of projection of an image on the overlapping region 81 and imaging of the overlapping region 81 after the control corresponding to the detection of the obstacle is performed. When the state shown in FIG. 14 occurs, the obstacle detection unit 16 first captures that the obstacle OB is within the angle of view of the projection by the projection device 61 and the angle of view of the image captured by the image pickup device 71. Detect based on the image 31e. After such detection, the projection control unit 17 controls the operation so that the projection device 62 projects the image on the overlapping region 81 and the projection device 61 does not project the image. As a result, as shown in FIG. 15, the first projected image is not projected, and the image corresponding to the projected image data 11b is projected as the second projected image. Since there are no obstacles within the angle of view of the projection by the projection device 62, an image similar to the image projected by the projection device 62 is formed in the overlapping region 81 as in the projection image V6 shown in FIG.
 また、撮像装置71による撮像画像の画角内に障害物があるので、図15に示す第1撮像画像31fのように、撮像部31による撮像画像は、障害物OBの画像を含む。一方、撮像装置72による撮像画像の画角内には障害物がないので、図15に示す第2撮像画像32fのように、撮像部32による撮像画像は、投影画像V6と同様の画像になる。 Further, since there is an obstacle within the angle of view of the image captured by the imaging device 71, the image captured by the imaging unit 31 includes an image of the obstacle OB as in the first captured image 31f shown in FIG. On the other hand, since there are no obstacles within the angle of view of the image captured by the imaging device 72, the image captured by the imaging unit 32 becomes the same image as the projected image V6 as in the second captured image 32f shown in FIG. ..
 図16は、障害物及び影領域の検知に係る処理の流れを示すフローチャートである。係る処理が行われる前提として、画像取得部12が取得した画像に対応した投影部21又は投影部22による画像の投影が所定周期で切り替わりながら行われている。 FIG. 16 is a flowchart showing the flow of processing related to the detection of obstacles and shadow areas. As a premise that such processing is performed, the projection of the image by the projection unit 21 or the projection unit 22 corresponding to the image acquired by the image acquisition unit 12 is performed while switching at a predetermined cycle.
 まず、撮像部31,32が、重複領域81を撮像することで複数の撮像画像を取得する(ステップS11)。障害物検知部16は、ステップS11の処理で取得された撮像画像に基づいて、影領域の画像を検知する(ステップS12)。ステップS12の処理で、影領域の画像が検知されなかった場合(ステップS12;No)、処理は終了する。 First, the imaging units 31 and 32 acquire a plurality of captured images by imaging the overlapping region 81 (step S11). The obstacle detection unit 16 detects an image in the shadow region based on the captured image acquired in the process of step S11 (step S12). If the image in the shadow region is not detected in the process of step S12 (step S12; No), the process ends.
 ステップS12の処理で、影領域が検知された場合(ステップS12;Yes)、投影制御部17が、影領域が検知された期間に重複領域81に画像を投影していた投影部を特定する(ステップS13)。そして、投影制御部17は、ステップS13の処理に基づいて特定された投影部とは異なる投影部に画像を投影させる(ステップS14)。 When the shadow region is detected in the process of step S12 (step S12; Yes), the projection control unit 17 identifies the projection unit that projected the image on the overlapping region 81 during the period when the shadow region was detected (step S12; Yes). Step S13). Then, the projection control unit 17 projects the image on a projection unit different from the projection unit specified based on the process of step S13 (step S14).
 その後も、ステップS11と同様、複数の撮像画像の取得が行われる。障害物検知部16は、係る複数の撮像画像に基づいて、影領域が解消されたか判定する(ステップS15)。ステップS15の処理で、影領域の輝度低下が解消されたと判定された場合(ステップS15;Yes)、処理は終了する。一方、ステップS15の処理で、影領域の輝度低下が解消されていないと判定された場合(ステップS15;No)、ステップS13の処理に戻る。 After that, as in step S11, a plurality of captured images are acquired. The obstacle detection unit 16 determines whether or not the shadow region has been eliminated based on the plurality of captured images (step S15). When it is determined in the process of step S15 that the decrease in brightness of the shadow region has been eliminated (step S15; Yes), the process ends. On the other hand, if it is determined in the process of step S15 that the decrease in brightness of the shadow region has not been resolved (step S15; No), the process returns to the process of step S13.
 なお、図14及び図15を参照した説明では、第1撮像画像及び第2撮像画像に基づいて障害物検知部16が投影部21,22のどちらの画角内に障害物OBがあるかを検知している。このような制御を行うことで、影領域DAを生じさせた障害物OBが投影装置61,62のいずれの画角内にあるかをより確実に特定できるという特筆すべき効果がある。 In the explanation with reference to FIGS. 14 and 15, the obstacle detection unit 16 determines in which angle of view the projection units 21 and 22 the obstacle OB is located based on the first captured image and the second captured image. It is being detected. By performing such control, there is a remarkable effect that the obstacle OB that caused the shadow region DA can be more reliably identified within which angle of view of the projection devices 61 and 62.
 一方、重複領域81に投影される画像における影領域の発生を解消させるための制御はこれに限られるものでない。例えば、重複領域81に影領域DAが生じたタイミングで画像を投影していた投影部が投影部21又は投影部22のいずれであるかを障害物検知部16が特定する。その特定の結果に基づいて、投影制御部17が特定された投影部とは異なる投影部によって画像を投影させるようにする。これによっても、重複領域81に投影される画像における影領域の発生を解消できる。この場合、撮像部は1つであってもよい。従って、第2実施形態では、投影部と撮像部とが組であることは必須でない。これによれば、より構成を簡素化できる。 On the other hand, the control for eliminating the occurrence of the shadow region in the image projected on the overlapping region 81 is not limited to this. For example, the obstacle detection unit 16 identifies whether the projection unit 21 or the projection unit 22 that was projecting the image at the timing when the shadow region DA is generated in the overlapping region 81. Based on the specific result, the projection control unit 17 causes the image to be projected by a projection unit different from the specified projection unit. This also eliminates the occurrence of a shadow region in the image projected on the overlapping region 81. In this case, the number of imaging units may be one. Therefore, in the second embodiment, it is not essential that the projection unit and the imaging unit are a set. According to this, the configuration can be further simplified.
 また、図14、図15及び図16のフローチャートを参照した説明にあるように、障害物検知部16は、投影装置61による投影の画角内及び撮像装置71による撮像の画角内に障害物OBがあることを、第1撮像画像31eに基づいて検知する。係る検知後、投影制御部17は、重複領域81に対する画像の投影を投影装置62に行わせ、投影装置61に行わせないよう動作を制御するが、これに限られるものではない。 Further, as described with reference to the flowcharts of FIGS. 14, 15 and 16, the obstacle detection unit 16 has an obstacle within the angle of view of the projection by the projection device 61 and within the angle of view of the image captured by the image pickup device 71. The presence of OB is detected based on the first captured image 31e. After such detection, the projection control unit 17 controls the operation so that the projection device 62 projects the image on the overlapping region 81 and the projection device 61 does not project the image, but the operation is not limited to this.
 例えば、投影制御部17により複数の投影装置の投影を制御して、フレームごとに投影装置を切り替えながら投影を行い、投影装置61による投影の画角内及び撮像装置71による撮像の画角内に障害物OBがあることを、第1撮像画像31eに基づいて検知するとともに、第2撮像画像32eおよび第1投影映像11bに基づいて影領域を特定し、特定された影領域に対して、次のフレームで投影される第2投影画像の影領域に対応する領域に対して、輝度の補正により影領域に対応する画像領域の輝度の低下を補うために高く設定し、補正された第2投影画像をフレームごとに切り替えて投影するようにしてもよい。このフレームごとに切り替えて投影する際に、投影の画角内に障害物OBを含む投影装置61のような投影装置による投影も含め、切り替えながら投影するようにしてもよい。また、フレームごとに切り替えて投影する際に、非常に高いフレームレートによる投影となるように、各々の投影装置の投影を高速に切り替えて投影するように投影制御部17が制御するようにしてもよい。ここで、非常に高いフレームレートは、映像表示で一般的な30Hzや60Hz以上のフレームレートであるとよい。 For example, the projection control unit 17 controls the projection of a plurality of projection devices to perform projection while switching the projection device for each frame, and within the projection angle of the projection device 61 and within the image capture angle of the image pickup device 71. The presence of the obstacle OB is detected based on the first captured image 31e, and the shadow region is specified based on the second captured image 32e and the first projected image 11b. The area corresponding to the shadow area of the second projection image projected in the frame of is set high to compensate for the decrease in the brightness of the image area corresponding to the shadow area by the correction of the brightness, and the corrected second projection. The image may be switched frame by frame and projected. When switching and projecting for each frame, projection may be performed while switching, including projection by a projection device such as a projection device 61 including an obstacle OB within the angle of view of the projection. Further, even if the projection control unit 17 controls the projection of each projection device to be switched and projected at high speed so that the projection is performed at a very high frame rate when the projection is switched for each frame. good. Here, a very high frame rate is preferably a frame rate of 30 Hz or 60 Hz or higher, which is common in video display.
 このように重複領域81に投影される画像が、非常に高いフレームレートにて投影されるようにすることで、視覚上、影領域による影響と、補正された投影画像による影響により相殺し、影領域が知覚されることを軽減することができる。 By projecting the image projected on the overlapping region 81 at a very high frame rate in this way, the influence of the shadow region and the influence of the corrected projected image are visually offset by the shadow. It is possible to reduce the perception of the area.
 更に、投影装置を3台以上となるように構成した場合、投影制御部17は、投影画角内に障害物OBがある投影装置の投影を行わせないようにし、重複領域81に対する画像の投影を投影画角内に障害物OBが無い投影装置に行われるように制御するようにしてもよい。ここで、投影制御部17によって投影するように制御される投影画角内に障害物OBが無い投影装置は、所定の投影装置であってもよいし、複数の投影画角内に障害物OBが無い投影装置を、フレームごとに切り替えながら投影するように制御してもよい。また、複数の投影画角内に障害物OBが無い投影装置にてフレームごとに切り替えながら投影する際に、全ての投影装置を利用して投影していた場合のフレームレートとは異なるフレームレートにて、投影するように制御するようにしてもよい。例えば、3台の投影装置で構成されており、そのうち1台が投影画角内に障害物OBを含み、30Hzのフレームレートで投影装置を切り替えながら投影していた場合、投影画角内に障害物OBを含まない残り2台の投影装置により投影する際に、例えば30Hzの2/3である20Hzで投影するように制御してもよい。 Further, when the number of projection devices is set to three or more, the projection control unit 17 prevents the projection device having an obstacle OB within the projection angle of view from projecting, and projects the image onto the overlapping region 81. May be controlled so that it is performed on a projection device having no obstacle OB within the projection angle of view. Here, the projection device having no obstacle OB within the projection angle of view controlled by the projection control unit 17 may be a predetermined projection device, or the obstacle OB may be within a plurality of projection angles of view. It is also possible to control the projection device without the above to project while switching for each frame. In addition, when projecting while switching frame by frame with a projection device that does not have obstacles OB within a plurality of projection angles of view, the frame rate is different from the frame rate when projecting using all the projection devices. It may be controlled so as to project. For example, if it is composed of three projection devices, one of which contains an obstacle OB within the projection angle of view and projects while switching the projection device at a frame rate of 30 Hz, the obstacle is within the projection angle of view. When projecting by the remaining two projection devices that do not include the object OB, it may be controlled to project at 20 Hz, which is 2/3 of 30 Hz, for example.
 図17は、撮像部が1つである場合の構成例を示す図である。図17に示す例では、撮像装置71及び撮像装置72が省略され、これらに代えて撮像装置75が設けられている。撮像装置75は、投影装置61及び投影装置62から独立した位置に設けられてよい。図17に示す例では、撮像装置75は、重複領域81に正対する位置に設けられている。撮像装置75が備える撮像部は、撮像装置71が備える撮像部31及び撮像装置72が備える撮像部32と同様であり、重複領域81を撮像範囲に含むようセッティングされる。図17では、撮像装置75による撮像の画角を、2つの破線L5が形成する鋭角内の範囲で示している。 FIG. 17 is a diagram showing a configuration example when there is one imaging unit. In the example shown in FIG. 17, the image pickup device 71 and the image pickup device 72 are omitted, and the image pickup device 75 is provided in place of them. The image pickup apparatus 75 may be provided at a position independent of the projection apparatus 61 and the projection apparatus 62. In the example shown in FIG. 17, the image pickup apparatus 75 is provided at a position facing the overlapping region 81. The imaging unit included in the imaging device 75 is the same as the imaging unit 31 included in the imaging device 71 and the imaging unit 32 included in the imaging device 72, and is set so as to include the overlapping region 81 in the imaging range. In FIG. 17, the angle of view of the image captured by the imaging device 75 is shown within the acute angle formed by the two broken lines L5.
 なお、第2実施形態では、第1実施形態で説明した補正画像の投影に係る各種の処理及び制御は省略される。以上、特筆した事項を除いて、第2実施形態は第1実施形態と同様である。 In the second embodiment, various processes and controls related to the projection of the corrected image described in the first embodiment are omitted. As described above, the second embodiment is the same as the first embodiment except for the matters noted.
 上述のとおり、第2実施形態では、画角内に障害物が検知された投影部とは異なる投影部によって画像を投影させることで、より簡易に投影画像の画質を維持し、投影画像の一部の明るさの変化や、欠ける等の画質低下の発生を防ぐことができる。 As described above, in the second embodiment, the image quality of the projected image is more easily maintained by projecting the image by a projection unit different from the projection unit in which the obstacle is detected within the angle of view, and one of the projected images. It is possible to prevent changes in the brightness of the part and deterioration of image quality such as chipping.
 また、第2実施形態では、投影部と撮像部とが組であることで、影領域DAを生じさせた障害物OBが投影装置61,62のいずれの画角内にあるかをより確実に特定できる。 Further, in the second embodiment, since the projection unit and the imaging unit are paired, it is more certain that the obstacle OB that caused the shadow region DA is within the angle of view of the projection devices 61 and 62. Can be identified.
 また、特許文献1のように画像投影装置から投影される光を部分的に禁止する処理は複雑であり、より簡易に投影画像の画質を維持できる方法が求められていた。また、遮る領域に対応する投影を禁止するため投影画像の明るさが低下する。これに対し、第2実施形態によれば、より簡易に画質を維持し、投影画像の一部の明るさの変化や、欠ける等の画質低下の発生を防ぐことができる。 Further, the process of partially prohibiting the light projected from the image projection device as in Patent Document 1 is complicated, and a method capable of maintaining the image quality of the projected image more easily has been required. In addition, the brightness of the projected image is reduced because the projection corresponding to the blocked area is prohibited. On the other hand, according to the second embodiment, it is possible to more easily maintain the image quality and prevent the change in the brightness of a part of the projected image and the occurrence of image quality deterioration such as chipping.
[第3実施形態]
 図18は、本発明の第3実施形態に係る画像投影システム200の構成を示す機能ブロック図である。図19は、図18に示す機能ブロックを実現する構成の一例を示すブロック図である。画像投影システム200は、複数の投影部と、複数の撮像部と、記憶部11と、画像取得部12と、特定部113と、抽出部114と、画像補正部115と、制御部116とを備える。図18及び後述する図25では、複数の投影部として、投影部21と投影部22とを例示している。また、図18及び後述する図25では、複数の撮像部として、撮像部31と撮像部32とを例示している。
[Third Embodiment]
FIG. 18 is a functional block diagram showing the configuration of the image projection system 200 according to the third embodiment of the present invention. FIG. 19 is a block diagram showing an example of a configuration for realizing the functional block shown in FIG. The image projection system 200 includes a plurality of projection units, a plurality of imaging units, a storage unit 11, an image acquisition unit 12, a specific unit 113, an extraction unit 114, an image correction unit 115, and a control unit 116. Be prepared. In FIG. 18 and FIG. 25 described later, the projection unit 21 and the projection unit 22 are illustrated as a plurality of projection units. Further, in FIG. 18 and FIG. 25 described later, an imaging unit 31 and an imaging unit 32 are illustrated as a plurality of imaging units.
 以下、第3実施形態及び第4実施形態の説明では、特筆しない限り、画像投影システム200によって投影された画像によって被投射体80の重複領域81で形成される画像を投影画像とする。 Hereinafter, in the description of the third embodiment and the fourth embodiment, unless otherwise specified, the image formed by the overlapping region 81 of the projected object 80 by the image projected by the image projection system 200 is referred to as a projected image.
 第3実施形態では、投影部21と投影部22は、後の記載で説明する画像補正部115による補正画像の生成が行われない限り、一方が画像の投影を行う。実施形態の説明では、記憶部11が記憶している投影画像データ11bに対応する画像が投影される場合を例としているが、投影される画像はこれに限られるものでない。画像投影システム200に接続された外部の記憶装置から読み出された投影画像データ、画像投影システム200に接続された外部の情報処理装置から入力された投影画像データ等に対応した画像が投影されてもよい。 In the third embodiment, one of the projection unit 21 and the projection unit 22 projects an image unless the correction image is generated by the image correction unit 115 described later. In the description of the embodiment, the case where the image corresponding to the projected image data 11b stored in the storage unit 11 is projected is taken as an example, but the projected image is not limited to this. Images corresponding to the projected image data read from the external storage device connected to the image projection system 200, the projected image data input from the external information processing device connected to the image projection system 200, and the like are projected. May be good.
 図3に示すように、投影の画角内に障害物がない場合、図13に示す投影画像V4のように、投影部21によって投影された画像と同様の画像が重複領域81に形成される。また、投影の画角内に障害物がないのと同様、撮像画像の画角内にも障害物がない場合、図13に示す第1撮像画像31d及び第2撮像画像32dのように、撮像部31と撮像部32による撮像画像は、投影画像V4と同様の画像になる。図13等では、記憶部11に記憶されている投影画像データ11b(図19参照)に対応する画像を投影部21が投影している場合を例示している。 As shown in FIG. 3, when there is no obstacle within the angle of view of the projection, an image similar to the image projected by the projection unit 21 is formed in the overlapping region 81 as in the projection image V4 shown in FIG. .. Further, just as there is no obstacle in the angle of view of the projection, when there is no obstacle in the angle of view of the captured image, the image is captured as in the first captured image 31d and the second captured image 32d shown in FIG. The image captured by the unit 31 and the imaging unit 32 is the same as the projected image V4. FIG. 13 and the like exemplify a case where the projection unit 21 projects an image corresponding to the projected image data 11b (see FIG. 19) stored in the storage unit 11.
 なお、後述する画像補正部115による補正画像の生成がない限り、投影部22は、画像を投影しない(図13参照)。図13等では、第2投影画像を黒の塗りつぶし矩形NLとすることで投影部22が画像を投影しないことを図示している。 Note that the projection unit 22 does not project an image unless the image correction unit 115, which will be described later, generates a corrected image (see FIG. 13). In FIG. 13 and the like, it is shown that the projection unit 22 does not project the image by making the second projected image a black filled rectangle NL.
 図20は、第1投影画像と、第2投影画像と、投影画像と、第1撮像画像と、第2撮像画像との関係の一例を示す図である。図20に示す関係は、図5に示す配置に対応する関係である。 FIG. 20 is a diagram showing an example of the relationship between the first projected image, the second projected image, the projected image, the first captured image, and the second captured image. The relationship shown in FIG. 20 corresponds to the arrangement shown in FIG.
 図5に示すように、投影装置61による投影の画角内及び撮像装置71による撮像の画角内に障害物OB2がある場合、障害物OB2が投影装置61による画像の投影のための投射光の一部を遮る。このため、図20に示す投影画像V7のように、障害物OB2によって遮られた光に対応した影領域DA2を含む画像が重複領域81に形成される。このように、投影部21と重複領域81との間に障害物OB2がある場合、障害物OB2によって投影画像V7に影領域DA2が生じる。 As shown in FIG. 5, when the obstacle OB2 is within the angle of view of the projection by the projection device 61 and the angle of view of the image captured by the image pickup device 71, the obstacle OB2 is the projected light for projecting the image by the projection device 61. Block a part of. Therefore, as in the projected image V7 shown in FIG. 20, an image including the shadow region DA2 corresponding to the light blocked by the obstacle OB2 is formed in the overlapping region 81. In this way, when there is an obstacle OB2 between the projection unit 21 and the overlapping region 81, the obstacle OB2 creates a shadow region DA2 in the projected image V7.
 また、撮像装置71による撮像の画角内に障害物OB2があるため、図20に示す第1撮像画像31gのように、影領域DA2の内側に障害物OB2が位置する画像が撮像部31によって撮像される。一方、撮像装置72による撮像の画角内には障害物OB2はない。従って、図20に示す第2撮像画像32gのように、撮像部32による撮像画像は、投影画像V7と同様の画像になる。 Further, since the obstacle OB2 is within the angle of view of the image captured by the imaging device 71, an image in which the obstacle OB2 is located inside the shadow region DA2 is captured by the imaging unit 31 as in the first captured image 31g shown in FIG. It is imaged. On the other hand, there is no obstacle OB2 within the angle of view of the image captured by the image pickup device 72. Therefore, as in the second captured image 32g shown in FIG. 20, the captured image by the imaging unit 32 is the same as the projected image V7.
 なお、投影部21から重複領域81に画像を投影する光は、投影部21の光学素子を介して重複領域81側に向かって広がるように投射される。従って、第1撮像画像31gが示すように、影領域DA2は、障害物OB2の画像よりも大きくなる。 The light that projects an image from the projection unit 21 onto the overlapping region 81 is projected so as to spread toward the overlapping region 81 side via the optical element of the projection unit 21. Therefore, as shown by the first captured image 31g, the shadow region DA2 is larger than the image of the obstacle OB2.
 情報処理装置500は、撮像部31,32による撮像画像に基づいて、投影部21,22による画像の投影に係る処理を行う。以下に特筆する事項を除いて、情報処理装置500は、情報処理装置50と同様である。 The information processing device 500 performs processing related to the projection of the image by the projection units 21 and 22 based on the images captured by the image pickup units 31 and 32. The information processing device 500 is the same as the information processing device 50, except for the matters to be noted below.
 第3実施形態の演算部51は、画像投影プログラム11dを読み出して実行することで、図18及び図19に示す画像取得部12、特定部113、抽出部114、画像補正部115及び制御部116として機能する。画像投影プログラム11aと画像投影プログラム11dとは、実現される機能の相違を除いて同様である。 By reading and executing the image projection program 11d, the calculation unit 51 of the third embodiment reads and executes the image acquisition unit 12, the identification unit 113, the extraction unit 114, the image correction unit 115, and the control unit 116 shown in FIGS. 18 and 19. Functions as. The image projection program 11a and the image projection program 11d are the same except for the difference in the realized functions.
 画像取得部12は、投影部21によって投影される画像を取得する。具体的には、画像取得部12は、例えば記憶部11から投影画像データ11bを読み出して取得する。画像取得部12によって取得された投影画像データ11bは、特定部113、画像補正部115による参照が可能になる。 The image acquisition unit 12 acquires the image projected by the projection unit 21. Specifically, the image acquisition unit 12 reads and acquires the projected image data 11b from, for example, the storage unit 11. The projected image data 11b acquired by the image acquisition unit 12 can be referred to by the identification unit 113 and the image correction unit 115.
 特定部113は、撮像画像に基づいて、投影部21と重複領域81との間の障害物の画像及び障害物により重複画像に生じた影領域を特定する。具体的には、特定部113は、撮像部31,32による撮像画像の内容と画像取得部12が取得した画像の内容とを比較する画像解析を行う。より具体的には、特定部113は、投影画像データ11bの内容に対応した輝度分布の傾向を基準として、撮像部31,32による撮像画像の内容の輝度分布の傾向において輝度が低下している部分領域が生じていた場合、当該部分領域を影領域として特定する。一例として、影領域を特定する処理である画像解析は、第1撮像画像の影領域を特定する場合には、画像取得部12により第1投影画像を取得し、第1撮像画像と第1投影画像との差分を行い、差分信号を生成する。この差分信号に基づいて、輝度が一定量低下した部分領域を特定するように構成するとよい。同様に、第2撮像画像の影領域を特定する場合は、第2投影画像との差分から、輝度が一定量低下した部分領域を特定するように構成するとよい。また、当該影領域の内側にさらに輝度低下とは異なる輝度の変化や色の変化等の特徴的変化がある部分領域がある場合、当該部分領域を障害物の画像として特定する。例えば、第1撮像画像と第1投影画像との差分信号において、差分信号の差分量があり、輝度が一定量低下した部分の特徴とは異なる差分信号の部分領域を特定し、障害物の画像領域として特定するように構成してもよい。係る影領域及び障害物の画像の抽出によって、撮像部31による撮像画像と32による撮像画像のいずれに障害物が含まれているか判定可能になる。係る判定は、特定部113が行ってもよいし、制御部116が行ってもよい。また、特定部113において影領域を特定する際に、更に、影領域を生じさせる障害物を含む撮像画像を撮像する撮像部と組になる投影部を特定するように構成するとよい。撮像画像を撮像する撮像部に対応づけられている投影部は1つの組として構成されるため、撮像部が特定できれば、対応する投影部を容易に特定することができる。係る特定は、特定部113が行ってもよいし、制御部116が行ってもよい。更に、特定された投影部に関する情報を、記憶部11に記憶するようにしてもよい。 The identification unit 113 identifies an image of an obstacle between the projection unit 21 and the overlapping region 81 and a shadow region generated in the overlapping image due to the obstacle based on the captured image. Specifically, the specific unit 113 performs image analysis for comparing the content of the image captured by the imaging units 31 and 32 with the content of the image acquired by the image acquisition unit 12. More specifically, the specific unit 113 has a decrease in brightness in the tendency of the brightness distribution of the contents of the captured images by the imaging units 31 and 32, based on the tendency of the brightness distribution corresponding to the contents of the projected image data 11b. If a partial area is generated, the partial area is specified as a shadow area. As an example, in the image analysis, which is a process of specifying a shadow region, when the shadow region of the first captured image is specified, the image acquisition unit 12 acquires the first projected image, and the first captured image and the first projection. The difference from the image is performed and a difference signal is generated. Based on this difference signal, it may be configured to specify a partial region in which the brightness has decreased by a certain amount. Similarly, when specifying the shadow region of the second captured image, it is preferable to specify a partial region in which the brightness is reduced by a certain amount from the difference from the second projected image. Further, when there is a partial region inside the shadow region where there is a characteristic change such as a change in brightness or a change in color different from the decrease in brightness, the partial region is specified as an image of an obstacle. For example, in the difference signal between the first captured image and the first projected image, a partial region of the difference signal that has a difference amount of the difference signal and is different from the feature of the portion where the brightness is reduced by a certain amount is specified, and the image of the obstacle. It may be configured to be specified as an area. By extracting the image of the shadow region and the obstacle, it becomes possible to determine which of the image captured by the imaging unit 31 and the image captured by 32 contains the obstacle. Such a determination may be made by the specific unit 113 or by the control unit 116. Further, when the shadow region is specified by the specific unit 113, it is preferable to further specify the projection unit that is paired with the image pickup unit that captures the captured image including the obstacle that causes the shadow region. Since the projection units associated with the imaging unit that captures the captured image are configured as one set, if the imaging unit can be specified, the corresponding projection unit can be easily specified. Such identification may be performed by the specific unit 113 or may be performed by the control unit 116. Further, the information regarding the specified projection unit may be stored in the storage unit 11.
 抽出部114は、撮像画像に含まれる障害物の画像を抽出する。具体的には、抽出部114は、特定部113によって障害物の画像として特定された領域を抽出する。抽出部114によって抽出された障害物の画像は、画像補正部115によって利用される。 The extraction unit 114 extracts an image of an obstacle included in the captured image. Specifically, the extraction unit 114 extracts a region specified as an image of an obstacle by the identification unit 113. The image of the obstacle extracted by the extraction unit 114 is used by the image correction unit 115.
 画像補正部115は、抽出された障害物の画像を影領域に対応する位置及び大きさに補正した補正画像を生成する。具体的には、画像補正部115は、抽出部114によって抽出された障害物OB2の画像の大きさを影領域DA2の大きさに対応させるよう拡大する。拡大された障害物OB2の大きさは、影領域DA2以上であればよく、完全に厳密に影領域DA2の大きさに一致していることは必須でない。また、画像補正部115は、拡大された障害物OB2の画像の投影位置が影領域DA2の位置になり、かつ、視認可能な他の画像が拡大された障害物OB2の画像の周囲に投影されないよう処理された画像を補正画像として生成して出力する。より具体的には、画像補正部115は、拡大された障害物OB2の画像の周囲に透過処理が施された画像であって、投影画像データ11bと同じ縦横画素数を有する画像を補正画像として生成する。なお、画像補正部115が行う処理における影領域DA2の位置及び大きさの基準は、第2撮像画像32gによってもよいし、第1撮像画像31gによってもよいし、これらに微細な相違があった場合に平均化する等の処理を行うようにしてもよい。 The image correction unit 115 generates a corrected image in which the extracted obstacle image is corrected to a position and size corresponding to the shadow area. Specifically, the image correction unit 115 enlarges the size of the image of the obstacle OB2 extracted by the extraction unit 114 so as to correspond to the size of the shadow region DA2. The size of the enlarged obstacle OB2 may be larger than or equal to the shadow area DA2, and it is not essential that the size of the enlarged obstacle OB2 exactly matches the size of the shadow area DA2. Further, in the image correction unit 115, the projected position of the enlarged image of the obstacle OB2 is the position of the shadow region DA2, and another visible image is not projected around the enlarged image of the obstacle OB2. The processed image is generated as a corrected image and output. More specifically, the image correction unit 115 uses an image in which transmission processing is performed around the enlarged image of the obstacle OB2 and having the same number of vertical and horizontal pixels as the projected image data 11b as a correction image. Generate. The position and size of the shadow region DA2 in the processing performed by the image correction unit 115 may be determined by the second captured image 32 g or the first captured image 31 g, and there are slight differences between them. In some cases, processing such as averaging may be performed.
 制御部116は、画像投影システム200の各構成及び各機能構成を統括制御する。具体的には、第3実施形態の制御部116は、投影部に補正画像を投影させる。ここで、制御部116は、特定された影領域を生じさせる投影の画角内に障害物を含む撮像画像を撮像する撮像部と組になる投影部とは異なる投影部により、補正画像を投影させるようにするとよい。 The control unit 116 comprehensively controls each configuration and each functional configuration of the image projection system 200. Specifically, the control unit 116 of the third embodiment projects the corrected image on the projection unit. Here, the control unit 116 projects the corrected image by a projection unit different from the projection unit that is paired with the imaging unit that captures the captured image including an obstacle within the angle of view of the projection that causes the specified shadow region. It is good to let it.
 図21は、補正画像が投影される場合の第1投影画像と、第2投影画像と、投影画像と、第1撮像画像と、第2撮像画像との関係の一例を示す図である。図21に示す関係は、図5に示す配置に対応する関係である。また、図21は、図20に示す影領域DA2及び障害物OB2の画像が抽出された結果を受けて補正画像15aが生成された場合を想定している。図21等では、補正画像15aに含まれる、影領域に対応する位置及び大きさに補正された障害物OB2の画像を拡大画像BOB2として図示している。 FIG. 21 is a diagram showing an example of the relationship between the first projected image, the second projected image, the projected image, the first captured image, and the second captured image when the corrected image is projected. The relationship shown in FIG. 21 corresponds to the arrangement shown in FIG. Further, FIG. 21 assumes a case where the corrected image 15a is generated in response to the result of extracting the images of the shadow region DA2 and the obstacle OB2 shown in FIG. 20. In FIG. 21 and the like, the image of the obstacle OB2 included in the corrected image 15a and corrected to the position and size corresponding to the shadow region is shown as an enlarged image BOB2.
 図21に示すように、制御部116は、補正画像15aを第2投影画像として投影部22に投影させる。なお、制御部116は、投影部21には投影画像データ11bの内容を投影させる。これによって、投影部21による投影画像データ11bに対応した投影に障害物OB2が生じさせる影領域に重なるように、補正画像15aに含まれる拡大画像BOB2が投影される。従って、図21に示す投影画像V8のように、投影画像V7における影領域DA2(図20参照)が拡大画像BOB2で置換された内容の画像が重複領域81に描画される。 As shown in FIG. 21, the control unit 116 projects the corrected image 15a onto the projection unit 22 as a second projection image. The control unit 116 causes the projection unit 21 to project the contents of the projected image data 11b. As a result, the enlarged image BOB2 included in the corrected image 15a is projected so as to overlap the shadow region generated by the obstacle OB2 in the projection corresponding to the projected image data 11b by the projection unit 21. Therefore, like the projected image V8 shown in FIG. 21, an image in which the shadow region DA2 (see FIG. 20) in the projected image V7 is replaced by the enlarged image BOB2 is drawn in the overlapping region 81.
 また、図21に示す第1撮像画像31h及び第2撮像画像32hが示すように、図20に示す第1撮像画像31g及び第2撮像画像32gで撮像されていた影領域DA2が拡大画像BOB2で置換された撮像画像が得られる。制御部116は、係る第1撮像画像31h及び第2撮像画像32hに基づいて、影領域の補正が正常に行われたことを確認できる。 Further, as shown by the first captured image 31h and the second captured image 32h shown in FIG. 21, the shadow region DA2 captured by the first captured image 31 g and the second captured image 32 g shown in FIG. 20 is the enlarged image BOB2. A replaced image is obtained. The control unit 116 can confirm that the correction of the shadow region is normally performed based on the first captured image 31h and the second captured image 32h.
 なお、図18及び後述する図25では、出力される画像のデータフローが画像取得部12から画像補正部115を介して投影部21に至るよう図示されているが、これに限られるものでない。画像取得部12から直接投影部21に至るデータフローがあってもよい。図18及び後述する図25で例示する構成では、画像補正部115は、投影部21に対しては、画像取得部12が取得した画像の内容に対応した出力をそのまま行う。 Note that, in FIG. 18 and FIG. 25 described later, the data flow of the output image is shown so as to reach the projection unit 21 from the image acquisition unit 12 via the image correction unit 115, but the present invention is not limited to this. There may be a data flow from the image acquisition unit 12 to the direct projection unit 21. In the configuration illustrated in FIG. 18 and FIG. 25 described later, the image correction unit 115 outputs to the projection unit 21 as it is corresponding to the content of the image acquired by the image acquisition unit 12.
 図22は、障害物及び影領域の検知に係る処理の流れを示すフローチャートである。係る処理が行われる前提として、画像取得部12が取得した画像に対応した投影部21による画像の投影が行われている。 FIG. 22 is a flowchart showing the flow of processing related to the detection of obstacles and shadow areas. As a premise that such processing is performed, an image is projected by the projection unit 21 corresponding to the image acquired by the image acquisition unit 12.
 まず、撮像部31,32が、重複領域81を撮像することで複数の撮像画像を取得する(ステップS21)。特定部113は、ステップS21の処理で取得された撮像画像に基づいて、影領域及び障害物の画像を検知する(ステップS22)。ステップS22の処理で、影領域及び障害物の画像が検知されなかった場合(ステップS22;No)、処理は終了する。 First, the imaging units 31 and 32 acquire a plurality of captured images by imaging the overlapping region 81 (step S21). The identification unit 113 detects an image of a shadow region and an obstacle based on the captured image acquired in the process of step S21 (step S22). When the image of the shadow area and the obstacle is not detected in the process of step S22 (step S22; No), the process ends.
 ステップS22の処理で、影領域及び障害物の画像が検知された場合(ステップS22;Yes)、抽出部114が、検知された障害物の画像を抽出する(ステップS23)。また、ステップS23の処理で抽出された障害物の画像を画像補正部115がステップS22の処理で検知された影領域の位置及び大きさに補正して補正画像を生成する(ステップS24)。そして、制御部116は、ステップS24の処理で生成された補正画像を投影部22に投影させる(ステップS25)。 When an image of a shadow area and an obstacle is detected in the process of step S22 (step S22; Yes), the extraction unit 114 extracts the detected image of the obstacle (step S23). Further, the image correction unit 115 corrects the image of the obstacle extracted in the process of step S23 to the position and size of the shadow region detected in the process of step S22 to generate a corrected image (step S24). Then, the control unit 116 projects the corrected image generated in the process of step S24 onto the projection unit 22 (step S25).
 その後も、ステップS21と同様、複数の撮像画像の取得が行われる。制御部116は、係る複数の撮像画像に基づいて、補正画像による障害物の画像が影領域にあてはめられたか判定する(ステップS26)。ステップS26の処理で、補正画像による障害物の画像が影領域にあてはめられたかと判定された場合(ステップS26;Yes)、処理は終了する。一方、ステップS26の処理で、補正画像による障害物の画像が影領域にあてはめられていないと判定された場合(ステップS26;No)、ステップS23の処理に戻る。 After that, as in step S21, a plurality of captured images are acquired. The control unit 116 determines whether or not the image of the obstacle by the corrected image is applied to the shadow region based on the plurality of captured images (step S26). When it is determined in the process of step S26 that the image of the obstacle by the corrected image is applied to the shadow area (step S26; Yes), the process ends. On the other hand, when it is determined in the process of step S26 that the image of the obstacle by the corrected image is not applied to the shadow region (step S26; No), the process returns to the process of step S23.
 上述のとおり、第3実施形態では、影領域を生じさせた障害物に対応した画像を投影できる。すなわち、投影部21と被投射体80との間の障害物に応じて、予め投影が想定された画像の内容から変更された内容の画像を投影できる。従って、投影される画像に新たな価値を付与することができる。 As described above, in the third embodiment, an image corresponding to an obstacle that has created a shadow region can be projected. That is, it is possible to project an image whose contents are changed from the contents of the image which is supposed to be projected in advance according to the obstacle between the projection unit 21 and the projected object 80. Therefore, new value can be added to the projected image.
 より具体的な例を挙げると、投影部21と被投射体80との間に出演者等の動体が介在する場合であっても、投射される画像に単に影ができるような描画とは異なり、当該動体の態様に応じた内容を含む画像を投影できる。 To give a more specific example, even when a moving object such as a performer intervenes between the projection unit 21 and the projected object 80, the drawing is different from the drawing in which a shadow is simply formed on the projected image. , An image including contents according to the mode of the moving body can be projected.
[第4実施形態]
 次に、第3実施形態とは異なる第4実施形態について説明する。第4実施形態の説明に係り、第3実施形態と同様の構成については同じ符号を付して説明を省略する。
[Fourth Embodiment]
Next, a fourth embodiment different from the third embodiment will be described. Regarding the description of the fourth embodiment, the same reference numerals are given to the same configurations as those of the third embodiment, and the description thereof will be omitted.
 図23は、本発明の第4実施形態における被投射体85と投影装置61,62と撮像装置71,72との位置関係を示す図である。図23に示すように、第4実施形態では、被投射体85が採用される。被投射体85は、透過スクリーンのように、一面側から投影された画像をその裏面側から視認することを想定して設けられる。図23に示す投影装置61と被投射体85の位置関係によって、組P1の投影装置61が備える投影部21から投影された画像は、被投射体85を挟んで投影部21の反対側に面する被視認面86で裏面画像として視認される。また、被投射体85を挟んで組P1の反対側には、組P2が設けられる。すなわち、組P1と組P2とは被投射体85を挟んで対向するよう配置される。 FIG. 23 is a diagram showing the positional relationship between the projected object 85, the projection devices 61, 62, and the image pickup devices 71, 72 according to the fourth embodiment of the present invention. As shown in FIG. 23, in the fourth embodiment, the projected body 85 is adopted. The projected object 85 is provided on the assumption that an image projected from one surface side is visually recognized from the back surface side, such as a transmission screen. Due to the positional relationship between the projection device 61 and the projected body 85 shown in FIG. 23, the image projected from the projection unit 21 included in the projection device 61 of the set P1 is surfaced on the opposite side of the projection unit 21 with the projected body 85 in between. It is visually recognized as a back surface image on the visible surface 86. Further, the set P2 is provided on the opposite side of the set P1 with the projected body 85 in between. That is, the set P1 and the set P2 are arranged so as to face each other with the projected body 85 in between.
 なお、図23及び図25では撮像装置71,72の画角の図示を省略しているが、第4実施形態でも第3実施形態と同様、撮像装置71の画角が投影装置61の画角に対応し、撮像装置72の画角が投影装置62の画角に対応するよう設けられる。 Although the angles of view of the imaging devices 71 and 72 are omitted in FIGS. 23 and 25, the angle of view of the imaging device 71 is the angle of view of the projection device 61 in the fourth embodiment as in the third embodiment. The angle of view of the imaging device 72 is provided so as to correspond to the angle of view of the projection device 62.
 図24は、第1投影画像と、第2投影画像と、投影画像と、裏面画像と、第1撮像画像と、第2撮像画像との関係の一例を示す図である。図24に示す関係は、図23に示す配置に対応する関係である。図24の裏面画像V9Rが示すように、被投射体85を挟んで投影部21の反対側に面する被視認面86で視認される裏面画像の内容は、投影部21が投影する投影画像データ11bの内容に対応する投影画像V9を左右反転した内容になる。また、図24の第2撮像画像32iと裏面画像V9Rとの対応関係が示すように、第2撮像画像の内容は、裏面画像の内容に対応する。それ以外の点では、第4実施形態における第1投影画像と、第2投影画像と、投影画像と、第1撮像画像と、第2撮像画像との関係は、図13を参照して説明した第3実施形態におけるこれらの関係と同様である。 FIG. 24 is a diagram showing an example of the relationship between the first projected image, the second projected image, the projected image, the back surface image, the first captured image, and the second captured image. The relationship shown in FIG. 24 corresponds to the arrangement shown in FIG. 23. As shown by the back surface image V9R of FIG. 24, the content of the back surface image visually recognized by the visible surface 86 facing the opposite side of the projection unit 21 with the projected body 85 sandwiched is the projected image data projected by the projection unit 21. The contents of the projected image V9 corresponding to the contents of 11b are inverted left and right. Further, as shown by the correspondence between the second captured image 32i and the back surface image V9R in FIG. 24, the content of the second captured image corresponds to the content of the back surface image. In other respects, the relationship between the first projected image, the second projected image, the projected image, the first captured image, and the second captured image in the fourth embodiment has been described with reference to FIG. This is similar to these relationships in the third embodiment.
 図25は、第4実施形態において投影装置61による投影の画角内及び撮像装置71による撮像の画角内に障害物OB2がある場合の一例を示す図である。図26は、第1投影画像と、第2投影画像と、投影画像と、裏面画像と、第1撮像画像と、第2撮像画像との関係の一例を示す図である。図26に示す関係は、図25に示す配置に対応する関係である。 FIG. 25 is a diagram showing an example of a case where the obstacle OB2 is within the angle of view of the projection by the projection device 61 and the angle of view of the image captured by the image pickup device 71 in the fourth embodiment. FIG. 26 is a diagram showing an example of the relationship between the first projected image, the second projected image, the projected image, the back surface image, the first captured image, and the second captured image. The relationship shown in FIG. 26 corresponds to the arrangement shown in FIG. 25.
 図25に示すように、投影装置61による投影の画角内及び撮像装置71による撮像の画角内に障害物OB2がある場合、障害物OB2が投影装置61による画像の投影のための投射光の一部を遮る。このため、図25に示す投影画像V10のように、障害物OB2によって遮られた光に対応した影領域DA2を含む画像が重複領域81に形成される。また、図25に示す裏面画像V10Rのように、裏面画像の内容は、投影画像V10を左右反転した内容になる。従って、裏面画像V10Rにおける影領域DA2の位置及び形状は、投影画像V10における影領域DA2の位置及び形状を左右反転したものとなる。 As shown in FIG. 25, when there is an obstacle OB2 within the angle of view of the projection by the projection device 61 and within the angle of view of the image captured by the image pickup device 71, the obstacle OB2 is the projected light for projecting the image by the projection device 61. Block a part of. Therefore, as in the projected image V10 shown in FIG. 25, an image including the shadow region DA2 corresponding to the light blocked by the obstacle OB2 is formed in the overlapping region 81. Further, as in the back surface image V10R shown in FIG. 25, the content of the back surface image is a left-right inverted content of the projected image V10. Therefore, the position and shape of the shadow region DA2 in the back image V10R is the left-right reversal of the position and shape of the shadow region DA2 in the projected image V10.
 図26に示す例では、投影画像V10と裏面画像V10Rとの間で影領域DA2の位置のみが反転しているように見えるが、これは障害物OB2が投影部21の光軸に対して左右対称の形状であることによる。障害物OB2が左右非対称である場合、裏面画像V10Rにおける影領域DA2の形状も、投影画像V10における影領域DA2の形状を左右反転したものとなる。 In the example shown in FIG. 26, it seems that only the position of the shadow region DA2 is inverted between the projected image V10 and the back surface image V10R, but this is because the obstacle OB2 is left and right with respect to the optical axis of the projection unit 21. Due to the symmetrical shape. When the obstacle OB2 is asymmetrical, the shape of the shadow region DA2 in the back image V10R is also the shape of the shadow region DA2 in the projected image V10 inverted left and right.
 また、撮像装置71による撮像の画角内に障害物OB2があるため、図26に示す第1撮像画像31jのように、影領域DA2の内側に障害物OB2が位置する画像が撮像部31によって撮像される。一方、図26に示す第2撮像画像32jのように、撮像部32による撮像画像は、裏面画像V10Rと同様の画像になる。 Further, since the obstacle OB2 is within the angle of view of the image captured by the image pickup apparatus 71, the image in which the obstacle OB2 is located inside the shadow region DA2 is captured by the image pickup unit 31 as in the first image captured image 31j shown in FIG. It is imaged. On the other hand, as in the second captured image 32j shown in FIG. 26, the captured image by the imaging unit 32 is the same as the back surface image V10R.
 図27は、補正画像が投影される場合の第1投影画像と、第2投影画像と、投影画像と、裏面画像と、第1撮像画像と、第2撮像画像との関係の一例を示す図である。図27に示す関係は、図5に示す配置に対応する関係である。また、図27は、図26に示す影領域DA2及び障害物OB2の画像が抽出された結果を受けて補正画像15aが生成された場合を想定している。 FIG. 27 is a diagram showing an example of the relationship between the first projected image, the second projected image, the projected image, the back surface image, the first captured image, and the second captured image when the corrected image is projected. Is. The relationship shown in FIG. 27 corresponds to the arrangement shown in FIG. Further, FIG. 27 assumes a case where the corrected image 15a is generated in response to the result of extracting the images of the shadow region DA2 and the obstacle OB2 shown in FIG. 26.
 第4実施形態の画像補正部115は、補正画像の生成に際して、さらに、拡大した障害物の画像を左右反転して反転画像とする。また、第4実施形態の画像補正部115が行う処理における影領域DA2の位置及び大きさの基準は、図26に示す第2撮像画像32jのように、撮像部32の撮像内容に基づいたものとなる。係る第4実施形態の画像補正部115の処理は、第4実施形態の制御部116が行ってもよい。 When generating the corrected image, the image correction unit 115 of the fourth embodiment further flips the enlarged image of the obstacle left and right to obtain a reversed image. Further, the reference of the position and size of the shadow region DA2 in the processing performed by the image correction unit 115 of the fourth embodiment is based on the image pickup content of the image pickup unit 32 as in the second image capture image 32j shown in FIG. It becomes. The processing of the image correction unit 115 of the fourth embodiment may be performed by the control unit 116 of the fourth embodiment.
 図27では、障害物OB2に基づいて生成された反転画像MOB2を例示している。図27では、障害物OB2に付された1本の対角線の向きと反転画像MOB2に付された1本の対角線の向きとが左右対称になっていることで、反転画像MOB2が障害物OB2を左右反転して拡大したものであることを示している。 FIG. 27 illustrates an inverted image MOB2 generated based on the obstacle OB2. In FIG. 27, the direction of one diagonal line attached to the obstacle OB2 and the direction of one diagonal line attached to the inverted image MOB2 are symmetrical, so that the inverted image MOB2 makes the obstacle OB2. It shows that it is enlarged left and right.
 また、図23を参照して説明したように、組P1と組P2とは被投射体85を挟んで対向するよう配置される。従って、投影部22は、被投射体85を挟んで投影部21の反対側、すなわち、被視認面86に正対する側から補正画像を投影する。ここで、拡大された障害物の画像が反転画像とされ、影領域DA2の位置及び大きさの基準が撮像部32の撮像内容に基づいたものとなっている。このため、補正画像Mに含まれる反転画像Mの位置及び形状は、図26に示す投影画像V10及び図27に示す投影画像V11における影領域DA2の位置及び形状を左右反転したものに対応する。これによって、図27の裏面画像V11Rが示すように、組P1と組P2とが被投射体85を挟んで対向するよう配置される第4実施形態の構成においても、影領域に対応するよう拡大された障害物の画像があてはめられた画像を投影できる。 Further, as described with reference to FIG. 23, the set P1 and the set P2 are arranged so as to face each other with the projected body 85 in between. Therefore, the projection unit 22 projects the corrected image from the opposite side of the projection unit 21 with the projected body 85 in between, that is, the side facing the visible surface 86. Here, the enlarged image of the obstacle is regarded as an inverted image, and the reference of the position and size of the shadow region DA2 is based on the image captured content of the imaging unit 32. Therefore, the position and shape of the inverted image M included in the corrected image M correspond to the left and right inverted positions and shapes of the shadow region DA2 in the projected image V10 shown in FIG. 26 and the projected image V11 shown in FIG. 27. As a result, as shown by the back image V11R of FIG. 27, even in the configuration of the fourth embodiment in which the set P1 and the set P2 are arranged so as to face each other with the projected object 85 in between, the set P1 and the set P2 are enlarged so as to correspond to the shadow region. It is possible to project an image to which an image of an obstacle has been applied.
 上述のとおり、第4実施形態のように、第3実施形態と異なる投影部及び撮像部の配置によっても、第3実施形態と同様の効果を奏することができる。 As described above, the same effect as that of the third embodiment can be obtained by arranging the projection unit and the imaging unit different from those of the third embodiment as in the fourth embodiment.
 画像を投影する光の射線上に位置する出演者等のように、射線上の障害物に応じて投影される画像の内容を柔軟に変更することで新たな価値を付与することは、特許文献1記載の技術では不可能であった。そこで、画像の内容の柔軟な変更に対応可能な技術が求められていた。第3実施形態及び第4実施形態によれば、投影される画像に新たな価値を付与することができる。 It is a patent document to add new value by flexibly changing the content of an image projected according to an obstacle on the line of sight, such as a performer located on the line of light that projects an image. It was not possible with the technique described in 1. Therefore, there has been a demand for a technique capable of flexibly changing the contents of an image. According to the third embodiment and the fourth embodiment, new value can be added to the projected image.
 なお、図1及び図11では2つの投影部21,22が例示されているが、投影部の数は3以上であってもよい。すなわち、図2等に示す構成において投影装置は3つ以上であってもよい。第1実施形態では、光軸上に障害物がない1つ以上の投影部による投影画像の輝度が補正されればよい。補正の度合いは、光軸上に障害物がない複数の投影部による投影画像が重複されることによって光軸上に障害物が投影部による影領域の輝度低下が相殺されればよい。第2実施形態では、光軸上に障害物がない1つ以上の投影部による画像の投影が行われればよい。すなわち、光軸上に障害物がある投影部による画像の投影が行われなければよい。 Although two projection units 21 and 22 are illustrated in FIGS. 1 and 11, the number of projection units may be 3 or more. That is, in the configuration shown in FIG. 2 and the like, the number of projection devices may be three or more. In the first embodiment, the brightness of the projected image by one or more projection units having no obstacle on the optical axis may be corrected. The degree of correction may be such that the projection images of the plurality of projection units having no obstacles on the optical axis are overlapped so that the decrease in brightness of the shadow region due to the projection units of the obstacles on the optical axis is offset. In the second embodiment, the image may be projected by one or more projection units having no obstacle on the optical axis. That is, it is sufficient that the image is not projected by the projection unit having an obstacle on the optical axis.
 また、図1及び図11では2つの撮像部31,32が例示されているが、撮像部の数は3以上であってもよい。第1実施形態では、撮像部の数は投影部の数に対応する。第2実施形態では、図9、図10及び図11を参照した説明のように、投影部と組になるように撮像部が設けられてもよいし、投影部から独立した位置に複数の撮像部が設けられてもよい。 Further, although two imaging units 31 and 32 are illustrated in FIGS. 1 and 11, the number of imaging units may be 3 or more. In the first embodiment, the number of imaging units corresponds to the number of projection units. In the second embodiment, as described with reference to FIGS. 9, 10 and 11, an imaging unit may be provided so as to be paired with the projection unit, or a plurality of imaging units may be provided at positions independent of the projection unit. A unit may be provided.
 また、図18では2つの投影部21,22が例示されているが、第3実施形態における投影部の数は3以上であってもよい。すなわち、図19等に示す構成において投影装置は3つ以上であってもよい。また、図18では2つの撮像部31,32が例示されているが、撮像部の数は3以上であってもよい。第3実施形態における撮像部の数及び配置は投影部の数に対応する。 Further, although two projection units 21 and 22 are illustrated in FIG. 18, the number of projection units in the third embodiment may be 3 or more. That is, in the configuration shown in FIG. 19 and the like, the number of projection devices may be three or more. Further, although two imaging units 31 and 32 are illustrated in FIG. 18, the number of imaging units may be 3 or more. The number and arrangement of imaging units in the third embodiment correspond to the number of projection units.
 また、図1に示す画像投影システム1及び図11に示す画像投影システム100に示す機能構成ならびに図18に示す画像投影システム200に示す機能構成は、1つの装置が備える機能構成であってもよい。すなわち、本発明は、図2及び図12ならびに図19に示すような複数の装置の組み合わせによるシステム構成に限らず、図1及び図11ならびに図18に示す機能を備える1つの装置によって実現されてもよい。すなわち、符号1が付された構成及び符号100が付された構成ならびに符号200が付された構成は、画像投影装置であってもよい。具体例を挙げると、符号1が付された構成及び符号100が付された構成は、所謂プロジェクションテレビのように設けられた一体の装置であってもよい。ただし、第3実施形態において、被投影体は、当該装置から離隔された位置に設けられる。すなわち、障害物が当該装置と被投影体との間に介在できるよう当該装置が構成される。 Further, the functional configuration shown in the image projection system 1 shown in FIG. 1 and the image projection system 100 shown in FIG. 11 and the functional configuration shown in the image projection system 200 shown in FIG. 18 may be a functional configuration included in one device. .. That is, the present invention is not limited to the system configuration by combining a plurality of devices as shown in FIGS. 2 and 12 and 19, but is realized by one device having the functions shown in FIGS. 1 and 11 and 18. May be good. That is, the configuration with reference numeral 1 and the configuration with reference numeral 100 and the configuration with reference numeral 200 may be an image projection device. To give a specific example, the configuration with reference numeral 1 and the configuration with reference numeral 100 may be an integrated device provided like a so-called projection television. However, in the third embodiment, the projected object is provided at a position separated from the device. That is, the device is configured so that an obstacle can intervene between the device and the projected object.
 また、図4等で例示した投影画像データ11bの内容や図6等で例示した障害物OB、影領域DAの位置及び形状、図7で例示した補正画像14aの内容、図20等で例示した障害物OB2、影領域DA2の位置及び形状、図21で例示した補正画像15aの内容はあくまで一例であってこれに限られるものでない。本発明は、任意の内容の画像を投影できる。また、本発明は、投影の画角内にある障害物の形状に対応して生じた影領域に応じた補正画像の生成が可能である。 Further, the content of the projected image data 11b exemplified in FIG. 4 and the like, the position and shape of the obstacle OB and the shadow region DA exemplified in FIG. 6 and the like, the content of the correction image 14a exemplified in FIG. The positions and shapes of the obstacle OB2 and the shadow region DA2, and the contents of the corrected image 15a illustrated in FIG. 21 are merely examples and are not limited thereto. The present invention can project an image having arbitrary contents. Further, the present invention can generate a corrected image according to a shadow region generated corresponding to the shape of an obstacle within the angle of view of projection.
 また、第2実施形態の投影制御部17は、複数の撮像部21,22によりフレームごとに切り替えて投影された投影画像を撮像した撮像画像、および投影画像である投影画像データ11bの内容に基づいて、投影画像の影の位置を特定するようにしてもよい。すなわち、第1実施形態における影領域特定部や第2実施形態における障害物検知部16が備えている影領域DAの抽出に係る機能を投影制御部17が有していてもよい。 Further, the projection control unit 17 of the second embodiment is based on the captured image obtained by capturing the projected image projected by switching each frame by the plurality of imaging units 21 and 22, and the contents of the projected image data 11b which is the projected image. The position of the shadow of the projected image may be specified. That is, the projection control unit 17 may have a function related to extraction of the shadow area DA included in the shadow area identification unit in the first embodiment and the obstacle detection unit 16 in the second embodiment.
 以上、本発明の実施形態を説明したが、これら実施形態の内容により本発明が限定されるものではない。前述した構成要素には、当業者が容易に想定できるもの、実質的に同一のもの、いわゆる均等の範囲のものが含まれる。さらに、前述した構成要素は適宜組み合わせることが可能である。さらに、前述した実施形態の要旨を逸脱しない範囲で構成要素の種々の省略、置換又は変更を行うことができる。 Although the embodiments of the present invention have been described above, the present invention is not limited by the contents of these embodiments. The components described above include those that can be easily assumed by those skilled in the art, those that are substantially the same, that is, those in a so-called equal range. Furthermore, the components described above can be combined as appropriate. Further, various omissions, replacements or changes of the components can be made without departing from the gist of the above-described embodiment.
 1,100,200 画像投影システム
 11a,11c,11d 画像投影プログラム
 12 画像取得部
 13 影領域特定部
 14,115 画像補正部
 15,116 制御部
 16 障害物検知部
 17 投影制御部
 21,22 投影部
 31,32 撮像部
 50,500 情報処理装置
 61,62 投影装置
 71,72,75 撮像装置
 113 特定部
 114 抽出部
1,100,200 Image projection system 11a, 11c, 11d Image projection program 12 Image acquisition unit 13 Shadow area identification unit 14,115 Image correction unit 15,116 Control unit 16 Obstacle detection unit 17 Projection control unit 21,22 Projection unit 31,32 Imaging unit 50,500 Information processing device 61, 62 Projection device 71, 72, 75 Imaging device 113 Specific unit 114 Extraction unit

Claims (7)

  1.  複数の投影画像が同一領域で重複した重複画像を形成するように、異なる画像投影角度から投影する複数の投影部と、
     前記複数の投影部の各々と組になるよう設けられて、それぞれの投影部の画像投影角度に対応する角度から前記同一領域を撮像して撮像画像を生成する複数の撮像部と、
     前記撮像画像に基づいて、前記投影部と前記同一領域との間の障害物により前記重複画像に生じた影領域を特定する影領域特定部と、
     前記投影部が投影する投影画像における前記影領域に対応する領域を補正した補正画像を生成する画像補正部と、
     前記投影部に前記補正画像を投影させる制御部とを備えることを特徴とする画像投影装置。
    Multiple projection units that project from different image projection angles so that multiple projected images form overlapping overlapping images in the same area.
    A plurality of imaging units that are provided so as to be paired with each of the plurality of projection units and that image the same region from an angle corresponding to the image projection angle of each projection unit to generate an image.
    Based on the captured image, a shadow region specifying portion that identifies a shadow region generated in the overlapping image due to an obstacle between the projection portion and the same region, and a shadow region specifying portion.
    An image correction unit that generates a corrected image in which a region corresponding to the shadow region in the projected image projected by the projection unit is corrected, and an image correction unit.
    An image projection device including a control unit for projecting the corrected image onto the projection unit.
  2.  前記影領域特定部において影領域を特定する際に、更に、影領域を生じさせる障害物を含む前記撮像画像を撮像する撮像部と組になる投影部を特定し、
     前記制御部において、特定された前記影領域を生じさせる障害物を含む前記撮像画像を撮像する撮像部と組になる投影部とは異なる投影部により、前記補正画像を投影させることを特徴とする請求項1に記載の画像投影装置。
    When the shadow region is specified in the shadow region specifying portion, the projection portion to be paired with the imaging unit that captures the captured image including the obstacle that causes the shadow region is further specified.
    The control unit is characterized in that the corrected image is projected by a projection unit different from the projection unit that is paired with the imaging unit that captures the captured image including an obstacle that causes the specified shadow region. The image projection device according to claim 1.
  3.  組になる投影部と撮像部とは、ハーフミラーを介して当該投影部が画像を投射する光軸と当該撮像部が前記同一領域を撮像する光軸とが重なるよう設けられることを特徴とする請求項1又は請求項2に記載の画像投影装置。 The pair of projection unit and imaging unit is characterized in that the optical axis on which the projection unit projects an image and the optical axis on which the imaging unit images the same region are provided so as to overlap with each other via a half mirror. The image projection device according to claim 1 or 2.
  4.  複数の投影画像が同一領域で重複した重複画像を形成するように、異なる画像投影角度から投影する複数の投影装置の各々と組になるよう設けられて、それぞれの投影装置の画像投影角度に対応する角度から前記同一領域を撮像する複数の撮像装置が生成した撮像画像に基づいて、投影装置と前記同一領域との間の障害物により前記重複画像に生じた影領域を特定するステップと、
     前記投影装置が投影する投影画像における前記影領域に対応する領域を補正した補正画像を生成するステップと、
     前記投影装置に前記補正画像を投影させるステップとを含むことを特徴とする画像補正方法。
    It is provided to be paired with each of a plurality of projection devices projecting from different image projection angles so that a plurality of projected images form overlapping overlapping images in the same region, and corresponds to the image projection angle of each projection device. Based on the captured images generated by a plurality of imaging devices that image the same region from the same angle, a step of identifying a shadow region generated in the overlapping image due to an obstacle between the projection device and the same region, and a step of identifying the shadow region.
    A step of generating a corrected image in which a region corresponding to the shadow region in the projected image projected by the projection device is corrected, and a step of generating a corrected image.
    An image correction method comprising a step of projecting the corrected image onto the projection device.
  5.  複数の投影画像が同一領域で重複した重複画像を形成するように、異なる画像投影角度から投影する複数の投影部と、
     前記同一領域を撮像して撮像画像を生成する撮像部と、
     前記撮像画像に基づいて、前記投影部と前記同一領域との間の障害物を検知する障害物検知部と、
     前記投影部に前記投影画像を投影させる制御を行う制御部とを備え、
     前記制御部は、投影画像をフレームごとに前記複数の投影部を切り替えて投影し、前記撮像部によりフレームごとに切り替えて投影された投影画像を撮像した撮像画像、および投影画像に基づいて、投影画像の影の位置を特定するように制御することを特徴とする画像投影装置。
    Multiple projection units that project from different image projection angles so that multiple projected images form overlapping overlapping images in the same area.
    An imaging unit that captures the same region and generates an captured image,
    An obstacle detection unit that detects an obstacle between the projection unit and the same region based on the captured image, and an obstacle detection unit.
    A control unit that controls the projection of the projected image on the projection unit is provided.
    The control unit projects the projected image by switching the plurality of projection units for each frame, and projects the projected image based on the captured image obtained by switching the projected image for each frame by the imaging unit and the projected image. An image projection device characterized in that it controls to specify the position of a shadow in an image.
  6.  複数の投影画像が同一領域で重複した重複画像を形成するように、異なる画像投影角度から投影する複数の投影部と、
     前記複数の投影部の各々と組になるよう設けられて、それぞれの投影部の画像投影角度に対応する角度から前記同一領域を撮像して撮像画像を生成する複数の撮像部と、
     前記撮像画像に基づいて、前記投影部と前記同一領域との間の障害物の画像及び障害物により前記重複画像に生じた影領域、並びに前記障害物の画像を撮像した撮像部と組になる投影部を特定する特定部と、
     前記撮像画像に含まれる前記障害物の画像を抽出する抽出部と、
     抽出された前記障害物の画像を前記影領域に対応する位置及び大きさに補正した補正画像を生成する画像補正部と、
     前記障害物の画像を撮像した撮像部と組になる投影部と異なる投影部を制御して前記補正画像を投影させる制御部とを備えることを特徴とする画像投影装置。
    Multiple projection units that project from different image projection angles so that multiple projected images form overlapping overlapping images in the same area.
    A plurality of imaging units that are provided so as to be paired with each of the plurality of projection units and that image the same region from an angle corresponding to the image projection angle of each projection unit to generate an image.
    Based on the captured image, the image of the obstacle between the projection unit and the same region, the shadow region generated in the overlapping image due to the obstacle, and the imaging unit that captured the image of the obstacle are combined. A specific part that identifies the projection part and
    An extraction unit that extracts an image of the obstacle included in the captured image, and an extraction unit.
    An image correction unit that generates a corrected image obtained by correcting the extracted image of the obstacle to a position and size corresponding to the shadow region, and an image correction unit.
    An image projection apparatus comprising: a projection unit that is paired with an image pickup unit that has captured an image of an obstacle, and a control unit that controls a projection unit different from the projection unit to project the corrected image.
  7.  両面から映像を投影可能な透過性を有するスクリーンと、
     前記スクリーンの同一領域に対して第1面側から投影画像を投影する第1面側投影部及び第2面側から投影画像を投影する第2面側投影部と、
     前記第1面側投影部及び前記第2面側投影部の各々と組になるように設けられて、それぞれの投影部の画像投影角度に対応する角度から前記同一領域を撮像して撮像画像を生成する撮像部と、
     前記撮像画像に基づいて、前記第1面側投影部または前記第2面側投影部と前記同一領域との間の障害物の画像及び障害物により重複画像に生じた影領域、並びに前記障害物の画像を撮像した撮像部と組になる投影部を特定する特定部と、
     前記撮像画像に含まれる前記障害物の画像を抽出する抽出部と、
     前記障害物の画像を撮像した撮像部と組になる投影部と異なる投影部を制御して補正画像を投影させる制御部とを備えることを特徴とする画像投影装置。
    A transparent screen that can project images from both sides,
    A first surface side projection unit that projects a projected image from the first surface side and a second surface side projection unit that projects a projected image from the second surface side with respect to the same area of the screen.
    It is provided so as to be paired with each of the first surface side projection unit and the second surface side projection unit, and the same region is imaged from an angle corresponding to the image projection angle of each projection unit to capture an image. The image pickup unit to be generated and
    Based on the captured image, an image of an obstacle between the first surface side projection unit or the second surface side projection unit and the same region, a shadow region generated in the overlapping image due to the obstacle, and the obstacle. A specific part that identifies the projection part that is paired with the image pickup part that captured the image of
    An extraction unit that extracts an image of the obstacle included in the captured image, and an extraction unit.
    An image projection device including a projection unit that is paired with an image pickup unit that has captured an image of an obstacle, and a control unit that controls a different projection unit to project a corrected image.
PCT/JP2021/010239 2020-03-25 2021-03-12 Image projection device and image correction method WO2021193171A1 (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP2020-054877 2020-03-25
JP2020054877A JP2021158445A (en) 2020-03-25 2020-03-25 Image projection device, image correction method and program
JP2020054879A JP2021158447A (en) 2020-03-25 2020-03-25 Image projection device, image projection method and program
JP2020054878A JP2021158446A (en) 2020-03-25 2020-03-25 Image projection device, image projection method and program
JP2020-054879 2020-03-25
JP2020-054878 2020-03-25

Publications (1)

Publication Number Publication Date
WO2021193171A1 true WO2021193171A1 (en) 2021-09-30

Family

ID=77891826

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/010239 WO2021193171A1 (en) 2020-03-25 2021-03-12 Image projection device and image correction method

Country Status (1)

Country Link
WO (1) WO2021193171A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001222062A (en) * 2000-02-09 2001-08-17 Nikon Corp Projection type display device
JP2007248824A (en) * 2006-03-16 2007-09-27 Matsushita Electric Ind Co Ltd Image projection apparatus and system
JP2008042781A (en) * 2006-08-09 2008-02-21 Fuji Xerox Co Ltd Image processing apparatus
JP2008116565A (en) * 2006-11-01 2008-05-22 Seiko Epson Corp Image correction device, projection system, image correction method, image correction program and recording medium
JP2011257609A (en) * 2010-06-09 2011-12-22 Nippon Telegr & Teleph Corp <Ntt> Optical projection control method, optical projection control device, optical projection control system and program
WO2019188046A1 (en) * 2018-03-26 2019-10-03 富士フイルム株式会社 Projection system, projection control device, projection control method, and projection control program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001222062A (en) * 2000-02-09 2001-08-17 Nikon Corp Projection type display device
JP2007248824A (en) * 2006-03-16 2007-09-27 Matsushita Electric Ind Co Ltd Image projection apparatus and system
JP2008042781A (en) * 2006-08-09 2008-02-21 Fuji Xerox Co Ltd Image processing apparatus
JP2008116565A (en) * 2006-11-01 2008-05-22 Seiko Epson Corp Image correction device, projection system, image correction method, image correction program and recording medium
JP2011257609A (en) * 2010-06-09 2011-12-22 Nippon Telegr & Teleph Corp <Ntt> Optical projection control method, optical projection control device, optical projection control system and program
WO2019188046A1 (en) * 2018-03-26 2019-10-03 富士フイルム株式会社 Projection system, projection control device, projection control method, and projection control program

Similar Documents

Publication Publication Date Title
US7357517B2 (en) Projector, method of controlling the projector, program for controlling the projector, and recording medium storing the program
JP3901072B2 (en) Video display device and video display method
JP5736535B2 (en) Projection-type image display device and image adjustment method
JP5372857B2 (en) Projection display device
JP2006189685A (en) Projection control system, projector, program, information storage medium and projection control method
JP7223837B2 (en) Image processing device, projection system, image processing method, and image processing program
US11611731B2 (en) Evaluation method for image projection system, image projection system, and image projection control apparatus
JP2012170007A (en) Projection type video display device and image adjusting method
JP2005204184A (en) Projection-type video display device
JP2012118289A (en) Projection type image display device
JP2012165091A (en) Multi-projection system and projector
JP2021158627A (en) Method for controlling projector, projector, and projection system
JP2012078490A (en) Projection image display device, and image adjusting method
JP2012018214A (en) Projection type video display device
JP2018004951A (en) Image projection device
JP2010085563A (en) Image adjusting apparatus, image display system and image adjusting method
US7370980B2 (en) Projection type video display
WO2021193171A1 (en) Image projection device and image correction method
JP2011164246A (en) Detection device of amount of projection position deviation, detection method of amount of projection position deviation, and projection system
JP2012220709A (en) Projection type image display apparatus and control method for the same
JP5298738B2 (en) Image display system and image adjustment method
JP2021158446A (en) Image projection device, image projection method and program
JP2021158445A (en) Image projection device, image correction method and program
JP2005148131A (en) Projector
JP2006091121A (en) Projection video display

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21774097

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21774097

Country of ref document: EP

Kind code of ref document: A1