WO2021193171A1 - Dispositif et procédé de projection d'image - Google Patents

Dispositif et procédé de projection d'image Download PDF

Info

Publication number
WO2021193171A1
WO2021193171A1 PCT/JP2021/010239 JP2021010239W WO2021193171A1 WO 2021193171 A1 WO2021193171 A1 WO 2021193171A1 JP 2021010239 W JP2021010239 W JP 2021010239W WO 2021193171 A1 WO2021193171 A1 WO 2021193171A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
projection
unit
projected
obstacle
Prior art date
Application number
PCT/JP2021/010239
Other languages
English (en)
Japanese (ja)
Inventor
浩 竹下
真哉 三原
礼子 近藤
友樹 杉山
直史 古川
Original Assignee
株式会社Jvcケンウッド
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2020054878A external-priority patent/JP2021158446A/ja
Priority claimed from JP2020054879A external-priority patent/JP2021158447A/ja
Priority claimed from JP2020054877A external-priority patent/JP2021158445A/ja
Application filed by 株式会社Jvcケンウッド filed Critical 株式会社Jvcケンウッド
Publication of WO2021193171A1 publication Critical patent/WO2021193171A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/10Intensity circuits
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/38Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor

Definitions

  • the present invention relates to an image projection device and an image correction method.
  • Patent Document 1 A so-called projection mapping technique is known in which an image is projected from a plurality of image projection devices onto one object to be projected to form an image (for example, Patent Document 1).
  • Patent Document 1 when a region where the light for projecting an image is blocked occurs, the image projection device is controlled so as to prohibit the projection of the light onto the region.
  • Patent Document 1 It may be difficult to form an image intended for output as an object to be projected by simply prohibiting the projection of light as in Patent Document 1. That is, image quality deterioration such as a part of the image becoming dark or chipped may occur.
  • An object of the present invention is to provide an image projection device and an image correction method that can prevent changes in the brightness of a part of a projected image and deterioration of image quality such as chipping.
  • the image projection device of the present invention is paired with a plurality of projection units that project from different image projection angles and each of the plurality of projection units so that a plurality of projected images form overlapping overlapping images in the same region.
  • a plurality of imaging units that capture the same region from an angle corresponding to the image projection angle of each projection unit to generate an captured image, and the same region as the projection unit based on the captured image.
  • the image correction method of the present invention is provided so as to be paired with each of a plurality of projection devices projecting from different image projection angles so that a plurality of projected images form overlapping overlapping images in the same region.
  • the image projection device of the present invention generates an captured image by imaging a plurality of projection units projecting from different image projection angles and the same region so that a plurality of projected images form overlapping overlapping images in the same region.
  • An obstacle detection unit that detects an obstacle between the projection unit and the same region based on the captured image, and a control unit that controls the projection image to be projected onto the projection unit.
  • the control unit is based on an captured image obtained by switching and projecting the plurality of projection units for each frame and capturing the projected image switched for each frame by the imaging unit, and a projected image. Therefore, it is characterized in that it is controlled so as to specify the position of the shadow of the projected image.
  • the image projection device of the present invention is paired with a plurality of projection units that project from different image projection angles and each of the plurality of projection units so that a plurality of projected images form overlapping overlapping images in the same region.
  • a plurality of imaging units that capture the same region from an angle corresponding to the image projection angle of each projection unit to generate an image, and the same region as the projection unit based on the captured image.
  • An extraction unit that extracts an image of the obstacle, an image correction unit that generates a corrected image obtained by correcting the extracted image of the obstacle to a position and size corresponding to the shadow region, and an image of the obstacle. It is characterized in that it includes a projection unit that is paired with an image pickup unit that has imaged the image, and a control unit that controls a projection unit different from the image pickup unit to project the corrected image.
  • the image projection device of the present invention has a transparent screen capable of projecting an image from both sides, and a first surface side projection unit and a second surface that project a projected image from the first surface side onto the same area of the screen.
  • a second surface side projection unit that projects a projected image from the side, and each of the first surface side projection unit and the second surface side projection unit are provided so as to form a pair, and the image projection angle of each projection unit.
  • the present invention it is possible to prevent changes in the brightness of a part of the projected image and deterioration of image quality such as chipping.
  • FIG. 1 is a functional block diagram showing a configuration of an image projection system according to a first embodiment of the present invention.
  • FIG. 2 is a block diagram showing an example of a configuration for realizing the functional block shown in FIG.
  • FIG. 3 is a diagram showing an example of the positional relationship between the projected object, the projection device, and the image pickup device for projecting an image by the plurality of projection devices and imaging by the plurality of image pickup devices.
  • FIG. 4 is a diagram showing an example of the relationship between the first projected image, the second projected image, the overlapping image, the first captured image, and the second captured image.
  • FIG. 5 is a diagram showing an example of a case where there is an obstacle within the angle of view of projection by the projection device and within the angle of view of imaging by the imaging device.
  • FIG. 6 is a diagram showing an example of the relationship between the first projected image, the second projected image, the overlapping image, the first captured image, and the second captured image.
  • FIG. 7 is a diagram showing an example of the relationship between the first projected image, the second projected image, the duplicated image, the first captured image, and the second captured image when the corrected image is projected.
  • FIG. 8 is a flowchart showing the flow of processing related to the detection of obstacles and shadow areas.
  • FIG. 9 is a schematic diagram showing an example of the relationship between the optical axes of the projection device and the image pickup device, which are treated as one set.
  • FIG. 10 is a schematic view showing a mechanism of a configuration in which an optical axis on which a projection device projects an image via a half mirror and an optical axis on which an imaging device images an overlapping region are provided so as to overlap each other.
  • FIG. 11 is a functional block diagram showing a configuration of an image projection system according to a second embodiment of the present invention.
  • FIG. 12 is a block diagram showing an example of a configuration for realizing the functional block shown in FIG.
  • FIG. 13 is a diagram showing an example of projecting an image on an overlapping region and imaging the overlapping region when there is no obstacle.
  • FIG. 14 is a diagram showing an example of projecting an image on an overlapping region and imaging the overlapping region when there is an obstacle.
  • FIG. 15 is a diagram showing an example of projection of an image on an overlapping region and imaging of the overlapping region after control corresponding to the detection of an obstacle is performed.
  • FIG. 16 is a flowchart showing the flow of processing related to the detection of obstacles and shadow areas.
  • FIG. 17 is a diagram showing a configuration example when there is only one imaging unit.
  • FIG. 18 is a functional block diagram showing a configuration of an image projection system according to a third embodiment of the present invention.
  • FIG. 19 is a block diagram showing an example of a configuration for realizing the functional block shown in FIG.
  • FIG. 20 is a diagram showing an example of the relationship between the first projected image, the second projected image, the projected image, the first captured image, and the second captured image.
  • FIG. 21 is a diagram showing an example of the relationship between the first projected image, the second projected image, the projected image, the first captured image, and the second captured image when the corrected image is projected.
  • FIG. 22 is a flowchart showing the flow of processing related to the detection of obstacles and shadow areas.
  • FIG. 23 is a diagram showing the positional relationship between the projected object, the projection device, and the image pickup device according to the fourth embodiment of the present invention.
  • FIG. 24 is a diagram showing an example of the relationship between the first projected image, the second projected image, the projected image, the back surface image, the first captured image, and the second captured image.
  • FIG. 25 is a diagram showing an example of a case where the obstacle OB2 is within the angle of view of the projection by the projection device and the angle of view of the image captured by the image pickup device in the fourth embodiment.
  • FIG. 26 is a diagram showing an example of the relationship between the first projected image, the second projected image, the projected image, the back surface image, the first captured image, and the second captured image.
  • FIG. 27 is a diagram showing an example of the relationship between the first projected image, the second projected image, the projected image, the back surface image, the first captured image, and the second captured image when the corrected image is projected. Is.
  • FIG. 1 is a functional block diagram showing a configuration of an image projection system 1 according to a first embodiment of the present invention.
  • the image projection system 1 includes a plurality of projection units, a plurality of imaging units, a storage unit 11, an image acquisition unit 12, a shadow area identification unit 13, an image correction unit 14, and a control unit 15.
  • the projection unit 21 and the projection unit 22 are illustrated as a plurality of projection units.
  • an imaging unit 31 and an imaging unit 32 are illustrated as a plurality of imaging units.
  • FIG. 2 is a block diagram showing an example of a configuration for realizing the functional block shown in FIG.
  • the image projection system 1 includes, for example, a plurality of projection devices, a plurality of image pickup devices, and an information processing device 50.
  • the flow of the data flow in the information processing apparatus 50 is indicated by a solid arrow, and the optical projection and imaging are indicated by a broken arrow.
  • the information processing device 50 is a so-called computer.
  • the connection form between the information processing device 50 and the plurality of projection devices 61 and 62 and the connection form between the information processing device 50 and the plurality of imaging devices 71 and 72 may be wired or wireless. It may be a mixture of wired and wireless.
  • the specific connection form may be a bus interface such as USB (Universal Serial Bus), a network communication line, or a dedicated connection form. ..
  • Each projection device shown in FIG. 2 includes one projection unit.
  • a projection device 61 and a projection device 62 are illustrated as a plurality of projection devices.
  • the projection devices 61 and 62 are so-called projectors, and project an image onto the projected object 80.
  • the projected object 80 is a so-called screen.
  • the projected object 80 is, for example, a diffusion type, a reflective type, or a regression type screen, and is provided on the assumption that the projected surface can be visually recognized.
  • the projected object 80 may be provided on the assumption that the back surface of the projected surface can be visually recognized, such as a transmissive screen.
  • the projection device 61 includes a projection unit 21.
  • the projection device 62 includes a projection unit 22.
  • the projection units 21 and 22, respectively, are a display element, a light source that irradiates the display element with light, an optical member such as a lens that causes the light reflected or transmitted by the display element to converge as an image on the projectile 80, and an image input from the outside. It is equipped with a control circuit or the like that operates the display element according to the data.
  • Examples of the display element include, but are not limited to, an Elcos (LCOS: Liquid Crystal On Silicon) device, a digital mirror device (DMD: Digital Mirror Device), or a liquid crystal device, and can be changed as appropriate. ..
  • Each imaging device shown in FIG. 2 includes one imaging unit.
  • an image pickup device 71 and an image pickup device 72 are illustrated as a plurality of image pickup devices.
  • the imaging device 71 includes an imaging unit 31.
  • the imaging device 72 includes an imaging unit 32.
  • the image pickup units 31 and 32 each include an image pickup element that functions as a so-called digital camera, a circuit that generates an image based on the output of the image pickup element, and outputs the image as an image pickup image.
  • Examples of the image pickup device include a CMOS (Complementary Metal Oxide Semiconductor) image sensor or a CCD (Charge Coupled Device) image sensor, but the present invention is not limited to these, and can be changed as appropriate.
  • CMOS Complementary Metal Oxide Semiconductor
  • CCD Charge Coupled Device
  • FIG. 3 is a diagram showing an example of the positional relationship between the projected object 80, the projection devices 61, 62, and the image pickup devices 71, 72 for projecting an image by the plurality of projection devices and imaging by the plurality of imaging devices. ..
  • the projection device 61 and the image pickup device 71 are treated as one set P1. That is, the projection unit 21 and the imaging unit 31 are provided so as to form a set P1. Further, the projection device 62 and the image pickup device 72 are treated as another set P2. That is, the projection unit 22 and the imaging unit 32 are provided so as to form a set P2.
  • the angle of view of the projection by the projection device 61 is shown within the acute angle formed by the two solid lines L1. Further, the angle of view of the projection by the projection device 62 is shown within the acute angle formed by the two solid lines L2.
  • the projection device 61 and the projection device 62 project images from different image projection angles so that the projected images overlap each other in the overlapping region 81 of the projected object 80 to form an overlapping image.
  • the plurality of projection units 21 form a duplicate image in which the projection image projected by the projection unit 21 shown in FIGS. 1 and 2 and the plurality of projection images produced by the projection image projected by the projection unit 22 overlap in the same region. , 22 project from different image projection angles.
  • the overlapping area 81 functions as the same area.
  • the angle of view of the image captured by the imaging device 71 is shown within the acute angle formed by the two broken lines L3. Further, the angle of view of the image captured by the imaging device 72 is shown within the acute angle formed by the two broken lines L4.
  • the imaging device 71 shown in FIG. 3 is provided so as to be paired with the projection device 61, and images the overlapping region 81 from an angle corresponding to the image projection angle of the projection device 61 to be paired with the projection device 61 to generate an captured image. Further, the imaging device 72 images the overlapping region 81 from an angle corresponding to the image projection angle of the projection device 62 which is provided to be paired with the projection unit 22 and generates an captured image. That is, the imaging unit 31 shown in FIGS.
  • the imaging unit 32 is provided so as to be paired with the projection unit 22, and images the overlapping region 81 from an angle corresponding to the image projection angle of the projection unit 22 to be paired with the projection unit 22 to generate an captured image.
  • the image projected by the projection unit 21 is referred to as the first projection image.
  • the image projected by the projection unit 22 is referred to as a second projection image.
  • the image formed in the overlapping region 81 of the projected object 80 by the images projected by the projection unit 21 and the projection unit 22 is regarded as the overlapping image.
  • the image captured by the imaging unit 31 is set as the first captured image.
  • the image captured by the imaging unit 32 is used as the second captured image.
  • FIG. 4 is a diagram showing an example of the relationship between the first projected image, the second projected image, the overlapping image, the first captured image, and the second captured image.
  • the relationship shown in FIG. 4 corresponds to the arrangement shown in FIG.
  • the trapezoidal distortion of the image generated by the projection devices 61 and 62 projecting the image obliquely to the projected body 80 and the imaging devices 71 and 72 imaging the image from an oblique angle is appropriately corrected in advance. It shall be.
  • the projection unit 21 of the projection device 61 of FIG. 2 may be configured to perform a predetermined trapezoidal distortion correction process according to the installation position on the first projected image.
  • the projection device 62 of FIG. 2 may be configured in the same manner as the projection device 61, and the image pickup device 72 may be configured in the same manner as the image pickup device 71.
  • the predetermined trapezoidal distortion correction process may be any generally used trapezoidal distortion correction process.
  • FIG. 4 and the like exemplify a case where an image corresponding to the projected image data 11b (see FIG. 2) stored in the storage unit 11 is projected.
  • the case where the image corresponding to the projected image data 11b stored in the storage unit 11 is projected is taken as an example, but the projected image is not limited to this. Images corresponding to the projected image data read from the external storage device connected to the image projection system 1, the projected image data input from the external information processing device connected to the image projection system 1, and the like are projected. May be good.
  • the projection unit 21 is as shown in the overlapping image V1 shown in FIG.
  • An image similar to the image projected by the projection unit 22 is formed in the overlapping region 81.
  • the captured images by the imaging unit 31 and the imaging unit 32 are the same images as the overlapping image V1. This means that there are no obstacles in either the angle of view of the projection by the projection device 61 or the angle of view of the projection by the projection device 62, and the image captured by the image pickup device 71 is captured by the image pickup device 72. This is because there are no obstacles in any of the angles of view of the image.
  • FIG. 5 is a diagram showing an example of a case where there is an obstacle OB within the angle of view of the projection by the projection device 61 and within the angle of view of the image captured by the image pickup device 71.
  • FIG. 6 is a diagram showing an example of the relationship between the first projected image, the second projected image, the overlapping image, the first captured image, and the second captured image. The relationship shown in FIG. 6 corresponds to the arrangement shown in FIG.
  • the obstacle OB when there is an obstacle OB within the angle of view of the projection by the projection device 61 and the angle of view of the image captured by the image pickup device 71, the obstacle OB is the projected light for projecting the image by the projection device 61. Block a part of. Therefore, as in the overlapping image V2 shown in FIG. 6, an image including a shadow region DA corresponding to the light blocked by the obstacle OB is formed in the overlapping region 81.
  • the shadow region DA is included as in the overlapping image V2.
  • the image is formed in the overlapping region 81. As described above, when there is an obstacle OB between one of the plurality of projection portions and the overlapping region 81, the obstacle OB causes a shadow region DA in the overlapping image V2.
  • the image capturing unit 31 displays an image in which the obstacle OB is located inside the shadow region DA as in the first captured image 31b shown in FIG. It is imaged.
  • the image capturing unit 31 displays an image in which the obstacle OB is located inside the shadow region DA as in the first captured image 31b shown in FIG. It is imaged.
  • the contents of the first captured image 31b and the contents of the second captured image 32b Is reversed.
  • the information processing device 50 performs processing according to the specific and specific results.
  • the information processing device 50 includes a storage unit 11 and a calculation unit 51.
  • the storage unit 11 includes a software program (hereinafter, simply referred to as a program) and a storage device capable of storing data. Examples of the storage device included in the storage unit 11 include, but are not limited to, a hard disk drive, a solid state drive, a flash memory, and the like, and can be changed as appropriate.
  • the storage unit 11 may be a combination of a reading device for a recording medium such as an optical disc and a recording medium set in the reading device.
  • the storage unit 11 stores the image projection program 11a and the projected image data 11b.
  • the image projection program 11a is a program read and executed by the calculation unit 51.
  • the arithmetic unit 51 includes an arithmetic circuit that realizes various functions by reading a program and executing it, like a CPU (Central Processing Unit).
  • the calculation unit 51 of the first embodiment functions as an image acquisition unit 12, a shadow area identification unit 13, an image correction unit 14, and a control unit 15 shown in FIGS. 1 and 2 by reading and executing the image projection program 11a. do.
  • the image acquisition unit 12 acquires the image projected by the projection units 21 and 22. Specifically, the image acquisition unit 12 reads and acquires the projected image data 11b from, for example, the storage unit 11.
  • the projected image data 11b acquired by the image acquisition unit 12 can be referred to by the shadow area identification unit 13 and the image correction unit 14. Further, unless the corrected image is generated by the image correction unit 14 described later, the projection units 21 and 22 project an image corresponding to the content of the projected image data 11b acquired by the image acquisition unit 12 (see FIG. 4).
  • the shadow area specifying unit 13 identifies a shadow area generated in the overlapping area 81 due to an obstacle between one projection unit and the overlapping area 81 based on the captured image. Specifically, the shadow region specifying unit 13 performs image analysis for comparing the content of the image captured by the imaging units 31 and 32 with the content of the image acquired by the image acquisition unit 12. More specifically, the shadow region specifying unit 13 has a decrease in brightness in the tendency of the brightness distribution of the contents of the captured images by the imaging units 31 and 32, based on the tendency of the brightness distribution corresponding to the contents of the projected image data 11b. If a subregion is generated, the subregion is extracted as a shadow region.
  • the image acquisition unit 12 acquires the first projected image, and the first captured image and the first projection.
  • the difference from the image is performed and a difference signal is generated.
  • this difference signal it may be configured to specify a partial region in which the brightness has decreased by a certain amount.
  • the partial region is extracted as an image of an obstacle.
  • a partial area of the difference signal that has a difference amount of the difference signal and is different from the feature of the part where the brightness is reduced by a certain amount is specified, and the area of the obstacle. It may be configured to be specified as.
  • the shadow region specifying unit 13 specifies the shadow region
  • the image correction unit 14 generates a corrected image in which a region corresponding to a shadow region in an image projected by one or more projection units other than one projection unit having an obstacle within the angle of view of the projection is corrected. Specifically, the image correction unit 14 determines the brightness of a partial region of the projected image data 11b at a position and size corresponding to the shadow region so that the overlapping image in the overlapping region 81 is drawn with the original brightness. Generates a corrected image that has been corrected to raise. The degree of correction corresponds to the degree of decrease in brightness caused by the shadow region extracted by the shadow region specifying unit 13 in the contents of the projected image data 11b. For example, when the second captured image has a shadow region, it is preferable to perform correction for increasing the brightness based on the difference signal obtained by the image analysis in the shadow region specifying unit 13. When the first captured image has a shadow region, the same correction may be performed.
  • the control unit 15 comprehensively controls each configuration and each functional configuration of the image projection system 1. Specifically, the control unit 15 of the first embodiment projects the corrected image on the projection unit.
  • the control unit 15 projects the corrected image by a projection unit different from the projection unit that is paired with the imaging unit that captures the captured image including an obstacle within the angle of view of the projection that causes the specified shadow region. It is good to let it.
  • FIG. 7 is a diagram showing an example of the relationship between the first projected image, the second projected image, the duplicated image, the first captured image, and the second captured image when the corrected image is projected.
  • the relationship shown in FIG. 7 corresponds to the arrangement shown in FIG. Further, FIG. 7 assumes a case where the corrected image 14a is generated in response to the result of extracting the images of the shadow region DA and the obstacle OB shown in FIG.
  • a partial region having a position and a size corresponding to a shadow region and having an increased brightness in the corrected image 14a is shown as a correction region BA.
  • the detection that "there is an obstacle OB within the angle of view projected by the projection unit 21" is detected in the storage unit 11 in the "second captured image” which is the image captured by the imaging unit 22. Based on the fact that a shadow that should not be present in the projected image is projected, it is configured so that it is determined that "the obstacle OB is within the angle of view of the projection by the projection unit 21".
  • the image of the projection by the projection unit 21 It may be configured so that it is determined that there is an obstacle OB in the corner.
  • the image correction unit 14 generates a corrected image 14a in which correction for increasing the brightness of a partial region having a position and a size corresponding to the shadow region DA in the second captured image is performed on the content of the projected image data 11b.
  • the control unit 15 projects the corrected image 14a on the projection unit 22 as a second projection image.
  • the control unit 15 causes the projection unit 21 to project the contents of the projected image data 11b.
  • the decrease in the brightness of the shadow region caused by the obstacle OB in the projection corresponding to the projected image data 11b by the projection unit 21 is offset by the increase in the brightness of the BA due to the projection corresponding to the corrected image 14a by the projection unit 22. Therefore, like the overlapping image V3 shown in FIG. 7, an image similar to the content of the projected image data 11b is formed in the overlapping region 81.
  • the control unit 15 can confirm that the correction of the shadow region is normally performed based on the first captured image 31c and the second captured image 32c.
  • the data flow of the output image is shown so as to reach the projection units 21 and 22 from the image acquisition unit 12 via the image correction unit 14, but the data flow is limited to this. Not. There may be a data flow from the image acquisition unit 12 to the direct projection units 21 and 22. Regardless of which data flow is adopted, when the control unit 15 generates a corrected image such as the corrected image 14a, the corrected image is projected onto a projection unit where the obstacle is not within the angle of view, and the obstacle is generated. The image acquired by the image acquisition unit 12 is projected onto the projection unit within the angle of view. In the configuration illustrated in FIG. 1 and FIG. 11 described later, the image correction unit 14 has acquired the image correction unit 12 with respect to the projection unit that does not output the corrected image and projects the content corresponding to the image acquisition unit 12. Output corresponding to the content of the image is performed as it is.
  • FIG. 8 is a flowchart showing the flow of processing related to the detection of obstacles and shadow areas. As a premise that such processing is performed, an image is projected by the projection units 21 and 22 corresponding to the image acquired by the image acquisition unit 12.
  • the imaging units 31 and 32 acquire a plurality of captured images by imaging the overlapping region 81 (step S1).
  • the shadow area identification unit 13 detects images of the shadow area and obstacles based on the captured image acquired in the process of step S1 (step S2). When the image of the shadow area and the obstacle is not detected in the process of step S2 (step S2; No), the process ends.
  • step S2 When an image of a shadow region and an obstacle is detected in the process of step S2 (step S2; Yes), the image correction unit 14 compensates for the decrease in the brightness of the shadow region, that is, the portion corresponding to the shadow region. A corrected image in which the brightness of the region is increased is generated (step S3). Then, the control unit 15 projects the corrected image onto the projection unit having no obstacle within the angle of view of the image projection specified based on the process of step S2 (step S4). In the case of the example shown in FIG. 7, it is the projection unit 22 that is treated as a projection unit having no obstacle within the angle of view of the image projection in the process of step S4.
  • step S1 a plurality of captured images are acquired.
  • the control unit 15 determines whether or not the decrease in brightness of the shadow region has been eliminated based on the plurality of captured images (step S5). When it is determined in the process of step S5 that the decrease in brightness of the shadow region has been eliminated (step S5; Yes), the process ends. On the other hand, if it is determined in the process of step S5 that the decrease in brightness of the shadow region has not been resolved (step S5; No), the process returns to the process of step S3.
  • the projection unit and the imaging unit which are treated as one set, have the optical axis of the image projected by the projection unit and the optical axis of the image captured by the imaging unit coincide with each other as much as possible.
  • FIG. 9 is a schematic diagram showing an example of the relationship between the optical axes of the projection device 61 and the image pickup device 71, which are treated as one set P1.
  • the projection device 61 and the image pickup device 71 are arranged side by side so that the optical axis of the image projected onto the overlap area 81 by the projection device 61 and the optical axis of the image pickup of the overlap area 81 by the image pickup device 71 correspond to each other. ing.
  • the angle of view of the projection device 61 and the angle of view of the image pickup device 71 can be matched.
  • FIG. 10 is a schematic diagram showing a mechanism of a configuration in which an optical axis on which a projection device 61 projects an image via a half mirror 90 and an optical axis on which an imaging device 71 images an overlapping region 81 are provided so as to overlap each other.
  • the half mirror 90 on the optical axis on which the projection device 61 projects an image
  • the light emitted by the projection device 61 for projecting the image is transmitted, and the reflected light from the overlapping region 81 is transmitted. It is possible to reflect a part of the light and point it at an angle different from the optical axis.
  • FIG. 10 is a schematic diagram showing a mechanism of a configuration in which an optical axis on which a projection device 61 projects an image via a half mirror 90 and an optical axis on which an imaging device 71 images an overlapping region 81 are provided so as to overlap each other.
  • the image pickup device 71 is arranged so that the half mirror 90 corresponds to a reflection angle that reflects a part of the reflected light from the overlapping region 81.
  • the optical axis on which the projection device 61 projects an image between the overlapping region 81 and the half mirror 90 can be aligned with the optical axis on which the imaging device 71 images the overlapping region 81.
  • the angle of view of the projection device 61 and the angle of view of the image pickup device 71 can be matched with higher accuracy.
  • the set P1 is taken as an example, but the same configuration can be obtained when there is a set P2 or a third or more set (not shown).
  • the decrease in brightness caused by an obstacle can be compensated by projecting a corrected image. Therefore, it is possible to prevent a change in the brightness of a part of the projected image and deterioration of the image quality such as chipping, and it is possible to more reliably maintain the image quality of the image projected on the overlapping region 81.
  • the angle of view of the projection device 61 and the angle of view of the image pickup device 71 can be matched with higher accuracy.
  • FIG. 11 is a functional block diagram showing the configuration of the image projection system 100 according to the second embodiment of the present invention.
  • FIG. 12 is a block diagram showing an example of a configuration for realizing the functional block shown in FIG.
  • the image projection system 100 includes an obstacle detection unit 16 and a projection control unit 17 instead of the shadow region identification unit 13 and the image correction unit 14 included in the image projection system 1.
  • the calculation unit 51 reads out and executes the image projection program 11c stored in the storage unit 11 included in the information processing device 50 of the image projection system 100.
  • the obstacle detection unit 16 and the projection control unit 17 are realized as functions that replace the shadow area identification unit 13 and the image correction unit 14.
  • the calculation unit 51 of the second embodiment functions as an image acquisition unit 12, a control unit 15, an obstacle detection unit 16 and a projection control unit 17 shown in FIGS. 11 and 12 by reading and executing the image projection program 11c. do.
  • the obstacle detection unit 16 plays at least a function related to extracting an image of an obstacle among the functions provided by the shadow area identification unit 13. That is, the obstacle detection unit 16 detects an obstacle between one projection unit and the overlapping region based on the images captured by the imaging units 31 and 32.
  • the obstacle detection unit 16 may be configured to further specify a projection unit that is paired with an imaging unit that captures an captured image including an obstacle that causes a shadow region. Since the projection units associated with the imaging unit that captures the captured image are configured as one set, if the imaging unit can be specified, the corresponding projection unit can be easily specified. Such identification may be performed by the obstacle detection unit 16 or the projection control unit 17. Further, the information regarding the specified projection unit may be stored in the storage unit 11.
  • the function related to the extraction of the shadow area included in the shadow area identification unit 13 is not essential, but it may be possessed. In the description of the second embodiment, it is assumed that the function provided by the obstacle detection unit 16 and the function provided by the shadow area identification unit 13 are the same.
  • the projection control unit 17 projects an image on a projection unit different from the one projection unit. Further, when an obstacle is not detected, the projection control unit 17 operates a plurality of projection units so as to switch which of the plurality of projection units projects an image on the overlapping region 81 at a predetermined cycle. In this way, the projection control unit 17 switches and projects the projected image projected by the plurality of projection units 21 and 22 for each frame.
  • the projection control unit 17 uses a projection unit different from the projection unit, which is paired with the imaging unit that captures the captured image including an obstacle within the angle of view of the projection that causes the specified shadow region, to produce the projected image. It is good to project it.
  • FIG. 13 is a diagram showing an example of projection of an image on the overlapping region 81 and imaging of the overlapping region 81 when there is no obstacle.
  • the image formed in the overlapping region 81 is described as a projected image.
  • the projection unit 21 projects an image corresponding to the projected image data 11b as the first projected image.
  • the second projected image is not projected.
  • the fact that the second projected image is not projected is illustrated by a black filled rectangle NL.
  • the configuration for projecting an image is switched at a predetermined cycle. Therefore, there is a period during which the second projected image is projected and the first projected image is not projected.
  • the configuration for projecting an image according to the update cycle of the frame image is alternately switched between the projection unit 21 and the projection unit 22. It may be switched every frame, or it may be switched every predetermined number of frames.
  • the captured images by the imaging unit 31 and the imaging unit 32 are the same images as the projected image V4.
  • FIG. 14 is a diagram showing an example of projecting an image on the overlapping region 81 and imaging the overlapping region 81 when there is an obstacle.
  • FIG. 14 shows a state in which the operation control of the plurality of projection units by the projection control unit 17 in response to the detection of an obstacle has not yet been performed.
  • the obstacle OB is the projected light for projecting the image by the projection device 61.
  • Block a part of. Therefore, during the period in which the projection unit 21 is projecting the image corresponding to the projected image data 11b as the first projected image, it corresponds to the light blocked by the obstacle OB as in the projected image V5 shown in FIG.
  • An image including the shadow region DA is formed in the overlapping region 81.
  • the image capturing unit 31 displays an image in which the obstacle OB is located inside the shadow region DA as in the first captured image 31e shown in FIG. It is imaged.
  • the image capturing unit 31 displays an image in which the obstacle OB is located inside the shadow region DA as in the first captured image 31e shown in FIG. It is imaged.
  • FIG. 15 is a diagram showing an example of projection of an image on the overlapping region 81 and imaging of the overlapping region 81 after the control corresponding to the detection of the obstacle is performed.
  • the obstacle detection unit 16 first captures that the obstacle OB is within the angle of view of the projection by the projection device 61 and the angle of view of the image captured by the image pickup device 71. Detect based on the image 31e. After such detection, the projection control unit 17 controls the operation so that the projection device 62 projects the image on the overlapping region 81 and the projection device 61 does not project the image.
  • the first projected image is not projected, and the image corresponding to the projected image data 11b is projected as the second projected image. Since there are no obstacles within the angle of view of the projection by the projection device 62, an image similar to the image projected by the projection device 62 is formed in the overlapping region 81 as in the projection image V6 shown in FIG.
  • the image captured by the imaging unit 31 includes an image of the obstacle OB as in the first captured image 31f shown in FIG.
  • the image captured by the imaging unit 32 becomes the same image as the projected image V6 as in the second captured image 32f shown in FIG. ..
  • FIG. 16 is a flowchart showing the flow of processing related to the detection of obstacles and shadow areas.
  • the projection of the image by the projection unit 21 or the projection unit 22 corresponding to the image acquired by the image acquisition unit 12 is performed while switching at a predetermined cycle.
  • the imaging units 31 and 32 acquire a plurality of captured images by imaging the overlapping region 81 (step S11).
  • the obstacle detection unit 16 detects an image in the shadow region based on the captured image acquired in the process of step S11 (step S12). If the image in the shadow region is not detected in the process of step S12 (step S12; No), the process ends.
  • step S12 When the shadow region is detected in the process of step S12 (step S12; Yes), the projection control unit 17 identifies the projection unit that projected the image on the overlapping region 81 during the period when the shadow region was detected (step S12; Yes). Step S13). Then, the projection control unit 17 projects the image on a projection unit different from the projection unit specified based on the process of step S13 (step S14).
  • step S11 a plurality of captured images are acquired.
  • the obstacle detection unit 16 determines whether or not the shadow region has been eliminated based on the plurality of captured images (step S15). When it is determined in the process of step S15 that the decrease in brightness of the shadow region has been eliminated (step S15; Yes), the process ends. On the other hand, if it is determined in the process of step S15 that the decrease in brightness of the shadow region has not been resolved (step S15; No), the process returns to the process of step S13.
  • the obstacle detection unit 16 determines in which angle of view the projection units 21 and 22 the obstacle OB is located based on the first captured image and the second captured image. It is being detected. By performing such control, there is a remarkable effect that the obstacle OB that caused the shadow region DA can be more reliably identified within which angle of view of the projection devices 61 and 62.
  • the control for eliminating the occurrence of the shadow region in the image projected on the overlapping region 81 is not limited to this.
  • the obstacle detection unit 16 identifies whether the projection unit 21 or the projection unit 22 that was projecting the image at the timing when the shadow region DA is generated in the overlapping region 81.
  • the projection control unit 17 causes the image to be projected by a projection unit different from the specified projection unit. This also eliminates the occurrence of a shadow region in the image projected on the overlapping region 81.
  • the number of imaging units may be one. Therefore, in the second embodiment, it is not essential that the projection unit and the imaging unit are a set. According to this, the configuration can be further simplified.
  • the obstacle detection unit 16 has an obstacle within the angle of view of the projection by the projection device 61 and within the angle of view of the image captured by the image pickup device 71.
  • the presence of OB is detected based on the first captured image 31e.
  • the projection control unit 17 controls the operation so that the projection device 62 projects the image on the overlapping region 81 and the projection device 61 does not project the image, but the operation is not limited to this.
  • the projection control unit 17 controls the projection of a plurality of projection devices to perform projection while switching the projection device for each frame, and within the projection angle of the projection device 61 and within the image capture angle of the image pickup device 71.
  • the presence of the obstacle OB is detected based on the first captured image 31e, and the shadow region is specified based on the second captured image 32e and the first projected image 11b.
  • the area corresponding to the shadow area of the second projection image projected in the frame of is set high to compensate for the decrease in the brightness of the image area corresponding to the shadow area by the correction of the brightness, and the corrected second projection.
  • the image may be switched frame by frame and projected.
  • projection When switching and projecting for each frame, projection may be performed while switching, including projection by a projection device such as a projection device 61 including an obstacle OB within the angle of view of the projection. Further, even if the projection control unit 17 controls the projection of each projection device to be switched and projected at high speed so that the projection is performed at a very high frame rate when the projection is switched for each frame. good.
  • a very high frame rate is preferably a frame rate of 30 Hz or 60 Hz or higher, which is common in video display.
  • the projection control unit 17 prevents the projection device having an obstacle OB within the projection angle of view from projecting, and projects the image onto the overlapping region 81. May be controlled so that it is performed on a projection device having no obstacle OB within the projection angle of view.
  • the projection device having no obstacle OB within the projection angle of view controlled by the projection control unit 17 may be a predetermined projection device, or the obstacle OB may be within a plurality of projection angles of view. It is also possible to control the projection device without the above to project while switching for each frame.
  • the frame rate is different from the frame rate when projecting using all the projection devices. It may be controlled so as to project. For example, if it is composed of three projection devices, one of which contains an obstacle OB within the projection angle of view and projects while switching the projection device at a frame rate of 30 Hz, the obstacle is within the projection angle of view. When projecting by the remaining two projection devices that do not include the object OB, it may be controlled to project at 20 Hz, which is 2/3 of 30 Hz, for example.
  • FIG. 17 is a diagram showing a configuration example when there is one imaging unit.
  • the image pickup device 71 and the image pickup device 72 are omitted, and the image pickup device 75 is provided in place of them.
  • the image pickup apparatus 75 may be provided at a position independent of the projection apparatus 61 and the projection apparatus 62.
  • the image pickup apparatus 75 is provided at a position facing the overlapping region 81.
  • the imaging unit included in the imaging device 75 is the same as the imaging unit 31 included in the imaging device 71 and the imaging unit 32 included in the imaging device 72, and is set so as to include the overlapping region 81 in the imaging range.
  • the angle of view of the image captured by the imaging device 75 is shown within the acute angle formed by the two broken lines L5.
  • the second embodiment various processes and controls related to the projection of the corrected image described in the first embodiment are omitted.
  • the second embodiment is the same as the first embodiment except for the matters noted.
  • the image quality of the projected image is more easily maintained by projecting the image by a projection unit different from the projection unit in which the obstacle is detected within the angle of view, and one of the projected images. It is possible to prevent changes in the brightness of the part and deterioration of image quality such as chipping.
  • the projection unit and the imaging unit are paired, it is more certain that the obstacle OB that caused the shadow region DA is within the angle of view of the projection devices 61 and 62. Can be identified.
  • the process of partially prohibiting the light projected from the image projection device as in Patent Document 1 is complicated, and a method capable of maintaining the image quality of the projected image more easily has been required.
  • the brightness of the projected image is reduced because the projection corresponding to the blocked area is prohibited.
  • FIG. 18 is a functional block diagram showing the configuration of the image projection system 200 according to the third embodiment of the present invention.
  • FIG. 19 is a block diagram showing an example of a configuration for realizing the functional block shown in FIG.
  • the image projection system 200 includes a plurality of projection units, a plurality of imaging units, a storage unit 11, an image acquisition unit 12, a specific unit 113, an extraction unit 114, an image correction unit 115, and a control unit 116. Be prepared.
  • the projection unit 21 and the projection unit 22 are illustrated as a plurality of projection units.
  • an imaging unit 31 and an imaging unit 32 are illustrated as a plurality of imaging units.
  • the image formed by the overlapping region 81 of the projected object 80 by the image projected by the image projection system 200 is referred to as a projected image.
  • one of the projection unit 21 and the projection unit 22 projects an image unless the correction image is generated by the image correction unit 115 described later.
  • the case where the image corresponding to the projected image data 11b stored in the storage unit 11 is projected is taken as an example, but the projected image is not limited to this. Images corresponding to the projected image data read from the external storage device connected to the image projection system 200, the projected image data input from the external information processing device connected to the image projection system 200, and the like are projected. May be good.
  • FIG. 3 when there is no obstacle within the angle of view of the projection, an image similar to the image projected by the projection unit 21 is formed in the overlapping region 81 as in the projection image V4 shown in FIG. .. Further, just as there is no obstacle in the angle of view of the projection, when there is no obstacle in the angle of view of the captured image, the image is captured as in the first captured image 31d and the second captured image 32d shown in FIG. The image captured by the unit 31 and the imaging unit 32 is the same as the projected image V4.
  • FIG. 13 and the like exemplify a case where the projection unit 21 projects an image corresponding to the projected image data 11b (see FIG. 19) stored in the storage unit 11.
  • the projection unit 22 does not project an image unless the image correction unit 115, which will be described later, generates a corrected image (see FIG. 13). In FIG. 13 and the like, it is shown that the projection unit 22 does not project the image by making the second projected image a black filled rectangle NL.
  • FIG. 20 is a diagram showing an example of the relationship between the first projected image, the second projected image, the projected image, the first captured image, and the second captured image.
  • the relationship shown in FIG. 20 corresponds to the arrangement shown in FIG.
  • the obstacle OB2 when the obstacle OB2 is within the angle of view of the projection by the projection device 61 and the angle of view of the image captured by the image pickup device 71, the obstacle OB2 is the projected light for projecting the image by the projection device 61. Block a part of. Therefore, as in the projected image V7 shown in FIG. 20, an image including the shadow region DA2 corresponding to the light blocked by the obstacle OB2 is formed in the overlapping region 81. In this way, when there is an obstacle OB2 between the projection unit 21 and the overlapping region 81, the obstacle OB2 creates a shadow region DA2 in the projected image V7.
  • the obstacle OB2 is within the angle of view of the image captured by the imaging device 71, an image in which the obstacle OB2 is located inside the shadow region DA2 is captured by the imaging unit 31 as in the first captured image 31g shown in FIG. It is imaged.
  • the image pickup device 72 there is no obstacle OB2 within the angle of view of the image captured by the image pickup device 72. Therefore, as in the second captured image 32g shown in FIG. 20, the captured image by the imaging unit 32 is the same as the projected image V7.
  • the light that projects an image from the projection unit 21 onto the overlapping region 81 is projected so as to spread toward the overlapping region 81 side via the optical element of the projection unit 21. Therefore, as shown by the first captured image 31g, the shadow region DA2 is larger than the image of the obstacle OB2.
  • the information processing device 500 performs processing related to the projection of the image by the projection units 21 and 22 based on the images captured by the image pickup units 31 and 32.
  • the information processing device 500 is the same as the information processing device 50, except for the matters to be noted below.
  • the calculation unit 51 of the third embodiment reads and executes the image acquisition unit 12, the identification unit 113, the extraction unit 114, the image correction unit 115, and the control unit 116 shown in FIGS. 18 and 19. Functions as.
  • the image projection program 11a and the image projection program 11d are the same except for the difference in the realized functions.
  • the image acquisition unit 12 acquires the image projected by the projection unit 21. Specifically, the image acquisition unit 12 reads and acquires the projected image data 11b from, for example, the storage unit 11.
  • the projected image data 11b acquired by the image acquisition unit 12 can be referred to by the identification unit 113 and the image correction unit 115.
  • the identification unit 113 identifies an image of an obstacle between the projection unit 21 and the overlapping region 81 and a shadow region generated in the overlapping image due to the obstacle based on the captured image. Specifically, the specific unit 113 performs image analysis for comparing the content of the image captured by the imaging units 31 and 32 with the content of the image acquired by the image acquisition unit 12. More specifically, the specific unit 113 has a decrease in brightness in the tendency of the brightness distribution of the contents of the captured images by the imaging units 31 and 32, based on the tendency of the brightness distribution corresponding to the contents of the projected image data 11b. If a partial area is generated, the partial area is specified as a shadow area.
  • the image acquisition unit 12 acquires the first projected image, and the first captured image and the first projection.
  • the difference from the image is performed and a difference signal is generated.
  • this difference signal it may be configured to specify a partial region in which the brightness has decreased by a certain amount.
  • the partial region is specified as an image of an obstacle.
  • a partial region of the difference signal that has a difference amount of the difference signal and is different from the feature of the portion where the brightness is reduced by a certain amount is specified, and the image of the obstacle. It may be configured to be specified as an area.
  • the projection unit that is paired with the image pickup unit that captures the captured image including the obstacle that causes the shadow region is specified by the specific unit 113. Since the projection units associated with the imaging unit that captures the captured image are configured as one set, if the imaging unit can be specified, the corresponding projection unit can be easily specified. Such identification may be performed by the specific unit 113 or may be performed by the control unit 116. Further, the information regarding the specified projection unit may be stored in the storage unit 11.
  • the extraction unit 114 extracts an image of an obstacle included in the captured image. Specifically, the extraction unit 114 extracts a region specified as an image of an obstacle by the identification unit 113. The image of the obstacle extracted by the extraction unit 114 is used by the image correction unit 115.
  • the image correction unit 115 generates a corrected image in which the extracted obstacle image is corrected to a position and size corresponding to the shadow area. Specifically, the image correction unit 115 enlarges the size of the image of the obstacle OB2 extracted by the extraction unit 114 so as to correspond to the size of the shadow region DA2.
  • the size of the enlarged obstacle OB2 may be larger than or equal to the shadow area DA2, and it is not essential that the size of the enlarged obstacle OB2 exactly matches the size of the shadow area DA2.
  • the projected position of the enlarged image of the obstacle OB2 is the position of the shadow region DA2, and another visible image is not projected around the enlarged image of the obstacle OB2.
  • the processed image is generated as a corrected image and output. More specifically, the image correction unit 115 uses an image in which transmission processing is performed around the enlarged image of the obstacle OB2 and having the same number of vertical and horizontal pixels as the projected image data 11b as a correction image. Generate.
  • the position and size of the shadow region DA2 in the processing performed by the image correction unit 115 may be determined by the second captured image 32 g or the first captured image 31 g, and there are slight differences between them. In some cases, processing such as averaging may be performed.
  • the control unit 116 comprehensively controls each configuration and each functional configuration of the image projection system 200. Specifically, the control unit 116 of the third embodiment projects the corrected image on the projection unit. Here, the control unit 116 projects the corrected image by a projection unit different from the projection unit that is paired with the imaging unit that captures the captured image including an obstacle within the angle of view of the projection that causes the specified shadow region. It is good to let it.
  • FIG. 21 is a diagram showing an example of the relationship between the first projected image, the second projected image, the projected image, the first captured image, and the second captured image when the corrected image is projected.
  • the relationship shown in FIG. 21 corresponds to the arrangement shown in FIG. Further, FIG. 21 assumes a case where the corrected image 15a is generated in response to the result of extracting the images of the shadow region DA2 and the obstacle OB2 shown in FIG. 20. In FIG. 21 and the like, the image of the obstacle OB2 included in the corrected image 15a and corrected to the position and size corresponding to the shadow region is shown as an enlarged image BOB2.
  • the control unit 116 projects the corrected image 15a onto the projection unit 22 as a second projection image.
  • the control unit 116 causes the projection unit 21 to project the contents of the projected image data 11b.
  • the enlarged image BOB2 included in the corrected image 15a is projected so as to overlap the shadow region generated by the obstacle OB2 in the projection corresponding to the projected image data 11b by the projection unit 21. Therefore, like the projected image V8 shown in FIG. 21, an image in which the shadow region DA2 (see FIG. 20) in the projected image V7 is replaced by the enlarged image BOB2 is drawn in the overlapping region 81.
  • the shadow region DA2 captured by the first captured image 31 g and the second captured image 32 g shown in FIG. 20 is the enlarged image BOB2.
  • a replaced image is obtained.
  • the control unit 116 can confirm that the correction of the shadow region is normally performed based on the first captured image 31h and the second captured image 32h.
  • the data flow of the output image is shown so as to reach the projection unit 21 from the image acquisition unit 12 via the image correction unit 115, but the present invention is not limited to this. There may be a data flow from the image acquisition unit 12 to the direct projection unit 21.
  • the image correction unit 115 outputs to the projection unit 21 as it is corresponding to the content of the image acquired by the image acquisition unit 12.
  • FIG. 22 is a flowchart showing the flow of processing related to the detection of obstacles and shadow areas. As a premise that such processing is performed, an image is projected by the projection unit 21 corresponding to the image acquired by the image acquisition unit 12.
  • the imaging units 31 and 32 acquire a plurality of captured images by imaging the overlapping region 81 (step S21).
  • the identification unit 113 detects an image of a shadow region and an obstacle based on the captured image acquired in the process of step S21 (step S22). When the image of the shadow area and the obstacle is not detected in the process of step S22 (step S22; No), the process ends.
  • step S22 When an image of a shadow area and an obstacle is detected in the process of step S22 (step S22; Yes), the extraction unit 114 extracts the detected image of the obstacle (step S23). Further, the image correction unit 115 corrects the image of the obstacle extracted in the process of step S23 to the position and size of the shadow region detected in the process of step S22 to generate a corrected image (step S24). Then, the control unit 116 projects the corrected image generated in the process of step S24 onto the projection unit 22 (step S25).
  • step S21 a plurality of captured images are acquired.
  • the control unit 116 determines whether or not the image of the obstacle by the corrected image is applied to the shadow region based on the plurality of captured images (step S26).
  • step S26 determines whether or not the image of the obstacle by the corrected image is applied to the shadow area based on the plurality of captured images.
  • an image corresponding to an obstacle that has created a shadow region can be projected. That is, it is possible to project an image whose contents are changed from the contents of the image which is supposed to be projected in advance according to the obstacle between the projection unit 21 and the projected object 80. Therefore, new value can be added to the projected image.
  • FIG. 23 is a diagram showing the positional relationship between the projected object 85, the projection devices 61, 62, and the image pickup devices 71, 72 according to the fourth embodiment of the present invention.
  • the projected body 85 is adopted.
  • the projected object 85 is provided on the assumption that an image projected from one surface side is visually recognized from the back surface side, such as a transmission screen. Due to the positional relationship between the projection device 61 and the projected body 85 shown in FIG. 23, the image projected from the projection unit 21 included in the projection device 61 of the set P1 is surfaced on the opposite side of the projection unit 21 with the projected body 85 in between. It is visually recognized as a back surface image on the visible surface 86. Further, the set P2 is provided on the opposite side of the set P1 with the projected body 85 in between. That is, the set P1 and the set P2 are arranged so as to face each other with the projected body 85 in between.
  • the angle of view of the imaging devices 71 and 72 are omitted in FIGS. 23 and 25, the angle of view of the imaging device 71 is the angle of view of the projection device 61 in the fourth embodiment as in the third embodiment.
  • the angle of view of the imaging device 72 is provided so as to correspond to the angle of view of the projection device 62.
  • FIG. 24 is a diagram showing an example of the relationship between the first projected image, the second projected image, the projected image, the back surface image, the first captured image, and the second captured image.
  • the relationship shown in FIG. 24 corresponds to the arrangement shown in FIG. 23.
  • the content of the back surface image visually recognized by the visible surface 86 facing the opposite side of the projection unit 21 with the projected body 85 sandwiched is the projected image data projected by the projection unit 21.
  • the contents of the projected image V9 corresponding to the contents of 11b are inverted left and right.
  • the content of the second captured image corresponds to the content of the back surface image.
  • the relationship between the first projected image, the second projected image, the projected image, the first captured image, and the second captured image in the fourth embodiment has been described with reference to FIG. This is similar to these relationships in the third embodiment.
  • FIG. 25 is a diagram showing an example of a case where the obstacle OB2 is within the angle of view of the projection by the projection device 61 and the angle of view of the image captured by the image pickup device 71 in the fourth embodiment.
  • FIG. 26 is a diagram showing an example of the relationship between the first projected image, the second projected image, the projected image, the back surface image, the first captured image, and the second captured image. The relationship shown in FIG. 26 corresponds to the arrangement shown in FIG. 25.
  • the obstacle OB2 when there is an obstacle OB2 within the angle of view of the projection by the projection device 61 and within the angle of view of the image captured by the image pickup device 71, the obstacle OB2 is the projected light for projecting the image by the projection device 61. Block a part of. Therefore, as in the projected image V10 shown in FIG. 25, an image including the shadow region DA2 corresponding to the light blocked by the obstacle OB2 is formed in the overlapping region 81. Further, as in the back surface image V10R shown in FIG. 25, the content of the back surface image is a left-right inverted content of the projected image V10. Therefore, the position and shape of the shadow region DA2 in the back image V10R is the left-right reversal of the position and shape of the shadow region DA2 in the projected image V10.
  • the image in which the obstacle OB2 is located inside the shadow region DA2 is captured by the image pickup unit 31 as in the first image captured image 31j shown in FIG. It is imaged.
  • the captured image by the imaging unit 32 is the same as the back surface image V10R.
  • FIG. 27 is a diagram showing an example of the relationship between the first projected image, the second projected image, the projected image, the back surface image, the first captured image, and the second captured image when the corrected image is projected. Is.
  • the relationship shown in FIG. 27 corresponds to the arrangement shown in FIG. Further, FIG. 27 assumes a case where the corrected image 15a is generated in response to the result of extracting the images of the shadow region DA2 and the obstacle OB2 shown in FIG. 26.
  • the image correction unit 115 of the fourth embodiment When generating the corrected image, the image correction unit 115 of the fourth embodiment further flips the enlarged image of the obstacle left and right to obtain a reversed image. Further, the reference of the position and size of the shadow region DA2 in the processing performed by the image correction unit 115 of the fourth embodiment is based on the image pickup content of the image pickup unit 32 as in the second image capture image 32j shown in FIG. It becomes.
  • the processing of the image correction unit 115 of the fourth embodiment may be performed by the control unit 116 of the fourth embodiment.
  • FIG. 27 illustrates an inverted image MOB2 generated based on the obstacle OB2.
  • the direction of one diagonal line attached to the obstacle OB2 and the direction of one diagonal line attached to the inverted image MOB2 are symmetrical, so that the inverted image MOB2 makes the obstacle OB2. It shows that it is enlarged left and right.
  • the set P1 and the set P2 are arranged so as to face each other with the projected body 85 in between. Therefore, the projection unit 22 projects the corrected image from the opposite side of the projection unit 21 with the projected body 85 in between, that is, the side facing the visible surface 86.
  • the enlarged image of the obstacle is regarded as an inverted image
  • the reference of the position and size of the shadow region DA2 is based on the image captured content of the imaging unit 32. Therefore, the position and shape of the inverted image M included in the corrected image M correspond to the left and right inverted positions and shapes of the shadow region DA2 in the projected image V10 shown in FIG. 26 and the projected image V11 shown in FIG. 27.
  • the set P1 and the set P2 are arranged so as to face each other with the projected object 85 in between, the set P1 and the set P2 are enlarged so as to correspond to the shadow region. It is possible to project an image to which an image of an obstacle has been applied.
  • the same effect as that of the third embodiment can be obtained by arranging the projection unit and the imaging unit different from those of the third embodiment as in the fourth embodiment.
  • the number of projection units may be 3 or more. That is, in the configuration shown in FIG. 2 and the like, the number of projection devices may be three or more.
  • the brightness of the projected image by one or more projection units having no obstacle on the optical axis may be corrected.
  • the degree of correction may be such that the projection images of the plurality of projection units having no obstacles on the optical axis are overlapped so that the decrease in brightness of the shadow region due to the projection units of the obstacles on the optical axis is offset.
  • the image may be projected by one or more projection units having no obstacle on the optical axis. That is, it is sufficient that the image is not projected by the projection unit having an obstacle on the optical axis.
  • the number of imaging units may be 3 or more. In the first embodiment, the number of imaging units corresponds to the number of projection units. In the second embodiment, as described with reference to FIGS. 9, 10 and 11, an imaging unit may be provided so as to be paired with the projection unit, or a plurality of imaging units may be provided at positions independent of the projection unit. A unit may be provided.
  • the number of projection units in the third embodiment may be 3 or more. That is, in the configuration shown in FIG. 19 and the like, the number of projection devices may be three or more. Further, although two imaging units 31 and 32 are illustrated in FIG. 18, the number of imaging units may be 3 or more. The number and arrangement of imaging units in the third embodiment correspond to the number of projection units.
  • the functional configuration shown in the image projection system 1 shown in FIG. 1 and the image projection system 100 shown in FIG. 11 and the functional configuration shown in the image projection system 200 shown in FIG. 18 may be a functional configuration included in one device. .. That is, the present invention is not limited to the system configuration by combining a plurality of devices as shown in FIGS. 2 and 12 and 19, but is realized by one device having the functions shown in FIGS. 1 and 11 and 18. May be good. That is, the configuration with reference numeral 1 and the configuration with reference numeral 100 and the configuration with reference numeral 200 may be an image projection device. To give a specific example, the configuration with reference numeral 1 and the configuration with reference numeral 100 may be an integrated device provided like a so-called projection television. However, in the third embodiment, the projected object is provided at a position separated from the device. That is, the device is configured so that an obstacle can intervene between the device and the projected object.
  • the positions and shapes of the obstacle OB2 and the shadow region DA2, and the contents of the corrected image 15a illustrated in FIG. 21 are merely examples and are not limited thereto.
  • the present invention can project an image having arbitrary contents. Further, the present invention can generate a corrected image according to a shadow region generated corresponding to the shape of an obstacle within the angle of view of projection.
  • the projection control unit 17 of the second embodiment is based on the captured image obtained by capturing the projected image projected by switching each frame by the plurality of imaging units 21 and 22, and the contents of the projected image data 11b which is the projected image.
  • the position of the shadow of the projected image may be specified. That is, the projection control unit 17 may have a function related to extraction of the shadow area DA included in the shadow area identification unit in the first embodiment and the obstacle detection unit 16 in the second embodiment.
  • the present invention is not limited by the contents of these embodiments.
  • the components described above include those that can be easily assumed by those skilled in the art, those that are substantially the same, that is, those in a so-called equal range. Furthermore, the components described above can be combined as appropriate. Further, various omissions, replacements or changes of the components can be made without departing from the gist of the above-described embodiment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Projection Apparatus (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

La présente invention concerne un dispositif de projection d'image qui comprend : une pluralité d'unités de projection pour projeter une pluralité d'images de projection à partir de différents angles de projection d'image de telle sorte que les images de projection forment des images qui se chevauchent dans la même région ; une pluralité d'unités d'imagerie disposées de manière à être appariées avec la pluralité d'unités de projection, pour imager ladite même région à partir d'angles correspondant à des angles de projection d'image des unités de projection respectives pour générer une image prise ; une unité de spécification de région ombrée pour spécifier, sur la base de l'image prise, une région ombrée qui se produit dans les images se chevauchant en raison d'un obstacle entre l'unité de projection et ladite même région ; une unité de correction d'image pour générer une image corrigée dans laquelle une région qui est dans l'image de projection projetée par l'unité de projection et qui correspond à la région ombrée est corrigée ; et une unité de commande pour commander l'unité de projection pour projeter l'image corrigée.
PCT/JP2021/010239 2020-03-25 2021-03-12 Dispositif et procédé de projection d'image WO2021193171A1 (fr)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP2020054878A JP2021158446A (ja) 2020-03-25 2020-03-25 画像投影装置、画像投影方法及びプログラム
JP2020-054877 2020-03-25
JP2020-054878 2020-03-25
JP2020-054879 2020-03-25
JP2020054879A JP2021158447A (ja) 2020-03-25 2020-03-25 画像投影装置、画像投影方法及びプログラム
JP2020054877A JP2021158445A (ja) 2020-03-25 2020-03-25 画像投影装置、画像補正方法及びプログラム

Publications (1)

Publication Number Publication Date
WO2021193171A1 true WO2021193171A1 (fr) 2021-09-30

Family

ID=77891826

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/010239 WO2021193171A1 (fr) 2020-03-25 2021-03-12 Dispositif et procédé de projection d'image

Country Status (1)

Country Link
WO (1) WO2021193171A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001222062A (ja) * 2000-02-09 2001-08-17 Nikon Corp 投射型表示装置
JP2007248824A (ja) * 2006-03-16 2007-09-27 Matsushita Electric Ind Co Ltd 画像投射装置および画像投射システム
JP2008042781A (ja) * 2006-08-09 2008-02-21 Fuji Xerox Co Ltd 画像処理装置
JP2008116565A (ja) * 2006-11-01 2008-05-22 Seiko Epson Corp 画像補正装置、プロジェクションシステム、画像補正方法、画像補正プログラム、および記録媒体
JP2011257609A (ja) * 2010-06-09 2011-12-22 Nippon Telegr & Teleph Corp <Ntt> 光学投影制御方法、光学投影制御装置、光学投影制御システムおよびプログラム
WO2019188046A1 (fr) * 2018-03-26 2019-10-03 富士フイルム株式会社 Système de projection, dispositif de commande de projection, procédé de commande de projection, et programme de commande de projection

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001222062A (ja) * 2000-02-09 2001-08-17 Nikon Corp 投射型表示装置
JP2007248824A (ja) * 2006-03-16 2007-09-27 Matsushita Electric Ind Co Ltd 画像投射装置および画像投射システム
JP2008042781A (ja) * 2006-08-09 2008-02-21 Fuji Xerox Co Ltd 画像処理装置
JP2008116565A (ja) * 2006-11-01 2008-05-22 Seiko Epson Corp 画像補正装置、プロジェクションシステム、画像補正方法、画像補正プログラム、および記録媒体
JP2011257609A (ja) * 2010-06-09 2011-12-22 Nippon Telegr & Teleph Corp <Ntt> 光学投影制御方法、光学投影制御装置、光学投影制御システムおよびプログラム
WO2019188046A1 (fr) * 2018-03-26 2019-10-03 富士フイルム株式会社 Système de projection, dispositif de commande de projection, procédé de commande de projection, et programme de commande de projection

Similar Documents

Publication Publication Date Title
US7357517B2 (en) Projector, method of controlling the projector, program for controlling the projector, and recording medium storing the program
JP3901072B2 (ja) 映像表示装置、映像表示方法
JP5736535B2 (ja) 投写型映像表示装置及び画像調整方法
JP5372857B2 (ja) 投写型映像表示装置
JP2006189685A (ja) 投写制御システム、プロジェクタ、プログラム、情報記憶媒体および投写制御方法
JP7223837B2 (ja) 画像処理装置、投影システム、画像処理方法、及び画像処理プログラム
US11611731B2 (en) Evaluation method for image projection system, image projection system, and image projection control apparatus
JP2012170007A (ja) 投写型映像表示装置及び画像調整方法
JP2005204184A (ja) 投写型映像表示装置
JP2012118289A (ja) 投写型映像表示装置
JP2012165091A (ja) マルチプロジェクションシステム、及び、プロジェクター
JP2021158627A (ja) プロジェクターの制御方法、プロジェクター、及び投射システム
JP2012018214A (ja) 投写型映像表示装置
JP2005037771A (ja) プロジェクタ
JP2018004951A (ja) 画像投写装置
JP2010085563A (ja) 画像調整装置、画像表示システム及び画像調整方法
US7370980B2 (en) Projection type video display
WO2021193171A1 (fr) Dispositif et procédé de projection d&#39;image
JP2011164246A (ja) 投写位置ずれ量検出装置、投写位置ずれ量検出方法及びプロジェクションシステム
JP2012078490A (ja) 投写型映像表示装置及び画像調整方法
JP2012220709A (ja) 投写型映像表示装置およびその制御方法
JP5298738B2 (ja) 画像表示システム及び画像調整方法
JP2021158446A (ja) 画像投影装置、画像投影方法及びプログラム
JP2021158445A (ja) 画像投影装置、画像補正方法及びプログラム
JP2005148131A (ja) プロジェクタ

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21774097

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21774097

Country of ref document: EP

Kind code of ref document: A1