WO2017061213A1 - 投射型表示装置および画像補正方法 - Google Patents
投射型表示装置および画像補正方法 Download PDFInfo
- Publication number
- WO2017061213A1 WO2017061213A1 PCT/JP2016/076095 JP2016076095W WO2017061213A1 WO 2017061213 A1 WO2017061213 A1 WO 2017061213A1 JP 2016076095 W JP2016076095 W JP 2016076095W WO 2017061213 A1 WO2017061213 A1 WO 2017061213A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- projection
- image
- linear light
- unit
- light
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 37
- 238000003702 image correction Methods 0.000 title claims description 26
- 238000003384 imaging method Methods 0.000 claims abstract description 74
- 238000012545 processing Methods 0.000 claims abstract description 70
- 230000003287 optical effect Effects 0.000 claims abstract description 43
- 230000001678 irradiating effect Effects 0.000 claims abstract description 10
- 238000012937 correction Methods 0.000 claims description 47
- 238000001514 detection method Methods 0.000 claims description 20
- 238000006073 displacement reaction Methods 0.000 claims description 14
- 230000014509 gene expression Effects 0.000 claims description 6
- 238000010586 diagram Methods 0.000 description 12
- 238000012986 modification Methods 0.000 description 11
- 230000004048 modification Effects 0.000 description 11
- 238000005286 illumination Methods 0.000 description 5
- 238000009434 installation Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 239000004973 liquid crystal related substance Substances 0.000 description 4
- 238000013459 approach Methods 0.000 description 3
- 238000013461 design Methods 0.000 description 3
- 230000001151 other effect Effects 0.000 description 3
- 238000005070 sampling Methods 0.000 description 3
- 230000015572 biosynthetic process Effects 0.000 description 2
- 239000000470 constituent Substances 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 238000000926 separation method Methods 0.000 description 2
- 238000003786 synthesis reaction Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 229910052602 gypsum Inorganic materials 0.000 description 1
- 239000010440 gypsum Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000010422 painting Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
- H04N9/3185—Geometric adjustment, e.g. keystone or convergence
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/48—Details of cameras or camera bodies; Accessories therefor adapted for combination with other photographic or optical apparatus
- G03B17/54—Details of cameras or camera bodies; Accessories therefor adapted for combination with other photographic or optical apparatus with projector
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
- G03B21/26—Projecting separately subsidiary matter simultaneously with main image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/80—Geometric correction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/74—Projection arrangements for image reproduction, e.g. using eidophor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
- G06T2207/30208—Marker matrix
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/06—Adjustment of display parameters
- G09G2320/0693—Calibration of display systems
Definitions
- the present disclosure relates to a projection display device having an imaging function and an image correction method.
- a projection display device called a short focus (ultra short focus) projector can project and display a large screen and a high-definition image at an extremely short projection distance. For this reason, even when the user approaches the screen (projection surface), the user's shadow or the like is hardly reflected on the projection surface, and a realistic video display can be realized. For example, a large screen image can be projected onto a wall surface such as a living room, and is attracting attention as a new image display technology in everyday space.
- a method has been proposed in which a pattern image for distortion calibration is projected, the projected pattern image is separately photographed by a camera, distortion is calculated from the photographed image, and the projection image is corrected (for example, Patent Documents). 1). Further, as a technique for correcting a projected image, a pattern image is captured by a camera mounted on the projection display main body, and the shape of the projected screen is corrected (trapezoidal correction) based on the captured image (Patent Literature). 2).
- Patent Document 1 lacks convenience because the user must install a camera or take a picture.
- apparatus configuration of Patent Document 2 it is difficult to detect distortion caused by the non-flatness of the projection surface. Realization of a technique capable of improving display image quality by correcting distortion of a projected image while maintaining convenience is desired.
- a projection display device includes an image display element that displays an image, a projection unit that projects an image displayed by the image display element toward a projection surface, and a projection surface with respect to the projection surface.
- a light irradiation unit that irradiates linear light extending along the first direction at an incident angle shallower than the projection light, and an imaging unit that has an optical axis different from that of the light irradiation unit and images the projection surface
- a signal processing unit that performs signal processing on an imaging signal output from the imaging unit.
- the imaging unit captures the linear light irradiated on the projection surface, and the signal processing unit corrects distortion of the projected image based on the captured image of the linear light.
- the light irradiation unit irradiates the projection surface with predetermined linear light at an incident angle shallower than the projection light, and the linear light is combined with the light irradiation unit. Images are taken by imaging units having different optical axes.
- the signal processing unit corrects distortion of the projected image based on the captured image of the linear light. Thereby, the distortion of the image resulting from the non-flatness of a projection surface can be corrected, without a user installing a camera or photographing.
- the linear light extending along the first direction in the projection surface with respect to the projection surface Irradiate at a shallower incident angle than the projection light, shoot the linear light irradiated on the projection surface from an optical path different from the linear light, and correct the distortion of the projected image based on the captured image of the linear light To do.
- the projection surface is irradiated with predetermined linear light at an incident angle shallower than the projection light, and the linear light is photographed from an optical path different from the linear light.
- the light irradiation unit irradiates the projection surface with predetermined linear light at an incident angle shallower than the projection light, and an optical axis different from that of the light irradiation unit.
- the image capturing unit having the function captures the linear light, and the signal processing unit corrects distortion of the projected image based on the captured image of the linear light. Thereby, it is possible to improve the image quality of the projected image while maintaining convenience.
- the projection surface is irradiated with predetermined linear light at an incident angle shallower than the incident angle of the projection light, and the linear light is the linear light. Photographing from different optical paths, and correcting distortion of the projected image based on the photographed image of linear light. Thereby, it is possible to improve the image quality of the projected image while maintaining convenience.
- FIG. 11B is an enlarged view of the vicinity of the upper end position in FIG. 11A. It is the figure which showed the numerical example of the parameter at the time of linear light imaging
- Embodiment an example of a projection display device that corrects distortion of a projected image based on a captured image of linear light irradiated at an incident angle shallower than the incident angle of the projected light
- Modified example example in which linear light is irradiated to a plurality of positions and image correction is performed for each region corresponding to each irradiation position
- FIG. 3 shows a functional configuration of the projection display device 1.
- the XY plane is the projection plane A
- the XZ plane is the installation plane B
- the X direction corresponds to, for example, the horizontal direction
- the Y direction corresponds to, for example, the vertical direction.
- the projection display device 1 is a projector (so-called short focus type or ultra short focus type) that projects an image at a short projection distance toward a projection surface A such as a wall surface of a room.
- the projection display device 1 is used in a state where it is installed in the vicinity of the projection surface A.
- an example in which the projector is used by being installed on an installation surface B such as a floor or a stand in the vicinity of the projection surface A is shown.
- the projection surface A is not limited to a wall surface, and another projection surface A can be used as another projection surface A, particularly a surface having irregularities along a certain direction.
- the projection display device 1 includes, for example, a projection unit 10, an imaging unit 20, a light irradiation unit 30, a control unit 40, a video signal processing unit 41, and an image display element 42.
- the imaging signal processing unit 43 and the storage unit 47 are provided.
- the projection unit 10, the imaging unit 20, the light irradiation unit 30, the control unit 40, the video signal processing unit 41, the image display element 42, the imaging signal processing unit 43, and the storage unit 47 include, for example, one outer casing (housing) 11) (contained in the housing 11).
- the video signal processing unit 41 and the imaging signal processing unit 43 of the present embodiment correspond to a specific example of “signal processing unit” in the present disclosure. However, some of these constituent elements may be mounted so as to be exposed from the casing 11, or may be installed outside the casing 11. Hereinafter, a specific configuration of each unit will be described.
- the projection unit 10 projects the image displayed by the image display element 42 toward the projection surface A (for example, enlarged projection), and includes, for example, a projection optical system 10A and a reflection mirror 110.
- the projection optical system 10A includes, for example, a projection lens unit.
- the reflection mirror 110 converts the optical path of the emitted light from the projection optical system 10A and guides it to the projection surface A. The light reflected by the reflection mirror 110 becomes the projection light L1. Depending on the layout of the projection optical system 10A, the reflection mirror 110 may not be installed.
- the projection unit 10 may include a light source, an illumination optical system, a color separation optical system, and a color synthesis optical system (not shown).
- the projection unit 10 when the image display element 42 is a liquid crystal display element or the like, the projection unit 10 includes a light source, and an illumination optical system, a color separation optical system, for example, on the optical path between the light source and the projection optical system 10A.
- the image display element 42 and the color synthesis optical system are provided in this order.
- the projection unit 10 does not need to include a light source and an illumination optical system.
- the projection lens unit includes, for example, a short focus lens with a small throw ratio.
- the slow ratio means that the distance from the projection lens unit to the projection surface A (projection distance) is L, and the width of the image range that can be projected (projection range B1, projection screen) (width along the X direction) is H. Then, it is expressed by L / H.
- the position where the elevation angle a1 of the projection light L1 with respect to the projection surface A is the largest is the upper end position p1
- the position where the elevation angle a1 is the smallest is the lower end position p2.
- a correction pattern image is projected onto the projection range B1.
- the imaging unit 20 is configured to include, for example, an imaging element and various optical elements, and is a camera for photographing the projection surface A.
- the imaging element is configured by a solid-state imaging element such as a CMOS (Complementary Metal-Oxide Semiconductor) or a CCD (Charge Coupled Device).
- Examples of the imaging unit 20 include a camera provided with a wide-angle lens.
- the imaging unit 20 may include a plurality of cameras, and a wide-angle image may be formed by stitching the captured images obtained by the cameras.
- the imaging unit 20 is mounted, for example, on the surface S1 side of the housing 11 together with the emission port 10B of the projection unit 10 and the light irradiation unit 30.
- the imaging unit 20 is disposed, for example, in the vicinity of the exit port 10B, and is installed at a position closer to the viewer (a position on the positive side in the Z direction) than the exit port 10B.
- a range imaging range B2 substantially equivalent to the projection range B1.
- the imaging range B2 has a larger width in the vertical direction (Y direction) and the horizontal direction (X direction) than the projection range B1.
- the imaging unit 20 has an optical axis different from that of the light irradiation unit 30 (arranged so that the optical axis of the camera of the imaging unit 20 does not coincide with the optical axis of the light irradiation unit 30). Thereby, the imaging unit 20 images the projection surface A from an optical path different from the optical path of the laser light L2 output from the light irradiation unit 30 (irradiation optical path of the linear light L2a).
- the image (imaging signal Dt1) captured by the imaging unit 20 is output to the imaging signal processing unit 43.
- the light irradiation unit 30 is a light source unit used for distortion correction (calibration) of a projected image.
- the light irradiation unit 30 outputs, for example, laser light L2, and extends (forms a linear shape) linear light L2a along one direction (first direction) in the projection surface A with respect to the projection surface A.
- the laser light source which irradiates is provided.
- the linear light L2a is, for example, a line laser beam that extends along a horizontal direction (first direction, X direction) perpendicular to the vertical direction (second direction, Y direction).
- the light irradiation unit 30 desirably has a laser light source that outputs laser light L2 having high directivity.
- the light irradiation unit 30 can irradiate light linearly on the projection surface A, for example, another light source such as an LED light source. You may use what used. Further, the wavelength of the linear light L2a may be in the visible range, or may be in a non-visible range such as near infrared (NIR).
- NIR near infrared
- the projection display device 1 is equipped with a function (for example, a human sensor) for detecting that an object such as a user has approached (approached) the light irradiation unit 30 together with the light irradiation unit 30. In such a case, it is desirable that the irradiation of the linear light L2a is stopped. This is because the linear light L2a having high directivity can be prevented from affecting the user's eyes.
- a function for example, a human sensor
- the elevation angle of the linear light L2a (the emission angle of the laser light L2) a2 may be fixed or variable.
- the irradiation position of the linear light L2a on the projection surface A may be fixed or variable.
- the elevation angle a2 is linear light at any position overlapping the projection range B1 when the projection display device 1 is installed in front of the projection surface A (so that the projection range B1 is rectangular). It is adjusted so that L2a is irradiated. Or if it is in the imaging range B2, the linear light L2a may be irradiated to the outer side of the projection range B1, for example, the position above the upper end position p1.
- the elevation angle a2 is set so that the linear light L2a is irradiated at a position substantially equal to the upper end position p1 of the projection range B1. Even if the degree of unevenness of the projection surface A in the Y direction is equal (constant), the amount of distortion due to the non-flatness of the projection surface A varies depending on the position in the Y direction within the projection range B1. That is, the distortion amount is the largest at the upper end position p1 where the elevation angle a1 of the projection light L1 is the largest in the projection range B1, and is the smallest at the lower end position p2 where the elevation angle a1 is the smallest. For this reason, the detection sensitivity of the amount of distortion can be increased by irradiating the upper end position p1 with the linear light L2a and performing the later-described distortion detection based on the captured image of the linear light L2a.
- the light irradiation unit 30 is disposed on the surface S1 of the housing 11 at a position closer to the projection surface A than the exit port 10B of the projection unit 10.
- the light irradiation unit 30 is such that the incident angle a2 ′ of the linear light L2a (laser light L2) on the projection surface A is shallower than the incident angle a1 ′ of the projected light L1 (a1 ′> a2 ′). It is configured to be irradiated.
- the linear light L2a is more easily distorted than the projection light L1.
- the optical axis of the light irradiation unit 30 is arranged so as not to coincide with the optical axis of the camera of the imaging unit 20 (having an optical axis different from that of the imaging unit 20).
- the control unit 40 includes, for example, a CPU (Central Processing Unit).
- the control unit 40 controls the operation of each unit in the projection display device 1.
- the projection unit 10, the imaging unit 20, the light irradiation unit 30, and the image display element 42 are each driven at a predetermined timing by a driving unit (not shown) based on the control of the control unit 40.
- the video signal processing unit 41 generates, for example, a video signal (Dt0) to be displayed on the image display element 42 based on a video signal (image signal) input from the outside, for example.
- a video signal image signal
- the video signal processing unit 41 includes, for example, an FPGA (Field Programmable Gate Array) or a GPU (Graphics Processing Unit), and uses the correction coefficient output from the imaging signal processing unit 43 to be described later. It corrects Dt0.
- the video signal Dt0 generated by the video signal processing unit 41 or the corrected video signal (Dt2) is supplied to the image display element 42 by a timing controller, a driving unit (driver circuit) or the like (not shown).
- the video signal processing unit 41 corresponds to a specific example of “correction unit” in the present disclosure.
- the image display element 42 is, for example, a reflective liquid crystal element such as LCOS (Liquid Crystal On Silicon), a transmissive liquid crystal element, or a DMD (Digital Micromirror Device).
- the image display element 42 modulates light from an illumination optical system (not shown) based on a video signal input from the video signal processing unit 41.
- the light modulated by the image display element 42 is output to the projection surface A via the projection unit 10.
- the image display element 42 may be a self-luminous element such as an organic electroluminescent element. In this case, an illumination optical system is unnecessary.
- the imaging signal processing unit 43 includes, for example, a CPU or a GPU, and performs various types of signal processing on the captured image (imaging signal Dt1) input from the imaging unit 20.
- the imaging signal processing unit 43 includes, for example, a distortion detection unit 44, a distortion estimation unit 45, and a correction coefficient calculation unit 46.
- the distortion detection unit 44, the distortion estimation unit 45, and the correction coefficient calculation unit 46 correspond to specific examples of “detection unit”, “estimation unit”, and “correction coefficient calculation unit” in the present disclosure, respectively.
- the distortion detection unit 44 calculates a distortion amount of the linear light L2a (a first distortion amount, a distortion amount dr described later) by a predetermined calculation. It is calculated using processing.
- the distortion amount dr is caused by the non-flatness (unevenness) of the projection surface A, and here, the deformation amount of the linear light L2a in the vertical direction (Y direction) orthogonal to the horizontal direction (X direction). It is.
- the distortion estimation unit 45 calculates a distortion amount (second distortion amount, a distortion amount dp described later) of the projection image based on the distortion amount dr of the linear light L2a using a predetermined calculation process.
- the distortion amount dp is caused by the non-flatness (unevenness) of the projection surface A, and is, for example, the deformation amount of the projection image in the vertical direction (Y direction).
- the correction coefficient calculation unit 46 calculates a correction coefficient for applying a deformation that cancels the distortion amount dp to the projection image based on the distortion amount dp calculated by the distortion estimation unit 45.
- the storage unit 47 stores, for example, programs and data handled by the control unit 40.
- FIG. 4 is a schematic diagram for explaining the distortion amount of the projected image due to the non-flatness of the projection surface A.
- 5A and 5B are perspective views for explaining the structure of the wall surface.
- FIG. 6 is a diagram illustrating the flow of the image correction operation.
- FIG. 7 is a schematic diagram illustrating an example of a pattern image.
- FIG. 8 is a schematic diagram showing a captured image of a pattern image and linear light.
- FIG. 9 is a schematic diagram for explaining a reference line setting operation and a difference value calculating operation.
- FIG. 10 is a schematic diagram illustrating parameters used in the calculation processing when estimating the distortion amount of the projection image.
- the “image correction method” of the present disclosure is embodied by the operation of the projection display device 1 described below, the description thereof is omitted.
- a video (image) based on the video signal Dt 0 (or a video signal Dt 2 described later) is displayed on the image display element 42 in a state of being arranged in front of the projection surface A (arranged on the installation surface B). Then, this image (projection light L1) is projected onto the projection surface A via the projection unit 10. In this way, an image can be displayed on the projection surface A.
- the projection surface A is a wall surface or the like, it is difficult to ensure flatness.
- the wall surface is likely to be uneven mainly along a certain direction (for example, horizontal direction, X direction), and in other directions (for example, vertical direction, Y direction). Tend to rarely occur.
- the unevenness of the wall surface is caused by a dimensional error in a structural material for placing a gypsum board that is a base for wallpaper or painting, or a construction deviation. .
- the projection surface A such as a wall surface is basically uneven only in a certain direction.
- noticeable unevenness that may affect the projected image occurs along a certain direction (here, the X direction).
- projection surface A ′ Assuming a concave portion (projection surface A ′) in such a projection surface A, projection is performed according to the amount of displacement (displacement amount da between the projection surfaces A and A ′) due to the unevenness (dent).
- the projection light L1 On the surface A ′, the projection light L1 is incident on the position (p11) above the upper end position p1 of the projection surface A (Y direction positive side).
- the projection surface A (projection surface A ′) is observed by the user from the front (from the observation direction Dz), distortion such as undulation in the projection image due to the unevenness of the projection surface A is visually recognized.
- the distortion generated in the projected image becomes larger as it approaches the upper end position p1 in the projection range B1.
- FIG. 6 shows the flow of image correction processing according to the present embodiment.
- the imaging signal processing unit 43 corrects the distortion of the projection image based on the captured image of the linear light L2a.
- the light irradiation unit 30 is disposed closer to the projection surface A than the emission port 10B of the projection unit 10.
- the light irradiation unit 30 is configured such that the linear light L2a (laser light L2) is irradiated at a shallower incident angle (a1 ′> a2 ′) with respect to the projection surface A than the projection light L1. Yes.
- the laser light L2 output toward the upper end position p1 of the projection surface A is incident on the position p12 on the projection surface A ′, which is further above the position p11 where the projection light L1 is incident.
- the distortion amount dr (first distortion amount) of the linear light L2a is larger than the distortion amount dp (second distortion amount) of the projection image (dr> dp).
- the distance between the positions p1 and p12 along the Y direction of the linear light L2a corresponds to the distortion amount dr
- the distance between the positions p1 and p11 along the Y direction of the projection light L1 corresponds to the distortion amount dp.
- the linear light L2a is distorted.
- the distortion of the projection image is estimated based on the distortion of the linear light L2a, and correction is performed.
- the projection unit 10 projects the pattern image 120 onto the projection surface A, and the light irradiation unit 30 irradiates the linear light L2a toward the upper end position p1 (step S11 illustrated in FIG. 6).
- An example of the pattern image 120 is shown in FIG.
- the pattern image 120 is, for example, a regular arrangement of lines or points (dots).
- a pattern in a grid shape (lattice shape) is shown.
- the pattern image 120 specifies the coordinates when calculating the distortion amount of the linear light L2a described later.
- the vertical line in the pattern image 120 specifies the horizontal coordinate (X coordinate) of the sampling point when calculating the distortion amount.
- the interval in the horizontal direction (X direction) of the lines in the pattern image 120 is the sampling interval at the time of distortion detection, it is desirable that the interval in the horizontal direction is narrower because the correction accuracy increases.
- the installation position of the projection display device 1 is finely adjusted according to the shape and position of the pattern image 120 and the linear light L2a displayed on the projection surface A (step S12). Specifically, the installation position is finely adjusted as necessary so that the horizontal line of the pattern image 120 on the projection surface A and the linear light L2a are substantially parallel. This fine adjustment is performed manually by the user, for example.
- the projection display device 1 may be equipped with an adjustment function and automatically performed by the adjustment function.
- the imaging unit 20 individually captures the pattern image 120 and the linear light L2a displayed on the projection surface A (step S13). Specifically, first, the imaging unit 20 captures the projection surface A in a state where the projection unit 10 projects the pattern image 120 onto the projection surface A and the light irradiation unit 30 does not irradiate the linear light L2a. Thereafter, the imaging unit 20 captures the projection surface A in a state where the light irradiation unit 30 irradiates, for example, the upper end position p1 of the projection surface A with the linear light L2a and the projection unit 10 does not project the pattern image 120. .
- the imaging order of the pattern image 120 and the linear light L2a may be reversed. However, the pattern image 120 and the linear light L2a may be photographed at the same time according to the contents of the arithmetic processing described later.
- FIG. 8 schematically shows a captured image of the pattern image 120 (captured image 120A) and a captured image of the linear light L2a (captured image L2a1).
- wavy distortion occurs due to the unevenness of the projection surface A.
- the amount of distortion in the Y direction of the captured image 120A of the pattern image 120 is the largest at the upper end position p1 and decreases as it approaches the lower end position p2.
- the distortion amount of the linear light L2a in the Y direction of the captured image L2a1 (distortion amount corresponding to the distortion amount dr) is larger than the distortion amount at the upper end position p1 of the captured image 120A.
- the imaging signal Dt1 including the captured image L2a1 of the linear light L2a and the captured image 120A of the pattern image 120 is output to the imaging signal processing unit 43.
- the imaging signal processing unit 43 performs lens distortion correction and projection conversion to a front-view image on each of the captured image L2a1 of the linear light L2a and the captured image 120A of the pattern image 120 (Ste S14).
- the distortion detection unit 44 performs these processes on each of the captured image L2a1 of the linear light L2a and the captured image 120A of the pattern image 120. Thereby, the distortion of the linear light L2a can be detected with high accuracy.
- the distortion detection unit 44 performs a thinning process on each of the captured images L2a1 and 120A subjected to each process of step S14 (step S15).
- the captured images L2a1 and 120A are thinned to a width corresponding to one pixel, for example.
- the distortion of the linear light L2a can be detected with high accuracy.
- the distortion detection unit 44 sets a reference line based on the captured image L2a1 of the linear light L2a (step S16).
- the reference line can be calculated by a technique such as linear fitting for the captured image L2a1.
- An example is shown in FIG.
- a linear reference line 130 is set for the captured image L2a1 of the linear light L2a.
- the distortion detector 44 calculates the distortion amount dr of the captured image L2a1 of the linear light L2a based on the linear light L2a and the captured images L2a1 and 120A of the pattern image 120 (step S17). Specifically, the difference value between the captured image L2a1 of the linear light L2a and the reference line 130 is calculated at a selective point on the captured image 120A of the pattern image 120. For example, as shown in FIG. 9, the horizontal coordinate (X coordinate) of the vertical line of the captured image 120A of the pattern image 120 is the horizontal coordinate of the sampling point (..., x-1, x, x + 1, ...: x is an integer).
- a difference value S (x) in the vertical direction (Y direction) between the captured image L2a1 and the reference line 130 is calculated.
- the distortion amount dr (specifically, the distortion amount dr (x)) can be calculated from the difference value S (x).
- the signal Dt11 for the distortion amount dr is output to the distortion estimation unit 45.
- the distortion estimation unit 45 estimates the distortion amount dp of the projection image based on the input signal Dt11 (step S18).
- the projection surface A has unevenness mainly in one direction, here the horizontal direction (X direction), and there is a tendency that the unevenness hardly occurs in the vertical direction (Y direction) orthogonal to the horizontal direction.
- the displacement amount da due to the unevenness of the projection surface A at a certain horizontal coordinate x can be regarded as being constant from the upper end position p1 to the lower end position p2 of the projection range B1.
- the distortion amount of the linear light L2a (laser light L2) is dr
- the distortion amount of the projection image (projection light L1) is dp
- the elevation angle of the projection light L1 (the emission angle of the projection light L1) is a1
- the linear light L2a the elevation angle of the laser beam L2 (the emission angle of the laser beam L2) is a2 (see FIG. 10)
- the elevation angle of the laser beam L2 is a2 (see FIG. 10)
- da dr / tan (a2)
- dp da ⁇ tan (a1)
- dp [tan (a1) / tan (a2)] ⁇ dr (3)
- the elevation angles a1 and a2 are values determined by design specifications or measurement of the projection display device 1, if the distortion amount dr of the linear light L2a can be detected, the displacement da of the projection surface A
- the distortion amount dp of the projected image can be calculated by arithmetic processing. Strictly speaking, as the distance from the irradiation position of the linear light L2a (here, the upper end position p1) increases, the displacement amount da slightly changes, and an error may occur in the estimated distortion amount dp. However, the lower the projection range B1 (closer to the lower end position p2), the smaller the elevation angle a1 and the smaller the amount of distortion dp, the smaller the influence of errors.
- the signal Dt12 for the distortion amount dp estimated in this way is output to the correction coefficient calculation unit 46.
- the correction coefficient calculation unit 46 calculates a correction coefficient based on the input signal Dt12 (step S19). Specifically, based on the distortion amount dp of the projection image, a correction coefficient for performing a deformation that cancels the distortion amount dp on the projection image is calculated.
- the correction coefficient calculation unit 46 can set a correction coefficient in consideration of such a change in the distortion amount dp and perform distortion correction of the projected image.
- correction coefficients may be set so that correction is performed.
- the calculated correction coefficient (correction coefficient signal Dt13) is held in the storage unit 47, for example.
- the correction coefficient signal Dt13 may be directly input to the video signal processing unit 41.
- the video signal processing unit 41 performs geometric correction processing of the projected image (video signal Dt0) using the correction coefficient signal Dt13 held in the storage unit 47 (step S20).
- the corrected projection image data (video signal Dt2) is sent to the image display element 42 at a predetermined timing under the control of the control unit 40. In this way, an image that has been subjected to distortion correction due to the non-flatness of the projection surface A can be projected onto the projection surface A.
- a series of processing operations by the imaging signal processing unit 43 and the video signal processing unit 41 are the projection type as described above. It may be performed by an electronic circuit mounted in the display device 1 or may be executed by software in the control unit 40 or an external device, for example.
- the imaging signal processing unit 43 and the video signal processing unit 41 are configured by one or a plurality of electronic circuit devices (such as a CPU or a GPU).
- the processing operations may be performed by one electronic circuit device. It may be performed by a circuit device.
- FIG. 11A and FIG. 12 show numerical examples of each parameter.
- FIG. 11B shows an enlarged view of the vicinity of the upper end position p1 in FIG. 11A.
- the size of the projection range B1 of the projection display device 1 is 150 inches
- the elevation angle of the imaging unit 20 (camera) (the elevation angle of the optical axis of the imaging unit 20) is 57.2 °
- the diagonal field angle Is 76.7 ° and the number of pixels is 20.7 megapixels.
- the angle at which the imaging unit 20 looks at the upper end position p1 is 72.39 °
- the angle at which the upper end position p11 is seen is 72.40 °.
- the elevation angle a1 of the projection light L1 incident on the upper end position p1 is 74.8 °
- the elevation angle a2 of the laser light L2 incident on the upper end position p1 is 82.4 °.
- the distortion amount dr of the linear light L2a is 22.5 mm
- the distortion amount dp of the projection image (pattern image 120) is 11.0 mm.
- the displacement amount da of the projection surface A is 3 mm.
- the distortion amount dc2 seen from the mounting position of the imaging unit 20 is 4.1 mm with respect to the distortion amount dr of 22.5 mm when viewed from the front. Since the size of 4.1 mm corresponds to, for example, about 8 pixels of the camera, the distortion amounts dc2 and dr can be detected from the captured image with sufficient accuracy. Since the distortion amount dp of the projection image can be estimated by the arithmetic processing as described above, the distortion amount dr of the linear light L2a can be easily converted into the distortion amount dp of the projection image. Thus, by estimating the distortion amount dp of the projection image based on the distortion amount dr of the linear light L2a, the detection sensitivity of the distortion amount dp can be increased, and the correction accuracy can be improved.
- the light irradiation unit 30 is predetermined with respect to the projection surface A at an incident angle (a2 ′) shallower than the incident angle (a1 ′) of the projection light L1.
- the linear light L ⁇ b> 2 a is irradiated, and the linear light L ⁇ b> 2 a is captured by the imaging unit 20 having an optical axis different from that of the light irradiation unit 30.
- the imaging signal processing unit 43 and the video signal processing unit 41 correct the distortion of the projection image based on the captured image of the linear light L2a. Thereby, the distortion of the image resulting from the non-flatness of the projection surface A can be corrected without the user installing or photographing the camera. Therefore, it is possible to improve the image quality of the projected image while maintaining convenience.
- FIG. 13 is a schematic diagram illustrating a schematic configuration and a usage state of a projection display device according to a modification.
- the light irradiation part 30 demonstrated the case where the linear light L2a was irradiated toward the upper end position p1 of the projection range B1, like this modification, the light irradiation part 30 is a projection range. You may be comprised so that the linear light L2a can be irradiated toward several positions among B1.
- the elevation angle (emission angle) a2 of the laser light L2 is variable.
- the laser beams L21, L22, and L23 can be emitted at the elevation angles a21, a22, and a23, respectively.
- the laser beam L21 is emitted at an elevation angle a21 toward the uppermost position p3, for example, the laser beam L22 is emitted at an elevation angle a22 toward a position p4 below the position p3, and the laser beam L23 is emitted from the position p4. Is emitted toward the lower position p5 at the elevation angle a23.
- the projection range B1 is divided into several regions, and the linear light L2a is irradiated to each of the divided regions.
- the elevation angle a2 of the laser light L2 can be changed using, for example, a stepping motor or the like, and the linear light L2a can be sequentially irradiated onto the positions p3, p4, and p5.
- the light irradiation unit 30 may include a plurality of laser light sources with different elevation angles of the output laser light.
- the light irradiation unit 30 includes a first laser light source that emits a laser beam L21 at an elevation angle a21 toward the position p3, a second laser light source that emits a laser beam L22 at an elevation angle a22 toward the position p4, and You may have the 3rd laser light source which radiate
- the linear light L2a can be irradiated sequentially or simultaneously to the positions p3, p4, and p5.
- the light irradiation part 30 is arrange
- the linear light L2a formed by the laser light L22 is applied to the projection surface A at an incident angle shallower than the projection light L12 emitted toward the position p4 at the elevation angle a12.
- the linear light L2a formed by the laser light L23 is applied to the projection surface A at an incident angle shallower than the projection light L13 emitted toward the position p5 at the elevation angle a13.
- the imaging signal processing unit 43 and the video signal processing unit 41 project based on the captured image of the linear light L2a irradiated to each of a plurality of positions.
- the distortion of the image is corrected for each region corresponding to the irradiation position of the linear light L2a.
- the projection range B1 is virtually divided into three regions E1, E2, E3 along the Y direction (vertical direction), and the linear light L2a is sequentially irradiated to each of the divided regions E1, E2, E3. Each time, the projection surface A is photographed.
- the series of signal processing operations detection of the distortion amount dr, estimation of the distortion amount dp, calculation of the correction coefficient, and image correction) described in the above embodiment are performed based on the captured images of the regions E1, E2, and E3. Made.
- the linear light L2a irradiated to the position p3 in the region E1 is photographed, and the distortion amount dr1 of the linear light L2a is detected.
- the distortion amount dr1 corresponds to the distance along the Y direction between the position p3 on the projection surface A and the position on the projection surface A ′ (the position where the laser beam L21 enters on the projection surface A ′) p31.
- the linear light L2a irradiated to the position p4 in the region E2 is photographed, and the distortion amount dr2 of the linear light L2a is detected.
- the distortion amount dr2 corresponds to the distance along the Y direction between the position p4 on the projection surface A and the position on the projection surface A ′ (position where the laser light L22 is incident on the projection surface A ′) p41.
- the linear light L2a applied to the position p5 in the region E3 is photographed, and the distortion amount dr3 of the linear light L2a is detected.
- the distortion amount dr3 corresponds to the distance along the Y direction between the position p5 on the projection surface A and the position on the projection surface A ′ (position where the laser beam L23 is incident on the projection surface A ′) p51.
- the linear light L2a is irradiated at three positions, and the projection range B1 is divided into three areas.
- the number of divisions of the areas is not limited to this, but two. Or four or more.
- the distortion amounts dp1, dp2, and dp3 of the projected image are estimated based on the distortion amounts dr1, dr2, and dr3 for the areas E1, E2, and E3 thus detected.
- the distortion amount dp1 corresponds to the distance along the Y direction between the position p3 on the projection surface A and the position on the projection surface A ′ (the position where the projection light L11 is incident on the projection surface A ′) p32.
- the distortion amount dp2 corresponds to the distance along the Y direction between the position p4 on the projection surface A and the position on the projection surface A ′ (the position where the projection light L12 enters on the projection surface A ′) p42.
- the distortion amount dp3 corresponds to the distance along the Y direction between the position p5 on the projection surface A and the position on the projection surface A ′ (a position where the projection light L13 is incident on the projection surface A ′) p52.
- the distortion amount dp (n) of the projected image in each area En when divided into n areas (n is an integer equal to or greater than 2) is calculated using the following conditional expressions (4) to (6). Can be calculated.
- the displacement amount of the projection surface A is da (n)
- the distortion amount of the linear light L2a is dr (n)
- the distortion amount of the projection image is dp (n)
- the elevation angle of the projection light L1n Is a1n
- the elevation angle of the laser beam L2n is a2n.
- the elevation angles a1n and a2n are values determined by design specifications or measurement of the projection display device 1, and therefore the distortion amount dr (n) for each region of the linear light L2a is detected. If it is possible, the displacement amount da (n) of the projection surface A and the distortion amount dp (n) of the projection image can be calculated by arithmetic processing.
- correction can be made for each region divided along the Y direction (vertical direction), and thus is particularly effective when the displacement amount da differs in the vertical direction of the projection surface A. That is, it is possible to improve the estimation accuracy of the distortion amount of the projected image and improve the correction accuracy (very fine distortion correction can be performed).
- the present disclosure is not limited to the above-described embodiments and the like, and various modifications can be made.
- the linear light extension direction of the present disclosure is the horizontal direction. It is not limited to.
- distortion in a direction orthogonal to the extending direction of the linear light can be corrected, and the extending direction of the linear light can take various directions depending on the purpose.
- the projection surface A is a wall surface, but the projection surface A is not limited to the wall surface.
- the projection surface A for example, various places having non-flatness can be assumed in addition to the wall surface.
- the extending direction of the linear light may be set according to the non-flatness of the projection surface A.
- the light irradiation part may have a mechanism that can rotate the extending direction of the linear light within the projection plane.
- this indication can take the following structures.
- An image display element for displaying an image;
- a projection unit that projects an image displayed by the image display element toward a projection surface;
- a light irradiation unit that irradiates the projection surface with linear light extending along the first direction in the projection surface at an incident angle shallower than the projection light, and
- An imaging unit that has an optical axis different from that of the light irradiation unit and photographs the projection surface;
- a signal processing unit that performs signal processing on an imaging signal output from the imaging unit; With The imaging unit photographs the linear light irradiated on the projection surface, The said signal processing part correct
- the projection type display apparatus for displaying an image;
- a light irradiation unit that irradiates the projection surface with linear light extending along the first direction in the projection surface at an incident angle shallower than the projection light, and
- An imaging unit that has an optical axis different from that
- the signal processing unit A detection unit that detects a first distortion amount in the linear light based on a captured image of the linear light; An estimation unit for estimating a second distortion amount in the image based on the detected first distortion amount; A correction coefficient calculation unit that calculates a correction coefficient based on the estimated second distortion amount;
- the projection display device further including: a correction unit that corrects the image using the correction coefficient.
- the light irradiation unit irradiates the projection surface with the linear light
- the projection unit projects a pattern image in which lines or points are regularly arranged on the projection surface
- the detector is A reference line is set based on the captured image of the linear light
- a difference value between the linear light and the reference line at a selective point on the pattern image is calculated, and the first distortion is calculated from the difference value.
- the projection display device according to (2) wherein the amount is calculated.
- the projection display device according to any one of (2) to (5), wherein the detection unit detects a distortion amount of the linear light in a second direction orthogonal to the first direction. .
- the projection type display device according to (6), wherein the first direction is a horizontal direction, and the second direction is a vertical direction.
- the light irradiation unit is configured to irradiate the linear light toward an upper end position of a projection range on the projection surface.
- the light irradiation unit according to any one of (1) to (7), Projection display device.
- the projection display device (9) The projection display device according to (8), wherein the signal processing unit corrects a distortion of a part or the entire area of the image based on a captured image of the linear light irradiated to the upper end position.
- the light irradiation unit is configured to be able to irradiate the linear light toward a plurality of positions in a projection range on the projection surface.
- the projection type display device described. (11) The signal processing unit corrects distortion of the image for each region corresponding to the irradiation position of the linear light based on the captured image of the linear light irradiated to each of the plurality of positions. ) Projection type display device.
- the projection display device according to any one of (1) to (11), wherein the light irradiation unit includes a laser light source. (13) Having a function of detecting an object approaching the light irradiation unit; The projection display device according to (12), configured to stop the irradiation of the linear light of the light irradiation unit when the object is detected. (14) The projection display device according to any one of (1) to (13), wherein the projection unit includes a short focus lens.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Geometry (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Transforming Electric Information Into Light Information (AREA)
- Projection Apparatus (AREA)
Abstract
Description
1.実施の形態(投射光の入射角度よりも浅い入射角度で照射した線状光の撮影画像に基づいて、投射画像の歪み補正を行う投射型表示装置の例)
2.変形例(線状光を複数位置に照射し、各照射位置に対応する領域毎に画像補正を行う場合の例)
[構成]
図1および図2は、本開示の一実施の形態に係る投射型表示装置(投射型表示装置1)の構成とその使用状態とを表したものである。また、図3は、投射型表示装置1の機能構成を表したものである。尚、図1および図2では、XY平面を投射面A、XZ平面を設置面Bとし、X方向が例えば水平方向、Y方向が例えば鉛直方向に相当する。
本実施の形態の投射型表示装置1の動作について、図1~図10を参照して説明する。図4は、投射面Aの非平坦性に起因する投射画像の歪み量について説明するための模式図である。図5Aおよび図5Bは、壁面の構造を説明するための斜視図である。図6は、画像補正動作の流れを表す図である。図7は、パターン画像の一例を表す模式図である。図8は、パターン画像と線状光との撮影画像を表す模式図である。図9は、基準線の設定動作および差分値の算出動作を説明するための模式図である。図10は、投射画像の歪み量を推定する際の演算処理で用いるパラメータを示す模式図である。尚、本開示の「画像補正方法」は、以下の投射型表示装置1の動作によって具現化されるため、その説明を省略する。
da=dr/tan(a2) ………(1)
dp=da・tan(a1) ………(2)
dp=[tan(a1)/tan(a2)]・dr ………(3)
図13は、変形例に係る投射型表示装置の概略構成とその使用状態を表す模式図である。上記実施の形態では、光照射部30が、投射範囲B1の上端位置p1に向けて線状光L2aを照射する場合について説明したが、本変形例のように、光照射部30が、投射範囲B1のうちの複数の位置に向けて線状光L2aを照射可能に構成されていてもよい。
da(n)=dr(n)/tan(a2n) ………(4)
dp(n)=da(n)・tan(a1n) ………(5)
dp(n)=[tan(a1n)/tan(a2n)]・dr(n) ………(6)
(1)
画像を表示する画像表示素子と、
前記画像表示素子により表示された画像を投射面に向けて投射する投射部と、
前記投射面に対し、前記投射面内の第1の方向に沿って延伸する線状光を、投射光よりも浅い入射角度で照射する光照射部と、
前記光照射部と異なる光軸を有すると共に、前記投射面を撮影する撮像部と、
前記撮像部から出力される撮像信号に対して信号処理を施す信号処理部と、
を備え、
前記撮像部は、前記投射面に照射された線状光を撮影し、
前記信号処理部は、前記線状光の撮影画像に基づいて、投射される前記画像の歪みを補正する
投射型表示装置。
(2)
前記信号処理部は、
前記線状光の撮影画像に基づいて、前記線状光における第1の歪み量を検出する検出部と、
検出された前記第1の歪み量に基づいて前記画像における第2の歪み量を推定する推定部と、
推定された前記第2の歪み量に基づいて補正係数を算出する補正係数算出部と、
前記補正係数を用いて前記画像を補正する補正部と
を有する
上記(1)に記載の投射型表示装置。
(3)
前記光照射部は、前記投射面に対して前記線状光を照射し、
前記投射部は、前記投射面に対して、線または点が規則的に配置されてなるパターン画像を投射し、
前記検出部は、
前記線状光の撮影画像に基づいて基準線を設定し、
前記線状光および前記パターン画像の撮影画像に基づいて、前記パターン画像上の選択的な点における前記線状光と前記基準線との差分値を算出し、前記差分値から前記第1の歪み量を算出する
上記(2)に記載の投射型表示装置。
(4)
前記推定部は、前記第2の歪み量を、以下の条件式[1]~[3]を用いた演算処理により推定する
上記(2)または(3)に記載の投射型表示装置。
da=dr/tan(a2) ………[1]
dp=da・tan(a1) ………[2]
dp=[tan(a1)/tan(a2)]・dr ………[3]
但し、
da:投射面の凹凸による変位量
dr:線状光の歪み量(第1の歪み量)
dp:画像の歪み量(第2の歪み量)
a1:投射光の出射角度
a2:線状光の出射角度
とする。
(5)
前記検出部は、前記線状光の撮影画像と、前記パターン画像の撮影画像とのそれぞれに対して細線化処理を施す
上記(3)に記載の投射型表示装置。
(6)
前記検出部は、前記線状光の前記第1の方向と直交する第2の方向における歪み量を検出する
上記(2)~(5)のうちのいずれか1つに記載の投射型表示装置。
(7)
前記第1の方向は水平方向であり、前記第2の方向は鉛直方向である
上記(6)に記載の投射型表示装置。
(8)
前記光照射部は、前記投射面における投射範囲の上端位置に向けて、前記線状光を照射するように構成されている
上記(1)~(7)のうちのいずれか1つに記載の投射型表示装置。
(9)
前記信号処理部は、前記上端位置に照射された線状光の撮影画像に基づいて、前記画像の一部または全域の歪みを補正する
上記(8)に記載の投射型表示装置。
(10)
前記光照射部は、前記投射面における投射範囲のうちの複数の位置に向けて、前記線状光を照射可能に構成されている
上記(1)~(9)のうちのいずれか1つに記載の投射型表示装置。
(11)
前記信号処理部は、前記複数の位置のそれぞれに照射された線状光の撮影画像に基づいて、前記画像の歪みを、前記線状光の照射位置に対応する領域毎に補正する
上記(10)に記載の投射型表示装置。
(12)
前記光照射部は、レーザ光源を有する
上記(1)~(11)のうちのいずれか1つに記載の投射型表示装置。
(13)
前記光照射部に接近する物体を検知する機能を有し、
前記物体が検知された場合に、前記光照射部の前記線状光の照射を停止する、ように構成されている
上記(12)に記載の投射型表示装置。
(14)
前記投射部は、短焦点レンズを含む
上記(1)~(13)のうちのいずれか1つに記載の投射型表示装置。
(15)
投射面に向けて投射する画像を補正する際に、
前記投射面に対し、前記投射面内の第1の方向に沿って延伸する線状光を、投射光よりも浅い入射角度で照射し、
前記投射面に照射された線状光を、前記線状光とは異なる光路から撮影し、
前記線状光の撮影画像に基づいて、投射される前記画像の歪みを補正する
画像補正方法。
(16)
前記線状光の撮影画像に基づいて、前記線状光における第1の歪み量を検出し、
検出された前記第1の歪み量に基づいて、前記画像における第2の歪み量を推定し、
推定された前記第2の歪み量に基づいて前記画像を補正する
上記(15)に記載の画像補正方法。
(17)
前記投射面に対し、前記線状光を照射すると共に、線または点が規則的に配置されてなるパターン画像を投射し、
前記線状光の撮影画像に基づいて基準線を設定し、
前記線状光および前記パターン画像の撮影画像に基づいて、前記パターン画像上の選択的な点における前記線状光と前記基準線との差分値を算出し、前記差分値から前記第1の歪み量を算出する
上記(16)に記載の画像補正方法。
(18)
前記第2の歪み量を、以下の条件式[1]~[3]を用いた演算処理により推定する
上記(16)または(17)に記載の画像補正方法。
da=dr/tan(a2) ………[1]
dp=da・tan(a1) ………[2]
dp=[tan(a1)/tan(a2)]・dr ………[3]
但し、
da:投射面の凹凸による変位量
dr:線状光の歪み量(第1の歪み量)
dp:投射される画像の歪み量(第2の歪み量)
a1:投射光の出射角度
a2:線状光の出射角度
とする。
(19)
前記線状光を、前記投射面における投射範囲の上端位置に向けて照射し、
前記上端位置に照射された線状光の撮影画像に基づいて、前記画像の一部または全域の歪みを補正する
上記(15)~(18)のうちのいずれか1つに記載の画像補正方法。
(20)
前記線状光を、前記投射面における投射範囲のうちの複数の位置に向けて順次または同時に照射し、
前記複数の位置のそれぞれに照射された線状光の撮影画像に基づいて、前記画像の歪みを、前記線状光の照射位置に対応する領域毎に補正する
上記(15)~(18)のうちのいずれか1つに記載の画像補正方法。
Claims (20)
- 画像を表示する画像表示素子と、
前記画像表示素子により表示された画像を投射面に向けて投射する投射部と、
前記投射面に対し、前記投射面内の第1の方向に沿って延伸する線状光を、投射光よりも浅い入射角度で照射する光照射部と、
前記光照射部と異なる光軸を有すると共に、前記投射面を撮影する撮像部と、
前記撮像部から出力される撮像信号に対して信号処理を施す信号処理部と、
を備え、
前記撮像部は、前記投射面に照射された線状光を撮影し、
前記信号処理部は、前記線状光の撮影画像に基づいて、投射される前記画像の歪みを補正する
投射型表示装置。 - 前記信号処理部は、
前記線状光の撮影画像に基づいて、前記線状光における第1の歪み量を検出する検出部と、
検出された前記第1の歪み量に基づいて前記画像における第2の歪み量を推定する推定部と、
推定された前記第2の歪み量に基づいて補正係数を算出する補正係数算出部と、
前記補正係数を用いて前記画像を補正する補正部と
を有する
請求項1に記載の投射型表示装置。 - 前記光照射部は、前記投射面に対して前記線状光を照射し、
前記投射部は、前記投射面に対して、線または点が規則的に配置されてなるパターン画像を投射し、
前記検出部は、
前記線状光の撮影画像に基づいて基準線を設定し、
前記線状光および前記パターン画像の撮影画像に基づいて、前記パターン画像上の選択的な点における前記線状光と前記基準線との差分値を算出し、前記差分値から前記第1の歪み量を算出する
請求項2に記載の投射型表示装置。 - 前記推定部は、前記第2の歪み量を、以下の条件式(1)~(3)を用いた演算処理により推定する
請求項2に記載の投射型表示装置。
da=dr/tan(a2) ………(1)
dp=da・tan(a1) ………(2)
dp=[tan(a1)/tan(a2)]・dr ………(3)
但し、
da:投射面の凹凸による変位量
dr:線状光の歪み量(第1の歪み量)
dp:画像の歪み量(第2の歪み量)
a1:投射光の出射角度
a2:線状光の出射角度
とする。 - 前記検出部は、前記線状光の撮影画像と、前記パターン画像の撮影画像とのそれぞれに対して細線化処理を施す
請求項3に記載の投射型表示装置。 - 前記検出部は、前記線状光の前記第1の方向と直交する第2の方向における歪み量を検出する
請求項2に記載の投射型表示装置。 - 前記第1の方向は水平方向であり、前記第2の方向は鉛直方向である
請求項6に記載の投射型表示装置。 - 前記光照射部は、前記投射面における投射範囲の上端位置に向けて、前記線状光を照射するように構成されている
請求項1に記載の投射型表示装置。 - 前記信号処理部は、前記上端位置に照射された線状光の撮影画像に基づいて、前記画像の一部または全域の歪みを補正する
請求項8に記載の投射型表示装置。 - 前記光照射部は、前記投射面における投射範囲のうちの複数の位置に向けて、前記線状光を照射可能に構成されている
請求項1に記載の投射型表示装置。 - 前記信号処理部は、前記複数の位置のそれぞれに照射された線状光の撮影画像に基づいて、前記画像の歪みを、前記線状光の照射位置に対応する領域毎に補正する
請求項10に記載の投射型表示装置。 - 前記光照射部は、レーザ光源を有する
請求項1に記載の投射型表示装置。 - 前記光照射部に接近する物体を検知する機能を有し、
前記物体が検知された場合に、前記光照射部の前記線状光の照射を停止する、ように構成されている
請求項12に記載の投射型表示装置。 - 前記投射部は、短焦点レンズを含む
請求項1に記載の投射型表示装置。 - 投射面に向けて投射する画像を補正する際に、
前記投射面に対し、前記投射面内の第1の方向に沿って延伸する線状光を、投射光よりも浅い入射角度で照射し、
前記投射面に照射された線状光を、前記線状光とは異なる光路から撮影し、
前記線状光の撮影画像に基づいて、投射される前記画像の歪みを補正する
画像補正方法。 - 前記線状光の撮影画像に基づいて、前記線状光における第1の歪み量を検出し、
検出された前記第1の歪み量に基づいて、前記画像における第2の歪み量を推定し、
推定された前記第2の歪み量に基づいて前記画像を補正する
請求項15に記載の画像補正方法。 - 前記投射面に対し、前記線状光を照射すると共に、線または点が規則的に配置されてなるパターン画像を投射し、
前記線状光の撮影画像に基づいて基準線を設定し、
前記線状光および前記パターン画像の撮影画像に基づいて、前記パターン画像上の選択的な点における前記線状光と前記基準線との差分値を算出し、前記差分値から前記第1の歪み量を算出する
請求項16に記載の画像補正方法。 - 前記第2の歪み量を、以下の条件式(1)~(3)を用いた演算処理により推定する
請求項16に記載の画像補正方法。
da=dr/tan(a2) ………(1)
dp=da・tan(a1) ………(2)
dp=[tan(a1)/tan(a2)]・dr ………(3)
但し、
da:投射面の凹凸による変位量
dr:線状光の歪み量(第1の歪み量)
dp:投射される画像の歪み量(第2の歪み量)
a1:投射光の出射角度
a2:線状光の出射角度
とする。 - 前記線状光を、前記投射面における投射範囲の上端位置に向けて照射し、
前記上端位置に照射された線状光の撮影画像に基づいて、前記画像の一部または全域の歪みを補正する
請求項15に記載の画像補正方法。 - 前記線状光を、前記投射面における投射範囲のうちの複数の位置に向けて順次または同時に照射し、
前記複数の位置のそれぞれに照射された線状光の撮影画像に基づいて、前記画像の歪みを、前記線状光の照射位置に対応する領域毎に補正する
請求項15に記載の画像補正方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/765,050 US10397534B2 (en) | 2015-10-06 | 2016-09-06 | Projection display and image correction method |
CN201680056351.1A CN108141561B (zh) | 2015-10-06 | 2016-09-06 | 投影型显示装置和图像校正方法 |
JP2017544421A JP6809477B2 (ja) | 2015-10-06 | 2016-09-06 | 投射型表示装置および画像補正方法 |
US16/510,787 US10721447B2 (en) | 2015-10-06 | 2019-07-12 | Projection display and image correction method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015198762 | 2015-10-06 | ||
JP2015-198762 | 2015-10-06 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/765,050 A-371-Of-International US10397534B2 (en) | 2015-10-06 | 2016-09-06 | Projection display and image correction method |
US16/510,787 Continuation US10721447B2 (en) | 2015-10-06 | 2019-07-12 | Projection display and image correction method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017061213A1 true WO2017061213A1 (ja) | 2017-04-13 |
Family
ID=58487531
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/076095 WO2017061213A1 (ja) | 2015-10-06 | 2016-09-06 | 投射型表示装置および画像補正方法 |
Country Status (4)
Country | Link |
---|---|
US (2) | US10397534B2 (ja) |
JP (1) | JP6809477B2 (ja) |
CN (1) | CN108141561B (ja) |
WO (1) | WO2017061213A1 (ja) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10397534B2 (en) | 2015-10-06 | 2019-08-27 | Sony Corporation | Projection display and image correction method |
JP6780315B2 (ja) * | 2016-06-22 | 2020-11-04 | カシオ計算機株式会社 | 投影装置、投影システム、投影方法及びプログラム |
WO2019188121A1 (ja) * | 2018-03-27 | 2019-10-03 | ソニー株式会社 | 画像表示装置 |
JP7338659B2 (ja) * | 2021-03-30 | 2023-09-05 | セイコーエプソン株式会社 | 指示体の検出方法及びプロジェクションシステム |
CN115883799A (zh) * | 2021-09-29 | 2023-03-31 | 中强光电股份有限公司 | 投影机以及投影方法 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005006272A (ja) * | 2003-05-21 | 2005-01-06 | Nec Viewtechnology Ltd | 傾斜角度測定装置を有するプロジェクタ |
JP2009200683A (ja) * | 2008-02-20 | 2009-09-03 | Seiko Epson Corp | 画像処理装置、プロジェクタ、歪み補正方法 |
JP2015118252A (ja) * | 2013-12-18 | 2015-06-25 | 株式会社リコー | 画像投影装置、画像投影システムおよび画像投影方法 |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6753907B1 (en) * | 1999-12-23 | 2004-06-22 | Justsystem Corporation | Method and apparatus for automatic keystone correction |
CN100371825C (zh) * | 2002-07-24 | 2008-02-27 | 索尼株式会社 | 投影屏及其制造方法 |
US7204596B2 (en) | 2003-09-19 | 2007-04-17 | Nec Corporation | Projector with tilt angle measuring device |
JP4199641B2 (ja) * | 2003-10-30 | 2008-12-17 | 日本電気株式会社 | プロジェクタ装置 |
US8303120B2 (en) * | 2006-02-21 | 2012-11-06 | Panasonic Corporation | Image display apparatus and image distortion correction method of the same |
JP5180689B2 (ja) * | 2007-08-07 | 2013-04-10 | 三洋電機株式会社 | 投写型映像表示装置 |
JP5256899B2 (ja) * | 2008-07-18 | 2013-08-07 | セイコーエプソン株式会社 | 画像補正装置、画像補正方法、プロジェクタおよびプロジェクションシステム |
JP5309828B2 (ja) | 2008-09-19 | 2013-10-09 | セイコーエプソン株式会社 | プロジェクタ |
JP2012151670A (ja) * | 2011-01-19 | 2012-08-09 | Renesas Electronics Corp | 画像投影システム及び半導体集積回路 |
JP2013171246A (ja) * | 2012-02-22 | 2013-09-02 | Ricoh Co Ltd | 超短焦点プロジェクター用リアスクリーン |
JP5910157B2 (ja) | 2012-02-23 | 2016-04-27 | 株式会社リコー | 画像投射装置 |
US10397534B2 (en) | 2015-10-06 | 2019-08-27 | Sony Corporation | Projection display and image correction method |
-
2016
- 2016-09-06 US US15/765,050 patent/US10397534B2/en active Active
- 2016-09-06 WO PCT/JP2016/076095 patent/WO2017061213A1/ja active Application Filing
- 2016-09-06 JP JP2017544421A patent/JP6809477B2/ja active Active
- 2016-09-06 CN CN201680056351.1A patent/CN108141561B/zh active Active
-
2019
- 2019-07-12 US US16/510,787 patent/US10721447B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005006272A (ja) * | 2003-05-21 | 2005-01-06 | Nec Viewtechnology Ltd | 傾斜角度測定装置を有するプロジェクタ |
JP2009200683A (ja) * | 2008-02-20 | 2009-09-03 | Seiko Epson Corp | 画像処理装置、プロジェクタ、歪み補正方法 |
JP2015118252A (ja) * | 2013-12-18 | 2015-06-25 | 株式会社リコー | 画像投影装置、画像投影システムおよび画像投影方法 |
Also Published As
Publication number | Publication date |
---|---|
US10397534B2 (en) | 2019-08-27 |
CN108141561A (zh) | 2018-06-08 |
JPWO2017061213A1 (ja) | 2018-08-02 |
CN108141561B (zh) | 2020-12-18 |
US20180295333A1 (en) | 2018-10-11 |
US20190342531A1 (en) | 2019-11-07 |
JP6809477B2 (ja) | 2021-01-06 |
US10721447B2 (en) | 2020-07-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10721447B2 (en) | Projection display and image correction method | |
TWI298416B (en) | Projector and image correction method | |
TWI242373B (en) | Image processing system, projector and image processing method | |
TWI249351B (en) | Image processing system, projector, and image processing method | |
JP3879858B2 (ja) | 画像処理システム、プロジェクタおよび画像処理方法 | |
US8727539B2 (en) | Projector and method of controlling projector | |
EP3136377B1 (en) | Information processing device, information processing method, program | |
CN102365865B (zh) | 多投影显示系统和屏幕画面形成方法 | |
JP5069038B2 (ja) | 背面投射型表示装置 | |
JP2016519330A (ja) | 短焦点カメラを用いてディスプレイシステムを校正するためのシステム及び方法 | |
TWI484283B (zh) | 影像計算量測方法、影像計算量測裝置及影像檢查裝置 | |
WO2015165209A1 (zh) | 穿戴式投影装置及其调焦方法、投影方法 | |
US9030553B2 (en) | Projector image correction device and method | |
WO2017169186A1 (ja) | 画像投影システムおよび補正方法 | |
US20120206696A1 (en) | Projection display apparatus and image adjusting method | |
JP7148855B2 (ja) | 投影制御装置、投影装置、投影方法及びプログラム | |
JP2014060549A (ja) | 照度出力装置、輝度出力装置及び画像投影装置 | |
CN109644248B (zh) | 投射型影像显示装置和投射影像的调整方法 | |
JP2005303493A (ja) | 障害物適応投射型表示装置 | |
JP6182739B2 (ja) | 投影装置及び投影方法 | |
JP2014224841A (ja) | 映像投写システム | |
JP2010130481A (ja) | 画像投射装置 | |
JP2010288062A (ja) | プロジェクター、プログラム、情報記憶媒体および画像投写方法 | |
US20050078280A1 (en) | Arrangement for correction of the visual depiction of image information | |
JP3730979B2 (ja) | 傾斜角度測定装置を有するプロジェクタ |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16853367 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017544421 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15765050 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16853367 Country of ref document: EP Kind code of ref document: A1 |