WO2015104870A1 - 画像生成装置 - Google Patents
画像生成装置 Download PDFInfo
- Publication number
- WO2015104870A1 WO2015104870A1 PCT/JP2014/074665 JP2014074665W WO2015104870A1 WO 2015104870 A1 WO2015104870 A1 WO 2015104870A1 JP 2014074665 W JP2014074665 W JP 2014074665W WO 2015104870 A1 WO2015104870 A1 WO 2015104870A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- light
- signal
- value
- component
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 claims abstract description 101
- 238000012545 processing Methods 0.000 claims description 38
- 239000000203 mixture Substances 0.000 claims description 26
- 230000007613 environmental effect Effects 0.000 claims description 9
- 230000002093 peripheral effect Effects 0.000 description 14
- 206010070834 Sensitisation Diseases 0.000 description 10
- 238000000034 method Methods 0.000 description 10
- 230000008313 sensitization Effects 0.000 description 10
- 101100342582 Drosophila melanogaster Irbp gene Proteins 0.000 description 9
- 238000010586 diagram Methods 0.000 description 9
- 238000006243 chemical reaction Methods 0.000 description 6
- 230000035945 sensitivity Effects 0.000 description 6
- 230000015572 biosynthetic process Effects 0.000 description 5
- 238000012937 correction Methods 0.000 description 5
- 230000033001 locomotion Effects 0.000 description 5
- 238000003786 synthesis reaction Methods 0.000 description 5
- 230000003595 spectral effect Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 238000000605 extraction Methods 0.000 description 3
- 239000011159 matrix material Substances 0.000 description 3
- 101100135116 Oryza sativa subsp. japonica RR12 gene Proteins 0.000 description 2
- 101100350468 Oryza sativa subsp. japonica RR32 gene Proteins 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000002156 mixing Methods 0.000 description 2
- 238000012806 monitoring device Methods 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000002194 synthesizing effect Effects 0.000 description 2
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000001235 sensitizing effect Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/20—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
- H04N23/21—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only from near infrared [NIR] radiation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
- H04N23/88—Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/30—Transforming light or analogous information into electric information
- H04N5/33—Transforming infrared radiation
Definitions
- the present invention relates to an image generation apparatus that can acquire distance information to an object existing in an imaging space in association with a captured image.
- a spatial pattern light is emitted from a projector, an image of an observation area irradiated with the pattern light is captured by a camera, the pattern light irradiation direction and the pattern light imaging direction in the captured image, and the projector
- a device that measures the distance to an object existing in an observation region based on the relative positional relationship of the camera is known (for example, see Patent Document 1).
- the pattern light is light in an invisible wavelength band such as near infrared rays. It is common to use. At this time, in order to capture the pattern light with the camera, it is necessary to use a camera having sensitivity to the wavelength of the pattern light. When shooting with a camera that is sensitive to wavelengths other than visible light, even when shooting without pattern light, the color reproducibility of the shot image is affected by components other than visible light included in the ambient light. There is a problem that the subject cannot be correctly identified.
- the present invention has been made to solve the above-described problems, and removes the influence of components other than visible light included in ambient light from an image signal of a captured image when generating an image with distance information, and reproduces color.
- the purpose is to obtain a high-quality image.
- an image generation apparatus includes: A projector that projects pattern light of near-infrared wavelength into the imaging space every predetermined number of frame periods; An R signal representing an R component, a G component, and a B component of the captured image, which is irradiated with ambient light and images the subject in the imaging space where the pattern light is projected every predetermined number of frame periods An imaging unit that outputs an imaging signal including a G signal and a B signal; A control unit that gives an instruction of the projection intensity of the pattern light to the projector; A pattern light image is generated by obtaining a difference between an image pickup signal obtained when the pattern light is projected and an image pickup signal obtained when the pattern light is not projected from the image pickup signal obtained by the image pickup by the image pickup unit.
- An ambient light type determination unit for determining the type of the ambient light; From the environmental light type determined by the environmental light type determination unit, the composition ratio information indicating the composition ratio of the light of the near-infrared wavelength included in the environmental light is generated, and based on the generated composition ratio information In the ambient light image, a component due to the near infrared wavelength light included in the ambient light is estimated, and a visible light image is obtained by subtracting the near infrared wavelength light component from the ambient light image.
- a visible light image generation unit for generating, The environment light type determination unit determines the type of the environment light from a ratio of R, G, and B components in the captured image, the environment light image, or the visible light image.
- the present invention it is possible to remove an ambient light component at a wavelength other than visible light from an image signal of a captured image when generating an image with distance information, and obtain an image with high color reproducibility.
- FIG. 3 is a diagram illustrating an arrangement of a projector and an imaging unit in the first embodiment.
- FIG. 1 shows the structural example of the light projector 12 of FIG. (A)-(p) is a figure which shows the component contained in the data which appears in each part of the image process part 40.
- FIG. It is a block diagram which shows the structural example of the image difference acquisition part 42 of FIG. It is a figure which expands and shows a part of projection pattern.
- FIG. 1 is the schematic which shows the structural example of the imaging part 20 of FIG. It is a figure which shows the arrangement
- FIG. 3 is a diagram illustrating an arrangement of a projector and an imaging unit in the first embodiment.
- FIG. shows the structural example of the light projector 12 of FIG. (A)-(p) is a figure which shows the component contained in the data which appears in each part of the image process part 40.
- FIG. It is a block diagram which shows
- FIG. 9 is a diagram showing an example of an identification code used in the projection pattern. It is a figure which shows the example of arrangement
- FIG. It is a block diagram which shows the structural example of the visible light image generation part 46 of FIG. (A)-(c) is a figure which shows the arrangement
- (b) shows the example of the output image of the display process part of FIG. It is a block block diagram which shows the image generation apparatus of Embodiment 2 of this invention. It is a block diagram which shows the structural example of the visible light image generation part 46b of FIG.
- FIG. FIG. 1 shows a configuration of an image generation apparatus according to Embodiment 1 of the present invention.
- the illustrated image generation apparatus includes a pattern light generation unit 10, an imaging unit 20, a control unit 30, and an image processing unit 40.
- the pattern light generation unit 10 includes a drive unit 11 and a projector 12.
- the imaging unit 20 includes a lens 22 and an imaging element 24.
- FIG. 3 three-dimensionally represents the imaging space (imaging target space) JS together with the projector 12 and the imaging unit 20.
- FIG. 3 it is assumed that there is a rectangular parallelepiped subject OJ1 and a spherical subject OJ2 in the imaging space JS.
- the image generation apparatus of the present invention projects pattern light of near-infrared wavelength toward the subjects OJ1 and OJ2 by the projector 12, and based on information obtained by imaging by the imaging unit 20, The distance to each part of the captured subjects OJ1 and OJ2 is obtained, and image information and distance information about each part of the image are obtained.
- the pattern light projected by the projector 12 generates a projection pattern.
- the projection pattern forms light spots arranged in a matrix, that is, in the horizontal direction (row direction) and the vertical direction (column direction).
- FIG. 4 is a diagram of the projector 12 and the imaging unit 20 and one light spot SP formed at an arbitrary point on the subjects OJ1 and OJ2 in the imaging space as viewed from above.
- the projector 12 and the imaging unit 20 are arranged apart from each other by a distance Lpc in the horizontal direction.
- a straight line connecting the projector 12 and the imaging unit 20 is referred to as a base line BL, and the distance Lpc is referred to as a base line length.
- a light spot SP is formed in one of subjects OJ1 and OJ2 in the imaging space JS by the light projected from the projector 12, and the light from the light spot SP is received by the imaging unit 20.
- the projection angle ⁇ from the projector 12 to the light spot SP, the incident angle ⁇ from the light spot SP to the imaging unit 20 and the baseline length Lpc are known, then the subject OJ1 from the baseline BL can be obtained based on the principle of triangulation. , The distance Dz to the light spot SP on OJ2 can be obtained by calculation.
- the projection angle ⁇ is an angle formed by a line perpendicular to the base line BL and a line connecting the projector 12 and the light spot SP in a plane including the base line BL and the light spot SP.
- the incident angle ⁇ is an angle formed by a line perpendicular to the base line BL and a line connecting the imaging unit 20 and the light spot SP in a plane including the base line BL and the light spot SP.
- the incident angle ⁇ in the imaging unit 20 can be obtained based on the position on the imaging surface of the imaging device 24 where the image of the light spot SP is formed, the direction of the axis of the imaging device 24, and the angle of view.
- the projection angle ⁇ from the projector 12 is predetermined by the configuration of the projector 12 and is therefore known.
- each light spot When a large number of light spots are projected from the projector 12 at different projection angles and these light spots are imaged by the imaging unit 20, if the respective projection angles are known, the image of the light spot on the imaging surface
- the projection angle of each light spot can be estimated based on the mutual relationship between the upper positions.
- the projector 12 includes a laser light source 13, a collimating lens 14, an aperture 15, and a diffraction grating 16.
- the drive unit 11 (FIG. 1) is controlled by the control unit 30 to cause the laser light source 13 to emit light, and the laser light emitted from the laser light source 13 is collimated by the collimator lens 14 and predetermined by the aperture 15.
- the beam diameter is set.
- the light emitted from the aperture 15 enters the diffraction grating 16.
- the diffraction grating 16 projects pattern light for generating a predetermined projection pattern onto the imaging space JS.
- the lens 22 (FIG. 2) of the imaging unit 20 focuses the subject image on the imaging surface of the imaging element 24.
- the imaging element 24 outputs an imaging signal obtained by photoelectrically converting the incident image.
- the imaging element 24 is, for example, one in which R, G, and B pixels are arranged in a Bayer shape, and R, G, and B signals are output as imaging signals.
- Each pixel of the image sensor 24 includes a photoelectric conversion element and a color filter provided on the incident side of the photoelectric conversion element.
- the imaging unit 20 images the subjects OJ1 and OJ2 in the imaging space JS. This imaging is performed at a predetermined frame frequency (frame rate), for example, 30 fps. By imaging, images of a plurality of continuous frames are obtained, and R, G, B color component signals R0, G0, B0 are output.
- frame rate frame frequency
- the subject in the imaging space receives projection of pattern light and is irradiated with ambient light.
- pattern light When pattern light is projected on the subjects OJ1 and OJ2, components (patterns) of pattern light reflected by the subjects OJ1 and OJ2 are added to components (environment light components or images) reflected by the subjects OJ1 and OJ2.
- An image (a signal representing) on which a light component or a pattern light image) is superimposed is output from the imaging unit 20.
- an image (a signal representing) only the component (environment light component or environment light image) due to the environment light reflected by the subjects OJ1 and OJ2 is output from the imaging unit 20. Is output.
- the image processing unit 40 (FIG. 1) includes an A / D converter 41, an image difference acquisition unit 42, a distance information generation unit 43, an ambient light type determination unit 44, a visible light image generation unit 46, and a sensitization.
- a processing unit 47, a video signal processing unit 48, and a display processing unit 49 are provided.
- the A / D converter 41 converts the output of the imaging unit 20 into, for example, digital signals R0, G0, and B0 each having 8 bits (256 gradations).
- the control unit 30 controls the pattern light generation unit 10, the imaging unit 20, and the image processing unit 40. Specifically, the control unit 30 controls the imaging mode, frame frequency, exposure time, aperture, analog gain, and the like of the imaging device 24 of the imaging unit 20. In controlling the exposure time, aperture, and analog gain, adjustment is performed so that the brightness of the captured image is constant.
- the control unit 30 also supplies a signal for controlling the operation timing to the A / D converter 41.
- the control unit 30 also controls the projection intensity of the pattern light by the pattern light generation unit 10. In controlling the projection intensity, adjustment is performed so that the difference between the signal value when the pattern light is projected and the signal value when the pattern light is not projected is constant.
- the control unit 30 also performs control for synchronizing the operation of the pattern light generation unit 10 and the operation of the imaging unit 20. That is, the control unit 30 provides the imaging unit 20 with a Control is performed to repeat imaging at a predetermined frame frequency, and control is performed so that pattern light generation unit 10 alternately repeats projection and non-projection of pattern light every other frame period. Specifically, the control unit 30 controls the driving unit 11 so that the laser light source 13 enters a light emitting state and a non-light emitting state every other frame period.
- the control unit 30 further indicates whether the pattern light generation unit 10 is in a projection state (whether the laser light source 13 is in a light emission state) or in a non-projection state (whether the laser light source 13 is in a non-light emission state).
- the signal Snf is supplied to the image difference acquisition unit 42.
- the control unit 30 also generates information Prp indicating the ratio of the R, G, and B components in the ambient light image. Therefore, for example, the ratio of the integrated values of the R, G, and B signals R1, G1, and B1 output from the image difference acquisition unit 42 is obtained for each area that indicates the entire screen or each part of the screen.
- the control unit 30 supplies information Prp indicating the ratio of the R, G, and B components to the video signal processing unit 48.
- the video signal processing unit 48 uses the information Prp indicating the above ratio for white balance adjustment.
- the information Prp indicating the above ratio is also supplied to the ambient light type determination unit 44, and the ambient light type determination unit 44 uses this information to determine the type of ambient light.
- the control unit 30 further includes information Sdp indicating the relationship between the positions in the projection pattern of each of the light spots included in the pattern light projected by the projector 12, and information Spa indicating the correspondence between the position on the projection pattern and the projection angle. , And information Szv indicating the axial direction and angle of view of the imaging unit 20 and information indicating the baseline length Lpc are held and supplied to the distance information generation unit 43.
- the image capturing unit 20 displays an image when the pattern light is projected (image during projection) and an image when the pattern light is not projected.
- Non-projection image that is, an image using only ambient light (environmental light image) is alternately obtained every other frame period.
- the color component signals R0, G0, B0 output from the A / D converter 41 are as shown in FIGS. 6 (a), (b), (d), (e), (g), (h).
- the color filter of each pixel has transparency in the near-infrared region, so a component by near-infrared light (near-infrared component) IRr, IRg and IRb are also included.
- the “original color components” Rr, Gg, and Bb are used when the color filter is not transmissive to near-infrared light (light of near-infrared wavelength) and / or photoelectric conversion elements. Is a component output from the imaging unit 20 when it has no sensitivity to near-infrared light.
- the near-infrared components IRr, IRg, and IRb of the signals R0, G0, and B0 obtained in the frame in which the projector 12 is in the projection state (on) are shown in FIGS. 6 (a), (d), and (g).
- the near-infrared components IRr, IRg, IRb do not include the components IRrp, IRgp, IRbp due to the pattern light as shown in FIGS. 6B, 6E, and 6H, and the components IRre, IRge due to the ambient light. , Only IRbe is included.
- the difference between the signals R0, G0, and B0 is obtained between the successive frames (including the component due to the pattern light from the signal including the component due to the pattern light).
- signals representing the pattern light components IRrp, IRgp, and IRbp shown in FIGS. 6 (m), (n), and (o), or an image (pattern light image) represented by the signals are obtained.
- the signals representing the ambient light components R1, G1, and B1 shown in FIGS. 6B, 6E, and 6H Alternatively, an image (environment light image or non-projection image) represented by the signal can be generated.
- the role of the image difference acquisition unit 42 includes an image (pattern light image) (FIGS. 6 (m), (n), and (o)) composed only of a component by pattern light and a component by pattern light. This is to generate images (environmental light images) R1, G1, and B1 (FIGS. 6B, 6E, and 6H) of no frames.
- image pattern light image
- R1, G1, and B1 FIGGS. 6B, 6E, and 6H
- Signals R1, G1, and B1 also include near-infrared components IRre, IRge, and IRbe due to ambient light, as shown in FIGS. 6 (b), (e), and (h).
- the role of the visible light image generation unit 46 is to remove the near-infrared components IRre, IRge, IRbe from the signals R1, G1, B1, and as shown in FIGS. 6 (c), (f), (i) To generate signals R2, G2, and B2 consisting only of the respective color components Rr, Gg, and Bb.
- FIG. 6 (j) shows a luminance signal Y0 obtained by combining the signals R0, G0, and B0 of FIGS. 6 (a), (d), and (g).
- the luminance signal Y0 includes an original luminance component Yy and a near infrared component IRy.
- the near infrared component IRy includes a component IRyp caused by pattern light and a component IRye caused by ambient light.
- FIG. 6 (k) shows a luminance signal Y1 obtained by synthesizing the signals R1, G1, and B1 of FIGS. 6 (b), (e), and (h).
- the luminance signal Y1 includes an original luminance component Yy and a near-infrared component IRye due to ambient light.
- FIG. 6 (l) shows a luminance signal Y2 obtained by combining the signals R2, G2, and B2 of FIGS. 6 (c), (f), and (i).
- This luminance signal Y2 includes only the original luminance component Yy.
- FIG. 6 (p) shows a signal IRyp obtained by synthesizing the signals IRrp, IRgp, and IRbp of FIGS. 6 (m), (n), and (o).
- This signal IRyp represents the intensity Sr of the imaged pattern light.
- the image difference acquisition unit 42 receives signals R0, G0, and B0 output from the A / D converter 41, and is based on an image when pattern light is projected and an image when pattern light is not projected.
- an image using the pattern light (pattern light image) and an image excluding the pattern light component (environment light image) are generated.
- an image obtained in a frame period in which pattern light is not projected is output as an ambient light image, and is obtained by imaging (exposure) in a frame period in which pattern light is projected out of two successive frame periods.
- An image obtained by subtracting an image obtained by imaging (exposure) in a frame period during which pattern light is not projected from the obtained image is output as a pattern light image.
- the R, G, and B component signals obtained during the frame period during which the pattern light is not projected are subtracted from the R, G, and B component signals obtained during the frame period during which the pattern light is projected. By combining these, a signal representing a single pattern light component is generated.
- FIG. 7 shows a configuration example of the image difference acquisition unit 42.
- the frame delay unit 421 delays the imaging signal D41 (R0, G0, B0) supplied from the A / D converter 41 via the input terminal 420 and outputs a frame delayed imaging signal D421.
- the difference calculation unit 422 is a difference between the imaging signal D41 and the frame delay imaging signal D421 (a difference obtained by subtracting imaging of the frame on which the pattern light is not projected from the imaging signal of the frame on which the pattern light is projected). And a difference signal D422 is generated.
- the switch 423 is closed at a timing when the imaging signal D41 for the frame on which the projector 12 does not project the pattern light is supplied to the input terminal 420, and the signal is output as the ambient light component D423 (R1, G1, B1).
- the image is output from the terminal 427 to the visible light image generation unit 46.
- the processing by the frame delay unit 421, the difference calculation unit 422, and the switch 423 is performed separately for each of the R, G, and B color components. That is, when the R signal R0 is input to the input terminal 420, the R signal R0 is separated into the pattern light component IRrp and the ambient light component (Rr + IRre), and the G signal G0 is input to the input terminal 420. The G signal G0 is separated into the pattern light component IRrp and the ambient light component (Gg + IRge). When the B signal B0 is input to the input terminal 420, the pattern light of the B signal B0 Separation into component IRbp and ambient light component (Bb + IRbe) is performed.
- the R, G, and B signals R0, G0, and B0 input to the image difference acquisition unit 42 are signals obtained from the pixels in the Bayer array, and thus do not have all the color components for all the pixels. Only one component of R, G, and B exists for the pixel. The same applies to the ambient light component output from the switch 423. Similarly, the pattern light component output from the difference calculation unit 422 is included in any one of the R, G, and B pixels (color signals R0, G0, and B0). Only component) is present.
- the interpolation unit 424 receives the R, G, and B pattern light components IRrp, IRgp, and IRbp output from the difference calculation unit 422, and the missing component (color signal different from the color signal of each pixel).
- the near-infrared components estimated to have been included in IRrp), IRrp, IRgp, and IRbp are interpolated.
- the synthesis unit 425 synthesizes the three components IRrp, IRgp, and IRbp for each pixel output from the interpolation unit 424. This synthesis is performed by, for example, the following calculation, as in the case where the luminance signal is generated from the R, G, and B signals.
- IRyp a1 * IRrp + a2 * IRgp + a3 * IRbp (1)
- the result of synthesis by the synthesis unit 425 (near infrared component for each pixel) IRyp is supplied to the distance information generation unit 43 through the output terminal 428 as the pattern light component Sr.
- the distance information generation unit 43 corresponds to each part of the pattern light image from the imaging unit 20 based on the pattern light component output from the image difference acquisition unit 42 and the information about the projection pattern separately supplied from the control unit 30. Information representing the distance to each part of the subject to be generated is generated.
- a projection pattern including an identification code in addition to a light spot is used. First, the projection pattern will be described.
- the projection pattern (projected image) projected by the projector 12 includes light spots arranged in a matrix as shown in FIG. 3, and an identification code is associated with each light spot in association therewith. It has a group of dots that have a role.
- FIG. 8 shows an enlarged part of the projection pattern.
- the minimum cell is called a dot position or a cell, and is a minimum unit that can be controlled to be on (a state where light is irradiated) or off (a state where light is not irradiated) in the projection pattern.
- cells of 480 rows in the vertical direction and 650 columns in the horizontal direction are formed in the projection range.
- a dot is constituted by a cell irradiated with light.
- Each light spot MK is formed so as to occupy an area composed of cells in the on state in two rows in the vertical direction and two columns in the horizontal direction.
- Each of the upper and lower rows and the left and right columns around the area of 2 rows and 2 columns is an area composed of off-state cells (cells not irradiated with light), and 4 rows and 4 including this area and the 2 rows and 2 columns area.
- the row area is called a spot area MA.
- DCa is a region that is adjacent to the right side of the spot region MA and is arranged in a row (a group of four dot positions that are adjacent to the spot region MA on the right side and aligned with each other). This is a region constituting the partial DCb.
- the four cells of the first part DCa are indicated by reference numerals c1 to c4, respectively, and the four cells of the second part DCb are indicated by reference numerals c5 to c8, respectively.
- Each cell of the first portion DCa and the second portion DCb can take either an on state (irradiated state) or an off state (non-irradiated state), and by this combination of on and off, An 8-bit identification code DC is configured.
- the identification code DC associated with each light spot MK is used for identifying the light spot MK.
- the cell adjacent to the right side of the first part DCa that is, the cell cbr adjacent to the lower side of the second part DCb is in the off state.
- the entire projection pattern is configured by repeating a region MB composed of cells of 5 rows and 5 columns, in which the identification code DC and the cell cbr are added to the spot region MA of 4 rows and 4 columns.
- the light spot MK is used for specifying the position of each part of the projection pattern, and is composed of 2 rows and 2 columns of dots. Therefore, the light spot MK has a relatively large area in the imaging unit 20, and thus appears as a relatively high brightness portion.
- the identification code DC associated with each light spot MK is used to determine which of the many light spots included in the projection pattern is the light spot MK.
- FIG. 9 shows an example of an identification code used in the projection pattern.
- different “values” up to 55 ie different on / off combinations of identification codes are used.
- the cell values (on or off) of the identification codes c1 to c8 of each number (No.) are indicated by “1” and “0”.
- FIG. 10 shows an example of the arrangement of identification codes in the projection pattern (an example of the arrangement of areas made up of cells of 5 rows and 5 columns including each identification code).
- Each square in FIG. 10 corresponds to a region MB made up of cells of 5 rows and 5 columns.
- the numbers in each square indicate the identification code numbers (No.) in FIG.
- the same identification codes are arranged in the vertical direction, and the identification codes are No. 1 in the horizontal direction from left to right. 0 to No. No. 55 are arranged in order. After No. 55 (on the right side), no. 0 is arranged, and the same arrangement is repeated (periodic arrangement).
- the arrangement of the cells in the on state and the off state is the center of the projection pattern (projection). It is located at the midpoint of the vertical direction of the pattern and is point-symmetric with respect to the center of the light spot MK in the area MB including the identification code No. 28.
- the identification codes associated with the adjacent light spots in the horizontal direction there is always only one place where the on state / off state is changed (the place where the on state is switched to the off state or the place where the off state is switched to the on state). It has become.
- the projection pattern formed when the pattern light is projected onto a plane that is not perpendicular to the optical axis of the projector 12 is a quadrilateral other than a rectangle, and the rows and columns of the light spots are not parallel to each other. The distance will not be uniform. In the projection pattern formed when the pattern light is projected onto the curved surface, the rows and columns of the light spots are not linear. If the surface on which the pattern light is projected has irregularities, steps, etc., the relationship between the projection angles of each light spot (for example, the order of arrangement from the smallest) and the relationship between the incident angles of each light spot ( For example, the order in the case of arranging from the smallest may not match, and “replacement” may occur.
- the 8-bit identification code itself does not include an amount of information that can identify the number of columns by itself, but, for example, even if there is a change between the light spots, a deviation from the original position (order) of each light spot is not possible.
- the range of the change of the “value” of the identification code in the example shown in FIG. 10, the area MB composed of cells of 5 rows and 5 columns, 56 positions. It is possible to identify the position of the light spot, and by identifying the original position, it is possible to identify the column of the light spot to which the identification code is added.
- said "replacement" arises because the light projector 12 and the imaging part 20 are arrange
- an identification code may be set so that the vertical order can be identified in consideration of the possibility that the projector 12 and the imaging unit 20 are arranged at different positions in the vertical direction.
- FIG. 11 shows a configuration example of the distance information generation unit 43.
- the distance information generating unit 43 shown in FIG. 11 includes a binarizing unit 431, a spot area extracting unit 432, an identification code reading unit 433, a storage unit 434, a projection angle estimating unit 436, and an incident angle calculating unit 437. And a distance calculation unit 438.
- the binarization unit 431 binarizes the pattern light component output from the image difference acquisition unit 42 and outputs a binary pattern light image.
- the spot area extraction unit 432 extracts a spot area MA (area of 4 rows and 4 columns in FIG. 8) centered on each light spot from the pattern light image.
- a spot area MA area of 4 rows and 4 columns in FIG. 8
- four dots of 2 rows and 2 columns are arranged at regular intervals and located in the center, and their surroundings (upper and lower one row, A group of 4 ⁇ 4 cells composed of cells in the off state located in each of the left and right columns) is searched.
- the group of four dots of 2 rows and 2 columns in the central portion is regularly arranged at equal intervals in the projection pattern, and therefore the same arrangement is also obtained in an image obtained by imaging. Is a condition.
- the captured image does not always have the same interval due to the curvature, unevenness, step, etc. of the surface of the subject, so the pattern matching based on the similarity is performed to extract the spot area MA.
- the identification code reading unit 433 reads the identification code DC from the identification code area adjacent to the extracted spot area MA.
- the projection angle estimation unit 436 receives the identification code reading result from the identification code reading unit 433, and further receives data indicating the contents of the table of FIG. 9 (represents the relationship between each identification code and the position in the projection pattern) from the control unit 30.
- Information Information Spa indicating the correspondence between Sdp and the position on the projection pattern and the projection angle is obtained, and the projection angle ⁇ of each light spot is estimated based on the information Spa. If the above information, that is, data Sdp indicating the contents of the table of FIG. 9 and information Spa indicating the correspondence between the position on the projection pattern and the projection angle is provided from the control unit 30, the projection angle estimation unit 436 stores the information. It may be held in a memory (not shown).
- the projection angle estimation unit 436 estimates the projection angle.
- the value of the read identification code DC is the identification code No. in the table of FIG. 0-No. 55 (ie, which position in the pattern the light spot is attached to) is determined, and the result of this determination and the position of the light spot in the vertical direction On the basis of the determination result, the position of the light spot in the horizontal direction of the projection pattern and the vertical method is specified.
- the projection angle ⁇ is obtained based on the information Spa (given from the control unit 30) indicating the relationship between the identified position and the projection angle.
- the incident angle calculation unit 437 determines the position of the light spot in the imaging surface, and the light spot based on the axial direction and the angle of view of the imaging unit. Is calculated. Information Szv representing the axial direction and the angle of view is supplied from the control unit 30.
- the distance calculation unit 438 is based on the base line BL based on the projection angle ⁇ estimated by the projection angle estimation unit 436, the incident angle ⁇ calculated by the incident angle calculation unit 437, and the base line length Lpc supplied from the control unit 30. Calculate the distance to the surface of the subject on which the light spot is projected.
- the distance Dr from the imaging unit 20 to the subject surface (spot SP) on which the light spot is formed is determined from the distance Dz to the base line BL obtained by Expression (3) and the incident angle ⁇ .
- Dr Dz / cos ⁇ (4) Ask for.
- the distance information Dr obtained by the distance information generation unit 43 is supplied to the display processing unit 49.
- the ambient light type determination unit 44 determines the type of ambient light from the information Prp output from the control unit 30 and indicating the ratio of the R, G, and B components. Ambient light is classified into natural light such as sunlight and artificial illumination light, and artificial light is further classified according to the type of light source. For example, the ambient light type determination unit 44 compares the ratio Prp of the R, G, and B components with one or more predetermined determination reference values, and determines the type of ambient light based on the comparison result. . The type determination result Ltp in the ambient light type determination unit 44 is transmitted to the visible light image generation unit 46.
- the visible light image generation unit 46 is based on the pixel values R1, G1, and B1 from the image difference acquisition unit 42 and the type determination result Ltp from the ambient light type determination unit 44, and is close to the pixel values R1, G1, and B1. Pixel values excluding the infrared components IRre, IRge, and IRbe, that is, R, G, and B component values (pixel values) R2, G2, and B2 in the visible region are calculated.
- FIGS. 12A, 12B, and 12C are the same as FIGS. 6E, 6K, and 6P, respectively, but each component when the reflectance of the subject is 100%. The value of is indicated by a dotted line.
- values G1 (100), Gg (100), IRge (100), Y1 (100), Yy (100), and IRyp (100) when the reflectance M is 100% are used. , “(100)” is attached.
- the visible light image generation unit 46 From the information Ltp indicating the type from the ambient light type determination unit 44, the visible light image generation unit 46 has the same value Mo for the reflectance for near-infrared light and the reflectance for visible light (in particular, the G component here). Of the ambient light image (R1, G1, B1) in the case where it is assumed that the ratio of the near infrared component or the composition ratio Ey is estimated.
- the ratio to (Mo)) is estimated.
- the reflectance Mo is the same regardless of the value, the following description will be made assuming that the reflectance Mo is 100% in order to simplify the description.
- the visible light image generation unit 46 multiplies the composition ratio Ey and the luminance value Y1 (M) to obtain the near infrared component IRge (M) included in the pixel value G1 (M1).
- R2 (M) Rr (M)
- B2 (M) Bb (M)
- the proximity between R, G, and B components is calculated.
- the values ( ⁇ ⁇ Ey, ⁇ ⁇ Ey) obtained by multiplying the above-mentioned composition ratio Ey by the parameters ⁇ and ⁇ taking into account the difference in the proportion of the infrared component are used.
- FIG. 13 shows a configuration example of the visible light image generation unit 46 of FIG.
- the visible light image generation unit 46 shown in the figure includes a composition ratio information generation unit 461, a luminance value calculation unit 463, a parameter generation unit 464, and a visible light component calculation unit 466.
- the composition ratio information generation unit 461 generates information (configuration ratio information) indicating the composition ratio Er of the near-infrared component included in the pixel value of the environment light image from the type determination result Ltp by the environment light type determination unit 44. To do.
- the configuration ratio information generation unit 461 includes a configuration ratio memory 462 that holds information indicating the configuration ratio of the near-infrared component included in the ambient light for each type of ambient light.
- the configuration ratio memory 462 includes, for example, an LUT table, and when information indicating the type of environment is input as an address, information indicating the configuration ratio of near infrared light is read out. “The composition ratio of the near-infrared component included in the ambient light” depends on the spectral transmission characteristics of the color filters of the imaging unit 20 and the spectral sensitivity characteristics of the photoelectric conversion elements, and the output value of the imaging unit 20 by the ambient light. Of these, the composition ratio of the near-infrared component is meant.
- the reflectance of the subject is the same value Mo for both visible light and near-infrared light
- the case where the reflectance Mo is 100% will be described with reference to FIGS. 12A to 12C.
- the component ratio Ey is the G component of the ambient light image shown in FIG.
- the ambient light type determination unit 44 receives the information Prp indicating the ratio (ratio over the entire screen) of the R, G, and B components included in the control information, and based on this information, determines the ambient light type Ltp.
- the component ratio information generation unit 461 reads the information indicating the component ratio Ey corresponding to the type Ltp (stored in association with the type) with reference to the component ratio memory 462 and outputs the information.
- the information indicating the composition ratio Ey is obtained only once for the entire screen, for example, and is used as a common value for all pixels or regions in the screen.
- the luminance value calculation unit 463 calculates the luminance value Y1 (M) for the pixel based on the pixel values R1, G1, and B1 output from the imaging unit 20.
- the calculation of the luminance value can be performed by, for example, the following formula.
- Y1 (M) ar ⁇ R1 (M) + ag ⁇ G1 (M) + ab ⁇ B1 (M) (5)
- the type determination result Ltp in the ambient light type determination unit 44 is also transmitted to the parameter generation unit 464.
- the parameter generation unit 464 includes a parameter memory 465 that holds parameters ⁇ and ⁇ for each ambient light.
- the parameter generation unit 464 reads the values of the parameters ⁇ and ⁇ corresponding to the type from the parameter memory 465 and outputs the values according to the type determination result Ltp in the ambient light type determination unit 44.
- the parameter ⁇ is included in the R component pixel value R1 (100) of the ambient light image when the subject is irradiated with the corresponding type of ambient light and the reflectance of the subject is a certain value Mo, for example, 100%.
- the ratio (IRre (100) / IRge (100)) of the near infrared component value IRre (100) to the near infrared component value IRge (100) included in the pixel value G1 (100) of the G component is represented.
- the parameter ⁇ is included in the B component pixel value B1 (100) of the ambient light image when the subject is irradiated with the corresponding type of ambient light and the reflectance of the subject is a certain value Mo, for example, 100%.
- the ratio (IRbe (100) / IRge (100)) of the near infrared component value IRbe (100) to the near infrared component value IRge (100) included in the pixel value G1 (100) of the G component It is.
- the values of ⁇ and ⁇ depend on the characteristics of the color filter in the imaging unit and the characteristics of the photoelectric conversion element, and are obtained in advance by experiments.
- the visible light component calculation unit 466 multiplies the luminance value Y1 from the luminance value calculation unit 463 by the configuration ratio Ey from the configuration ratio information generation unit 461, thereby near-infrared included in the G signal of the ambient light image.
- the visible light component calculation unit 466 converts the near-infrared components IRre, IRge, and IRbe included in the R signal, G signal, and B signal of the ambient light image thus obtained into the R signal, the G signal, By subtracting from the B signal, visible light components (values obtained by removing near-infrared components) R2, G2, and B2 are calculated.
- the sensitization processing unit 47 performs multiplication or sensitization processing on the visible light image generated by the visible light image generation unit 46, and outputs a sensitized visible light image. This sensitization process is performed by weighting and adding pixel values of surrounding pixels to the pixel values R2, G2, and B2 output from the visible light image generation unit 46.
- the pixel values R2, G2, and B2 output from the visible light image generation unit 46 do not have all the color components for all the pixels, and each pixel has any one color component according to the position in the Bayer array.
- Has the value of The sensitization processing unit 47 adds the pixel values of the pixels of the same color (pixels having the same color component) around each pixel (target pixel) to obtain the sensitized (multiplied) pixel value.
- the signals R3, G3, and B3 are output.
- FIGS. 14A to 14C show pixels added to the target pixel.
- the smallest square represents a pixel.
- the target pixel is the R pixel RR34 as shown in FIG. 14A
- the pixel RR12 on the left by two columns on the second row
- the pixel RR32 on the same column on the second row
- the second row The top two pixels right RR52, the same row two columns left pixel RR14, the same row two columns right pixel RR54, the second row down two columns left pixel RR16, the second row down pixel RR36
- Eight pixels of the pixel RR56 on the right in the second column and two rows below are added.
- NRR34 RR12 + RR32 + RR52 + RR14 + RR34 + RR54 + RR16 + RR36 + RR56 (7)
- a value NRR34 obtained as a result of such pixel addition is output as a multiplied R component value R3.
- the target pixel is the G pixel GB33
- the pixel GB31 in the same column on the second row
- the pixels GR22 and 1 on the left one column on the first row Pixel GR42 on the right by one column on the row
- pixel GB13 on the left by two columns on the same row
- pixel GB53 on the right by two columns on the same row
- pixel GR24 on the left by one column on the first row
- pixel on the right by one column on the bottom Eight pixels of GR44 and pixel GB35 in the same column under two rows are added.
- NGB33 GB31 + GR22 + GR42 + GB13 + GB33 + GB53 + GR24 + GR44 + GB35 (8)
- the value NGB33 obtained as a result of such pixel addition is output as a multiplied G component value G3.
- the pixel of interest is a B pixel BB43 as shown in FIG. 14C
- a peripheral pixel a pixel BB21 on the left and two columns on the second row, a pixel BB41 on the same column on the second row, Pixel BB61, two columns right on the second row, pixel BB23 two columns on the left in the same row, pixel BB63, two columns on the right in the same row, pixel BB25 on the second row, two columns on the left, and pixels in the same column on the second row
- Eight pixels of BB45 and the pixel BB65 on the right by two columns and two rows below are added.
- NBB43 BB21 + BB41 + BB61 + BB23 + BB43 + BB63 + BB25 + BB45 + BB65 (9)
- a value NBB43 obtained as a result of such pixel addition is output as a multiplied B component value B3.
- the above addition process is a process of mixing the peripheral pixels in the same frame with the target pixel. Since the peripheral pixels generally have the same pixel value as the target pixel, the signal component is increased. Has the effect of doubling. For example, when the pixel values of the surrounding eight pixels are added to each pixel of interest as described above (assuming that the surrounding pixels have the same pixel value as the pixel of interest), the addition result is 9 times the pixel value. However, as a result of adding (mixing) peripheral pixels, the resolution (still resolution) is lowered.
- the weight of addition may be changed according to the peripheral pixel value, instead of adding the peripheral pixels evenly when adding the pixel values.
- a weighted addition may be performed by increasing the weight for the pixel value of a pixel having a strong correlation with the target pixel.
- the pixel values of the target pixel and the surrounding pixels are compared, and the addition weight is increased only for a pixel whose pixel value difference is equal to or less than a predetermined value with respect to the target pixel, and the addition weight is decreased for other pixels. It's also good.
- a pixel at the same position as the target pixel in a different frame that is, a frame before and after the frame including the target pixel is added.
- the preceding and following frames are not limited to the immediately preceding frame and the immediately following frame, but may be the immediately preceding predetermined number of frames and the immediately following predetermined number of frames. If pixels at the same position in different frames are added, the signal component can be enhanced while avoiding a decrease in still resolution, which is particularly effective when there is little movement of the image. However, the motion blur increases in the case of an image with intense motion.
- both the peripheral pixels in the same frame and the pixels at the same position in different frames may be added to the target pixel, and further, the pixels around the pixels at the same position in different frames may be added. Also good. By doing so, the multiplication factor of the signal component can be further increased.
- the video signal processing unit 48 performs color interpolation processing (interpolation of missing color components at each pixel position), tone correction processing, on the sensitized visible light image output from the sensitizing processing unit 47, Noise reduction processing, contour correction processing, white balance adjustment processing, signal amplitude adjustment processing, color correction processing, and the like are added, and images obtained as a result of these processing are output as corrected visible light images R4, G4, and B4.
- the display processing unit 49 performs processing for displaying the corrected visible light image output from the video signal processing unit 48 in association with the distance information generated by the distance information generating unit 43.
- FIGS. 15A and 15B show examples of output images of the display processing unit 49.
- FIG. 15A shows a visible light image (non-projection image)
- FIG. 15B shows an image with distance information.
- an image with distance information an image in which luminance or color is assigned to the distance is displayed. For example, an image in which a visible light image is expressed by luminance and a distance is expressed by color is displayed.
- an object existing in the imaging space is recognized, and an image in which character information indicating the distance of the object is superimposed and displayed on the visible light image is output.
- the visible light image of FIG. 15 (a) may be displayed on one side, and the image with distance information shown in FIG. 15 (b) may be displayed on the other side.
- the visible light image shown in FIG. 15 (a) and the image with distance information shown in FIG. 15 (b) may be displayed alternately on the screen, or the one selected by the user's operation may be displayed. good. In this case, it is desirable that the image with distance information is displayed synchronously with the same angle of view and the number of pixels as the visible light image.
- the image (indicating signal) associated with the distance information is output to a display device (not shown).
- FIG. FIG. 16 shows a configuration of an image generation apparatus according to Embodiment 2 of the present invention.
- the illustrated image generation apparatus is generally the same as the configuration shown in FIG. 1, but differs in the following points. That is, a visible light image generation unit 46b is provided instead of the visible light image generation unit 46 of FIG.
- the visible light image generation unit 46 obtains the composition ratio Ey with respect to the luminance value of the near-infrared component included in the value of the G component of the ambient light image, and stores the parameters ⁇ and ⁇ stored in advance.
- pixel values R2, G2, and B2 having only a visible light component are calculated based on the composition ratio Ey
- the near-infrared component IRre included in the value of the R signal of the ambient light image is calculated using the component ratio (first component ratio) Er of the NIR
- the component ratio (second component ratio) Eg of the near-infrared component IRge included in the value of the G signal of the ambient light image is calculated.
- Is used to calculate the pixel value G2, and the pixel value B2 is calculated using the component ratio (component ratio of 3) Eb of the near-infrared component IRbe included in the value of the B signal of the ambient light image.
- FIG. 17 shows the visible light image generation unit 46b used in the second embodiment.
- the visible light image generation unit 46b in FIG. 17 includes a configuration ratio information generation unit 461b, a reflectance calculation unit 467, an intensity ratio calculation unit 468, and a visible light component calculation unit 466b.
- the composition ratio information generation unit 461b includes a composition ratio memory 462b.
- the configuration ratio memory 462b is information (configuration) that indicates the configuration ratios Er, Eg, and Eb of the near-infrared components IRre, IRge, and IRbe included in the pixel values R1, G1, and B1 of the R, G, and B components of the ambient light image. Ratio information) is stored for each type of ambient light.
- the composition ratios Er, Eg, and Eb are each expressed by the following formula.
- the configuration ratio information generation unit 461b reads information indicating the configuration ratios Er, Eg, and Eb corresponding to the type Ltp determined by the ambient light type determination unit 44 from the configuration ratio memory 462b and outputs the information.
- the reflectance calculation unit 467 includes the value of the pattern light component output from the image difference acquisition unit 42 (represents the intensity of light reflected from the subject and incident on the imaging unit 20 out of the pattern light), and the control unit 30 based on the pattern light projection intensity St (information indicating) and the distance information Dr of the subject calculated by the distance information generation unit 43, the reflectance of the near-infrared component of the subject for each region on the image M is obtained, and information (reflectance information) M indicating the reflectance is output.
- This reflectance is obtained, for example, as follows.
- the control unit 30 controls the light projecting intensity of the projector 12, and for this purpose, information indicating the light projecting intensity is generated.
- Information indicating the light projection intensity St for this control is supplied to the visible light image generator 46b.
- the dots constituting the pattern are formed by a single laser beam collimated by the collimator lens 14,
- the size of the dot itself does not change regardless of the subject distance. Accordingly, it can be assumed that there is no attenuation with respect to the propagation from the projector 12 to the subject. In addition, although there is attenuation due to scattering by particles in the air, it can be ignored as compared with the attenuation obtained by Equation (10).
- the intensity of light actually reaching the imaging unit 20 is Sr.
- the actual reflected light intensity Sr is obtained from the difference in brightness of the captured image during pattern light projection and non-projection. That is, the size of the pattern light component output from the image difference acquisition unit 42, that is, IRyp (M) in FIG. 12C represents the intensity Sr of the reflected light. On the other hand, Sr0 corresponds to IRyp (100) in FIG. Since the intensity of the reflected light is obtained for each spot in each frame, the reflectance M is obtained for each area of the image corresponding to each spot.
- the image is divided into a plurality of regions, and the average value (simple average value or weighted average value) Ma of the reflectance for the spots included in each region is used as the reflectance M for all the spots in the region. Also good. Further, instead of obtaining the reflectance for each region, the reflectance may be obtained for each pixel. For example, for each pixel, the average value (simple average value or weighted average value) Mp of the reflectance for the spot included in the region centered on the pixel may be used as the reflectance M for the pixel.
- the intensity ratio calculation unit 468 calculates and outputs intensity ratios Psr, Psg, and Psb for each of R, G, and B by multiplying the composition ratios Er, Eg, and Eb by the reflectance M.
- the intensity ratios Psr, Psg, and Psb are ratios of near-infrared components (IRre (M) / R1) included in R, G, and B pixel values (pixel values output from the imaging unit 20) in each region of the image. (100), IRge (M) / G1 (100), IRbe (M) / B1 (100)).
- the visible light component calculation unit 466b multiplies the intensity ratios Psr, Psg, and Psb by the pixel values R1, G1, and B1 of the ambient light image, so that the near red included in the R signal, the G signal, and the B signal of the ambient light.
- IRre R1 ⁇ Psr
- pixel values having only a visible light component values obtained by removing near-infrared components
- the pixel values R2, G2, and B2 of the visible light image obtained as described above are supplied to the sensitization processing unit 47. Except for the above, the second embodiment is the same as the first embodiment.
- the ratio Prp of the R, G, and B components is obtained based on the signals R1, G1, and B1 output from the image difference acquisition unit 42, but the present invention is limited to this.
- the ratio Prp may be obtained based on the signals R2, G2, and B2 output from the visible light image generation unit 46, and based on the signals R3, G3, and B3 output from the sensitization processing unit 47. It may be obtained based on the signals R4, G4, and B4 output from the video signal processing unit 48, and further obtained based on the signals R0, G0, and B0 output from the A / D converter 41. good.
- a laser is used as the light source of the projector 12.
- another light source such as an LED is used instead, the same operation is performed as long as the characteristics of the incident light of the diffraction grating are satisfied. Similar effects can be obtained.
- the projector 12 projects the pattern formed on the diffraction grating 16 by the laser light source 13, but the laser beam is scanned two-dimensionally at high speed (within one frame period).
- a configuration in which a pattern is projected by scanning the entire field of view operates in the same manner and provides the same effect.
- the projector 12 projects pattern light every other frame period, but the present invention is not limited to this, and pattern light projection is performed every two frame periods or more. In short, it may be performed every predetermined number of frame periods.
- the image difference acquisition unit 42 determines the pattern light from the difference between each projection image and the non-projection image obtained in the frame period immediately before or immediately after the frame period in which the projection image is obtained. An image may be generated.
- the pattern light is calculated from the difference between each projection image and the average of the non-projection images obtained in a plurality of frame periods before and after the frame period in which the projection image was obtained. An image may be generated.
- an image generation apparatus that can acquire distance information to an object existing in an imaging space in association with a captured image.
- the image generation apparatus of the present invention can be applied to intrusion monitoring in monitoring applications because it can simultaneously acquire an intruder image and distance, for example.
- the image generation apparatus of the present invention can also be applied to driving assistance based on obstacle detection in front of or behind the vehicle, for example, parking assistance.
- the composition of the near infrared light included in the ambient light and the near infrared light reflectance of the subject By correcting the pixel value and correcting the pixel value, it is possible to remove the near-infrared component and obtain an image with high color reproducibility even when an imaging unit having sensitivity to near-infrared light is used.
- each part of the image generation apparatus or a part of steps constituting the image generation method can be realized by software, that is, by a programmed computer.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- Toxicology (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Measurement Of Optical Distance (AREA)
- Image Processing (AREA)
- Color Television Image Signal Generators (AREA)
- Studio Devices (AREA)
- Projection Apparatus (AREA)
Abstract
Description
この時カメラでパターン光を撮影するためには、パターン光の波長にも感度をもつカメラを用いることが必要となる。可視光以外の波長にも感度を持つカメラで撮影を行うと、パターン光を投写しない場合の撮影においても、環境光に含まれる可視光以外の成分の影響を受け、撮影画像の色再現性が低下し、被写体を正しく識別できなくなるという問題がある。
撮像空間に近赤外波長のパターン光を所定数のフレーム期間おきに投写する投光器と、
環境光により照射されるとともに、前記パターン光を所定数のフレーム期間おきに投写される前記撮像空間内の被写体を撮像して、それぞれ撮像画像のR成分、G成分、及びB成分を表すR信号、G信号、及びB信号を含む撮像信号を出力する撮像部と、
前記投光器に対して前記パターン光の投写強度の指示を与える制御部と、
前記撮像部による撮像で得られた撮像信号のうち、前記パターン光の投写時に得られた撮像信号と前記パターン光の非投写時に得られた撮像信号の差分を求めることで、パターン光画像を生成するとともに、前記パターン光の非投写時に得られた撮像信号から環境光画像を生成する画像差分取得部と、
前記環境光の種別を判定する環境光種別判定部と、
前記環境光種別判定部により判定された前記環境光の種別から、前記環境光に含まれる近赤外波長の光の構成比を示す構成比情報を生成し、生成した前記構成比情報に基づいて、前記環境光画像において、前記環境光に含まれる前記近赤外波長の光による成分を推定し、前記近赤外波長の光による成分を、前記環境光画像から減算することで可視光画像を生成する可視光画像生成部を備え、
前記環境光種別判定部は、前記撮像画像、前記環境光画像、又は前記可視光画像におけるR、G、B成分の比率から前記環境光の種別を判定する
ことを特徴とする。
図1は本発明の実施の形態1における画像生成装置の構成を示す。図示の画像生成装置は、パターン光生成部10と、撮像部20と、制御部30と、画像処理部40とを有する。
パターン光生成部10は、駆動部11と投光器12を含む。
撮像部20は、図2に示されるように、レンズ22と、撮像素子24とを備える。
入射角θは、図4に示すように、基線BLと光スポットSPを含む平面内において、基線BLに垂直な線と、撮像部20と光スポットSPを結ぶ線の成す角である。
投光器12からの投写角φは、投光器12の構成によって予め定まっており、従って既知である。
駆動部11(図1)は、制御部30により制御されて、レーザー光源13を発光させ、レーザー光源13から出射されたレーザー光はコリメートレンズ14で平行光にされ、アパーチャ15で予め定められたビーム径にされる。
アパーチャ15から出た光は、回折格子16に入射する。回折格子16は、予め定められた投写パターンを生成するためのパターン光を撮像空間JSに投写する。
撮像素子24は入射像を光電変換した撮像信号を出力する。撮像素子24は例えばR、G、Bの画素がベイヤ型に配列されたものであり、撮像信号としては、R、G、Bの信号が出力される。撮像素子24の各画素は光電変換素子と光電変換素子の入射側に設けられた色フィルタとにより構成される。
被写体OJ1、OJ2にパターン光が投写されているときは、被写体OJ1、OJ2で反射した環境光による成分(環境光成分乃至環境光画像)に、被写体OJ1、OJ2で反射したパターン光による成分(パターン光成分乃至パターン光画像)が重畳された画像(を表す信号)が、撮像部20から出力される。
被写体OJ1、OJ2にパターン光が投写されていないときは、被写体OJ1、OJ2で反射した環境光による成分(環境光成分乃至環境光画像)のみから成る画像(を表す信号)が、撮像部20から出力される。
具体的には、制御部30は、撮像部20の撮像素子24の撮像モード、フレーム周波数、露光時間、絞り、アナログゲインなどを制御する。露光時間、絞り、アナログゲインの制御に当たっては、撮像された画像の明るさが一定となるように調整を行う。
制御部30はまた、A/D変換器41に対して動作のタイミングを制御するための信号を供給する。
投写強度の制御に当たっては、パターン光投写時の信号の値とパターン光非投写時の信号の値との差分が一定となるように調整を行う。
予め定められたフレーム周波数で撮像を繰り返すように制御を行うとともに、パターン光生成部10に対し、1フレーム期間おきにパターン光の投写と非投写を交互に繰り返すように制御を行う。具体的には、制御部30は、レーザー光源13が1フレーム期間おきに発光状態、非発光状態になるように駆動部11を制御する。
本実施の形態では上記の比率を示す情報Prpを、環境光種別判定部44にも供給し、環境光種別判定部44ではこの情報を用いて環境光の種別の判定を行う。
また、パターン光による成分を含まないフレームの信号のみを選択して出力することで、図6(b)、(e)、(h)に示される環境光成分R1、G1、B1を表す信号、或いは該信号で表される画像(環境光画像乃至非投写時画像)を生成することができる。
図6(k)には、図6(b)、(e)、(h)の信号R1、G1、B1を合成することで得られる輝度信号Y1が示されている。この輝度信号Y1には、本来の輝度成分Yyと、環境光による近赤外成分IRyeとが含まれる。
図6(l)には、図6(c)、(f)、(i)の信号R2、G2、B2を合成することで得られる輝度信号Y2が示されている。この輝度信号Y2には、本来の輝度成分Yyのみが含まれる。
図6(p)には、図6(m)、(n)、(o)の信号IRrp、IRgp、IRbpを合成することで得られる信号IRypが示されている。この信号IRypは、撮像されたパターン光の強度Srを表す。
画像差分取得部42は、A/D変換器41から出力される信号R0、G0、B0を受け、パターン光が投写されているときの画像とパターン光が投写されていないときの画像とに基づいて、パターン光による画像(パターン光画像)と、パターン光成分を除いた画像(環境光画像)とを生成する。例えば、パターン光が投写されていないフレーム期間に得られた画像を環境光画像として出力し、相前後する2つのフレーム期間のうち、パターン光が投写されているフレーム期間における撮像(露光)で得られた画像からパターン光が投写されていないフレーム期間における撮像(露光)で得られた画像を差し引くことで得られた画像をパターン光画像として出力する。具体的には、パターン光が投写されているフレーム期間に得られたR、G、B成分の信号からパターン光が投写されていないフレーム期間に得られたR、G、B成分の信号を差し引き、これらを組み合わせることで単一のパターン光成分を表す信号を生成する。
フレーム遅延部421は、A/D変換器41から入力端子420を介して供給される撮像信号D41(R0、G0、B0)を、1フレーム期間遅延させてフレーム遅延撮像信号D421を出力する。
この合成は、R、G、B信号から輝度信号を生成する場合と同様に、例えば、下記の演算で行われる。
IRyp=a1×IRrp+a2×IRgp+a3×IRbp (1)
式(1)において、
a1、a2、a3は、
a1+a2+a3=1
となるように予め定められた係数である。係数a1、a2、a3は、撮像部20のR、G、Bの色フィルタの近赤外成分に対する分光透過特性、R、G、Bの光電変換素子の近赤外成分に対する分光感度特性により定められる。また、簡単化のため、例えば、
a1=a2=a3=1/3
としても良い。
最小のマスはドット位置乃至セルと呼ばれるものであり、投写パターンにおいて、オン(光が照射された状態)又はオフ(光が照射されない状態)を制御可能な最小単位である。例えば、投写範囲内に縦方向480行、横方向650列のセルが形成される。光が照射された状態のセルによりドットが構成される。
2行2列の領域の周囲の上下各1行、左右各1列はオフ状態のセル(光が照射されないセル)からなる領域であり、この領域と2行2列の領域を含む4行4列の領域をスポット領域MAと言う。
第1の部分DCa及び第2の部分DCbの各セルはオン状態(照射された状態)又はオフ状態(照射されない状態)のいずれかを取ることが可能であり、このオン、オフの組み合わせにより、8ビットの識別コードDCが構成される。各光スポットMKに付随した識別コードDCは、当該光スポットMKの識別のために用いられる。
投写パターンの全体は、上記の4行4列のスポット領域MAに、識別コードDC及びセルcbrを加えた、5行5列のセルからなる領域MBの繰り返しで構成される。
各光スポットMKに付随した識別コードDCは、当該光スポットMKが投写パターンに含まれる多数の光スポットのうちのどれであるかの判定に用いられるものである。
図10に示す例では、垂直方向には同じ識別コードが並べられ、水平方向には、左から右へ、識別コードがNo.0からNo.55まで順に並べられ、No.55の次(右側)には、再びNo.0が配置され、以下同様の配置の繰り返し(周期的な配置)となっている。
また、水平方向において隣接する光スポットに付随する識別コード間では、オン状態/オフ状態の変更箇所(オン状態からオフ状態へ切り替わる箇所又はオフ状態からオン状態へ切り替わる箇所)が必ず一箇所のみとなっている。
8個のセルc1~c8のオンオフの組合せは256通りあるが、そのすべてを利用するのではなく、上記の条件を満たすようにするため、256通りの組合せのうち、56通りの組合せのみが識別コードとして用いられている。
図11に示される距離情報生成部43は、2値化部431と、スポット領域抽出部432と、識別コード読取部433と、記憶部434と、投写角推定部436と、入射角算出部437と、距離算出部438とを有する。
スポット領域MAの抽出のためには、一定の間隔で配置され、中心部に位置する2行2列の4つのドット(オン状態のセルによって構成される)と、その周囲(上下各1行、左右各1列)に位置するオフ状態のセルから成る4行4列のセルの群を探索する。また、中心部の2行2列の4つドットの群は、投写パターンにおいては、規則的に等間隔に配置されたものであるので、撮像で得られた画像においても同様の配置であることが条件となる。但し、被写体の表面の曲がり、凹凸、段差などにより、撮像画像においては、完全に等間隔になるとは限らないため、類似度に基づくパターンマッチング等を行って、スポット領域MAの抽出を行う。
光スポットの、投写パターン上の位置が特定できたら、特定された位置と投写角との関係を示す情報Spa(制御部30から与えられる)に基づいて投写角φを求める。
Dz=Lpc/(tanφ-tanθ) (2)
の関係から求めることができる。式(2)は、図4において、
Dz・tanφ-Dz・tanθ=Lpc (3)
の関係があることから得られる。
Dr=Dz/cosθ (4)
により求める。
距離情報生成部43で求めた距離情報Drは、表示処理部49に供給される。
環境光種別判定部44での種別の判定結果Ltpは、可視光画像生成部46に伝えられる。
この反射率Moはどのような値であっても同じ結果となるが、以下では説明を簡単にするため、反射率Moが100%である場合を想定して説明する。上記の反射率Moが100%である場合、上記の構成比Erは、下記の式で与えられる。
Er
=IRge(100)/Y1(100)
=IRge(100)/{IRye(100)+Yy(100)}
画素値R2(M)(=Rr(M))、B2(M)(=Bb(M))の算出には、上記の構成比Eyの代わりに、R、G、B成分相互間における、近赤外成分が占める割合の違いを考慮したパラメータα、βを、上記の構成比Eyに乗じた値(α×Ey、β×Ey)を用いる。
図示の可視光画像生成部46は、構成比情報生成部461と、輝度値算出部463と、パラメータ生成部464と、可視光成分算出部466とを有する。
「環境光に含まれる近赤外成分の構成比」は、撮像部20の色フィルタの分光透過特性及び光電変換素子の分光感度特性に依存するものであり、環境光による撮像部20の出力値のうちの、近赤外成分の構成比を意味する。例えば、被写体の反射率が可視光に対しても近赤外光に対しても同じ値Moであると仮定した場合に、環境光画像のG成分の画素値G1(Mo)に含まれる近赤外成分IRge(Mo)の、輝度値Y1(Mo)(=Yy(Mo)+IRye(Mo))に対する比(IRge(Mo)/Y1(Mo))が上記構成比Eyとして構成比メモリ462に保持されている。上記の反射率Moが100%である場合について図12(a)~(c)を参照して説明すると、上記の構成比Eyは、図12(a)に示される、環境光画像のG成分の画素値G1(100)に含まれる近赤外成分IRge(100)の、図12(b)に示される輝度値Y1(100)(=Yy(100)+IRye(100))に対する比(IRge(100)/Y1(100))である。
上記の構成比Eyを示す情報は、例えば、画面全体に対して一度だけ求められ、画面内のすべての画素乃至領域に対して共通の値として用いられる。
なお、画面全体に対して一度だけではなく、それぞれ画面の一部をなす領域毎にR、G、B成分の比率を示す情報Prpが得られる場合には、これに基づいて領域毎に構成比Eyを求めることとしても良い。
輝度値の算出は、例えば下記の式で行うことができる。
Y1(M)=ar×R1(M)+ag×G1(M)+ab×B1(M)
(5)
ar、ag、abは、
ar+ag+ab=1となるように予め定められた係数であり、例えば
ar=0.3
ag=0.59、
ab=0.11
である。
パラメータ生成部464は、環境光毎にパラメータα、βを保持するパラメータメモリ465を有する。パラメータ生成部464は、環境光種別判定部44での種別の判定結果Ltpに応じて、種別に対応したパラメータα、βの値をパラメータメモリ465から読み出して出力する。
パラメータαは、被写体が該当する種別の環境光で照射され、該被写体の反射率がある値Mo、例えば100%である場合に、環境光画像のR成分の画素値R1(100)に含まれる近赤外成分の値IRre(100)の、G成分の画素値G1(100)に含まれる近赤外成分の値IRge(100)に対する比(IRre(100)/IRge(100))を表し、
パラメータβは、被写体が該当する種別の環境光で照射され、該被写体の反射率がある値Mo、例えば100%である場合に、環境光画像のB成分の画素値B1(100)に含まれる近赤外成分の値IRbe(100)の、G成分の画素値G1(100)に含まれる近赤外成分の値IRge(100)に対する比(IRbe(100)/IRge(100))を表すものである。
α、βの値は、撮像部におけるカラーフィルタの特性、光電変換素子の特性に依存し、予め実験により求められる。
この値(IRge=Y1×Ey)にパラメータαを乗算することで、環境光画像のR信号に含まれる近赤外成分(IRre=Y1×Ey×α)を求め、
この値(IRge=Y1×Ey)にパラメータβを乗算することで、環境光画像のB信号に含まれる近赤外成分(IRbe=Y1×Ey×β)を求める。
可視光成分算出部466は、このようにして求めた環境光画像のR信号、G信号、B信号に含まれる近赤外成分IRre、IRge、IRbeを、環境光画像のR信号、G信号、B信号から減算することで、可視光成分(近赤外成分を除去した値)R2、G2、B2を計算する。
以上の計算は下記の式で表される。
R2=R1-Y1×Ey×α
G2=G1-Y1×Ey
B2=B1-Y1×Ey×β
(6)
可視光画像生成部46から出力される画素値R2、G2、B2も、すべての画素についてすべての色成分を有するわけではなく、各画素はベイヤ配列中の位置に応じていずれか一つの色成分の値を有する。増感処理部47は、各画素(注目画素)について、その周辺の同じ色の画素(同じ色成分を有する画素)の画素値を加算することで、増感(増倍)された画素値を有する信号R3、G3、B3を出力する。
注目画素が図14(a)に示されるようにR画素RR34である場合には、周辺の画素として、2行上で2列左の画素RR12、2行上で同じ列の画素RR32、2行上で2列右の画素RR52、同じ行で2列左の画素RR14、同じ行で2列右の画素RR54、2行下で2列左の画素RR16、2行下で同じ列の画素RR36、及び2行下で2列右の画素RR56の、8つの画素が加算される。
NRR34=RR12+RR32+RR52
+RR14+RR34+RR54
+RR16+RR36+RR56
(7)
このような画素加算の結果得られる値NRR34が増倍されたR成分値R3として出力される。
以上注目画素がRR34である場合について説明したが、他の位置のR画素についても、同様の配置の周辺画素を加算する。
NGB33=GB31+GR22+GR42
+GB13+GB33+GB53
+GR24+GR44+GB35
(8)
このような画素加算の結果得られる値NGB33が増倍されたG成分値G3として出力される。
以上注目画素がGB33である場合について説明したが、他の位置のG画素についても、同様の配置の周辺の画素を加算する。
NBB43=BB21+BB41+BB61
+BB23+BB43+BB63
+BB25+BB45+BB65
(9)
このような画素加算の結果得られる値NBB43が増倍されたB成分値B3として出力される。
以上注目画素がBB43である場合について説明したが、他の位置のB画素についても、同様の配置の周辺画素を加算する。
例えば各注目画素に対して上記のように周辺の8個の画素の画素値を加算する場合(仮に周囲の画素が注目画素と同じ画素値を有するとすれば)、加算結果は、注目画素の画素値の9倍となる。
但し、周辺の画素を加算(混合)する結果、解像度(静止解像度)は低下する。
ここで、前後のフレームとは直前の1フレーム及び直後の1フレームに限らず、直前の予め定められた数のフレーム、及び直後の予め定められた数のフレームであっても良い。
異なるフレームの同一位置の画素を加算することとすれば、静止解像度の低下を避けながら、信号成分を増強することができ、画像の動きが少ない場合に特に有効である。
但し、動きの激しい映像の場合に動きぼけが多くなる。
そのようにすれば信号成分の増倍率を一層大きくすることができる。
図15(a)及び(b)は、表示処理部49の出力画像の例を示す。図15(a)は、可視光画像(非投写時画像)を表し、図15(b)は、距離情報付き画像を表す。
距離情報付き画像としては、距離に輝度又は色を割り当てたものが表示される。例えば、可視光画像を輝度で表現し、距離を色で表現した画像が表示される。あるいは、撮像空間に存在する物体を認識して、当該物体の距離を表す文字情報を可視光画像に重畳表示した画像を出力する。
図16は、本発明の実施の形態2における画像生成装置の構成を示す。図示の画像生成装置は、概して図1に示される構成と同じであるが、以下の点で異なる。即ち、図1の可視光画像生成部46の代わりに可視光画像生成部46bが設けられている。
図17の可視光画像生成部46bは、構成比情報生成部461bと、反射率算出部467と、強度比率算出部468と、可視光成分算出部466bとを有する。
構成比Er、Eg、Ebはそれぞれ下記の式で表される。
Er=IRre(Mo)/R1(Mo)
Eg=IRge(Mo)/G1(Mo)
Eb=IRbe(Mo)/B1(Mo)
上記の式でMoは被写体の反射率を表し、実施の形態1で述べたのと同様、Mo=100%の場合には、上記の式は以下のように書き直すことができる。
Er=IRre(100)/R1(100)
Eg=IRge(100)/G1(100)
Eb=IRbe(100)/B1(100)
構成比情報生成部461bは、環境光種別判定部44で判定された種別Ltpに応じた構成比Er、Eg、Ebを示す情報を構成比メモリ462bから読み出して出力する。
上記のように、制御部30は、投光器12の投光強度の制御を行うが、そのために投光強度を示す情報を生成している。この制御のための投光強度Stを示す情報が可視光画像生成部46bに供給される。
即ち、被写体の反射率が100%のときに、被写体で反射して撮像部に到達し受光される光の強度(或いはそれに対応する撮像信号の値)Sr0は下記の式で与えられる。
Sr0=St×(ka/Dr2) (10)
式(10)でkaは撮像部20におけるレンズ22の開口率及び撮像素子24の感度によって決まる定数である。
M=Sr/Sr0 (11)
で与えられる。
反射光の強さは、各フレームにおいて、スポット毎に求められるので、上記の反射率Mは各スポットに対応する画像の領域毎に求められる。
また、領域毎に反射率を求める代わりに画素毎に反射率を求めることとしても良い。例えば、各画素について、当該画素を中心とする領域に含まれるスポットについての反射率の平均値(単純平均値又は加重平均値)Mpを当該画素についての反射率Mとして用いても良い。
R2=R1-R1×Psr
G2=G1-G1×Psg
B2=B1-B1×Psb
(12)
投光器12が2フレーム期間以上の期間おきにパターン光の投写を行う場合、各投写時画像に対して複数の非投写時画像が得られる。この場合、画像差分取得部42は、例えば、各投写時画像と、当該投写時画像が得られたフレーム期間の直前又は直後のフレーム期間に得られた非投写時画像との差分から、パターン光画像を生成することとしても良い。また画像に動きがない場合には、各投写時画像と、当該投写時画像が得られたフレーム期間の前後の複数のフレーム期間に得られた非投写時画像の平均との差分から、パターン光画像を生成することとしても良い。
Claims (7)
- 撮像空間に近赤外波長のパターン光を所定数のフレーム期間おきに投写する投光器と、
環境光により照射されるとともに、前記パターン光を所定数のフレーム期間おきに投写される前記撮像空間内の被写体を撮像して、それぞれ撮像画像のR成分、G成分、及びB成分を表すR信号、G信号、及びB信号を含む撮像信号を出力する撮像部と、
前記投光器に対して前記パターン光の投写強度の指示を与える制御部と、
前記撮像部による撮像で得られた撮像信号のうち、前記パターン光の投写時に得られた撮像信号と前記パターン光の非投写時に得られた撮像信号の差分を求めることで、パターン光画像を生成するとともに、前記パターン光の非投写時に得られた撮像信号から環境光画像を生成する画像差分取得部と、
前記環境光の種別を判定する環境光種別判定部と、
前記環境光種別判定部により判定された前記環境光の種別から、前記環境光に含まれる近赤外波長の光の構成比を示す構成比情報を生成し、生成した前記構成比情報に基づいて、前記環境光画像において、前記環境光に含まれる前記近赤外波長の光による成分を推定し、前記近赤外波長の光による成分を、前記環境光画像から減算することで可視光画像を生成する可視光画像生成部を備え、
前記環境光種別判定部は、前記撮像画像、前記環境光画像、又は前記可視光画像におけるR、G、B成分の比率から前記環境光の種別を判定する
ことを特徴とする画像生成装置。 - 前記構成比が、前記環境光画像のG信号の値に含まれる近赤外波長の光による成分の前記環境光画像の輝度値に対する比であり、
前記可視光画像生成部は、前記構成比と、前記環境光画像の輝度値との積を、前記環境光画像のG信号の値から減算することで、前記可視光画像のG信号の値を求める
ことを特徴とする請求項1に記載の画像生成装置。 - 前記可視光画像生成部は、
前記環境光種別判定部で判定された環境光の種別に応じて、
前記環境光画像のR信号の値に含まれる近赤外波長の光による成分の値の、前記環境光画像のG信号の値に含まれる前記近赤外波長の光による成分の値に対する比を表す第1のパラメータと、
前記環境光画像のB信号の値に含まれる近赤外波長の光による成分の値の、前記環境光画像のG信号の値に含まれる前記近赤外波長の光による成分の値に対する比を表す第2のパラメータとを生成し、
前記構成比と、前記第1のパラメータと、前記環境光画像の輝度値との積を、前記環境光画像のR信号の値から減算することで、前記可視光画像のR信号の値を求め、
前記構成比と、前記第2のパラメータと、前記環境光画像の輝度値との積を、前記環境光画像のB信号の値から減算することで、前記可視光画像のB信号の値を求める
ことを特徴とする請求項2に記載の画像生成装置。 - 前記画像差分取得部で生成された前記パターン光画像に基づいて、前記撮像部から前記被写体までの距離を表す距離情報を生成する距離情報生成部をさらに有し、
前記構成比情報が、
前記環境光画像のR信号の値に含まれる、近赤外波長の光による成分の比である第1の構成比と、
前記環境光画像のG信号の値に含まれる、近赤外波長の光による成分の比である第2の構成比と、
前記環境光画像のB信号の値に含まれる、近赤外波長の光による成分の比である第3の構成比を示すものであり、
前記可視光画像生成部は、
前記環境光種別判定部により判定された環境光の種別と、前記投光器による前記パターン光の投写強度と、前記画像差分取得部で生成された前記パターン光画像の信号の値と、前記距離情報生成部で生成された前記距離情報とから、前記被写体の反射率を推定し、
推定した前記反射率と、前記第1の構成比と、前記環境光画像のR信号の値を乗算することで、前記環境光画像のR信号に含まれる近赤外波長の光による成分を算出し、
推定した前記反射率と、前記第2の構成比と、前記環境光画像のG信号の値を乗算することで、前記環境光画像のG信号に含まれる近赤外波長の光による成分を算出し、
推定した前記反射率と、前記第3の構成比と、前記環境光画像のB信号の値を乗算することで、前記環境光画像のB信号に含まれる近赤外波長の光による成分を算出する
ことを特徴とする請求項1に記載の画像生成装置。 - 前記可視光画像生成部は、前記投写強度と、前記被写体までの距離の2乗の逆数との積に対する、前記パターン光画像の信号の値の比に予め定められた係数を掛けた値を前記反射率として求めることを特徴とする請求項4に記載の画像生成装置。
- 前記距離情報生成部は、前記パターン光画像内の光スポットの配列と、予め記憶されている、前記パターン光による投写パターン内における各光スポットの位置と投写角との関係から、撮像された投写パターン内の各光スポットの投写角を特定し、特定された投写角に基づいて、光スポットが投写されている前記被写体までの距離を求めることを特徴とする請求項4又は5に記載の画像生成装置。
- 前記可視光画像生成部で生成された前記可視光画像の各画素の信号の値に対し、周辺の画素の信号の値を重み付け加算することで、前記各画素の信号の値を増倍する増感処理部をさらに備えることを特徴とする請求項1から6のいずれか一項に記載の画像生成装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE112014006127.1T DE112014006127B4 (de) | 2014-01-08 | 2014-09-18 | Bilderzeugungsvorrichtung |
JP2015556707A JP5992116B2 (ja) | 2014-01-08 | 2014-09-18 | 画像生成装置 |
CN201480066632.6A CN105814887B (zh) | 2014-01-08 | 2014-09-18 | 图像生成装置 |
US15/029,746 US9900485B2 (en) | 2014-01-08 | 2014-09-18 | Image generation device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014001804 | 2014-01-08 | ||
JP2014-001804 | 2014-01-08 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015104870A1 true WO2015104870A1 (ja) | 2015-07-16 |
Family
ID=53523712
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/074665 WO2015104870A1 (ja) | 2014-01-08 | 2014-09-18 | 画像生成装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9900485B2 (ja) |
JP (1) | JP5992116B2 (ja) |
CN (1) | CN105814887B (ja) |
DE (1) | DE112014006127B4 (ja) |
WO (1) | WO2015104870A1 (ja) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017112611A (ja) * | 2015-12-17 | 2017-06-22 | 合盈光電科技股▲ふん▼有限公司H.P.B. Optoelectronics Co., Ltd. | 交通搬送手段のイメージ検出システム |
WO2017122553A1 (ja) * | 2016-01-15 | 2017-07-20 | ソニー株式会社 | 画像処理装置、画像処理方法、及び、撮像装置 |
JP2017216678A (ja) * | 2016-05-27 | 2017-12-07 | パナソニックIpマネジメント株式会社 | 撮像システム |
CN107645625A (zh) * | 2016-07-22 | 2018-01-30 | 松下知识产权经营株式会社 | 图像生成装置以及图像生成方法 |
JP2018021894A (ja) * | 2016-07-22 | 2018-02-08 | パナソニックIpマネジメント株式会社 | 画像生成装置及び画像生成方法 |
WO2021166542A1 (ja) * | 2020-02-18 | 2021-08-26 | 株式会社デンソー | 物体検出装置、受光部および物体検出装置の制御方法 |
WO2021176872A1 (ja) * | 2020-03-03 | 2021-09-10 | ソニーグループ株式会社 | 情報表示制御方法と情報表示制御装置およびプログラム |
WO2022038991A1 (ja) * | 2020-08-17 | 2022-02-24 | ソニーグループ株式会社 | 情報処理装置、情報処理方法及び情報処理プログラム |
US20220412798A1 (en) * | 2019-11-27 | 2022-12-29 | ams Sensors Germany GmbH | Ambient light source classification |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016024200A2 (en) * | 2014-08-12 | 2016-02-18 | Mantisvision Ltd. | Structured light projection and imaging |
JP6677172B2 (ja) * | 2015-01-16 | 2020-04-08 | 日本電気株式会社 | 画像処理装置、画像処理方法およびプログラム |
JP6441771B2 (ja) * | 2015-08-27 | 2018-12-19 | クラリオン株式会社 | 撮像装置 |
US10594974B2 (en) | 2016-04-07 | 2020-03-17 | Tobii Ab | Image sensor for vision based on human computer interaction |
CN108353137A (zh) * | 2016-04-07 | 2018-07-31 | 托比股份公司 | 用于基于视觉的人机交互的图像传感器 |
US10397504B2 (en) * | 2017-02-13 | 2019-08-27 | Kidde Technologies, Inc. | Correcting lag in imaging devices |
TWM570473U (zh) * | 2018-07-03 | 2018-11-21 | 金佶科技股份有限公司 | 取像模組 |
JP7401211B2 (ja) * | 2019-06-25 | 2023-12-19 | ファナック株式会社 | 外光照度測定機能を備えた測距装置及び外光照度測定方法 |
JP7463671B2 (ja) * | 2019-08-01 | 2024-04-09 | Toppanホールディングス株式会社 | 距離画像撮像装置、及び距離画像撮像方法 |
CN111429389B (zh) * | 2020-02-28 | 2023-06-06 | 北京航空航天大学 | 一种保持光谱特性的可见光和近红外图像融合方法 |
EP4319125A1 (en) * | 2021-03-24 | 2024-02-07 | Samsung Electronics Co., Ltd. | Device capable of simultaneously performing lighting function and light source detection function through common hole |
WO2023058666A1 (ja) * | 2021-10-04 | 2023-04-13 | 株式会社ブルービジョン | 距離測定装置 |
WO2023108442A1 (zh) * | 2021-12-14 | 2023-06-22 | 深圳传音控股股份有限公司 | 图像处理方法、智能终端及存储介质 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007089084A (ja) * | 2005-09-26 | 2007-04-05 | Canon Inc | 波長成分の割合検出装置及びそれを用いた撮像装置 |
JP2007202107A (ja) * | 2005-12-27 | 2007-08-09 | Sanyo Electric Co Ltd | 撮像装置 |
JP2008008700A (ja) * | 2006-06-28 | 2008-01-17 | Fujifilm Corp | 距離画像センサ |
JP2013126165A (ja) * | 2011-12-15 | 2013-06-24 | Fujitsu Ltd | 撮像装置、撮像方法及びプログラム |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4843456A (en) | 1986-07-24 | 1989-06-27 | Minolta Camera Kabushiki Kaisha | White balance adjusting device for a color video camera |
JPS63104591A (ja) | 1986-10-21 | 1988-05-10 | Minolta Camera Co Ltd | ビデオカメラの露出制御装置 |
JP2737568B2 (ja) | 1992-09-22 | 1998-04-08 | 松下電器産業株式会社 | カラー撮像装置 |
JPH0989553A (ja) | 1995-09-26 | 1997-04-04 | Olympus Optical Co Ltd | 測距装置 |
JPH11113006A (ja) | 1997-09-30 | 1999-04-23 | Olympus Optical Co Ltd | 電子的撮像装置 |
JP4892909B2 (ja) | 2005-09-22 | 2012-03-07 | ソニー株式会社 | 信号処理方法、信号処理回路およびこれを用いたカメラシステム |
US7821552B2 (en) * | 2005-12-27 | 2010-10-26 | Sanyo Electric Co., Ltd. | Imaging apparatus provided with imaging device having sensitivity in visible and infrared regions |
JP2007312304A (ja) | 2006-05-22 | 2007-11-29 | Fujitsu Ltd | 画像処理装置および画像処理方法 |
JP5061767B2 (ja) | 2006-08-10 | 2012-10-31 | 日産自動車株式会社 | 画像処理装置および画像処理方法 |
JP4761066B2 (ja) * | 2006-09-22 | 2011-08-31 | 富士フイルム株式会社 | ホワイトバランス補正方法及び撮像装置 |
JP4346634B2 (ja) * | 2006-10-16 | 2009-10-21 | 三洋電機株式会社 | 目標物検出装置 |
JP5272431B2 (ja) | 2008-02-15 | 2013-08-28 | 株式会社豊田中央研究所 | 対象物測距装置及びプログラム |
JP2010093472A (ja) * | 2008-10-07 | 2010-04-22 | Panasonic Corp | 撮像装置および撮像装置用信号処理回路 |
JP5100615B2 (ja) | 2008-10-31 | 2012-12-19 | キヤノン株式会社 | 撮像装置 |
US8760499B2 (en) * | 2011-04-29 | 2014-06-24 | Austin Russell | Three-dimensional imager and projection device |
JP5899894B2 (ja) | 2011-12-19 | 2016-04-06 | 富士通株式会社 | 撮像装置、画像処理装置、画像処理プログラムおよび画像処理方法 |
JP2013156109A (ja) | 2012-01-30 | 2013-08-15 | Hitachi Ltd | 距離計測装置 |
-
2014
- 2014-09-18 US US15/029,746 patent/US9900485B2/en not_active Expired - Fee Related
- 2014-09-18 CN CN201480066632.6A patent/CN105814887B/zh not_active Expired - Fee Related
- 2014-09-18 JP JP2015556707A patent/JP5992116B2/ja not_active Expired - Fee Related
- 2014-09-18 DE DE112014006127.1T patent/DE112014006127B4/de not_active Expired - Fee Related
- 2014-09-18 WO PCT/JP2014/074665 patent/WO2015104870A1/ja active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007089084A (ja) * | 2005-09-26 | 2007-04-05 | Canon Inc | 波長成分の割合検出装置及びそれを用いた撮像装置 |
JP2007202107A (ja) * | 2005-12-27 | 2007-08-09 | Sanyo Electric Co Ltd | 撮像装置 |
JP2008008700A (ja) * | 2006-06-28 | 2008-01-17 | Fujifilm Corp | 距離画像センサ |
JP2013126165A (ja) * | 2011-12-15 | 2013-06-24 | Fujitsu Ltd | 撮像装置、撮像方法及びプログラム |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017112611A (ja) * | 2015-12-17 | 2017-06-22 | 合盈光電科技股▲ふん▼有限公司H.P.B. Optoelectronics Co., Ltd. | 交通搬送手段のイメージ検出システム |
WO2017122553A1 (ja) * | 2016-01-15 | 2017-07-20 | ソニー株式会社 | 画像処理装置、画像処理方法、及び、撮像装置 |
US11218679B2 (en) | 2016-01-15 | 2022-01-04 | Sony Corporation | Image processing device, image processing method, and imaging device |
JP2017216678A (ja) * | 2016-05-27 | 2017-12-07 | パナソニックIpマネジメント株式会社 | 撮像システム |
JP2022033118A (ja) * | 2016-05-27 | 2022-02-28 | パナソニックIpマネジメント株式会社 | 撮像システム |
JP7281681B2 (ja) | 2016-05-27 | 2023-05-26 | パナソニックIpマネジメント株式会社 | 撮像システム |
JP6994678B2 (ja) | 2016-05-27 | 2022-01-14 | パナソニックIpマネジメント株式会社 | 撮像システム |
US11196983B2 (en) | 2016-05-27 | 2021-12-07 | Panasonic Iniellectual Property Management Co., Ltd. | Imaging system including light source, image sensor, and double-band pass filter |
CN107645625A (zh) * | 2016-07-22 | 2018-01-30 | 松下知识产权经营株式会社 | 图像生成装置以及图像生成方法 |
CN107645625B (zh) * | 2016-07-22 | 2020-10-09 | 松下知识产权经营株式会社 | 图像生成装置以及图像生成方法 |
JP2018021894A (ja) * | 2016-07-22 | 2018-02-08 | パナソニックIpマネジメント株式会社 | 画像生成装置及び画像生成方法 |
US20220412798A1 (en) * | 2019-11-27 | 2022-12-29 | ams Sensors Germany GmbH | Ambient light source classification |
JP2021131229A (ja) * | 2020-02-18 | 2021-09-09 | 株式会社デンソー | 物体検出装置、受光部および物体検出装置の制御方法 |
WO2021166542A1 (ja) * | 2020-02-18 | 2021-08-26 | 株式会社デンソー | 物体検出装置、受光部および物体検出装置の制御方法 |
JP7255513B2 (ja) | 2020-02-18 | 2023-04-11 | 株式会社デンソー | 物体検出装置、受光部および物体検出装置の制御方法 |
WO2021176872A1 (ja) * | 2020-03-03 | 2021-09-10 | ソニーグループ株式会社 | 情報表示制御方法と情報表示制御装置およびプログラム |
WO2022038991A1 (ja) * | 2020-08-17 | 2022-02-24 | ソニーグループ株式会社 | 情報処理装置、情報処理方法及び情報処理プログラム |
Also Published As
Publication number | Publication date |
---|---|
US20160248953A1 (en) | 2016-08-25 |
JP5992116B2 (ja) | 2016-09-14 |
US9900485B2 (en) | 2018-02-20 |
CN105814887A (zh) | 2016-07-27 |
CN105814887B (zh) | 2017-10-27 |
JPWO2015104870A1 (ja) | 2017-03-23 |
DE112014006127T5 (de) | 2016-09-29 |
DE112014006127B4 (de) | 2019-11-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5992116B2 (ja) | 画像生成装置 | |
JP5808502B2 (ja) | 画像生成装置 | |
JP6431861B2 (ja) | 風景の背景と前景とを区別する方法、及び風景の画像において背景を置き換える方法 | |
US20180224333A1 (en) | Image capturing apparatus and image capturing computer program product | |
JP6784295B2 (ja) | 距離測定システム、距離測定方法およびプログラム | |
CN103609102B (zh) | 高分辨率多光谱图像捕捉 | |
US8199228B2 (en) | Method of and apparatus for correcting contour of grayscale image | |
US20130265438A1 (en) | Imaging apparatus, imaging method, and camera system | |
US11122180B2 (en) | Systems, methods, apparatuses, and computer-readable storage media for collecting color information about an object undergoing a 3D scan | |
US10121271B2 (en) | Image processing apparatus and image processing method | |
US10863165B2 (en) | Image processing apparatus and method | |
US10663593B2 (en) | Projector apparatus with distance image acquisition device and projection method | |
JP2013195865A (ja) | プロジェクタ、プロジェクタ制御方法及びプロジェクタ制御プログラム | |
US11202044B2 (en) | Image processing apparatus and method | |
JP6190119B2 (ja) | 画像処理装置、撮像装置、制御方法、及びプログラム | |
US20090102939A1 (en) | Apparatus and method for simultaneously acquiring multiple images with a given camera | |
JP6550827B2 (ja) | 画像処理装置、画像処理方法及びプログラム | |
JP2017191082A (ja) | 輝点画像取得装置および輝点画像取得方法 | |
WO2024202556A1 (ja) | 測距装置および測距方法 | |
US9294686B2 (en) | Image capture apparatus and image capture method | |
JP2012015834A (ja) | 撮像装置 | |
JP2007334349A (ja) | 光変調器における変位差を補償するためのラインプロファイルを生成する方法及び装置 | |
JP2020107097A (ja) | 遠赤外線画像処理装置、及びそれを備えた遠赤外線監視装置、並びに遠赤外線画像処理プログラム | |
JP2019202580A (ja) | 配光制御装置、投光システム及び配光制御方法 | |
KR20110132786A (ko) | 카메라를 이용한 전자 도서 스캐닝 장치 및 시스템 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14877695 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015556707 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15029746 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112014006127 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14877695 Country of ref document: EP Kind code of ref document: A1 |