WO2016088483A1 - 画像処理装置と画像処理方法 - Google Patents
画像処理装置と画像処理方法 Download PDFInfo
- Publication number
- WO2016088483A1 WO2016088483A1 PCT/JP2015/080380 JP2015080380W WO2016088483A1 WO 2016088483 A1 WO2016088483 A1 WO 2016088483A1 JP 2015080380 W JP2015080380 W JP 2015080380W WO 2016088483 A1 WO2016088483 A1 WO 2016088483A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- polarization
- image
- unit
- depth map
- imaging
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims description 137
- 238000003672 processing method Methods 0.000 title description 7
- 230000010287 polarization Effects 0.000 claims abstract description 594
- 238000003384 imaging method Methods 0.000 claims abstract description 311
- 235000019557 luminance Nutrition 0.000 claims description 78
- 238000000034 method Methods 0.000 claims description 57
- 230000008569 process Effects 0.000 claims description 41
- 230000003287 optical effect Effects 0.000 claims description 22
- 238000004458 analytical method Methods 0.000 claims description 4
- 230000002123 temporal effect Effects 0.000 abstract description 5
- 230000015556 catabolic process Effects 0.000 abstract description 2
- 238000006731 degradation reaction Methods 0.000 abstract description 2
- 238000004891 communication Methods 0.000 description 20
- 230000014509 gene expression Effects 0.000 description 20
- 239000011159 matrix material Substances 0.000 description 17
- 238000001514 detection method Methods 0.000 description 16
- 238000010586 diagram Methods 0.000 description 15
- 238000005516 engineering process Methods 0.000 description 15
- 230000006870 function Effects 0.000 description 12
- 230000010354 integration Effects 0.000 description 10
- 230000033001 locomotion Effects 0.000 description 10
- 238000004364 calculation method Methods 0.000 description 9
- 102100029860 Suppressor of tumorigenicity 20 protein Human genes 0.000 description 6
- 101000661807 Homo sapiens Suppressor of tumorigenicity 14 protein Proteins 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 238000004519 manufacturing process Methods 0.000 description 5
- 230000008859 change Effects 0.000 description 3
- 238000000605 extraction Methods 0.000 description 3
- 230000028161 membrane depolarization Effects 0.000 description 3
- 238000010606 normalization Methods 0.000 description 3
- 230000005236 sound signal Effects 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 230000002730 additional effect Effects 0.000 description 2
- 238000002485 combustion reaction Methods 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 101001093748 Homo sapiens Phosphatidylinositol N-acetylglucosaminyltransferase subunit P Proteins 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000001816 cooling Methods 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 238000000572 ellipsometry Methods 0.000 description 1
- 230000004313 glare Effects 0.000 description 1
- 230000013011 mating Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000002945 steepest descent method Methods 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000002834 transmittance Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/207—Image signal generators using stereoscopic image cameras using a single 2D image sensor
- H04N13/214—Image signal generators using stereoscopic image cameras using a single 2D image sensor using spectral multiplexing
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
- G01C3/06—Use of electric means to obtain final indication
- G01C3/08—Use of electric radiation detectors
- G01C3/085—Use of electric radiation detectors with electronic parallax measurement
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J4/00—Measuring polarisation of light
- G01J4/04—Polarimeters using electric detection means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/207—Image signal generators using stereoscopic image cameras using a single 2D image sensor
- H04N13/218—Image signal generators using stereoscopic image cameras using a single 2D image sensor using spatial multiplexing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/207—Image signal generators using stereoscopic image cameras using a single 2D image sensor
- H04N13/229—Image signal generators using stereoscopic image cameras using a single 2D image sensor using lenticular lenses, e.g. arrangements of cylindrical lenses
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/271—Image signal generators wherein the generated image signals comprise depth maps or disparity maps
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/45—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/207—Image signal generators using stereoscopic image cameras using a single 2D image sensor
- H04N13/232—Image signal generators using stereoscopic image cameras using a single 2D image sensor using fly-eye lenses, e.g. arrangements of circular lenses
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/243—Image signal generators using stereoscopic image cameras using three or more 2D image sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
- H04N2013/0081—Depth or disparity estimation from stereoscopic image signals
Definitions
- This technology relates to an image processing apparatus and an image processing method, and makes it possible to acquire the polarization characteristics of a subject with high accuracy.
- Patent Document 1 discloses a method in which a polarization filter is arranged in front of an imaging unit, and a polarization image in a plurality of polarization directions is acquired by rotating the polarization filter and photographing.
- a method is disclosed in which a polarization filter having a different polarization direction is provided for each pixel, thereby obtaining a polarization image having a plurality of different polarization directions in one imaging.
- the normal information of the subject is acquired from the polarization images of a plurality of polarization directions.
- normal information is calculated by fitting polarization images of a plurality of polarization directions to a model equation.
- a polarization filter in a different direction for each pixel of the image sensor, when acquiring a polarization image of a plurality of polarization directions in one shooting, a polarization filter in a different direction is provided for each pixel of the image sensor.
- the spatial resolution of the polarization image is degraded. For this reason, it is difficult to acquire the polarization characteristics of the subject with high resolution.
- an object of this technique is to provide an image processing apparatus and an image processing method that can acquire the polarization characteristics of a subject with high accuracy.
- a first aspect of this technique is a polarization image obtained by imaging the subject at a plurality of viewpoint positions via a polarization filter having a polarization direction different for each viewpoint position based on a depth map indicating distance information of the subject.
- an image processing apparatus having an alignment unit that performs alignment of the image and a polarization characteristic acquisition unit that acquires a polarization characteristic of the subject from a desired viewpoint position using the polarization image aligned by the alignment unit.
- the alignment unit is configured to capture a polarization image obtained by capturing an image of a subject at a plurality of viewpoint positions via a polarization filter having a polarization direction different for each viewpoint position based on a depth map indicating distance information of the subject.
- Perform alignment The depth map is generated by the depth map generation unit based on images obtained by imaging the subject from different viewpoint positions.
- the depth map generator may be a polarization image obtained by imaging a subject at a plurality of viewpoint positions via a polarization filter having a different polarization direction for each viewpoint position, or a polarization filter having the same polarization direction without using a polarization filter.
- a depth map is generated based on the parallax between images from the image captured via the.
- the depth map generation unit generates and integrates a depth map for each combination of images in the images at a plurality of viewpoint positions.
- the alignment unit determines the parallax based on, for example, the integrated depth map and the positional relationship of the imaging unit that generated the polarization image, and performs alignment of the polarization image so that the parallax is eliminated.
- the polarization characteristic acquisition unit acquires the polarization characteristic of the subject from a desired viewpoint position using the aligned polarization image.
- the polarization characteristic acquisition unit for example, the polarization characteristics of the subject from the desired viewpoint position based on the luminance and polarization direction of the plurality of polarization images after alignment, the positional relationship between the viewpoint position of the polarization image and the desired viewpoint position, etc. To get.
- the normal map generation unit when the normal map generation unit is provided, the normal map generation unit generates a normal map of the subject at a desired viewpoint position based on the polarization characteristic calculated by the polarization characteristic acquisition unit. In the generation of the normal map, the generation unit removes the 180 degree indefiniteness in the polarization analysis based on the depth map used in the alignment unit.
- the high-precision depth map generation unit integrates the depth map used by the alignment unit and the normal map generated by the normal map generation unit to obtain the depth used by the alignment unit. Generate a depth map with higher spatial resolution than the map.
- the polarization characteristic utilization unit calculates the image processing using the polarization characteristic calculated by the polarization characteristic acquisition unit, for example, adjustment of the reflection component of the image at a desired viewpoint position and the polarization characteristic. Matching processing or the like is performed using the image feature amount.
- the imaging unit when an imaging unit for generating a polarization image is provided, the imaging unit generates a polarization image for each of the plurality of viewpoint positions by providing a polarization filter having a different polarization direction in an imaging unit provided for each of the plurality of viewpoint positions.
- the imaging unit includes a plurality of lenses arranged on the light incident surface side of the imaging element in a direction orthogonal to the optical axis direction, and a polarizing filter having a different polarization direction is provided for each lens, and a polarization image for each of a plurality of viewpoint positions. Is generated.
- the alignment unit is configured to image the subject at a plurality of viewpoint positions via a polarization filter having a polarization direction different for each viewpoint position based on a depth map indicating distance information of the subject.
- An image processing method comprising: aligning the obtained polarization image; and obtaining a polarization characteristic of a subject from a desired viewpoint position using the aligned polarization image in the polarization characteristic acquisition unit. is there.
- FIG. 1 is a diagram illustrating a functional configuration according to the first embodiment of the present technology.
- the image processing apparatus 10 includes an imaging unit 21, a depth map generation unit 22, an alignment unit 23, a polarization characteristic acquisition unit 24, and a normal map generation unit 25.
- the imaging unit 21 images a subject at a plurality of viewpoint positions via a polarization filter (for example, a polarizing plate) whose polarization direction is different for each viewpoint position, and generates a polarization image.
- the imaging unit 21 includes a plurality of imaging units 211-1 to 211-4 so that a plurality of imaging units, for example, polarization images in three or more directions having different polarization directions can be acquired.
- a polarizing plate 210-1 is provided on the front surface of the imaging unit 211-1.
- polarizing plates 210-2 to 210-4 are provided in front of the imaging units 211-2 to 211-4.
- the polarizing plates 210-1 to 210-4 have different polarization directions, and the imaging units 211-1 to 211-4 generate polarized images having different polarization directions.
- the imaging unit 21 outputs the image data of the polarization image generated by the imaging units 211-1 to 211-4 to the depth map generation unit 22 and the alignment unit 23.
- FIG. 2 is a diagram illustrating the arrangement of the imaging units in the imaging unit 21.
- the imaging unit 21 may have a configuration in which imaging units 211-1 to 211-4 are arranged at four corners of a rectangle as shown in FIG. 2A, and as shown in FIG. 2B, the imaging unit 211 is arranged. -1 to 211-4 may be arranged in a straight line. Further, when the imaging unit 21 includes three imaging units 211-1 to 211-3, the imaging units 211-1 to 211-3 are arranged at the vertices of a triangle as shown in FIG. A configuration may be employed, and as illustrated in FIG. 2D, the imaging units 211-1 to 211-3 may be linearly arranged.
- the imaging unit is arranged so that the polarization characteristic acquisition position in the subject can be captured by three or more imaging units.
- the imaging unit 21 may generate a plurality of polarized images having different polarization directions as a configuration using a multi-lens array.
- a plurality (four in the figure) of lenses 222 are provided on the front surface of the image sensor 221 in a direction orthogonal to the optical axis direction, and an optical image of a subject is formed on the imaging surface of the image sensor 221 by each lens 222.
- a polarizing plate 223 is provided in front of each lens 222, and the polarizing direction of the polarizing plate 223 is set to a different direction. With such a configuration, polarized images with different polarization directions can be generated by the image sensor 221.
- the spatial resolution of the polarized image is lower than when a polarized image is generated for each imaging unit. Therefore, when acquiring polarization characteristics with high spatial resolution, a polarization image is generated for each imaging unit. In addition, since there is less parallax than when a polarization image is generated for each imaging unit, when acquiring polarization characteristics with little parallax effect, a polarization image is generated with a configuration using a multi-lens array.
- the imaging unit 21 is configured in this way, it is not necessary to use a special imaging unit as in the case of generating four polarization images having different polarization directions with one sub-pixel having four types of polarization directions as one pixel.
- the imaging unit 21 can be easily configured at a low cost.
- the imaging unit 21 uses, for example, a linear polarizing plate as a polarizing filter. Further, the imaging unit 21 is not limited to a linear polarizing plate, and a circular polarizing plate made of a linear polarizing plate and a quarter wavelength plate may be used. Furthermore, the imaging unit 21 may provide a depolarization plate between the linearly polarizing plate and the imaging unit. For example, when exposure control or the like is performed based on an optical image via a half mirror in the imaging unit, the reflectance or transmittance at the half mirror varies depending on the direction of linearly polarized light, and thus exposure control or the like cannot be performed correctly. There is a fear.
- the linearly polarized optical image is converted to a non-polarized optical image, and an optical image indicating the non-polarized and linearly polarized component is incident on the imaging unit.
- exposure control and the like can be correctly performed based on the optical image.
- the imaging unit 21 sets the polarization direction so that a plurality of polarization images having different polarization directions can be generated.
- the polarization direction of the linear polarizing plate is rotated by 180 degrees, the components passing through the linear polarizing plate become equal. Therefore, the polarization directions are set to be different from each other in the range of 0 to 180 degrees.
- the image processing apparatus 10 calculates a polarization model formula based on the luminance and the polarization direction of the polarization image generated by the plurality of imaging units, as will be described later. Therefore, it is preferable to set the polarization direction to have an equal angular difference, for example, so that the polarization model formula can be calculated with high accuracy.
- FIG. 3 illustrates the polarization direction in the imaging unit 21.
- FIG. 3A illustrates the polarization direction when using four imaging units.
- the imaging unit 21 generates, for example, four polarization images with the same angle difference (45 degrees) in the polarization direction by the four imaging units with the polarization directions being 0 degrees, 45 degrees, 90 degrees, and 135 degrees.
- FIG. 3B illustrates the polarization direction when three imaging units are used.
- the imaging unit 21 generates, for example, three polarization images with the same angle difference (60 degrees) in the polarization direction by three imaging units, with the polarization directions being 0 degrees, 60 degrees, and 120 degrees.
- the depth map generation unit 22 generates a depth map indicating distance information of the subject from images generated by the imaging unit 21 with different viewpoint positions.
- the depth map generation unit 22 performs a stereo matching process for each pair of polarized images in the polarized images at different viewpoint positions generated by the imaging unit 21. Further, the depth map generation unit 22 generates a depth map indicating the depth for each pixel, for example, based on the stereo matching processing result and the calibration information acquired in advance for the imaging unit.
- the calibration information has position information indicating the positional relationship between the imaging units. In addition, if the calibration information includes parameters related to optical distortion generated in each imaging unit, even if optical distortion occurs in the image generated by the imaging unit, this optical distortion can be removed. Stereo matching processing and the like can be performed with high accuracy.
- the depth map generation unit 22 integrates the depth maps generated for each pair of polarization images, and generates a depth map with higher accuracy than before the integration.
- the depth map generation unit 22 outputs the integrated depth map to the alignment unit 23 and the normal map generation
- the alignment unit 23 aligns the polarization image generated by the imaging unit 21 based on the depth map generated by the depth map generation unit 22.
- the alignment unit 23 determines the parallax based on, for example, the depth map of the integrated depth map generated by the depth map generation unit 22 and the positional relationship between the imaging units indicated by the calibration information acquired in advance.
- the polarization image is aligned for each pixel so that the parallax is “0”, that is, the subject matches.
- the alignment unit 23 outputs the polarization image after alignment to the polarization characteristic acquisition unit 24.
- the polarization characteristics acquisition unit 24 acquires the polarization characteristics of the subject from a desired viewpoint position using the polarization image after alignment.
- the polarization characteristic acquisition unit 24 calculates a rotation matrix having the imaging unit as a desired viewpoint position based on the positional relationship of the imaging unit and the depth map indicated by the calibration information.
- the polarization characteristic acquisition unit 24 determines the subject from the desired viewpoint position based on the polarization direction and brightness of the plurality of polarization images, the rotation matrix indicating the positional relationship between the imaging unit that generated the polarization image and the desired viewpoint position, and the like.
- a polarization model equation indicating the polarization characteristics of the light is calculated.
- the polarization characteristic acquisition unit 24 outputs the polarization model formula that is the acquired polarization characteristic to the normal map generation unit 25.
- the normal map generation unit 25 generates a normal map of the subject based on the polarization characteristics of the subject from the desired viewpoint position acquired by the polarization characteristic acquisition unit 24.
- the normal map generation unit 25 obtains the zenith angle for each pixel based on the azimuth angle and the polarization degree that give the highest luminance from the polarization characteristic acquired by the polarization characteristic acquisition unit 24, that is, the polarization model formula, and calculates the normal direction.
- a normal map storing normal information (azimuth angle and zenith angle) is generated.
- the normal map generation unit 25 uses the depth map output from the depth map generation unit 22 to the alignment unit 23 and the normal map generation unit 25 to remove 180 degree indefiniteness in the normal map.
- FIG. 4 is a flowchart showing the operation of the first embodiment, and exemplifies a case where the imaging unit 21 includes four imaging units 211-1 to 211-4.
- step ST1 to step ST4 the imaging unit 21 generates a polarization image.
- the imaging unit 211-1 of the imaging unit 21 generates a first polarization image.
- the imaging unit 211-2 generates a second polarization image.
- the imaging unit 211-3 generates a third polarization image.
- the imaging unit 211-4 generates a fourth polarization image.
- the image processing apparatus 10 generates the polarization images having different polarization directions for each viewpoint position by the imaging units 211-1 to 211-4, and proceeds to step ST11 to step ST14.
- the depth map generator 22 In steps ST11 to ST14, the depth map generator 22 generates a depth map. For example, in step ST11, the depth map generation unit 22 generates a depth map from the first polarization image and the second polarization image.
- FIG. 5 is a flowchart showing the operation of the depth map generator.
- the depth map generation unit 22 acquires two polarization images.
- the depth map generation unit 22 acquires the first polarization image generated by the imaging unit 211-1 and the second polarization image generated by the imaging unit 211-2, and proceeds to step ST102.
- step ST102 the depth map generation unit 22 performs edge extraction processing on each polarization image.
- images generated by the imaging unit are images having different luminances depending on the polarization direction. Therefore, the depth map generation unit 22 performs edge extraction processing on the polarization image to generate an edge image so that stereo matching processing can be performed even if a luminance change occurs due to a difference in polarization direction.
- the depth map generation unit 22 performs edge extraction processing, generates a first edge image from the first polarization image and a second edge image from the second polarization image, and proceeds to step ST103.
- the depth map generating unit 22 performs a stereo matching process using the edge image.
- the depth map generation unit 22 performs a stereo matching process between the first edge image and the second edge image.
- the depth map generation unit 22 detects a phase difference (a pixel position difference based on parallax) between the target pixel in the first edge image and the second edge image corresponding to the target pixel.
- a phase difference a pixel position difference based on parallax
- the stereo matching processing for example, a template matching method for detecting a most similar image region from the second edge image with respect to a template image set to include the target pixel is used.
- the stereo matching process is not limited to the template matching method, and other methods (for example, a graph cut method or the like) may be used.
- the depth map generation unit 22 performs a stereo matching process to calculate a phase difference, and proceeds to step ST105.
- the depth map generation unit 22 performs a depth map generation process.
- the depth map generation unit 22 calculates the depth, which is the distance to the subject of the target pixel, based on the phase difference detected by the stereo matching process and the calibration information acquired in advance. Further, the depth map generating unit 22 generates a depth map by associating the calculated depth with the pixels of the polarization image.
- FIG. 6 is a diagram for explaining the depth calculation process.
- FIG. 6 illustrates a case where two imaging units are arranged on the left and right with the same posture.
- the imaging unit 211-1 is used as a reference imaging unit
- the imaging unit 211-2 is used as a reference imaging unit.
- the reference position interval (base length) of the imaging unit is “LB”
- the focal length of the imaging unit is “f”.
- the depth map generation unit is not limited to the case of using the edge image, and may generate the depth map using another method.
- FIG. 7 is a flowchart showing another operation of the depth map generation unit, and illustrates the case of using stereo matching processing that is resistant to changes in luminance.
- the depth map generation unit 22 captures two polarized images.
- the depth map generation unit 22 takes in the first polarization image generated by the imaging unit 211-1 and the second polarization image generated by the imaging unit 211-2, and proceeds to step ST104.
- step ST ⁇ b> 104 the depth map generation unit 22 performs a stereo matching process that is resistant to luminance changes.
- the depth map generation unit 22 performs a stereo matching process that is resistant to changes in luminance using the first polarization image and the second polarization image, and a pixel of interest in the first polarization image and a pixel of the second polarization image corresponding to the pixel of interest.
- the amount of position movement is detected.
- zero-mean normalized cross correlation ZNCC
- Formula (2) is a formula for calculating the zero-average normalized cross-correlation RZNCC , and enables strong matching to the difference in luminance by subtracting the average value of luminance values from the luminance value and normalizing.
- T (i, j) is the luminance value of the pixel at the coordinate (i, j) in the base image (template)
- I (i, j) is the coordinate (i, j) in the reference image.
- the luminance value of the pixel of j) is shown.
- M is the number of pixels indicating the width of the template
- “N” is the number of pixels indicating the height of the template. Note that stereo matching processing that is resistant to luminance changes is not limited to zero-average normalized cross-correlation, and other methods may be used.
- the depth map generator 22 performs a stereo matching process that is resistant to changes in luminance, calculates a phase difference, and proceeds to step ST105.
- the depth map generation unit 22 performs a depth map generation process.
- the depth map generation unit 22 calculates the depth, which is the distance to the subject of the target pixel, based on the phase difference detected by the stereo matching process and the calibration information acquired in advance. Further, the depth map generating unit 22 generates a depth map by associating the calculated depth with the pixels of the polarization image.
- step ST12 the depth map generation unit 22 generates a depth map from the second polarization image and the third polarization image.
- step ST ⁇ b> 13 the depth map generation unit 22 generates a depth map from the third polarization image and the fourth polarization image.
- step ST14 the depth map generation unit 22 generates a depth map from the fourth polarization image and the first polarization image. Note that the depth map generation unit 22 can generate a depth map having the maximum number of image pairs of “J (J ⁇ 1) / 2” when the number of polarization images is “J”.
- the pair of polarization images is not limited to the combination shown in FIG.
- the depth map generation unit 22 may generate a plurality of depth maps for each image pair of this polarization image and another polarization image with any polarization image as a reference. For example, on the basis of the first polarization image, the first polarization image and the second polarization image, the first polarization image and the third polarization image, and the first polarization image and the fourth polarization image, respectively. A depth map may be generated.
- the depth map generation unit 22 performs depth map integration processing.
- the depth map generation unit 22 integrates the depth maps generated for each pair of polarization images, and generates a depth map with higher accuracy than before the integration.
- the depth map generation unit 22 integrates depth maps by a method similar to, for example, Japanese Patent No. 5387856 “Image processing apparatus, image processing method, image processing program, and imaging apparatus”. That is, the depth map generation unit 22 performs the reliability determination process based on the shape of the correlation characteristic line indicating the relationship between the correlation value indicating the similarity calculated in the stereo matching process and the pixel position. In the determination of the reliability, the determination using the kurtosis which is an index indicating the degree of sharpness in the shape of the correlation characteristic line is performed.
- the determination using the difference value of the correlation value between the vertex in the correlation characteristic line and the surrounding points, or the integrated value (integrated value) of the differential value of the correlation value at each pixel position is used. A determination or the like may be performed.
- the depth map generation unit 22 performs the process of adopting the depth with the highest reliability in the pixels indicating the same position of the subject for each pixel based on the determination result of the reliability for each depth map. Generate a depth map of. When a plurality of depth maps are generated based on any polarization image, the same pixel position indicates the same position of the subject in each depth map. Therefore, the depth maps can be easily integrated by adopting the depth with the highest reliability from the reliability of each depth map for each pixel position.
- the depth map generation unit 22 performs depth map integration processing, generates a combined depth map, and proceeds to step ST30.
- the alignment unit 23 performs polarization image alignment processing.
- the alignment unit 23 determines the parallax with respect to the desired viewpoint position based on the positional information between the imaging units indicated by the integrated depth map and the calibration information, and the parallax is “0”, that is, the subject matches.
- the plurality of polarization images are aligned so as to do so.
- the desired viewpoint position is not limited to any of the positions of the imaging units 211-1 to 211-4, for example, a rectangular shape when the imaging units 211-1 to 211-4 are provided at four rectangular corners. It may be an internal position or the like.
- the alignment unit 23 performs alignment of the polarization image and proceeds to step ST40.
- step ST40 the polarization characteristic acquisition unit 24 performs a polarization characteristic acquisition process.
- the polarization characteristic acquisition unit 24 acquires the polarization characteristic at a desired viewpoint position using the polarization image after alignment.
- FIG. 8 is a flowchart showing the polarization characteristic acquisition process.
- the polarization characteristic acquisition unit 24 acquires position information between the imaging units.
- the polarization characteristic acquisition unit 24 acquires position information between the imaging units included in preset calibration information.
- the polarization characteristic acquisition unit 24 acquires a polarization image after alignment.
- the polarization characteristic acquisition unit 24 acquires the aligned polarization image output from the alignment unit 23.
- the polarization characteristic acquisition unit 24 acquires a depth map.
- the polarization characteristic acquisition unit 24 acquires the depth map generated by the depth map generation unit 22.
- step ST404 the polarization characteristic acquisition unit 24 calculates a rotation matrix to a desired viewpoint position.
- the polarization characteristic acquisition unit 24 calculates a rotation matrix R with the imaging unit that generates the polarization image as a viewpoint position desired by the user or the like based on the calibration information and the depth map, and the process proceeds to step ST405.
- Expression (3) illustrates the rotation matrix R.
- step ST405 the polarization characteristic acquisition unit 24 calculates a polarization model formula for a desired viewpoint position.
- the imaging unit 211-p at a desired viewpoint position with respect to the imaging unit 211-n has the positional relationship illustrated in FIG.
- FIG. 10 is a diagram for explaining a subject surface shape and a polarization image.
- the light source LT is used to illuminate the subject OB
- the imaging unit 211-n images the subject OB via the polarizing plate 210-n.
- the luminance of the subject OB changes according to the polarization direction of the polarizing plate 210-n, and the highest luminance is Imax and the lowest luminance is Imin.
- the x-axis and y-axis in the two-dimensional coordinates are on the plane of the polarizing plate 210-n, and the polarization direction of the polarizing plate 210-n is the angle in the y-axis direction with respect to the x-axis.
- the polarizing plate 210-n returns to the original polarization state when the polarization direction is rotated by 180 degrees, and has a period of 180 degrees.
- the polarization angle ⁇ when the maximum luminance Imax is observed is defined as an azimuth angle ⁇ . With such a definition, if the polarization direction of the polarizing plate 210-n is changed, the observed luminance I can be expressed by the polarization model equation of Equation (4).
- FIG. 11 illustrates the relationship between the luminance and the polarization angle.
- the polarization angle ⁇ is clear when the polarization image is generated, and the maximum luminance Imax, the minimum luminance Imin, and the azimuth angle ⁇ are variables.
- the object surface normal is expressed in a polar coordinate system, and the normal information is defined as an azimuth angle ⁇ and a zenith angle ⁇ .
- the zenith angle ⁇ is an angle from the z axis toward the normal
- the azimuth angle ⁇ is an angle in the y axis direction with respect to the x axis as described above.
- the degree of polarization ⁇ can be calculated based on the equation (5).
- the degree of polarization ⁇ can be calculated using the relative refractive index n of the subject OB and the zenith angle ⁇ as shown in the equation (5).
- the relationship between the degree of polarization and the zenith angle is, for example, the characteristic shown in FIG. 12.
- the zenith angle ⁇ is determined based on the degree of polarization ⁇ .
- the characteristic shown in FIG. 12 depends on the relative refractive index n as is apparent from the equation (5), and the degree of polarization increases as the relative refractive index n increases.
- the normal line of the imaging unit 211-n is the direction of the azimuth angle ⁇ and the zenith angle ⁇ , and the normal line N detected by the imaging unit 211-n can be expressed as Expression (12). Further, since the normal line N ′ detected by the imaging unit 211-p can be expressed as Expression (13) using the rotation matrix R shown in Expression (3), the relationship of Expression (14) is established.
- the azimuth angle ⁇ ′ can be calculated from the component of the rotation matrix R, the zenith angle ⁇ , and the azimuth angle ⁇ based on the equation (15).
- the zenith angle ⁇ ′ can be calculated from the component of the rotation matrix R, the zenith angle ⁇ , and the azimuth angle ⁇ based on the equation (16).
- the polarization model equation (8) indicating the polarization characteristic of the imaging unit 211-p uses the equations (11), (15), and (16), and the luminance addition value A and the zenith as shown in the equation (17). It is expressed as a function using three variables of the angle ⁇ and the azimuth angle ⁇ .
- the same modeling is performed for three or more imaging units, and the luminance and calibration of a polarization image obtained by imaging a subject at three or more viewpoint positions via a polarization filter whose polarization direction differs for each viewpoint position.
- the luminance addition value A, the azimuth angle ⁇ , and the zenith angle ⁇ which are three variables, are calculated.
- a polarization model expression indicating the polarization characteristics at a desired viewpoint position can be calculated based on three or more polarization images generated by the imaging unit 21 and calibration information.
- the luminance addition value A, the azimuth angle ⁇ , and the zenith angle ⁇ which are three variables, are the luminance and polarization model formulas for three or more imaging units (a polarization model formula using a rotation matrix between imaging units based on calibration information). ) Analytically. Further, three variables may be calculated using an optimization method such as the LM method or the steepest descent method so that the error is minimized. Further, the three variables may be approximately calculated assuming that the interval between the imaging units is smaller than the depth and the rotation matrix can be ignored.
- the polarization characteristic acquisition unit 24 performs the above processing, calculates the polarization model formula for the desired viewpoint position, that is, the polarization model formula for the imaging unit 211-p, and sets it as the polarization characteristic.
- FIG. 13 is a flowchart showing normal map generation processing.
- the normal map generator 25 calculates a normal.
- the normal map generation unit 25 determines the azimuth angle ⁇ ′ at which the maximum luminance is obtained by using a polarization model expression indicating the polarization characteristic at a desired viewpoint position, that is, Expression (17). Note that the degree of polarization ⁇ ′ may be calculated based on Expression (11).
- the normal map generation unit 25 obtains the zenith angle ⁇ ′ for each pixel based on the azimuth angle ⁇ ′ and the polarization degree ⁇ ′ at which the maximum luminance is obtained, and obtains normal information of the subject (azimuth angle ⁇ ′ and zenith angle ⁇ ′). Information) is calculated, and the process proceeds to step ST502.
- the normal map generation unit 25 removes the indefiniteness of 180 degrees.
- FIG. 14 is a diagram for explaining the removal of 180 degrees of ambiguity.
- the imaging unit 211 captures an image of the subject OB.
- the normal direction (indicated by an arrow) is correct in the upper half area GA of the subject OB.
- the normal direction may be reversed, and has an indefiniteness of 180 degrees.
- the normal map generation unit 25 determines the gradient direction of the subject OB based on the depth map
- the normal map generation unit 25 can determine that the subject OB has a shape protruding in the direction of the imaging unit. .
- the normal direction of the lower half region GB shown in FIG. Can be determined. Accordingly, the normal map generation unit 25 removes the indefiniteness of 180 degrees as shown in FIG. 14C by setting the normal direction of the lower half region GB to the reverse direction. In this manner, the normal map generation unit 25 removes 180 degrees of indefiniteness from the normal map calculated in step ST501 and generates a normal map that correctly indicates the surface shape of the subject. Generate.
- the image processing apparatus is not limited to performing the above-described processing in the order of steps, and for example, processing such as image and information acquisition and depth map generation may be performed in parallel processing. Further, if the above-described processing is performed by pipeline processing, it is possible to sequentially calculate the polarization characteristics at a desired viewpoint position and generate a normal map for each frame, for example.
- the normal information of the subject can be generated from the polarization characteristic at a desired viewpoint position.
- the polarization images generated by combining the polarization images having different polarization directions for the respective viewpoint positions are integrated, and the polarization images are aligned using the integrated depth map. Since the polarization characteristic is acquired using, the polarization characteristic can be acquired with high accuracy.
- a normal map can be generated based on the polarization characteristic at a desired viewpoint position, a normal map corresponding to the desired viewpoint position can be generated. Since this normal map corresponds to a feature amount corresponding to the surface shape of the subject, it is possible to accurately perform subject recognition, subject matching processing, and the like using this normal map.
- the depth map is generated using the polarization image, it is not necessary to provide an imaging unit used only for generating the depth map.
- Second Embodiment> Next, a second embodiment will be described. In the second embodiment, a case will be described in which a depth map having a high spatial resolution is generated using the generated normal map.
- FIG. 15 is a diagram illustrating a functional configuration according to the second embodiment of the present technology.
- the image processing apparatus 10 includes an imaging unit 21, a depth map generation unit 22, a positioning unit 23, a polarization characteristic acquisition unit 24, and a normal map generation unit 25. Further, the image processing apparatus 10 according to the second embodiment includes a high-precision depth map generation unit 26.
- the imaging unit 21 images a subject at a plurality of viewpoint positions via a polarization filter (for example, a polarizing plate) whose polarization direction is different for each viewpoint position, and generates a polarization image.
- the imaging unit 21 includes a plurality of imaging units 211-1 to 211-4 so that a plurality of imaging units, for example, polarized images in three or more directions having different polarization directions can be generated.
- a polarizing plate 210-1 is provided on the front surface of the imaging unit 211-1.
- polarizing plates 210-2 to 210-4 are provided in front of the imaging units 211-2 to 211-4.
- the polarizing plates 210-1 to 210-4 have different polarization directions, and the imaging units 211-1 to 211-4 generate polarized images having different polarization directions.
- the imaging unit 21 outputs the image data of the polarization image generated by the imaging units 211-1 to 211-4 to the depth map generation unit 22 and the alignment unit 23.
- the imaging unit 21 uses, for example, a linear polarizing plate as a polarizing filter. Note that the imaging unit 21 may generate polarized images of three or more directions having different polarization directions in other configurations as in the first embodiment.
- the depth map generation unit 22 generates a depth map indicating distance information of a subject from polarized images generated by the imaging unit 21 and having different viewpoint positions.
- the depth map generation unit 22 performs stereo matching processing using polarized images with different viewpoint positions, and generates a depth map indicating the depth for each pixel. Further, the depth map generation unit 22 generates a depth map for each pair of polarization images at different viewpoint positions, integrates the generated depth maps, and generates a depth map with higher accuracy than before the integration.
- the depth map generation unit 22 outputs the integrated depth map to the alignment unit 23 and the normal map generation unit 25.
- the alignment unit 23 aligns the polarization image generated by the imaging unit 21 based on the depth map generated by the depth map generation unit 22.
- the alignment unit 23 determines the parallax between polarized images based on the depth map generated by the depth map generation unit 22 and the positional relationship between the imaging units indicated by the calibration information acquired in advance.
- the polarization image is aligned for each pixel.
- the alignment unit 23 outputs the polarization image after alignment to the polarization characteristic acquisition unit 24.
- the polarization characteristics acquisition unit 24 acquires the polarization characteristics of the subject from a desired viewpoint position using the polarization image after alignment.
- the polarization characteristic acquisition unit 24 calculates a rotation matrix having the imaging unit as a desired viewpoint position based on the positional relationship of the imaging unit and the depth map indicated by the calibration information.
- the polarization characteristic acquisition unit 24 determines the subject from the desired viewpoint position based on the polarization direction and brightness of the plurality of polarization images, the rotation matrix indicating the positional relationship between the imaging unit that generated the polarization image and the desired viewpoint position, and the like.
- a polarization model equation indicating the polarization characteristics of the light is calculated.
- the polarization characteristic acquisition unit 24 outputs the polarization model formula that is the acquired polarization characteristic to the normal map generation unit 25.
- the normal map generation unit 25 generates a normal map of the subject based on the polarization characteristics of the subject from the desired viewpoint position acquired by the polarization characteristic acquisition unit 24.
- the normal map generation unit 25 obtains the zenith angle for each pixel based on the azimuth angle and the polarization degree that gives the highest luminance from the polarization model equation acquired by the polarization characteristic acquisition unit 24, and indicates the normal direction.
- a normal map storing information (azimuth angle and zenith angle) is generated.
- the normal map generation unit 25 uses the depth map to remove the 180 degree indeterminacy in the normal map, and sends the normal map from which the 180 degree indefiniteness has been removed to the high-precision depth map generation unit 26. Output.
- the high-precision depth map generation unit 26 performs high-precision depth map processing using the normal map.
- the high-precision depth map generator 26 is based on the subject surface shape indicated by the normal map generated by the normal map generator 25 and the depth indicated by the depth map output from the depth map generator 22.
- the surface shape of the subject is traced starting from the pixel in which is obtained. In this way, the surface shape of the subject is traced starting from the pixel where the depth is obtained, and the normal map generation unit 25 estimates the depth corresponding to the pixel where the depth is not obtained.
- the high-precision depth map generation unit 26 includes the estimated depth in the depth map output from the depth map generation unit 22, so that the depth map has a higher spatial resolution than the depth map output from the depth map generation unit 22. Is generated.
- FIG. 16 is a flowchart showing the operation of the second embodiment. Similar to the first embodiment, the imaging unit 21 generates a polarization image in steps ST1 to ST4. For example, in step ST1, the imaging unit 211-1 of the imaging unit 21 generates a first polarization image. In step ST2, the imaging unit 211-2 generates a second polarization image. In step ST3, the imaging unit 211-3 generates a third polarization image. In step ST4, the imaging unit 211-4 generates a fourth polarization image. As described above, the image processing apparatus 10 generates the polarization images having different polarization directions for each viewpoint position by the imaging units 211-1 to 211-4, and proceeds to step ST11 to step ST14.
- the depth map generator 22 In steps ST11 to ST14, the depth map generator 22 generates a depth map.
- the depth map generation unit 22 generates a depth map from two polarized images with different viewpoint positions, and proceeds to step ST20.
- the pair of polarization images is not limited to the combination shown in FIG.
- step ST20 the depth map generation unit 22 performs depth map integration processing.
- the depth map generator 22 integrates the depth maps generated in steps ST11 to ST14 and proceeds to step ST30.
- step ST30 the alignment unit 23 performs polarization image alignment processing.
- the alignment unit 23 performs alignment of the polarization image using the integrated depth map and proceeds to step ST40.
- step ST40 the polarization characteristic acquisition unit 24 performs a polarization characteristic acquisition process.
- the polarization characteristic acquisition unit 24 calculates a polarization model expression for a desired viewpoint position using the aligned polarization image, and proceeds to step ST50.
- step ST50 the normal map generation unit 25 performs normal map generation processing.
- the normal map generation unit 25 generates a normal map indicating the surface normal of the subject for each pixel based on the polarization characteristics at the desired viewpoint position, and the process proceeds to step ST60.
- step ST60 the high accuracy depth map generation unit 26 performs high accuracy depth map generation processing.
- the high-precision depth map generation unit 26 generates a depth map with a high spatial resolution from the depth map generated in step ST20 and the normal map generated in step ST50.
- FIG. 17 is a diagram for explaining high-precision depth map generation processing. For the sake of simplicity, for example, processing for one line will be described.
- the imaging unit 211 images the subject OB
- the depth map generation unit 22 uses the depth map and normal map generation unit 25 shown in FIG. It is assumed that the normal map shown in c) is obtained.
- the depth map for example, the depth for the leftmost pixel is “2 (meters)”, and the depth is not stored in other pixels indicated by “x”.
- the high-precision depth map generator 26 estimates the surface shape of the subject OB based on the normal map.
- the second pixel from the left end can be determined to correspond to an inclined surface that approaches the direction of the imaging unit 21 from the subject surface corresponding to the leftmost pixel. Accordingly, the high-precision depth map generation unit 26 estimates the depth of the second pixel from the left end by tracing the surface shape of the subject OB starting from the left end pixel, and sets it to “1.5 (meters)”, for example. . In addition, the high-precision depth map generation unit 26 stores the estimated depth in the depth map. It can be determined that the third pixel from the left end corresponds to the surface facing the imaging unit 21 based on the normal direction of this pixel.
- the high-precision depth map generation unit 26 estimates the depth of the third pixel from the left end by tracing the surface shape of the subject OB starting from the left end pixel, and sets it to “1 (meter)”, for example.
- the high-precision depth map generation unit 26 stores the estimated depth in the depth map. It can be determined that the fourth pixel from the left end corresponds to an inclined surface in a direction away from the imaging unit 21 from the subject surface corresponding to the third pixel from the left end. Therefore, the high-precision depth map generation unit 26 estimates the depth of the fourth pixel from the left end by tracing the surface shape of the subject OB starting from the left end pixel, and sets it to “1.5 (meters)”, for example. . In addition, the high-precision depth map generation unit 26 stores the estimated depth in the depth map. Similarly, the depth of the fifth pixel from the left end is estimated and stored in the depth map as “2 (meters)”, for example.
- the high-precision depth map generation unit 26 performs the depth map high-precision processing using the depth map and the normal map, and traces the surface shape based on the normal map starting from the depth of the depth map. Then, the depth is estimated. Accordingly, the high-precision depth map generation unit 26 compensates for the missing depth even if some of the depths are missing in the depth map shown in FIG. 17B generated by the depth map generation unit 22. Is possible. Accordingly, the depth map shown in FIG. 17D, which has a spatial resolution higher than the depth map shown in FIG. 17B, can be generated.
- the second embodiment not only the effects of the first embodiment can be obtained, but also a plurality of polarized images can be obtained for a subject region where it is difficult to obtain the depth by stereo matching processing.
- the depth can be estimated using the normal map generated based on the above. Therefore, a depth map having a spatial resolution equal to or higher than the depth map generated by the depth map generation unit 22 can be generated.
- a depth map is generated without being affected by a luminance difference of a polarization image by generating a depth map using an image captured without passing through a polarization filter.
- FIG. 18 is a diagram illustrating a functional configuration according to the third embodiment.
- the image processing apparatus 10 includes an imaging unit 21a, a depth map generation unit 22a, and an alignment unit 23, a polarization characteristic acquisition unit 24, and a normal map generation unit 25, as in the first embodiment.
- the imaging unit 21a captures a subject at a plurality of viewpoint positions via a polarizing filter (polarizing plate) having a different polarization direction for each viewpoint position, and generates a polarization image.
- the imaging unit 21a includes a plurality of imaging units 211-1 to 211-4 so as to generate a plurality of imaging units, for example, polarization images in three or more directions having different polarization directions.
- a polarizing plate 210-1 is provided on the front surface of the imaging unit 211-1.
- polarizing plates 210-2 to 210-4 are provided in front of the imaging units 211-2 to 211-4.
- the polarizing plates 210-1 to 210-4 have different polarization directions, and the imaging units 211-1 to 211-4 generate polarized images having different polarization directions.
- the imaging unit 21a outputs the image data of the polarization image generated by the imaging units 211-1 to 211-4 to the alignment unit 23.
- the imaging unit 21a uses, for example, a linear polarizing plate as a polarizing filter. Note that the imaging unit 21a may generate polarization images of three or more directions with different polarization directions in other configurations as in the first embodiment described above.
- the imaging unit 21a has an imaging unit that performs imaging without using a polarizing filter or through a polarizing filter having the same polarization direction.
- FIG. 18 illustrates a configuration including imaging units 211-5 and 211-6 that perform imaging without using a polarizing filter.
- a polarizing filter is not provided in front of the imaging units 211-5 and 211-6, and the imaging units 211-5 and 211-6 generate non-polarized images.
- the imaging unit 21a outputs the non-polarized images generated by the imaging units 211-5 and 211-6 to the depth map generation unit 22a.
- FIG. 19 is a diagram illustrating an arrangement of the imaging units in the imaging unit 21a.
- the imaging unit 21a has imaging units 211-1 to 211-4 arranged at the four corners of a rectangle, and the left side of the imaging units 211-1 to 211-4 arranged in a rectangular shape.
- the imaging unit 211-5 is arranged on the right side and the imaging unit 211-6 is arranged on the right side.
- FIG. 19 illustrates that an imaging unit without an arrow indicating the polarization direction is an imaging unit that generates a non-polarized image.
- the imaging unit 21a arranges the imaging units for generating the polarization image in a straight line as shown in FIGS. 19B and 19C, and there is no image on the left and right sides of the linearly arranged imaging units. It is good also as a structure which has arrange
- the imaging unit 21a may be configured to further provide an imaging unit that generates a non-polarized image, and to integrate a depth map generated for each pair of non-polarized images to generate a more accurate depth map.
- an imaging unit that generates a non-polarized image is provided so as to surround the imaging unit that is arranged in a rectangular shape and generates a polarized image.
- a plurality of depth maps generated may be integrated to generate a highly accurate depth map.
- the imaging unit that generates a non-polarized image is not limited to the arrangement shown in FIGS. 19D and 19E as long as a plurality of depth maps can be integrated to generate a highly accurate depth map. .
- the depth map generation unit 22a generates a depth map indicating distance information of the subject from the non-polarized image generated by the imaging unit 21a.
- the depth map generation unit 22a performs stereo matching processing using non-polarized images at different viewpoint positions, and generates a depth map indicating the depth for each pixel.
- the depth map generation unit 22 a outputs the generated depth map to the alignment unit 23 and the normal map generation unit 25.
- the depth map generation unit 22a when there are a plurality of pairs of non-polarized images, the depth map generation unit 22a generates a depth map for each pair and performs the depth map integration processing as described above, thereby generating a high-precision depth map. May be.
- the alignment unit 23 aligns the polarization image generated by the imaging unit 21a based on the depth map generated by the depth map generation unit 22a.
- the alignment unit 23 determines the parallax between the polarization images based on the depth map generated by the depth map generation unit 22a and the positional relationship between the imaging units indicated by the calibration information acquired in advance.
- the polarization image is aligned for each pixel.
- the alignment unit 23 outputs the polarization image after alignment to the polarization characteristic acquisition unit 24.
- the polarization characteristics acquisition unit 24 acquires the polarization characteristics of the subject from a desired viewpoint position using the polarization image after alignment.
- the polarization characteristic acquisition unit 24 calculates a rotation matrix having the imaging unit as a desired viewpoint position based on the positional relationship of the imaging unit and the depth map indicated by the calibration information.
- the polarization characteristic acquisition unit 24 determines the subject from the desired viewpoint position based on the polarization direction and brightness of the plurality of polarization images, the rotation matrix indicating the positional relationship between the imaging unit that generated the polarization image and the desired viewpoint position, and the like.
- a polarization model equation indicating the polarization characteristics of the light is calculated.
- the polarization characteristic acquisition unit 24 outputs the polarization model formula that is the acquired polarization characteristic to the normal map generation unit 25.
- the normal map generation unit 25 generates a normal map of the subject based on the polarization characteristics of the subject from the desired viewpoint position acquired by the polarization characteristic acquisition unit 24.
- the normal map generation unit 25 obtains the zenith angle for each pixel based on the azimuth angle and the polarization degree that gives the highest luminance from the polarization model equation acquired by the polarization characteristic acquisition unit 24, and indicates the normal direction.
- a normal map storing information (azimuth angle and zenith angle) is generated. Further, the normal map generation unit 25 uses the depth map to remove the 180 degree indefiniteness in the normal map, and generates a normal map from which the 180 degree indefiniteness has been removed.
- FIG. 20 is a flowchart showing the operation of the third embodiment. Similar to the first embodiment, the imaging unit 21a generates a polarization image in steps ST1 to ST4. For example, in step ST1, the imaging unit 211-1 of the imaging unit 21a generates a first polarization image. In step ST2, the imaging unit 211-2 generates a second polarization image. In step ST3, the imaging unit 211-3 generates a third polarization image. In step ST4, the imaging unit 211-4 generates a fourth polarization image. In steps ST5 to ST6, the imaging unit 21a generates a non-polarized image that does not pass through the polarizing filter.
- step ST5 the imaging unit 211-5 of the imaging unit 21a generates a first non-polarized image.
- step ST6 the imaging unit 211-6 generates a second non-polarized image.
- the image processing apparatus 10 uses the imaging units 211-1 to 211-6 to generate a plurality of polarized images having different polarization directions for each viewpoint position and non-polarized images having different viewpoint positions, and proceeds to step ST15.
- step ST15 the depth map generator 22a generates a depth map.
- the depth map generation unit 22a performs a stereo matching process using the first non-polarized image and the second non-polarized image having different viewpoint positions, generates a depth map, and proceeds to step ST30.
- step ST30 the alignment unit 23 performs polarization image alignment processing.
- the alignment unit 23 aligns each polarization image using the depth map generated in step ST15, and proceeds to step ST40.
- step ST40 the polarization characteristic acquisition unit 24 performs a polarization characteristic acquisition process.
- the polarization characteristic acquisition unit 24 calculates a polarization model expression for a desired viewpoint position using the aligned polarization image, and proceeds to step ST50.
- step ST50 the normal map generation unit 25 performs normal map generation processing.
- the normal map generation unit 25 generates a normal map indicating the surface normal of the subject for each pixel based on the polarization characteristic at a desired viewpoint position.
- the depth map is generated using the non-polarized image
- the depth map is compared with the case of using the polarized image that may cause a difference in luminance depending on the polarization direction.
- Generation can be easily performed with high accuracy.
- the polarization characteristic at a desired viewpoint position can be obtained with high accuracy for each pixel without causing a decrease in temporal resolution and spatial resolution.
- FIG. 21 is a diagram illustrating a functional configuration according to the fourth embodiment.
- the image processing apparatus 10 includes an imaging unit 21, a depth map generation unit 22, a positioning unit 23, and a polarization characteristic acquisition unit 24. Further, the image processing apparatus 10 according to the fourth embodiment includes a polarization characteristic using unit 27.
- the imaging unit 21 images a subject at a plurality of viewpoint positions via a polarization filter (for example, a polarizing plate) whose polarization direction is different for each viewpoint position, and generates a polarization image.
- the imaging unit 21 includes a plurality of imaging units 211-1 to 211-4 so that a plurality of imaging units, for example, polarized images in three or more directions having different polarization directions can be generated.
- a polarizing plate 210-1 is provided on the front surface of the imaging unit 211-1.
- polarizing plates 210-2 to 210-4 are provided in front of the imaging units 211-2 to 211-4.
- the polarizing plates 210-1 to 210-4 have different polarization directions, and the imaging units 211-1 to 211-4 generate polarized images having different polarization directions.
- the imaging unit 21 outputs the image data of the polarization image generated by the imaging units 211-1 to 211-4 to the depth map generation unit 22 and the alignment unit 23.
- the imaging unit 21 uses, for example, a linear polarizing plate as a polarizing filter. Note that the imaging unit 21 may generate polarized images of three or more directions having different polarization directions in other configurations as in the first embodiment.
- the depth map generation unit 22 generates a depth map indicating distance information of a subject from polarized images generated by the imaging unit 21 and having different viewpoint positions.
- the depth map generation unit 22 performs stereo matching processing using polarized images with different viewpoint positions, and generates a depth map indicating the depth for each pixel. Further, the depth map generation unit 22 generates a depth map for each pair of polarization images at different viewpoint positions, integrates the generated depth maps, and generates a depth map with higher accuracy than before the integration.
- the depth map generation unit 22 outputs the integrated depth map to the alignment unit 23 and the normal map generation unit 25.
- the alignment unit 23 aligns the polarization image generated by the imaging unit 21 based on the depth map generated by the depth map generation unit 22.
- the alignment unit 23 determines the parallax between polarized images based on the depth map generated by the depth map generation unit 22 and the positional relationship between the imaging units indicated by the calibration information acquired in advance.
- the polarization image is aligned for each pixel.
- the alignment unit 23 outputs the polarization image after alignment to the polarization characteristic acquisition unit 24.
- the polarization characteristics acquisition unit 24 acquires the polarization characteristics of the subject from a desired viewpoint position using the polarization image after alignment.
- the polarization characteristic acquisition unit 24 calculates a rotation matrix having the imaging unit as a desired viewpoint position based on the positional relationship of the imaging unit and the depth map indicated by the calibration information.
- the polarization characteristic acquisition unit 24 determines the subject from the desired viewpoint position based on the polarization direction and brightness of the plurality of polarization images, the rotation matrix indicating the positional relationship between the imaging unit that generated the polarization image and the desired viewpoint position, and the like.
- a polarization model equation indicating the polarization characteristics of the light is calculated.
- the polarization characteristic acquisition unit 24 outputs the polarization model formula that is the acquired polarization characteristic to the polarization characteristic utilization unit 27.
- the polarization characteristic utilization unit 27 uses the polarization characteristic acquired by the polarization characteristic acquisition unit 24 to perform processing on the image generated by the imaging unit 21, for example, adjustment of the reflection component of the image at a desired viewpoint position. Specifically, processing such as processing for generating a polarization image of an arbitrary azimuth, processing for removing a reflection component, processing for adjusting a specular reflection component to adjust glossiness, and the like are performed. In addition, the polarization characteristic using unit 27 may perform processing in consideration of the surface shape of the subject, recognition of a three-dimensional subject, and the like using the polarization property according to the surface shape of the subject as an image feature amount.
- FIG. 22 is a flowchart showing the operation of the fourth embodiment. Similar to the first embodiment, the imaging unit 21 generates a polarization image in steps ST1 to ST4. For example, in step ST1, the imaging unit 211-1 of the imaging unit 21 generates a first polarization image. In step ST2, the imaging unit 211-2 generates a second polarization image. In step ST3, the imaging unit 211-3 generates a third polarization image. In step ST4, the imaging unit 211-4 generates a fourth polarization image. As described above, the image processing apparatus 10 generates the polarization images having different polarization directions for each viewpoint position by the imaging units 211-1 to 211-4, and proceeds to step ST11 to step ST14.
- the depth map generator 22 In steps ST11 to ST14, the depth map generator 22 generates a depth map.
- the depth map generation unit 22 generates a depth map from two polarized images with different viewpoint positions, and proceeds to step ST20.
- the pair of polarization images is not limited to the combination shown in FIG.
- step ST20 the depth map generation unit 22 performs depth map integration processing.
- the depth map generator 22 integrates the depth maps generated in steps ST11 to ST14 and proceeds to step ST30.
- step ST30 the alignment unit 23 performs polarization image alignment processing.
- the alignment unit 23 performs alignment of the polarization image using the integrated depth map and proceeds to step ST40.
- step ST40 the polarization characteristic acquisition unit 24 performs a polarization characteristic acquisition process.
- the polarization characteristic acquisition unit 24 calculates a polarization model formula for a desired viewpoint position using the aligned polarization image, and proceeds to step ST70.
- the polarization characteristic utilization unit 27 performs polarization characteristic utilization processing.
- the polarization characteristic utilization unit 27 performs, for example, image processing using the obtained polarization characteristic.
- FIG. 23 to FIG. 25 exemplify the case where the reflection component of the image is adjusted as the processing using the polarization characteristic.
- FIG. 23A, FIG. 24A, and FIG. 25A illustrate the relationship between the azimuth angle and the luminance based on the polarization model equation calculated in step ST40.
- the range from the minimum luminance Imin to the maximum luminance Imax is a range in which the luminance changes depending on the polarization state.
- the luminance is controlled according to the azimuth angle, it is possible to obtain a filter effect equivalent to a PL filter in a pseudo manner, and the polarization in a desired polarization direction from the normal captured image as shown in FIG. A polarized image can be generated through the filter.
- the range from the minimum luminance Imin to the maximum luminance Imax is a component in which the luminance does not change regardless of the polarization state and corresponds to a non-polarization component.
- the luminance range from the minimum luminance Imin to the maximum luminance Imax is a range in which the luminance changes depending on the polarization state. It corresponds to the polarization component. Therefore, as shown in (b) of FIG. 24, a captured image from which the reflection component is removed can be generated by removing the luminance component corresponding to the polarization component from the normal captured image.
- the range up to the minimum luminance Imin is a component that does not change in luminance regardless of the polarization state, corresponds to a non-polarized component, and can be regarded as a diffuse reflection component.
- the luminance range from the minimum luminance Imin to the maximum luminance Imax can be regarded as a specular reflection component. Therefore, as shown in FIG. 25 (b), if the specular reflection component is suppressed, a captured image with suppressed glossiness can be generated. Further, if the specular reflection component is enhanced, a captured image with enhanced glossiness can be generated.
- the polarization characteristic using unit 27 calculates an image feature amount using the polarization characteristic calculated by the polarization property acquisition unit, and performs processing that considers the surface shape of the subject using the image feature amount, for example, three-dimensional Subject matching processing, stereoscopic subject recognition processing, and the like may be performed. Next, an operation when performing a three-dimensional subject matching process as a process considering the surface shape of the subject will be described.
- the polarization characteristic using unit 27 uses the polarization characteristic to calculate, for example, an image feature amount corresponding to the surface shape of the subject in the polarization image for each pixel.
- FIG. 26 is a diagram for explaining the calculation of the image feature amount. 26A and 26B illustrate the relationship between the angle of the polarization direction and the luminance in the feature amount calculation target pixel (hereinafter simply referred to as “target pixel”) of the image feature amount in the polarization image. (B) has shown the case where illumination light is bright compared with (a) of FIG. FIG. 26 exemplifies a case where the polarization angle is 0 degree, 45 degrees, 90 degrees, and 135 degrees.
- the polarization characteristic using unit 27 normalizes the luminance so that it can be determined whether or not the polarization characteristic has the same characteristic even when the luminance is different.
- the polarization characteristic using unit 27 calculates the average luminance of each polarization angle, divides the luminance of each polarization angle by the calculated average luminance, and calculates normalized luminance.
- FIG. 26C shows the luminance after normalization, and the normalization reference level corresponds to the average luminance.
- the polarization characteristic utilization unit 27 uses the luminance of each polarization angle after normalization as an image feature amount.
- Expression (18) exemplifies an image feature amount when a polarization image having a polarization angle of 0 degrees, 45 degrees, 90 degrees, and 135 degrees is acquired.
- the image feature amount calculated in this way is information indicating the surface shape of the subject position corresponding to the target pixel.
- the polarization characteristic utilization unit 27 performs a three-dimensional subject matching process using the calculated image feature amount.
- FIG. 27 exemplifies an operation when matching processing is performed in the polarization characteristic using unit.
- the polarization characteristic using unit 27 uses the image feature amount to determine which of the feature points detected in the other image matches the feature point of the matching target detected in one image (hereinafter referred to as “target feature point”). Determine.
- the feature point is detected based on the image feature amount calculated based on the luminance of the polarization angle ⁇ of 0 degrees, 45 degrees, 90 degrees, and 135 degrees, for example.
- the image feature amount of the target feature point TP0 is [F o 0 ° , F o 45 ° , F o 90 ° , F o 135 ° ].
- the image feature quantity of the other feature point TQj is set to [F j 0 ° , F j 45 ° , F j 90 ° , F j 135 ° ].
- J is a variable indicating the jth feature point in the other feature point.
- the polarization characteristic using unit 27 determines a point where the distance between the vector of the image feature amount is minimum and sets it as a matching point. For example, the polarization characteristic utilization unit 27 performs the calculation of Expression (19) to determine the feature point j that minimizes the sum of squared differences from the image feature amount of the target feature point from the other feature point and perform matching. Let it be a point.
- the fourth embodiment by using the calculated polarization characteristics, various processing processes, processes considering the surface shape of the subject, and the like can be easily performed. Further, as in the first embodiment, the polarization characteristic at a desired viewpoint position can be obtained with high accuracy for each pixel without causing a decrease in temporal resolution and spatial resolution.
- FIG. 28 is a diagram showing a functional configuration of another embodiment.
- the image processing apparatus 10 includes an imaging unit 21, a positioning unit 23a, and a normal map generation unit 25a.
- the imaging unit 21 captures a subject at a plurality of viewpoint positions via a polarization filter whose polarization direction differs for each viewpoint position, and generates a polarization image.
- the imaging unit 21 includes a plurality of imaging units 211-1 to 211-4 so that a plurality of imaging units, for example, polarized images in three or more directions having different polarization directions can be generated.
- a polarizing plate 210-1 is provided on the front surface of the imaging unit 211-1.
- polarizing plates 210-2 to 210-4 are provided in front of the imaging units 211-2 to 211-4.
- the polarizing plates 210-1 to 210-4 have different polarization directions, and the imaging units 211-1 to 211-4 generate polarized images having different polarization directions.
- the imaging unit 21 outputs the image data of the polarization image generated by the imaging units 211-1 to 211-4 to the depth map generation unit 22 and the alignment unit 23a.
- the imaging unit 21 uses, for example, a linear polarizing plate as a polarizing plate. Further, the imaging unit 21 is not limited to a linear polarizing plate, and a circular polarizing plate made of a linear polarizing plate and a quarter wavelength plate may be used. Furthermore, the imaging unit 21 may provide a depolarization plate between the linearly polarizing plate and the imaging unit. Note that the imaging unit 21 may generate polarized images of three or more directions having different polarization directions in other configurations as in the first embodiment.
- the alignment unit 23a aligns the polarization image generated by the imaging unit 21.
- the alignment unit 23a uses the feature of the image to align the polarization image without using the depth map.
- the alignment unit 23a approximately models movement between images by homography, for example, over the entire screen, and performs alignment of the polarization image based on this model. Further, for example, when a stationary subject is imaged from different viewpoint positions, the subject is moved between the captured images, so that the alignment unit 23a detects an optical flow or the like, and based on the detection result.
- the polarization image may be aligned. For example, as shown in FIG.
- the normal map generation unit 25a obtains the relationship between the luminance and the polarization angle from the polarization direction and the luminance of the polarization image based on the polarization image after the alignment in which the polarization direction is three directions or more, and determines the azimuth angle ⁇ that is the maximum luminance. Determine. Further, the normal map generation unit 25a calculates the degree of polarization ⁇ using the highest luminance and the lowest luminance obtained from the relationship between the luminance and the polarization angle, and calculates based on the characteristic curve indicating the relationship between the polarization degree and the zenith angle. The zenith angle ⁇ corresponding to the polarization degree ⁇ is determined.
- the normal map generation unit 25a calculates the normal information (azimuth angle ⁇ and zenith angle ⁇ ) of the subject for each pixel position based on the polarization image after alignment in which the polarization directions are three or more. Generate a line map.
- a normal map since it is not necessary to generate a depth map, a normal map can be easily generated. Since the depth map has not been generated, the generated normal map has an indefiniteness of 180 degrees.
- FIG. 30 is a block diagram illustrating a schematic configuration of a vehicle control system using the image processing apparatus of this technology.
- the vehicle control system 90 includes a plurality of control units and detection units connected via a communication network 920.
- the vehicle control system 90 includes a drive system control unit 931, a body system control unit 932, a battery control unit 933, a vehicle exterior information detection unit 934, a wireless communication unit 935, and an integrated control unit 940.
- the communication network 920 may be an in-vehicle communication network that conforms to an arbitrary standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network), or FlexRay (registered trademark).
- CAN Controller Area Network
- LIN Local Interconnect Network
- LAN Local Area Network
- FlexRay registered trademark
- an input unit 951, an audio output unit 952, and a display unit 953 are connected to the integrated control unit 940.
- Each control unit includes a microcomputer that performs arithmetic processing according to various programs, a storage unit that stores programs executed by the microcomputer or parameters used for various calculations, and a drive circuit that drives devices to be controlled. Is provided.
- the drive system control part 931 controls the operation
- the drive system controller 931 includes a driving force generator for generating a driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. Functions as a steering mechanism to adjust.
- the drive system control unit 931 has a function as a control device such as a braking device that generates a braking force of the vehicle, and a function as a control device such as ABS (Antilock Brake System) or ESC (Electronic Stability Control). Also good.
- a vehicle state detection unit 9311 is connected to the drive system control unit 931.
- the vehicle state detection unit 9311 includes, for example, a gyro sensor that detects the angular velocity of the axial rotation motion of the vehicle body, an acceleration sensor that detects the acceleration of the vehicle, or an accelerator pedal operation amount, a brake pedal operation amount, and a steering wheel steering. At least one of sensors for detecting an angle, an engine speed, a traveling speed, or the like is included.
- the drive system control unit 931 performs arithmetic processing using a signal input from the vehicle state detection unit 9311 to control the internal combustion engine, the drive motor, the electric power steering device, the brake device, or the like.
- the body system control unit 932 controls the operation of various devices mounted on the vehicle body according to various programs.
- the body control unit 932 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a headlamp, a back lamp, a brake lamp, a blinker, or a fog lamp.
- the body control unit 932 can be input with radio waves transmitted from a portable device that substitutes for a key or signals from various switches.
- the body system control unit 932 receives input of these radio waves or signals, and controls a door lock device, a power window device, a lamp, and the like of the vehicle.
- the battery control unit 933 controls the secondary battery 9331 that is a power supply source of the drive motor according to various programs. For example, information such as battery temperature, battery output voltage, or battery remaining capacity is input to the battery control unit 933 from a battery device including the secondary battery 9331. The battery control unit 933 performs arithmetic processing using these signals, and controls the temperature adjustment control of the secondary battery 9331 or the cooling device provided in the battery device.
- the outside information detection unit 934 detects information outside the vehicle on which the vehicle control system 90 is mounted.
- the outside image detection unit 934 uses the image processing apparatus 10 of this technology.
- FIG. 31 is a diagram showing an installation example of the imaging unit.
- the imaging unit 21 of the image processing apparatus 10 is provided, for example, at at least one position among a front nose of the vehicle 80, a side mirror, a rear bumper, a back door, and an upper portion of a windshield in the vehicle interior.
- the imaging unit 21-A provided in the front nose and the imaging unit 21-B provided in the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 80.
- the imaging units 21-C and 21-D provided in the side mirror mainly acquire images on the side of the vehicle 80.
- the imaging unit 21-E provided in the rear bumper or the back door mainly acquires an image behind the vehicle 80.
- the imaging range AR-a indicates the imaging range of the imaging unit 21-A provided in the front nose
- the imaging ranges AR-c and AR-d are imaging units 21-C and 21- provided in the side mirrors, respectively.
- the imaging range AR-e indicates the imaging range of the imaging unit 21-E provided on the rear bumper or the back door.
- the vehicle exterior information detection unit 934 captures an area around the vehicle and acquires a polarization image. Further, the vehicle exterior information detection unit 934 acquires the polarization characteristics of the subject from the acquired polarization image. Furthermore, the vehicle exterior information detection unit 934 generates information that can be used for vehicle control or the like using the acquired polarization characteristics.
- the wireless communication unit 935 communicates with a management center that manages other vehicles, road conditions, etc. outside the vehicle via a wireless communication network such as DSRC (registered trademark) (Dedicated Short Range Communication), and performs integrated control of the received information Output to the unit 940.
- a wireless communication network such as DSRC (registered trademark) (Dedicated Short Range Communication)
- the wireless communication unit 935 transmits the polarization characteristics and the like acquired by the outside information detection unit 934 to other vehicles, the management center, and the like.
- the wireless communication unit 935 may communicate with the management center via a wireless communication network such as a wireless communication network for wireless LAN, a wireless communication network for mobile phones such as 3G, LTE, and 4G.
- the wireless communication unit 935 may perform positioning by receiving a global positioning system (GNSS) signal or the like, and may output a positioning result to the integrated control unit 940.
- GNSS global positioning system
- the integrated control unit 940 is connected to an input unit 951, an audio output unit 952, and a display unit 953.
- the input unit 951 is realized by a device that can be input by a passenger, such as a touch panel, a button, a microphone, a switch, or a lever.
- the input unit 951 generates an input signal based on information input by a passenger or the like and outputs the input signal to the integrated control unit 940.
- the audio output unit 952 outputs information based on the audio signal from the integrated control unit 940, thereby audibly notifying the vehicle passengers of the information.
- the display unit 953 displays an image based on the image signal from the integrated control unit 940 and visually notifies the vehicle occupant of the information.
- the integrated control unit 940 includes a CPU (Central Processing Unit), ROM (Read Only Memory), RAM (Random Access Memory), and the like.
- ROM Read Only Memory stores various programs executed by CPU (Central Processing Unit).
- RAM Random Access Memory stores information such as various parameters, calculation results or sensor values.
- the CPU executes various programs stored in the ROM, and acquires information obtained by communication with an input signal from the input unit 951 and each control unit, the vehicle outside information detection unit, and the wireless communication unit via the communication network 920, The overall operation in the vehicle control system 90 is controlled in accordance with information stored in the RAM.
- the integrated control unit 940 generates an audio signal indicating information to be audibly notified to a vehicle occupant, outputs the audio signal to the audio output unit 952, and generates an image signal for visually notifying the information. And output to the display portion 953.
- the integrated control unit 940 communicates with various devices existing outside the vehicle such as other vehicles and a management center using the wireless communication unit 935.
- the integrated control unit 940 performs driving support for the vehicle based on the map information stored in the ROM or RAM and the positioning result acquired from the wireless communication unit 935.
- At least two control units connected via the communication network 920 may be integrated as one control unit.
- each control unit may be configured by a plurality of control units.
- the vehicle control system 90 may include another control unit not shown.
- some or all of the functions of any one of the control units may be provided to another control unit. That is, as long as information is transmitted and received via the communication network 920, a predetermined calculation process may be performed by any of the control units.
- the outside vehicle information detection unit when the image processing apparatus of the present technology is applied to, for example, an outside vehicle information detection unit, the outside vehicle information detection unit accurately performs subject recognition or the like, and has a high spatial resolution and a high accuracy depth. A map can be generated. Further, by performing various processing processes such as filter processing corresponding to a PL filter, removal of reflection components, adjustment of glossiness, and the like, it is possible to generate a captured image with reduced reflection and glare. For this reason, it becomes possible to accurately detect obstacles and grasp the distance to the obstacles using the depth map generated by the outside information detection unit and the processed image, etc. It is possible to construct a vehicle control system that enables smooth running.
- the above-described image processing apparatus may be an imaging apparatus or an electronic device having an imaging function.
- the series of processing described in the specification can be executed by hardware, software, or a combined configuration of both.
- a program in which a processing sequence is recorded is installed and executed in a memory in a computer incorporated in dedicated hardware.
- the program can be recorded in advance on a hard disk, SSD (Solid State Drive), or ROM (Read Only Memory) as a recording medium.
- the program is a flexible disk, a CD-ROM (Compact Disc Read Only Memory), an MO (Magneto optical disc), a DVD (Digital Versatile Disc), a BD (Blu-Ray Disc (registered trademark)), a magnetic disk, or a semiconductor memory card. It can be stored (recorded) in a removable recording medium such as temporarily or permanently. Such a removable recording medium can be provided as so-called package software.
- the program may be transferred from the download site to the computer wirelessly or by wire via a network such as LAN (Local Area Network) or the Internet.
- the computer can receive the program transferred in this way and install it on a recording medium such as a built-in hard disk.
- the effect described in this specification is an illustration to the last, and is not limited, There may be an additional effect which is not described.
- the present technology should not be construed as being limited to the above-described embodiments, and for example, the above-described embodiments may be combined.
- the embodiments of this technology disclose the present technology in the form of examples, and it is obvious that those skilled in the art can make modifications and substitutions of the embodiments without departing from the gist of the present technology. In other words, the scope of the claims should be considered in order to determine the gist of the present technology.
- the image processing apparatus may have the following configuration.
- An image processing apparatus comprising: a polarization characteristic acquisition unit configured to acquire a polarization characteristic of the subject from a desired viewpoint position using the polarization image aligned by the alignment unit.
- the image processing apparatus further including a depth map generation unit that generates the depth map from images obtained by imaging the subject at a plurality of viewpoint positions.
- the depth map generation unit generates a depth map for each combination of images in the images at the plurality of viewpoint positions, and integrates the generated depth maps.
- the depth map generation unit uses, as an image obtained by imaging a subject at the plurality of viewpoint positions, an image captured without passing through the polarization filter or through a polarization filter having the same polarization direction.
- the polarization characteristics acquisition unit acquires the polarization characteristics of the subject from a desired viewpoint position based on polarization directions and luminances of the aligned polarization images.
- An image processing apparatus according to 1.
- An image processing apparatus according to 1.
- the depth map used in the alignment unit and the normal map generated in the normal map generation unit are integrated to generate a depth map with higher accuracy than the depth map used in the alignment unit.
- a plurality of lenses are arranged on the light incident surface side of the imaging element in a direction orthogonal to the optical axis direction, and each of the plurality of viewpoints is provided with a polarizing filter having a different polarization direction.
- the imaging unit further includes an imaging unit that generates an image by imaging the subject at a plurality of viewpoint positions without using the polarizing filter or through a polarizing filter having the same polarization direction.
- Image processing apparatus (14)
- the image processing apparatus and the image processing method according to this technology are obtained by imaging a subject at a plurality of viewpoint positions via a polarization filter whose polarization direction differs for each viewpoint position based on a depth map indicating distance information of the subject.
- the polarization image is aligned.
- the polarization characteristics of the subject from a desired viewpoint position are acquired using the aligned polarization image. For this reason, if the normal is calculated based on this polarization characteristic, the surface shape of the subject can be accurately detected from the desired position.
- the acquired polarization characteristics it is possible to acquire a desired polarization image without performing imaging at a desired viewpoint position while adjusting the polarization direction of the polarization filter. Therefore, a device that acquires the three-dimensional shape of the subject, a processing of processing a captured image, and the like are suitable for the device.
- DESCRIPTION OF SYMBOLS 10 ... Image processing apparatus 21, 21a, 21-A-21-E ... Imaging unit 22, 22a ... Depth map production
Abstract
Description
1-1.第1の実施の形態の構成
1-2.第1の実施の形態の動作
2.第2の実施の形態
2-1.第2の実施の形態の構成
2-2.第2の実施の形態の動作
3.第3の実施の形態
3-1.第3の実施の形態の構成
3-2.第3の実施の形態の動作
4.第4の実施の形態
4-1.第4の実施の形態の構成
4-2.第4の実施の形態の動作
5.他の実施の形態
6.適用例
<1.第1の実施の形態>
[1-1.第1の実施の形態の構成]
図1は、本技術の第1の実施の形態の機能構成を示す図である。画像処理装置10は、撮像ユニット21、デプスマップ生成部22、位置合わせ部23、偏光特性取得部24、法線マップ生成部25を有している。
次に第1の実施の形態の動作について説明する。図4は、第1の実施の形態の動作を示すフローチャートであり、撮像ユニット21が4台の撮像部211-1~211-4で構成されている場合を例示している。
次に、第2の実施の形態について説明する。第2の実施の形態では、生成した法線マップを用いて空間解像度の高いデプスマップを生成する場合について説明する。
図15は、本技術の第2の実施の形態の機能構成を示す図である。画像処理装置10は、第1の実施の形態と同様に、撮像ユニット21、デプスマップ生成部22、位置合わせ部23、偏光特性取得部24、法線マップ生成部25を有している。また、第2の実施の形態の画像処理装置10は、高精度デプスマップ生成部26を有している。
次に第2の実施の形態の動作について説明する。図16は、第2の実施の形態の動作を示すフローチャートである。第1の実施の形態と同様に、ステップST1~ステップST4で撮像ユニット21は偏光画像を生成する。例えばステップST1で撮像ユニット21の撮像部211-1は、第1偏光画像を生成する。ステップST2で撮像部211-2は、第2偏光画像を生成する。ステップST3で撮像部211-3は、第3偏光画像を生成する。ステップST4で撮像部211-4は、第4偏光画像を生成する。このように、画像処理装置10は、視点位置毎に偏光方向が異なる偏光画像を撮像部211-1~211-4によって生成してステップST11~ステップST14に進む。
次に、第3の実施の形態について説明する。第3の実施の形態では、偏光フィルタを介することなく撮像された画像を用いてデプスマップの生成を行うことで、偏光画像の輝度差による影響を受けることなくデプスマップを生成する。
図18は、第3の実施の形態の機能構成を示す図である。画像処理装置10は、撮像ユニット21aとデプスマップ生成部22a、および第1の実施の形態と同様に、位置合わせ部23、偏光特性取得部24、法線マップ生成部25を有している。
次に第3の実施の形態の動作について説明する。図20は、第3の実施の形態の動作を示すフローチャートである。第1の実施の形態と同様に、ステップST1~ステップST4で撮像ユニット21aは偏光画像を生成する。例えばステップST1で撮像ユニット21aの撮像部211-1は、第1偏光画像を生成する。ステップST2で撮像部211-2は、第2偏光画像を生成する。ステップST3で撮像部211-3は、第3偏光画像を生成する。ステップST4で撮像部211-4は、第4偏光画像を生成する。また、ステップST5~ステップST6で撮像ユニット21aは偏光フィルタを介していない無偏光画像を生成する。例えばステップST5で撮像ユニット21aの撮像部211-5は、第1無偏光画像を生成する。ステップST6で撮像部211-6は、第2無偏光画像を生成する。このように、画像処理装置10は、視点位置毎に偏光方向が異なる複数の偏光画像と視点位置が異なる無偏光画像を撮像部211-1~211-6によって生成してステップST15に進む。
次に、第4の実施の形態について説明する。第4の実施の形態では、取得した所望の視点位置の偏光特性を利用した処理を行う場合について説明する。
図21は、第4の実施の形態の機能構成を示す図である。画像処理装置10は、第1の実施の形態と同様に、撮像ユニット21、デプスマップ生成部22、位置合わせ部23、偏光特性取得部24を有している。また、第4の実施の形態の画像処理装置10は、偏光特性利用部27を有している。
次に第4の実施の形態の動作について説明する。図22は、第4の実施の形態の動作を示すフローチャートである。第1の実施の形態と同様に、ステップST1~ステップST4で撮像ユニット21は偏光画像を生成する。例えばステップST1で撮像ユニット21の撮像部211-1は、第1偏光画像を生成する。ステップST2で撮像部211-2は、第2偏光画像を生成する。ステップST3で撮像部211-3は、第3偏光画像を生成する。ステップST4で撮像部211-4は、第4偏光画像を生成する。このように、画像処理装置10は、視点位置毎に偏光方向が異なる偏光画像を撮像部211-1~211-4によって生成してステップST11~ステップST14に進む。
次に、他の実施の形態について説明する。他の実施の形態では、デプスマップを生成することなく法線マップを生成する場合を例示している。
次に、画像処理装置(画像処理方法)の適用例について説明する。図30は、この技術の画像処理装置を用いた車両制御システムの概略構成を例示したブロック図である。車両制御システム90は、通信ネットワーク920を介して接続された複数の制御部や検出部を備える。図30に示した例では、車両制御システム90は、駆動系制御部931、ボディ系制御部932、バッテリ制御部933、車外情報検出部934、無線通信部935および統合制御部940を備える。通信ネットワーク920は、例えば、CAN(Controller Area Network)、LIN(Local Interconnect Network)、LAN(Local Area Network)またはFlexRay(登録商標)等の任意の規格に準拠した車載通信ネットワークであってよい。また、統合制御部940には、入力部951、音声出力部952、表示部953が接続されている。
(1) 被写体の距離情報を示すデプスマップに基づいて、偏光方向が視点位置毎に異なる偏光フィルタを介して複数の視点位置で前記被写体を撮像して得られた偏光画像の位置合わせを行う位置合わせ部と、
前記位置合わせ部で位置合わせした偏光画像を用いて、所望の視点位置からの前記被写体の偏光特性を取得する偏光特性取得部と
を有する画像処理装置。
(2) 複数の視点位置で前記被写体を撮像して得られた画像から前記デプスマップを生成するデプスマップ生成部をさらに有する(1)に記載の画像処理装置。
(3) 前記デプスマップ生成部は、前記複数の視点位置の画像における画像の組み合わせ毎にデプスマップを生成して、生成したデプスマップの統合を行い、
前記位置合わせ部は、前記デプスマップ生成部で得られた統合後のデプスマップに基づいて前記偏光画像の位置合わせを行う(2)に記載の画像処理装置。
(4) 前記デプスマップ生成部は、前記複数の視点位置で被写体を撮像して得られた画像として前記偏光画像を用いる(2)または(3)に記載の画像処理装置。
(5) 前記デプスマップ生成部は、前記複数の視点位置で被写体を撮像して得られた画像として、前記偏光フィルタを介することなくまたは偏光方向が等しい偏光フィルタを介して撮像された画像を用いる(2)または(3)に記載の画像処理装置。
(6) 前記偏光特性取得部は、前記位置合わせした複数の偏光画像の偏光方向と輝度に基づいて、所望の視点位置からの被写体の偏光特性を取得する(1)乃至(5)のいずれかに記載の画像処理装置。
(7) 前記偏光特性取得部で算出された偏光特性に基づき、前記所望の視点位置における前記被写体の法線マップを生成する法線マップ生成部をさらに有する(1)乃至(6)のいずれかに記載の画像処理装置。
(8) 前記法線マップ生成部は、前記位置合わせ部で用いたデプスマップに基づき偏光解析における180度の不定性を除去した法線マップを生成する(7)に記載の画像処理装置。
(9) 前記位置合わせ部で用いたデプスマップと前記法線マップ生成部で生成した法線マップを統合して、前記位置合わせ部で用いたデプスマップよりも高精度なデプスマップを生成する高精度デプスマップ生成部をさらに有する(7)または(8)に記載の画像処理装置。
(10) 前記偏光方向が視点位置毎に異なる偏光フィルタを介して複数の視点位置で前記被写体を撮像して偏光画像を生成する撮像ユニットをさらに有する(1)乃至(9)のいずれかに記載の画像処理装置。
(11) 前記撮像ユニットでは、前記複数の視点位置毎に撮像部が設けられており、前記撮像部に偏光方向が異なる偏光フィルタを設けて前記複数の視点位置毎の偏光画像を生成する(10)に記載の画像処理装置。
(12) 前記撮像ユニットでは、光軸方向に対して直交する方向にレンズが撮像素子の光入射面側に複数配置されており、各レンズに偏光方向が異なる偏光フィルタを設けて前記複数の視点位置毎の偏光画像を生成する(10)に記載の画像処理装置。
(13) 前記撮像ユニットは、前記偏光フィルタを介することなくまたは偏光方向が等しい偏光フィルタを介して複数の視点位置で前記被写体を撮像して画像を生成する撮像部をさらに有する(10)に記載の画像処理装置。
(14) 前記偏光特性取得部で算出された偏光特性を利用して画像の処理を行う偏光特性利用部をさらに有する(1)乃至(12)のいずれかに記載の画像処理装置。
(15) 前記偏光特性利用部は、前記偏光特性取得部で算出された偏光特性を利用して、前記所望の視点位置における画像の反射成分を調整した画像を生成する(14)に記載の画像処理装置。
(16) 前記偏光特性利用部は、前記偏光特性取得部で算出された偏光特性を利用して画像特徴量の算出を行い、画像特徴量を用いて前記被写体の表面形状を考慮した処理を行う(14)または(15)に記載の画像処理装置。
21,21a,21-A~21-E・・・撮像ユニット
22,22a・・・デプスマップ生成部
23,23a・・・位置合わせ部
24・・・偏光特性取得部
25,25a・・・法線マップ生成部
26・・・高精度デプスマップ生成部
27・・・加工処理部
90・・・車両制御システム
210-1~210-4,210-n、210-p,223・・・偏光板
211-1~211-6,211-n,211-p,211-p・・・撮像部
221・・・イメージセンサ
222・・・レンズ
Claims (17)
- 被写体の距離情報を示すデプスマップに基づいて、偏光方向が視点位置毎に異なる偏光フィルタを介して複数の視点位置で前記被写体を撮像して得られた偏光画像の位置合わせを行う位置合わせ部と、
前記位置合わせ部で位置合わせした偏光画像を用いて、所望の視点位置からの前記被写体の偏光特性を取得する偏光特性取得部と
を有する画像処理装置。 - 複数の視点位置で前記被写体を撮像して得られた画像から前記デプスマップを生成するデプスマップ生成部をさらに有する
請求項1に記載の画像処理装置。 - 前記デプスマップ生成部は、前記複数の視点位置の画像における画像の組み合わせ毎にデプスマップを生成して、生成したデプスマップの統合を行い、
前記位置合わせ部は、前記デプスマップ生成部で得られた統合後のデプスマップに基づいて前記偏光画像の位置合わせを行う
請求項2に記載の画像処理装置。 - 前記デプスマップ生成部は、前記複数の視点位置で被写体を撮像して得られた画像として前記偏光画像を用いる
請求項2に記載の画像処理装置。 - 前記デプスマップ生成部は、前記複数の視点位置で被写体を撮像して得られた画像として、前記偏光フィルタを介することなくまたは偏光方向が等しい偏光フィルタを介して撮像された画像を用いる
請求項2に記載の画像処理装置。 - 前記偏光特性取得部は、前記位置合わせした複数の偏光画像の偏光方向と輝度に基づいて、所望の視点位置からの被写体の偏光特性を取得する
請求項1に記載の画像処理装置。 - 前記偏光特性取得部で算出された偏光特性に基づき、前記所望の視点位置における前記被写体の法線マップを生成する法線マップ生成部をさらに有する
請求項1に記載の画像処理装置。 - 前記法線マップ生成部は、前記位置合わせ部で用いたデプスマップに基づき偏光解析における180度の不定性を除去した法線マップを生成する
請求項7に記載の画像処理装置。 - 前記位置合わせ部で用いたデプスマップと前記法線マップ生成部で生成した法線マップを統合して、前記位置合わせ部で用いたデプスマップよりも高精度なデプスマップを生成する高精度デプスマップ生成部をさらに有する
請求項7に記載の画像処理装置。 - 前記偏光方向が視点位置毎に異なる偏光フィルタを介して複数の視点位置で前記被写体を撮像して偏光画像を生成する撮像ユニットをさらに有する
請求項1に記載の画像処理装置。 - 前記撮像ユニットでは、前記複数の視点位置毎に撮像部が設けられており、前記撮像部に偏光方向が異なる偏光フィルタを設けて前記複数の視点位置毎の偏光画像を生成する
請求項10に記載の画像処理装置。 - 前記撮像ユニットでは、光軸方向に対して直交する方向にレンズが撮像素子の光入射面側に複数配置されており、各レンズに偏光方向が異なる偏光フィルタを設けて前記複数の視点位置毎の偏光画像を生成する
請求項10に記載の画像処理装置。 - 前記撮像ユニットは、前記偏光フィルタを介することなくまたは偏光方向が等しい偏光フィルタを介して複数の視点位置で前記被写体を撮像して画像を生成する撮像部をさらに有する
請求項10に記載の画像処理装置。 - 前記偏光特性取得部で算出された偏光特性を利用して画像の処理を行う偏光特性利用部をさらに有する
請求項1に記載の画像処理装置。 - 前記偏光特性利用部は、前記偏光特性取得部で算出された偏光特性を利用して、前記所望の視点位置における画像の反射成分を調整した画像を生成する
請求項14に記載の画像処理装置。 - 前記偏光特性利用部は、前記偏光特性取得部で算出された偏光特性を利用して画像特徴量の算出を行い、画像特徴量を用いて前記被写体の表面形状を考慮した処理を行う
請求項14に記載の画像処理装置。 - 位置合わせ部で、被写体の距離情報を示すデプスマップに基づいて、偏光方向が視点位置毎に異なる偏光フィルタを介して複数の視点位置で前記被写体を撮像して得られた偏光画像の位置合わせを行う工程と、
偏光特性取得部で、前記位置合わせした偏光画像を用いて、所望の視点位置からの前記被写体の偏光特性を取得する工程と
を含む画像処理方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201580064243.4A CN107003110B (zh) | 2014-12-01 | 2015-10-28 | 图像处理装置和图像处理方法 |
US15/515,260 US11206388B2 (en) | 2014-12-01 | 2015-10-28 | Image processing apparatus and image processing method for aligning polarized images based on a depth map and acquiring a polarization characteristic using the aligned polarized images |
JP2016562351A JP6652065B2 (ja) | 2014-12-01 | 2015-10-28 | 画像処理装置と画像処理方法 |
EP15865560.5A EP3228977A4 (en) | 2014-12-01 | 2015-10-28 | Image-processing device and image-processing method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-243299 | 2014-12-01 | ||
JP2014243299 | 2014-12-01 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016088483A1 true WO2016088483A1 (ja) | 2016-06-09 |
Family
ID=56091438
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/080380 WO2016088483A1 (ja) | 2014-12-01 | 2015-10-28 | 画像処理装置と画像処理方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US11206388B2 (ja) |
EP (1) | EP3228977A4 (ja) |
JP (1) | JP6652065B2 (ja) |
CN (1) | CN107003110B (ja) |
WO (1) | WO2016088483A1 (ja) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018207661A1 (ja) * | 2017-05-11 | 2018-11-15 | ソニー株式会社 | 光センサ、及び、電子機器 |
WO2019044123A1 (ja) * | 2017-08-30 | 2019-03-07 | ソニー株式会社 | 情報処理装置、情報処理方法、及び記録媒体 |
WO2019138678A1 (ja) * | 2018-01-15 | 2019-07-18 | キヤノン株式会社 | 情報処理装置及びその制御方法及びプログラム、並びに、車両の運転支援システム |
JP2019534515A (ja) * | 2016-11-03 | 2019-11-28 | クゥアルコム・インコーポレイテッドQualcomm Incorporated | 移動体デバイスのための向上した深度マップ画像 |
WO2019235019A1 (ja) | 2018-06-05 | 2019-12-12 | ソニー株式会社 | 情報生成装置と情報生成方法およびプログラム |
CN111465818A (zh) * | 2017-12-12 | 2020-07-28 | 索尼公司 | 图像处理设备、图像处理方法、程序和信息处理系统 |
CN112954281A (zh) * | 2019-12-10 | 2021-06-11 | 通用汽车环球科技运作有限责任公司 | 在配备有驾驶辅助系统的车辆中使用偏振相机生成三维点云 |
JP2022513847A (ja) * | 2018-12-14 | 2022-02-09 | スペクトラル エムディー,インコーポレイテッド | 高精度マルチアパーチャスペクトルイメージングのためのシステムおよび方法 |
US11676245B2 (en) | 2018-05-24 | 2023-06-13 | Sony Corporation | Information processing apparatus and method for processing information |
US11948300B2 (en) | 2018-12-14 | 2024-04-02 | Spectral Md, Inc. | Machine learning systems and methods for assessment, healing prediction, and treatment of wounds |
Families Citing this family (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI694604B (zh) | 2015-07-23 | 2020-05-21 | 光澄科技股份有限公司 | 光偵測器 |
US10761599B2 (en) | 2015-08-04 | 2020-09-01 | Artilux, Inc. | Eye gesture tracking |
US10861888B2 (en) | 2015-08-04 | 2020-12-08 | Artilux, Inc. | Silicon germanium imager with photodiode in trench |
WO2017024121A1 (en) | 2015-08-04 | 2017-02-09 | Artilux Corporation | Germanium-silicon light sensing apparatus |
US10707260B2 (en) | 2015-08-04 | 2020-07-07 | Artilux, Inc. | Circuit for operating a multi-gate VIS/IR photodiode |
EP3783656B1 (en) | 2015-08-27 | 2023-08-23 | Artilux Inc. | Wide spectrum optical sensor |
US10739443B2 (en) | 2015-11-06 | 2020-08-11 | Artilux, Inc. | High-speed light sensing apparatus II |
US10418407B2 (en) | 2015-11-06 | 2019-09-17 | Artilux, Inc. | High-speed light sensing apparatus III |
US10886309B2 (en) | 2015-11-06 | 2021-01-05 | Artilux, Inc. | High-speed light sensing apparatus II |
US10254389B2 (en) | 2015-11-06 | 2019-04-09 | Artilux Corporation | High-speed light sensing apparatus |
US10741598B2 (en) | 2015-11-06 | 2020-08-11 | Atrilux, Inc. | High-speed light sensing apparatus II |
JP2018029280A (ja) * | 2016-08-18 | 2018-02-22 | ソニー株式会社 | 撮像装置と撮像方法 |
JP2019015575A (ja) * | 2017-07-05 | 2019-01-31 | 株式会社東芝 | 画像処理装置、測距装置および処理システム |
WO2019125427A1 (en) * | 2017-12-20 | 2019-06-27 | Olympus Corporation | System and method for hybrid depth estimation |
US11105928B2 (en) | 2018-02-23 | 2021-08-31 | Artilux, Inc. | Light-sensing apparatus and light-sensing method thereof |
TWI788246B (zh) | 2018-02-23 | 2022-12-21 | 美商光程研創股份有限公司 | 光偵測裝置 |
TWI780007B (zh) | 2018-04-08 | 2022-10-01 | 美商光程研創股份有限公司 | 光偵測裝置及其系統 |
TWI795562B (zh) | 2018-05-07 | 2023-03-11 | 美商光程研創股份有限公司 | 雪崩式之光電晶體 |
US10969877B2 (en) | 2018-05-08 | 2021-04-06 | Artilux, Inc. | Display apparatus |
CN110971889A (zh) * | 2018-09-30 | 2020-04-07 | 华为技术有限公司 | 一种获取深度图像的方法、摄像装置以及终端 |
EP3863278B1 (en) * | 2018-10-03 | 2023-04-19 | FUJIFILM Corporation | Imaging device |
KR20200097865A (ko) * | 2019-02-08 | 2020-08-20 | 삼성전자주식회사 | 깊이 측정을 위한 이미지 처리 시스템 및 이의 동작 방법 |
US20220375125A1 (en) * | 2021-05-07 | 2022-11-24 | Intrinsic Innovation Llc | Systems and methods for using computer vision to pick up small objects |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5028138A (en) * | 1989-05-23 | 1991-07-02 | Wolff Lawrence B | Method of and apparatus for obtaining object data by machine vision form polarization information |
WO2009147814A1 (ja) * | 2008-06-02 | 2009-12-10 | パナソニック株式会社 | 法線情報を生成する画像処理装置、方法、コンピュータプログラム、および、視点変換画像生成装置 |
JP2010256138A (ja) * | 2009-04-23 | 2010-11-11 | Canon Inc | 撮像装置及びその制御方法 |
JP2011171858A (ja) * | 2010-02-16 | 2011-09-01 | Sony Corp | 画像処理装置、画像処理方法、画像処理プログラムおよび撮像装置 |
JP2013030889A (ja) * | 2011-07-27 | 2013-02-07 | Dainippon Printing Co Ltd | 個体識別装置、個体識別対象物、個体識別方法、及びプログラム |
JP2013044597A (ja) * | 2011-08-23 | 2013-03-04 | Canon Inc | 画像処理装置および方法、プログラム |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1706725B1 (en) * | 2003-12-03 | 2019-12-25 | FPInnovations | Circularly polarized light method for determining wall thickness and orientations of fibrils of cellulosic fibres |
EP2120007A4 (en) | 2007-02-13 | 2010-12-01 | Panasonic Corp | IMAGE PROCESSING SYSTEM, METHOD AND APPARATUS AND IMAGE FORMAT |
WO2010131436A1 (ja) * | 2009-05-15 | 2010-11-18 | 株式会社ニコン | 測距装置および撮像装置 |
JP5440927B2 (ja) * | 2009-10-19 | 2014-03-12 | 株式会社リコー | 測距カメラ装置 |
US8760517B2 (en) * | 2010-09-27 | 2014-06-24 | Apple Inc. | Polarized images for security |
JP5831024B2 (ja) * | 2011-08-04 | 2015-12-09 | ソニー株式会社 | 画像処理装置、および画像処理方法、並びにプログラム |
JP6516429B2 (ja) * | 2014-09-16 | 2019-05-22 | キヤノン株式会社 | 距離計測装置、撮像装置、および距離計測方法 |
-
2015
- 2015-10-28 US US15/515,260 patent/US11206388B2/en active Active
- 2015-10-28 JP JP2016562351A patent/JP6652065B2/ja not_active Expired - Fee Related
- 2015-10-28 EP EP15865560.5A patent/EP3228977A4/en not_active Withdrawn
- 2015-10-28 WO PCT/JP2015/080380 patent/WO2016088483A1/ja active Application Filing
- 2015-10-28 CN CN201580064243.4A patent/CN107003110B/zh not_active Expired - Fee Related
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5028138A (en) * | 1989-05-23 | 1991-07-02 | Wolff Lawrence B | Method of and apparatus for obtaining object data by machine vision form polarization information |
WO2009147814A1 (ja) * | 2008-06-02 | 2009-12-10 | パナソニック株式会社 | 法線情報を生成する画像処理装置、方法、コンピュータプログラム、および、視点変換画像生成装置 |
JP2010256138A (ja) * | 2009-04-23 | 2010-11-11 | Canon Inc | 撮像装置及びその制御方法 |
JP2011171858A (ja) * | 2010-02-16 | 2011-09-01 | Sony Corp | 画像処理装置、画像処理方法、画像処理プログラムおよび撮像装置 |
JP2013030889A (ja) * | 2011-07-27 | 2013-02-07 | Dainippon Printing Co Ltd | 個体識別装置、個体識別対象物、個体識別方法、及びプログラム |
JP2013044597A (ja) * | 2011-08-23 | 2013-03-04 | Canon Inc | 画像処理装置および方法、プログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3228977A4 * |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019534515A (ja) * | 2016-11-03 | 2019-11-28 | クゥアルコム・インコーポレイテッドQualcomm Incorporated | 移動体デバイスのための向上した深度マップ画像 |
JP7044107B2 (ja) | 2017-05-11 | 2022-03-30 | ソニーグループ株式会社 | 光センサ、及び、電子機器 |
JPWO2018207661A1 (ja) * | 2017-05-11 | 2020-06-18 | ソニー株式会社 | 光センサ、及び、電子機器 |
WO2018207661A1 (ja) * | 2017-05-11 | 2018-11-15 | ソニー株式会社 | 光センサ、及び、電子機器 |
WO2019044123A1 (ja) * | 2017-08-30 | 2019-03-07 | ソニー株式会社 | 情報処理装置、情報処理方法、及び記録媒体 |
CN111465818A (zh) * | 2017-12-12 | 2020-07-28 | 索尼公司 | 图像处理设备、图像处理方法、程序和信息处理系统 |
WO2019138678A1 (ja) * | 2018-01-15 | 2019-07-18 | キヤノン株式会社 | 情報処理装置及びその制御方法及びプログラム、並びに、車両の運転支援システム |
US11676245B2 (en) | 2018-05-24 | 2023-06-13 | Sony Corporation | Information processing apparatus and method for processing information |
WO2019235019A1 (ja) | 2018-06-05 | 2019-12-12 | ソニー株式会社 | 情報生成装置と情報生成方法およびプログラム |
JP2022513847A (ja) * | 2018-12-14 | 2022-02-09 | スペクトラル エムディー,インコーポレイテッド | 高精度マルチアパーチャスペクトルイメージングのためのシステムおよび方法 |
JP7186298B2 (ja) | 2018-12-14 | 2022-12-08 | スペクトラル エムディー,インコーポレイテッド | 高精度マルチアパーチャスペクトルイメージングのためのシステムおよび方法 |
US11631164B2 (en) | 2018-12-14 | 2023-04-18 | Spectral Md, Inc. | System and method for high precision multi-aperture spectral imaging |
US11948300B2 (en) | 2018-12-14 | 2024-04-02 | Spectral Md, Inc. | Machine learning systems and methods for assessment, healing prediction, and treatment of wounds |
CN112954281A (zh) * | 2019-12-10 | 2021-06-11 | 通用汽车环球科技运作有限责任公司 | 在配备有驾驶辅助系统的车辆中使用偏振相机生成三维点云 |
Also Published As
Publication number | Publication date |
---|---|
EP3228977A1 (en) | 2017-10-11 |
JP6652065B2 (ja) | 2020-02-19 |
US20170223339A1 (en) | 2017-08-03 |
EP3228977A4 (en) | 2018-07-04 |
CN107003110B (zh) | 2020-09-15 |
US11206388B2 (en) | 2021-12-21 |
JPWO2016088483A1 (ja) | 2017-09-07 |
CN107003110A (zh) | 2017-08-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2016088483A1 (ja) | 画像処理装置と画像処理方法 | |
JP6708199B2 (ja) | 撮像装置と画像処理装置と画像処理方法 | |
WO2021004548A1 (zh) | 一种基于双目立体视觉系统的车辆智能测速方法 | |
US10460422B2 (en) | Image processing device and image processing method | |
WO2017159382A1 (ja) | 信号処理装置および信号処理方法 | |
US11302022B2 (en) | Three-dimensional measurement system and three-dimensional measurement method | |
WO2017057042A1 (ja) | 信号処理装置、信号処理方法、プログラム、および、物体検出システム | |
JP7040447B2 (ja) | 画像処理装置および情報生成装置と情報生成方法 | |
WO2017056821A1 (ja) | 情報取得装置と情報取得方法 | |
CN105551020B (zh) | 一种检测目标物尺寸的方法及装置 | |
JP2011007794A (ja) | デュアルステレオカメラを備えた距離測定装置 | |
JP2020506487A (ja) | シーンから深度情報を取得するための装置および方法 | |
JP6701532B2 (ja) | 画像処理装置および画像処理方法 | |
JP2016175586A (ja) | 車両周辺監視装置、車両周辺監視方法、及びプログラム | |
CN112455502B (zh) | 基于激光雷达的列车定位方法及装置 | |
US11663831B2 (en) | Image processing device and image processing method | |
JP2019074535A (ja) | 校正方法、校正装置、及びプログラム | |
JP6801666B2 (ja) | 画像処理装置と画像処理方法および車両制御システム | |
Gehrig et al. | 6D vision goes fisheye for intersection assistance | |
WO2021056185A1 (en) | Systems and methods for partially updating high-definition map based on sensor data matching | |
CN111465818B (zh) | 图像处理设备、图像处理方法、程序和信息处理系统 | |
Huang et al. | Wide-angle vision for road views | |
CN115523929B (zh) | 一种基于slam的车载组合导航方法、装置、设备及介质 | |
CN116362020A (zh) | 一种基于多模态信息的场景流估计方法、系统及应用 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15865560 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016562351 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15515260 Country of ref document: US |
|
REEP | Request for entry into the european phase |
Ref document number: 2015865560 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2015865560 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |