US20230342963A1 - Imaging device, information processing device, imaging method, and information processing method - Google Patents
Imaging device, information processing device, imaging method, and information processing method Download PDFInfo
- Publication number
- US20230342963A1 US20230342963A1 US17/756,776 US202017756776A US2023342963A1 US 20230342963 A1 US20230342963 A1 US 20230342963A1 US 202017756776 A US202017756776 A US 202017756776A US 2023342963 A1 US2023342963 A1 US 2023342963A1
- Authority
- US
- United States
- Prior art keywords
- polarization
- subject
- image
- lights
- light
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 118
- 230000010365 information processing Effects 0.000 title claims description 35
- 238000003672 processing method Methods 0.000 title claims description 15
- 230000010287 polarization Effects 0.000 claims abstract description 266
- 239000011159 matrix material Substances 0.000 claims description 9
- 238000000034 method Methods 0.000 description 48
- 230000008569 process Effects 0.000 description 40
- 238000010586 diagram Methods 0.000 description 25
- 238000005259 measurement Methods 0.000 description 11
- 230000006870 function Effects 0.000 description 10
- 238000007781 pre-processing Methods 0.000 description 10
- 238000003705 background correction Methods 0.000 description 6
- 230000004048 modification Effects 0.000 description 5
- 238000012986 modification Methods 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 4
- 238000001514 detection method Methods 0.000 description 4
- 239000003086 colorant Substances 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000023077 detection of light stimulus Effects 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 238000000691 measurement method Methods 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/586—Depth or shape recovery from multiple images from multiple light sources, e.g. photometric stereo
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B11/00—Filters or other obturators specially adapted for photographic purposes
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
- G03B15/02—Illuminating scene
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
- G03B15/02—Illuminating scene
- G03B15/03—Combinations of cameras with lighting apparatus; Flash units
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
- G03B15/02—Illuminating scene
- G03B15/03—Combinations of cameras with lighting apparatus; Flash units
- G03B15/05—Combinations of cameras with electronic flash apparatus; Electronic flash units
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B19/00—Cameras
- G03B19/02—Still-picture cameras
- G03B19/04—Roll-film cameras
- G03B19/06—Roll-film cameras adapted to be loaded with more than one film, e.g. with exposure of one or the other at will
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
- G03B35/08—Stereoscopic photography by simultaneous recording
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/28—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for polarising
- G02B27/288—Filters employing polarising elements, e.g. Lyot or Solc filters
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B2215/00—Special procedures for taking photographs; Apparatus therefor
- G03B2215/05—Combinations of cameras with electronic flash units
- G03B2215/0514—Separate unit
- G03B2215/0557—Multiple units, e.g. slave-unit
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B2215/00—Special procedures for taking photographs; Apparatus therefor
- G03B2215/05—Combinations of cameras with electronic flash units
- G03B2215/0589—Diffusors, filters or refraction means
- G03B2215/0592—Diffusors, filters or refraction means installed in front of light emitter
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10141—Special mode during image acquisition
- G06T2207/10152—Varying illumination
Definitions
- the present disclosure relates to an imaging device, an information processing device, an imaging method, and an information processing method.
- photometric stereo As one of methods for measuring a three-dimensional shape of a subject, there is photometric stereo.
- photometric stereo light is sequentially and separately emitted to a subject from a plurality of directions, and a distance to the subject or a three-dimensional shape of the subject is measured from the differences in shading (see, for example, Patent Literature 1).
- the subject is sequentially imaged by a camera while switching lights, and thus in a case where the subject is a moving object, a positional shift due to movement of the subject occurs during switching of the lights, and accurate measurement cannot be performed.
- the present disclosure proposes an imaging device, an information processing device, an imaging method, and an information processing method capable of accurately measuring a three-dimensional shape of a moving object.
- an imaging device includes an imaging unit, a separating unit, a calculating unit, and an estimation unit.
- the imaging unit includes a plurality of polarization lights having different polarization directions of light emitted to a subject and a polarization sensor and captures an image of the subject that is simultaneously irradiated with the light from the plurality of polarization lights.
- the separating unit separates pixel signals corresponding to each of the polarization directions from the image captured by the imaging unit and generates an image for every polarization direction.
- the calculating unit calculates a normal line on a surface of the subject from the image for each of the polarization directions by photometric stereo.
- the estimation unit estimates the shape of the subject on the basis of the normal line calculated by the calculating unit.
- FIG. 1 is an explanatory diagram illustrating an overview of an imaging method and an information processing method according to the present disclosure.
- FIG. 2 is a block diagram illustrating an exemplary configuration of an imaging device according to the present disclosure.
- FIG. 3 is an explanatory diagram illustrating a relationship between an exposure amount and output luminance of a camera according to the present disclosure.
- FIG. 4 is an explanatory diagram of an acquisition method of shading data according to the present disclosure.
- FIG. 5 is an explanatory diagram of the acquisition method of shading data according to the present disclosure.
- FIG. 6 is an explanatory diagram of a detection method of a light source direction according to the present disclosure.
- FIG. 7 is an explanatory diagram of the detection method of a light source direction according to the present disclosure.
- FIG. 8 is an explanatory diagram of the detection method of a light source direction according to the present disclosure.
- FIG. 9 is an explanatory table illustrating an example of a light source direction and polarization direction correspondence table according to the present disclosure.
- FIG. 10 is an explanatory diagram of an imaging unit according to the present disclosure.
- FIG. 11 is an explanatory diagram of an imaging unit according to the present disclosure.
- FIG. 12 is an explanatory diagram of an imaging unit according to the present disclosure.
- FIG. 13 is a block diagram illustrating an exemplary configuration of a signal separating unit according to the present disclosure.
- FIG. 14 is an explanatory diagram of a polarization demosaic process according to the present disclosure.
- FIG. 15 is an explanatory graph illustrating an example of a polarization model according to the present disclosure.
- FIG. 16 is an explanatory diagram of a normal line calculation method according to the present disclosure.
- FIG. 17 is a flowchart illustrating processes executed by the imaging device according to the present disclosure.
- FIG. 18 is a flowchart illustrating processes executed by the imaging device according to the present disclosure.
- FIG. 19 is a flowchart illustrating processes executed by the imaging device according to the present disclosure.
- FIG. 20 is an explanatory diagram illustrating a modification of the camera according to the present disclosure.
- the basic photometric stereo light is emitted to a subject while sequentially switching a plurality of lights having different light emission directions with respect to the subject, an image of the subject illuminated by each light is captured, and the three-dimensional shape of the subject is measured on the basis of differences in the shade of the subject in the images.
- the three-dimensional shape of the subject is measured by one-time imaging without switching the lights.
- the three-dimensional shape of the subject is measured by imaging the subject by simultaneously emitting light having different wavelengths (colors) to the subject from a plurality of lights, extracting each color component from the captured image, and obtaining shade to be obtained in a case where the light emission is performed with one of the lights alone.
- the wavelength multiplexing system it is not necessary to switch lights, and the three-dimensional shape of a subject can be measured by one-time imaging. Therefore, even in a case where the subject is a moving object, it is possible to measure the three-dimensional shape of the subject.
- the color of light to be emitted to the subject is changed by applying different narrowband bandpass filters to lights. Therefore, the amount of light is reduced by transmission through the narrowband bandpass filters, the S/N ratio is deteriorated, which may deteriorate the measurement accuracy.
- an imaging device, an information processing device, an imaging method, and an information processing method according to the present disclosure accurately measure the three-dimensional shape of a moving object without increasing the cost.
- FIG. 1 is an explanatory diagram illustrating an overview of an imaging method and an information processing method according to the present disclosure.
- the imaging method and the information processing method according to the present disclosure enable measurement of the three-dimensional shape of a moving object by multiplexing lights using polarized light and simultaneously emitting light having different polarization directions from a plurality of directions to a subject at one time of imaging and thereby imaging the subject.
- one camera 10 and a plurality of (here, four) light sources L 1 , L 2 , L 3 , and L 4 are prepared.
- the light sources L 1 , L 2 , L 3 , and L 4 are arranged in different directions (hereinafter referred to as light source directions) S 1 , S 2 , S 3 , and S 4 with respect to a subject 100 .
- the number of light sources is not limited to four.
- the light sources L 1 , L 2 , L 3 , and L 4 include, at light emission units, polarization filters F 1 , F 2 , F 3 , and F 4 having different polarization directions, respectively, and emit light each having different polarization directions to the subject 100 .
- the polarization direction of light to emit and the light source direction are associated with each other in advance.
- the camera 10 includes a polarization sensor 11 .
- each polarization direction component is separated from image data acquired by the polarization sensor 11 by a signal separating process. Then, on the basis of the correspondence relationship between the polarization directions of light and the light source directions associated in advance, images I 1 , I 2 , I 3 , and I 4 , each of which would be obtained in a case where light is emitted from one of the light sources L 1 , L 2 , L 3 , and L 4 in the respective directions, are calculated.
- a normal line calculating process is performed on each of the images I 1 , I 2 , I 3 , and I 4 , thereby calculating a normal line image Is, and a distance estimating process is performed using the normal line image Is, thereby estimating the surface three-dimensional shape of the subject 100 .
- the three-dimensional shape of the moving object can be accurately measured.
- FIG. 2 is a block diagram illustrating an exemplary configuration of the imaging device according to the present disclosure.
- an imaging device 1 according to the present disclosure includes an imaging unit 2 and an information processing device 3 .
- the imaging unit 2 includes the light sources L 1 , L 2 , L 3 , and L 4 including the polarization filters F 1 , F 2 , F 3 , and F 4 , respectively, and the camera 10 illustrated in FIG. 1 .
- the imaging unit 2 causes the camera 10 to capture an image of the subject 100 in a state in which light having different polarization directions is simultaneously emitted from the light sources L 1 , L 2 , L 3 , and L 4 via the polarization filters F 1 , F 2 , F 3 , and F 4 , respectively, and outputs image data of the captured image to the information processing device 3 .
- the information processing device 3 includes, for example, a microcomputer including a central processing unit (CPU), a read only memory (ROM), a random access memory (RAM), and the like and various types of circuits.
- a microcomputer including a central processing unit (CPU), a read only memory (ROM), a random access memory (RAM), and the like and various types of circuits.
- the information processing device 3 includes a calibration unit 4 that functions when the CPU executes an information processing program stored in the ROM using the RAM as a work area, a signal separating unit 5 , a normal line calculating unit 6 , and a distance estimation unit 7 .
- the calibration unit 4 may be configured by hardware such as an application specific integrated circuit (ASIC) or a field programmable gate array (FPGA).
- ASIC application specific integrated circuit
- FPGA field programmable gate array
- the signal separating unit 5 , the normal line calculating unit 6 , and the distance estimation unit 7 are not necessarily included in the information processing device 3 and may be provided on a cloud, for example.
- the calibration unit 4 , the signal separating unit 5 , the normal line calculating unit 6 , and the distance estimation unit 7 included in the information processing device 3 each implement or execute an action of information processing described below.
- the internal configuration of the information processing device 3 is not limited to the configuration illustrated in FIG. 2 and may be another configuration as long as the information processing described below is performed.
- the calibration unit 4 includes a storage unit and stores information regarding linearization of the output luminance of the camera 10 , information of shading data generated in a case where light is emitted by each of the light sources, information in which the light source directions and the polarization directions are associated with each other, and other information.
- FIG. 3 is an explanatory diagram illustrating a relationship between the exposure amount and the output luminance of the camera according to the present disclosure.
- the output luminance i′ in a captured image increases as the exposure amount i increases.
- the changes in the output luminance i′ of the camera 10 accompanying the changes in the exposure amount i is not linear.
- the calibration unit 4 measures the luminance of images sequentially captured while changing the exposure amount i of the camera 10 , calculates a characteristic function indicating the relationship between the exposure amount i and the output luminance i′ of the camera 10 indicated by the solid line in FIG. 3 , and stores an inverse transform function of the characteristic function.
- the inverse transform function is used when a preprocessing unit 51 (see FIG. 13 ) to be described later included in the signal separating unit 5 linearizes an image signal.
- the characteristic of the output luminance i′ of the camera 10 is corrected to be linear as indicated by a dotted line in FIG. 3 .
- the calibration unit 4 also acquires and stores shading data indicating the state of shade made when light is emitted by each of the light sources L 1 , L 2 , L 3 , and L 4 .
- FIGS. 4 and 5 are explanatory diagrams of an acquisition method of shading data according to the present disclosure.
- a gray plate 101 is disposed at a place where the subject 100 (see FIG. 1 ) is placed. Then, the gray plate 101 is sequentially irradiated with light by each of the light sources L 1 , L 2 , L 3 , and L 4 , and images of the gray plate 101 are captured by the camera 10 .
- the positions of the light sources L 1 , L 2 , L 3 , and L 4 with respect to the gray plate 101 are different from each other. Therefore, as illustrated in FIG. 4 , for example, while the light source L 1 is turned on, an image I 11 in which the shade becomes darker from the lower left toward the upper right of the image is captured, and while the light source L 2 is turned on, an image I 12 in which the shade becomes darker from the upper left toward the lower right of the image is captured.
- the calibration unit 4 acquires and stores image data of these images I 11 , I 12 , I 13 , and I 14 as shading data.
- FIGS. 6 to 8 are explanatory diagrams of a detection method of a light source direction according to the present disclosure.
- a light source direction for example, a spherical object 102 , surface of which is mirror-finished and the shape of which is known, is disposed at a place where the subject 100 (see FIG. 1 ) is placed.
- the spherical object 102 is sequentially irradiated with light by each of the light sources L 1 , L 2 , L 3 , and L 4 , and images of the spherical object 102 are captured by the camera 10 .
- the positions of the light sources L 1 , L 2 , L 3 , and L 4 with respect to the spherical object 102 are different from each other. Therefore, as illustrated in FIG. 7 , for example, while the light source L 1 is turned on, an image I 21 in which a position on the lower left of the spherical object 102 appears as a highlight portion with a high luminance is captured, and while the light source L 2 is turned on, an image 122 in which a position on the upper left of the spherical object 102 appears as a highlight portion is captured.
- a normal line direction n at the center of a highlight portion is obtained as a half vector of a line-of-sight direction (imaging direction of the camera 10 ) V and a light source direction S. Therefore, the calibration unit 4 can calculate the light source direction of each of the light sources L 1 , L 2 , L 3 , and L 4 by the following Equation (1).
- the calibration unit 4 associates the light source direction of each of the light sources L 1 , L 2 , L 3 , and L 4 with a polarization direction of emitted light.
- the gray plate 101 (see FIG. 4 ) is imaged by the camera 10 as in the case of acquiring shading data.
- the calibration unit 4 performs linearization and shading correction of the captured image data, estimates a polarization model to be described later and thereby obtains a polarization angle ⁇ i , sets the polarization angle ⁇ i as the polarization direction of a polarization light i, associates the polarization angle ⁇ i with the light source direction that has been detected earlier and stores as a light source direction and polarization direction correspondence table.
- FIG. 9 is an explanatory table illustrating an example of a light source direction and polarization direction correspondence table according to the present disclosure.
- ⁇ 1 is associated as the polarization direction ( ⁇ i )
- S 1 is associated as the light source direction (S i ).
- ⁇ 2 is associated as the polarization direction ( ⁇ i )
- S 2 is associated as the light source direction (S i ).
- ⁇ 3 is associated as the polarization direction ( ⁇ i )
- S 3 is associated as the light source direction (S i ).
- ⁇ 4 is associated as the polarization direction ( ⁇ i )
- S 4 is associated as the light source direction (S i ).
- FIGS. 10 to 12 are explanatory diagrams of the imaging unit according to the present disclosure.
- the imaging unit 2 includes a plurality of polarization lights 21 , 22 , 23 , and 24 , the camera 10 , and an imaging control unit 12 .
- Each of the polarization lights 21 , 22 , 23 , and 24 includes light sources L 1 , L 2 , L 3 , and L 4 that are white LEDs and polarization filters F 1 , F 2 , F 3 , and F 4 having different polarization directions, respectively.
- the polarization filters F 1 , F 2 , F 3 , and F 4 for example, selectively transmit light having polarization directions of 0°, 45°, 90°, and 135°, respectively.
- the camera 10 includes a polarization sensor 11 .
- the polarization sensor 11 includes a pixel array 13 in which a plurality of imaging elements is arranged in a matrix shape, a polarization filter 14 that selectively causes light having different polarization directions associated with the respective imaging elements to enter the imaging elements, and microlenses 15 provided to every imaging element.
- the imaging control unit 12 simultaneously turns on all of the plurality of polarization lights 21 , 22 , 23 , and 24 , then causes the camera 10 to image the subject 100 (see FIG. 1 ), and then turns off the polarization lights 21 , 22 , 23 , and 24 .
- the imaging control unit 12 causes the camera 10 to repeatedly perform imaging while all of the plurality of polarization lights 21 , 22 , 23 , and 24 are simultaneously turned on and continues imaging until there is an instruction to stop the imaging from a user.
- the imaging control unit 12 turns off the polarization lights 21 , 22 , 23 , and 24 after the moving image has been capturing.
- the imaging control unit 12 acquires image data of the image that has been captured from the camera 10 and outputs the image data to the signal separating unit 5 in the subsequent stage.
- FIG. 13 is a block diagram illustrating an exemplary configuration of the signal separating unit according to the present disclosure.
- the signal separating unit 5 according to the present disclosure includes a preprocessing unit 51 , a polarization demosaic unit 52 , a polarization model estimating unit 53 , and a polarization luminance calculating unit 54 .
- the preprocessing unit 51 performs linearization of the output luminance i′ of the camera 10 in the image data input from the camera 10 and shading correction.
- the preprocessing unit 51 performs linearization of the output luminance i′ using the following Equation (2).
- the preprocessing unit 51 calculates linearized output luminance j′ x, y by applying an inverse transform function of a characteristic function of the camera 10 to output luminance j x, y of each pixel.
- the preprocessing unit 51 performs shading correction using the following Equation (3).
- the preprocessing unit 51 calculates output luminance j′′ x, y obtained by performing shading correction by dividing each linearized output luminance j′ x, y by the luminance l 1 l 2 , . . . , l M of corresponding pixels in the images I 11 , I 12 , I 13 , and I 14 illustrated in FIG. 5 .
- the polarization demosaic unit 52 obtains, from the data in the directions of 0°, 45°, 90°, and 135° assigned to each pixel, data in these four directions for each pixel (j′′′ x, y (0), j′′′ x, y (45), j′′′ x, y (90), j′′′ x, y (135)).
- FIG. 14 is an explanatory diagram of a polarization demosaic process according to the present disclosure.
- the following Equations (4) are used.
- the polarization demosaic unit 52 calculates and complements the data of A, B, and C using data a, b, c, and d of imaging elements where a wire grid of 90° (see FIG. 12 ) is included and thereby calculates data of an image I 31 having been subjected to the polarization demosaic process.
- the polarization demosaic unit 52 calculates data for every pixel by a similar method also for data in the polarization directions of 0°, 45°, and 135°. Referring back to FIG. 13 , the polarization demosaic unit 52 outputs the calculated data (j′′′ x, y (0), j′′′ x, y (45), j′′′ x, y (90), j′′′ x, y (135)) to the polarization model estimating unit 53 .
- the polarization model estimating unit 53 estimates a polarization model indicating a correspondence relationship between the polarization angle and the luminance.
- FIG. 15 is an explanatory graph illustrating an example of a polarization model according to the present disclosure.
- the polarization model estimating unit 53 estimates the polarization model illustrated in FIG. 15 using the polarization sensor data (j′′′ x, y (0), j′′′ x, y (45), j′′′ x, y (90), j′′′ x, y (135)) obtained for each pixel.
- the signal separating unit 5 can estimate luminance I( ⁇ ) for any polarization angle (a) by using such a polarization model.
- the polarization model illustrated in FIG. 15 is expressed by a formula as the following Equation (5).
- the polarization model estimating unit 53 obtains I max , I min , and ⁇ , which are unknown parameters in the above Equation (5), from I( ⁇ 1 ), . . . , I( ⁇ m ), which are imaging data.
- Equation (5) is expressed by a matrix as the following Equation (6).
- Equation (6) is obtained as the following Equations (7).
- Equations (7) are transformed using an inverse matrix A ⁇ 1 of the known matrix A, and the following (8) is obtained.
- the polarization model estimating unit 53 can obtain the unknown parameters I max , I min , and ⁇ by the following Equations (9).
- the polarization luminance calculating unit 54 uses the unknown parameters I max , I min , ⁇ , and Equation (5) obtained by the polarization model estimating unit 53 .
- the polarization luminance calculating unit 54 uses the angle of the polarization direction in the light source direction and polarization direction correspondence table 55 .
- FIG. 16 is an explanatory diagram of the normal line calculation method according to the present disclosure.
- the normal line calculating unit 6 calculates a normal vector n for every pixel by calculating the following Equations (10) and (12) using a light source vector S i corresponding to the light source direction in the light source direction and polarization direction correspondence table 55 and the luminance i of each pixel input from the polarization luminance calculating unit 54 .
- the normal line calculating unit 6 calculates the normal vector n for every pixel by performing calculations of the following Equations (11) and (12).
- Equation (11) M in Equation (11) is 4.
- the distance estimation unit 7 calculates a distance Z from a certain reference point to a corresponding point on the subject for each pixel by using normal line information obtained for each pixel.
- the distance estimation unit 7 calculates the distance Z using, for example, Frankot-Chellappa Algorithm expressed by the following Equation (13) using a Fourier basis.
- Variables p and q in the above Equation (13) are the x component and the y component, respectively, of the normal vector n calculated by the normal line calculating unit 6 .
- F denotes a Fourier transform
- ⁇ x denotes a spatial frequency (x)
- ⁇ x is a spatial frequency (y).
- the distance estimation unit 7 sets a certain reference point in advance, integrates the gradient field from the reference point, and estimates the shape (distance Z) of the subject. At this point, the distance estimation unit 7 calculates the distance Z so that the differentiation between the gradient field and the shape matches.
- FIGS. 17 to 19 are flowcharts illustrating processes executed by the imaging device according to the present disclosure.
- FIG. 17 is an example of a calibration process executed by the calibration unit 4 .
- Illustrated in FIG. 18 is an example of a three-dimensional shape measuring process performed by the imaging device.
- Illustrated in FIG. 19 is an example of a signal separating process in the three-dimensional shape measuring process.
- the calibration unit 4 first calculates and stores an inverse transform function of the camera characteristics (characteristic function indicating the relationship between the exposure amount and the output luminance of the camera 10 illustrated in FIG. 3 ) (Step S 101 ). Subsequently, the calibration unit 4 acquires and stores shading data (see FIG. 5 ) (Step S 102 ).
- the calibration unit 4 calculates the light source direction of each of the light sources (light sources L 1 f L 2 , L 3 , and L 4 ) (Step S 103 ) and calculates the polarization direction of each of the light sources (light sources L 1 , L 2 , L 3 , and L 4 ) (Step S 104 ). Then, the calibration unit 4 stores each of the light source directions and the polarization direction in association with each other as the light source direction and polarization direction correspondence table 55 (Step S 105 ) and ends the calibration process.
- the imaging unit 2 performs an imaging process (Step S 201 ).
- the imaging unit 2 simultaneously irradiates a three-dimensional measurement target with light having different polarization directions from a plurality of directions and captures an image of the measurement target by the polarization sensor 11 .
- the signal separating unit 5 performs the signal separating process of separating image signals corresponding to each of the polarization directions from the image captured by the imaging unit 2 (Step S 202 ).
- the signal separating unit 5 first performs preprocessing (Step S 301 ).
- the signal separating unit 5 linearizes the output luminance of the camera 10 in the captured image using the inverse transform function of the camera characteristic and performs shading correction of the captured image using the shading data.
- the signal separating unit 5 performs polarization demosaic process (Step S 302 ).
- the signal separating unit 5 generates image data for every polarization direction by complementing the captured image, which has been subjected to the shading correction, by performing the demosaic process (see FIG. 14 ).
- the signal separating unit 5 performs a polarization model estimating process (Step S 303 ).
- the signal separating unit 5 estimates a polarization model by calculating unknown parameters (I max , I min , and ⁇ ) in the polarization model (Equation (5)) from the image data (luminance for every pixel) for every polarization direction.
- the signal separating unit 5 performs a polarization luminance calculating process (Step S 304 ).
- the signal separating unit 5 calculates the luminance of each pixel in the image for every light source direction on the basis of the polarization directions corresponding to the light source directions included in the light source direction and polarization direction correspondence table 55 and the polarization model and outputs the luminance to the normal line calculating unit 6 .
- the normal line calculating unit 6 performs a normal line calculating process (Step S 203 ).
- the normal line calculating unit 6 calculates a normal vector on the surface of the measurement target for every pixel on the basis of the luminance of each pixel in the image for every light source direction calculated by the signal separating process and the known light source direction.
- the distance estimation unit 7 performs a distance estimating process (Step S 204 ).
- the distance estimation unit 7 measures the three-dimensional shape of the measurement target by calculating the distance from a predetermined reference point to a point on the measurement target for every pixel using the normal vector for every pixel calculated in the normal line calculating process.
- FIG. 20 is an explanatory diagram illustrating a modification of the camera according to the present disclosure.
- the camera according to the modification includes a polarization sensor 10 A illustrated in FIG. 20 .
- the polarization sensor 10 A includes beam splitters 15 a , 15 b , 15 c , and 15 d , image sensors 10 a , 10 b , 10 c , and 10 d , and polarization filters 11 a , 11 b , 11 c , and 11 d.
- the beam splitters 15 a , 15 b , 15 c , and 15 d divide incident light into a plurality of light beams.
- the image sensors 10 a , 10 b , 10 c , and 10 d receive respective light beams.
- the polarization filters 11 a , 11 b , 11 c , and 11 d are arranged between the image sensors 10 a , 10 b , 10 c , and 10 d and the beam splitters 15 a , 15 b , 15 c , and 15 d and have different polarization directions for each of the image sensors 10 a , 10 b , 10 c , and 10 d.
- the polarization filter 11 a selectively transmits, for example, light having a polarization angle of 0°.
- the polarization filter lib selectively transmits, for example, light having a polarization angle of 45°.
- the polarization filter 11 c selectively transmits, for example, light having a polarization angle of 90°.
- the polarization filter 11 d selectively transmits, for example, light having a polarization angle of 135°.
- the image sensor 10 a can capture an image of a subject to which only the light having the polarization angle of 0° is emitted.
- the image sensor 10 b can capture an image of the subject to which only the light having the polarization angle of 45° is emitted.
- the image sensor 10 c can capture an image of the subject to which only the light having the polarization angle of 90° is emitted.
- the image sensor 10 d can capture an image of the subject to which only the light having the polarization angle of 135° is emitted.
- the polarization demosaic process performed by the polarization demosaic unit 52 becomes unnecessary, and thus it becomes possible to reduce the processing load.
- the imaging device includes the imaging unit 2 , the signal separating unit 5 , the normal line calculating unit 6 , and the distance estimation unit 7 .
- the imaging unit 2 includes the plurality of polarization lights 21 , 22 , 23 , and 24 having different polarization directions of light emitted to the subject 100 and the polarization sensor 11 and captures an image of the subject 100 that is simultaneously irradiated with the light from the plurality of polarization lights 21 , 22 , 23 , and 24 .
- the signal separating unit 5 separates pixel signals corresponding to each of the polarization directions from the image captured by the imaging unit 2 and generates an image for every polarization direction.
- the normal line calculating unit 6 calculates a normal line on the surface of the subject from the image for every polarization direction by photometric stereo.
- the distance estimation unit 7 estimates the shape of the subject on the basis of the normal line calculated by the normal line calculating unit. As a result, the imaging device 1 can accurately measure the three-dimensional shape of a moving object.
- the imaging device 1 further includes the storage unit that stores the light source direction and polarization direction correspondence table 55 that is correspondence information in which the polarization lights 21 , 22 , 23 , and 24 , the polarization directions of light emitter by the polarization lights 21 , 22 , 23 , and 24 , and directions of the polarization lights 21 , 22 , 23 , and 24 with respect to the subject 100 are associated with each other.
- the signal separating unit 5 estimates a polarization model indicating a correspondence relationship between any polarization direction and the luminance of each pixel in an image of the subject irradiated with light having that polarization direction on the basis of the luminance of each pixel in the image for every polarization direction and calculates the luminance of each pixel in the image for each of the polarization lights 21 , 22 , 23 , and 24 on the basis of the polarization model and the correspondence information.
- the imaging device 1 can calculate accurate luminance of each pixel in the image for each of the polarization lights 21 , 22 , 23 , and 24 .
- the normal line calculating unit 6 calculates the normal line on the surface of the subject 100 on the basis of the luminance of each pixel in the image for each of the polarization lights 21 , 22 , 23 , and 24 and the correspondence information. As a result, the imaging device 1 can calculate a more accurate normal line.
- the polarization filters F 1 , F 2 , F 3 , and F 4 having different polarization directions are provided on the light emission surfaces of the light sources L 1 , L 2 , L 3 , and L 4 , respectively.
- the imaging device 1 can increase the number of light source directions without increasing the cost only by, for example, increasing the number of light sources that emits white light and polarization directions of light transmitted by polarization filters provided to the light sources.
- the polarization sensor 11 includes the pixel array 13 and the polarization filter 14 .
- the pixel array 13 a plurality of imaging elements is arrayed in a matrix shape.
- the polarization filter 14 selectively causes light of different polarization directions associated with the imaging elements to enter the imaging elements.
- the imaging device 1 can capture an image for each of a plurality of light rays having different polarization directions by the single pixel array 13 .
- the polarization sensor 11 includes the beam splitters 15 a , 15 b , 15 c , and 15 d , the image sensors 10 a , 10 b , 10 c , and 10 d , and the polarization filters 11 a , 11 b , 11 c , and 11 d .
- the beam splitters 15 a , 15 b , 15 c , and 15 d divide incident light into a plurality of light beams.
- the image sensors 10 a , 10 b , 10 c , and 10 d receive respective light beams.
- the polarization filters 11 a , 11 b , 11 c , and 11 d are arranged between the image sensors 10 a , 10 b , 10 c , and 10 d and the beam splitters 15 a , 15 b , 15 c , and 15 d and have different polarization directions for each of the image sensors 10 a , 10 b , 10 c , and 10 d .
- the imaging device 1 does not need to perform the polarization demosaic process, and thus it becomes possible to reduce the processing load.
- the imaging method includes: by a computer, capturing, by the polarization sensor 11 , an image of the subject that is simultaneously irradiated with light from a plurality of lights having different polarization directions, the light emitted to the subject 100 ; separating a pixel signal corresponding to each of the polarization directions from the image captured and generating an image for each of the polarization directions by the polarization sensor 11 ; calculating a normal line on the surface of the subject from the image for each of the polarization directions by photometric stereo; and estimating the shape of the subject on the basis of the normal line.
- the imaging method can accurately measure the three-dimensional shape of a moving object.
- the information processing device 3 includes the storage unit, the signal separating unit 5 , the normal line calculating unit 6 , and the distance estimation unit 7 .
- the storage unit stores the light source direction and polarization direction correspondence table 55 that is correspondence information in which the plurality of polarization lights 21 , 22 , 23 , and 24 having different polarization directions of light emitted to the subject 100 , polarization directions of light emitted by the polarization lights 21 , 22 , 23 , and 24 , and directions of the polarization lights 21 , 22 , 23 , and 24 with respect to the subject are associated with each other.
- the signal separating unit 5 separates pixel signals corresponding to each of the polarization directions from the image in which the subject 100 simultaneously irradiated with light from the plurality of polarization lights 21 , 22 , 23 , and 24 is imaged by the polarization sensor 11 and generates an image for each of the lights on the basis of the correspondence information.
- the normal line calculating unit 6 calculates the normal line on the surface of the subject 100 from the image for each of the polarization lights 21 , 22 , 23 , and 24 by the photometric stereo.
- the distance estimation unit 7 estimates the shape of the subject 100 on the basis of the normal line calculated by the normal line calculating unit 6 . As a result, the information processing device 3 can accurately measure the three-dimensional shape of a moving object.
- the information processing method includes, by a computer: storing the light source direction and polarization direction correspondence table 55 which is correspondence information in which a plurality of lights 21 , 22 , 23 , and 24 having different polarization directions of light emitted to the subject 100 , the polarization directions of light emitted by the lights 21 , 22 , 23 , and 24 , and directions of the lights 21 , 22 , 23 , and 24 with respect to the subject 100 are associated with each other; separating pixel signals corresponding to each of the polarization directions from an image, in which the subject 100 simultaneously irradiated with light from the plurality of polarization lights 21 , 22 , 23 , and 24 is captured by the polarization sensor 11 , and generating an image for each of the polarization lights 21 , 22 , 23 , and 24 on the basis of the correspondence information; calculating a normal line on the surface of the subject 100 from the image for each of the polarization lights 21 , 22 , 23 , and 24 by photometric stereo;
- An imaging device including:
- the imaging device further including:
- the imaging device according to any one of (1) to (3),
- the imaging device according to any one of (1) to (4),
- the imaging device according to any one of (1) to (4),
- An imaging method including the steps of: by a computer,
- An information processing device including:
- An information processing method including, by a computer:
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
An imaging device (1) according to the present disclosure includes an imaging unit (2), a signal separating unit (5), a normal line calculating unit (6), and a distance estimation unit (7). The imaging unit (2) includes a plurality of polarization lights (21 to 24) having different polarization directions of light emitted to a subject and a polarization sensor (11) and captures an image of the subject that is simultaneously irradiated with the light from the plurality of polarization lights (21 to 24). The signal separating unit (5) separates pixel signals corresponding to each of the polarization directions from the image captured by the imaging unit (2) and generates an image for every polarization direction. The normal line calculating unit (6) calculates a normal line on a surface of the subject from the image for each of the polarization directions by photometric stereo. The distance estimation unit (7) estimates the shape of the subject on the basis of the normal line calculated by the normal line calculating unit (6).
Description
- The present disclosure relates to an imaging device, an information processing device, an imaging method, and an information processing method.
- As one of methods for measuring a three-dimensional shape of a subject, there is photometric stereo. In the photometric stereo, light is sequentially and separately emitted to a subject from a plurality of directions, and a distance to the subject or a three-dimensional shape of the subject is measured from the differences in shading (see, for example, Patent Literature 1).
-
- Patent Literature 1: JP 2017-72499 A
- However, in the above conventional technology, the subject is sequentially imaged by a camera while switching lights, and thus in a case where the subject is a moving object, a positional shift due to movement of the subject occurs during switching of the lights, and accurate measurement cannot be performed.
- Therefore, the present disclosure proposes an imaging device, an information processing device, an imaging method, and an information processing method capable of accurately measuring a three-dimensional shape of a moving object.
- According to the present disclosure, an imaging device is provided. The imaging device includes an imaging unit, a separating unit, a calculating unit, and an estimation unit. The imaging unit includes a plurality of polarization lights having different polarization directions of light emitted to a subject and a polarization sensor and captures an image of the subject that is simultaneously irradiated with the light from the plurality of polarization lights. The separating unit separates pixel signals corresponding to each of the polarization directions from the image captured by the imaging unit and generates an image for every polarization direction. The calculating unit calculates a normal line on a surface of the subject from the image for each of the polarization directions by photometric stereo. The estimation unit estimates the shape of the subject on the basis of the normal line calculated by the calculating unit.
-
FIG. 1 is an explanatory diagram illustrating an overview of an imaging method and an information processing method according to the present disclosure. -
FIG. 2 is a block diagram illustrating an exemplary configuration of an imaging device according to the present disclosure. -
FIG. 3 is an explanatory diagram illustrating a relationship between an exposure amount and output luminance of a camera according to the present disclosure. -
FIG. 4 is an explanatory diagram of an acquisition method of shading data according to the present disclosure. -
FIG. 5 is an explanatory diagram of the acquisition method of shading data according to the present disclosure. -
FIG. 6 is an explanatory diagram of a detection method of a light source direction according to the present disclosure. -
FIG. 7 is an explanatory diagram of the detection method of a light source direction according to the present disclosure. -
FIG. 8 is an explanatory diagram of the detection method of a light source direction according to the present disclosure. -
FIG. 9 is an explanatory table illustrating an example of a light source direction and polarization direction correspondence table according to the present disclosure. -
FIG. 10 is an explanatory diagram of an imaging unit according to the present disclosure. -
FIG. 11 is an explanatory diagram of an imaging unit according to the present disclosure. -
FIG. 12 is an explanatory diagram of an imaging unit according to the present disclosure. -
FIG. 13 is a block diagram illustrating an exemplary configuration of a signal separating unit according to the present disclosure. -
FIG. 14 is an explanatory diagram of a polarization demosaic process according to the present disclosure. -
FIG. 15 is an explanatory graph illustrating an example of a polarization model according to the present disclosure. -
FIG. 16 is an explanatory diagram of a normal line calculation method according to the present disclosure. -
FIG. 17 is a flowchart illustrating processes executed by the imaging device according to the present disclosure. -
FIG. 18 is a flowchart illustrating processes executed by the imaging device according to the present disclosure. -
FIG. 19 is a flowchart illustrating processes executed by the imaging device according to the present disclosure. -
FIG. 20 is an explanatory diagram illustrating a modification of the camera according to the present disclosure. - Hereinafter, embodiments of the present disclosure will be described in detail on the basis of the drawings. Note that in each of the following embodiments, the same parts are denoted by the same symbols, and redundant description will be omitted.
- [1. Problem Behind Present Disclosure]
- In the basic photometric stereo, light is emitted to a subject while sequentially switching a plurality of lights having different light emission directions with respect to the subject, an image of the subject illuminated by each light is captured, and the three-dimensional shape of the subject is measured on the basis of differences in the shade of the subject in the images.
- However, in this method, a positional shift due to movement of the subject occurs during switching of the lights, and accurate measurement cannot be performed, and thus it is difficult to apply the method to a subject that is a moving object, and it is only possible to measure an accurate three-dimensional shape of a subject that is a stationary object.
- Therefore, there is a measurement method of a wavelength multiplexing system in which lights are multiplexed using a difference in color of light emitted to a subject, and the three-dimensional shape of the subject is measured by one-time imaging without switching the lights. In the wavelength multiplexing system, the three-dimensional shape of the subject is measured by imaging the subject by simultaneously emitting light having different wavelengths (colors) to the subject from a plurality of lights, extracting each color component from the captured image, and obtaining shade to be obtained in a case where the light emission is performed with one of the lights alone.
- As described above, in the wavelength multiplexing system, it is not necessary to switch lights, and the three-dimensional shape of a subject can be measured by one-time imaging. Therefore, even in a case where the subject is a moving object, it is possible to measure the three-dimensional shape of the subject.
- However, in the wavelength multiplexing system, for example, the color of light to be emitted to the subject is changed by applying different narrowband bandpass filters to lights. Therefore, the amount of light is reduced by transmission through the narrowband bandpass filters, the S/N ratio is deteriorated, which may deteriorate the measurement accuracy.
- In addition, in a case where it is difficult to discriminate between the color of a light and the color of the subject, a measurement error may occur. Furthermore, in a case where the number of colors of light to be emitted to a subject is increased, it is necessary to use a narrower bandpass filter or to increase the number of light emitting diodes (LEDs) having different colors to be developed, which increases the cost. Meanwhile, an imaging device, an information processing device, an imaging method, and an information processing method according to the present disclosure accurately measure the three-dimensional shape of a moving object without increasing the cost.
- [2. Overview of Imaging Method and Information Processing Method]
- First, an overview of an imaging method and an information processing method according to the present disclosure will be described with reference to
FIG. 1 .FIG. 1 is an explanatory diagram illustrating an overview of an imaging method and an information processing method according to the present disclosure. - The imaging method and the information processing method according to the present disclosure enable measurement of the three-dimensional shape of a moving object by multiplexing lights using polarized light and simultaneously emitting light having different polarization directions from a plurality of directions to a subject at one time of imaging and thereby imaging the subject.
- For example, as illustrated in
FIG. 1 , onecamera 10 and a plurality of (here, four) light sources L1, L2, L3, and L4 are prepared. The light sources L1, L2, L3, and L4 are arranged in different directions (hereinafter referred to as light source directions) S1, S2, S3, and S4 with respect to asubject 100. Note that the number of light sources is not limited to four. - Moreover, the light sources L1, L2, L3, and L4 include, at light emission units, polarization filters F1, F2, F3, and F4 having different polarization directions, respectively, and emit light each having different polarization directions to the
subject 100. In the light sources L1, L2, L3, and L4, the polarization direction of light to emit and the light source direction (position with respect to the subject) are associated with each other in advance. - The
camera 10 includes apolarization sensor 11. In the imaging method and the information processing method according to the present disclosure, each polarization direction component is separated from image data acquired by thepolarization sensor 11 by a signal separating process. Then, on the basis of the correspondence relationship between the polarization directions of light and the light source directions associated in advance, images I1, I2, I3, and I4, each of which would be obtained in a case where light is emitted from one of the light sources L1, L2, L3, and L4 in the respective directions, are calculated. - Then, a normal line calculating process is performed on each of the images I1, I2, I3, and I4, thereby calculating a normal line image Is, and a distance estimating process is performed using the normal line image Is, thereby estimating the surface three-dimensional shape of the subject 100. As a result, in the imaging method and the information processing method according to the present disclosure, the three-dimensional shape of the moving object can be accurately measured.
- According to this method, in a case where the number of light sources is increased, it is only necessary to change the direction of the polarization filters provided in each of the lights, and thus the cost does not increase, and the amount of light does not decrease since no narrowband bandpass filters are used. Therefore, it is possible to improve the measurement accuracy of a three-dimensional shape.
- [3. Configuration of Imaging Device]
- Next, the configuration of the imaging device according to the present disclosure will be described with reference to
FIG. 2 .FIG. 2 is a block diagram illustrating an exemplary configuration of the imaging device according to the present disclosure. As illustrated inFIG. 2 , animaging device 1 according to the present disclosure includes animaging unit 2 and aninformation processing device 3. - The
imaging unit 2 includes the light sources L1, L2, L3, and L4 including the polarization filters F1, F2, F3, and F4, respectively, and thecamera 10 illustrated inFIG. 1 . Theimaging unit 2 causes thecamera 10 to capture an image of the subject 100 in a state in which light having different polarization directions is simultaneously emitted from the light sources L1, L2, L3, and L4 via the polarization filters F1, F2, F3, and F4, respectively, and outputs image data of the captured image to theinformation processing device 3. - The
information processing device 3 includes, for example, a microcomputer including a central processing unit (CPU), a read only memory (ROM), a random access memory (RAM), and the like and various types of circuits. - The
information processing device 3 includes acalibration unit 4 that functions when the CPU executes an information processing program stored in the ROM using the RAM as a work area, asignal separating unit 5, a normalline calculating unit 6, and a distance estimation unit 7. - Note that, some or all of the
calibration unit 4, thesignal separating unit 5, the normalline calculating unit 6, and the distance estimation unit 7 included in theinformation processing device 3 may be configured by hardware such as an application specific integrated circuit (ASIC) or a field programmable gate array (FPGA). Furthermore, thesignal separating unit 5, the normalline calculating unit 6, and the distance estimation unit 7 are not necessarily included in theinformation processing device 3 and may be provided on a cloud, for example. - The
calibration unit 4, thesignal separating unit 5, the normalline calculating unit 6, and the distance estimation unit 7 included in theinformation processing device 3 each implement or execute an action of information processing described below. Note that the internal configuration of theinformation processing device 3 is not limited to the configuration illustrated inFIG. 2 and may be another configuration as long as the information processing described below is performed. - [4.1. Calibration Unit]
- The
calibration unit 4 includes a storage unit and stores information regarding linearization of the output luminance of thecamera 10, information of shading data generated in a case where light is emitted by each of the light sources, information in which the light source directions and the polarization directions are associated with each other, and other information. - [4.1.1. Linearization of Output Luminance of Camera]
-
FIG. 3 is an explanatory diagram illustrating a relationship between the exposure amount and the output luminance of the camera according to the present disclosure. As illustrated inFIG. 3 , in thecamera 10, the output luminance i′ in a captured image increases as the exposure amount i increases. However, as indicated by a solid line inFIG. 3 , the changes in the output luminance i′ of thecamera 10 accompanying the changes in the exposure amount i is not linear. - Therefore, the
calibration unit 4 measures the luminance of images sequentially captured while changing the exposure amount i of thecamera 10, calculates a characteristic function indicating the relationship between the exposure amount i and the output luminance i′ of thecamera 10 indicated by the solid line inFIG. 3 , and stores an inverse transform function of the characteristic function. The inverse transform function is used when a preprocessing unit 51 (seeFIG. 13 ) to be described later included in thesignal separating unit 5 linearizes an image signal. As a result, the characteristic of the output luminance i′ of thecamera 10 is corrected to be linear as indicated by a dotted line inFIG. 3 . - [4.1.2. Storing of Shading Data]
- The
calibration unit 4 also acquires and stores shading data indicating the state of shade made when light is emitted by each of the light sources L1, L2, L3, and L4. -
FIGS. 4 and 5 are explanatory diagrams of an acquisition method of shading data according to the present disclosure. As illustrated inFIG. 4 , in a case where shading data is acquired, for example, agray plate 101 is disposed at a place where the subject 100 (seeFIG. 1 ) is placed. Then, thegray plate 101 is sequentially irradiated with light by each of the light sources L1, L2, L3, and L4, and images of thegray plate 101 are captured by thecamera 10. - At this point, the positions of the light sources L1, L2, L3, and L4 with respect to the
gray plate 101 are different from each other. Therefore, as illustrated inFIG. 4 , for example, while the light source L1 is turned on, an image I11 in which the shade becomes darker from the lower left toward the upper right of the image is captured, and while the light source L2 is turned on, an image I12 in which the shade becomes darker from the upper left toward the lower right of the image is captured. - Likewise, while the light source L3 is turned on, an image I13 in which the shade becomes darker from the upper right toward the lower left of the image is captured, and while the light source L4 is turned on, an image I14 in which the shade becomes darker from the lower right toward the upper left of the image is captured. The
calibration unit 4 acquires and stores image data of these images I11, I12, I13, and I14 as shading data. - [4.1.3. Detection of Light Source Direction]
- The
calibration unit 4 also detects each of the light source directions.FIGS. 6 to 8 are explanatory diagrams of a detection method of a light source direction according to the present disclosure. As illustrated inFIG. 6 , in a case where a light source direction is detected, for example, aspherical object 102, surface of which is mirror-finished and the shape of which is known, is disposed at a place where the subject 100 (seeFIG. 1 ) is placed. Then, thespherical object 102 is sequentially irradiated with light by each of the light sources L1, L2, L3, and L4, and images of thespherical object 102 are captured by thecamera 10. - At this point, the positions of the light sources L1, L2, L3, and L4 with respect to the
spherical object 102 are different from each other. Therefore, as illustrated inFIG. 7 , for example, while the light source L1 is turned on, an image I21 in which a position on the lower left of thespherical object 102 appears as a highlight portion with a high luminance is captured, and while the light source L2 is turned on, animage 122 in which a position on the upper left of thespherical object 102 appears as a highlight portion is captured. - In addition, while the light source L3 is turned on, an image I23 in which a position on the upper right of the
spherical object 102 appears as a highlight portion is captured, and while the light source L4 is turned on, an image I24 in which a position on the lower right of thespherical object 102 appears as a highlight portion is captured. - At this point, as illustrated in
FIG. 8 , a normal line direction n at the center of a highlight portion is obtained as a half vector of a line-of-sight direction (imaging direction of the camera 10) V and a light source direction S. Therefore, thecalibration unit 4 can calculate the light source direction of each of the light sources L1, L2, L3, and L4 by the following Equation (1). -
- [4.1.4. Association Between Light Source Direction and Polarization Direction]
- In addition, the
calibration unit 4 associates the light source direction of each of the light sources L1, L2, L3, and L4 with a polarization direction of emitted light. - When a light source direction and a polarization direction are associated with each other, the gray plate 101 (see
FIG. 4 ) is imaged by thecamera 10 as in the case of acquiring shading data. Thecalibration unit 4 performs linearization and shading correction of the captured image data, estimates a polarization model to be described later and thereby obtains a polarization angle ϕi, sets the polarization angle ϕi as the polarization direction of a polarization light i, associates the polarization angle ϕi with the light source direction that has been detected earlier and stores as a light source direction and polarization direction correspondence table. -
FIG. 9 is an explanatory table illustrating an example of a light source direction and polarization direction correspondence table according to the present disclosure. In the example illustrated inFIG. 9 , to the light source L1 having 1 as the light source number (i), ϕ1 is associated as the polarization direction (ϕi), and S1 is associated as the light source direction (Si). To the light source L2 having 2 as the light source number (i), ϕ2 is associated as the polarization direction (ϕi), and S2 is associated as the light source direction (Si). - Moreover, to the light source L3 having 3 as the light source number (i), ϕ3 is associated as the polarization direction (ϕi), and S3 is associated as the light source direction (Si). To the light source L4 having 4 as the light source number (i), ϕ4 is associated as the polarization direction (ϕi), and S4 is associated as the light source direction (Si).
- [4.1.5. Configuration of Imaging Unit]
- Here, before describing the
signal separating unit 5, the configuration of theimaging unit 2 will be described with reference toFIGS. 10 to 12 .FIGS. 10 to 12 are explanatory diagrams of the imaging unit according to the present disclosure. As illustrated inFIG. 10 , theimaging unit 2 includes a plurality of polarization lights 21, 22, 23, and 24, thecamera 10, and animaging control unit 12. - Each of the polarization lights 21, 22, 23, and 24 includes light sources L1, L2, L3, and L4 that are white LEDs and polarization filters F1, F2, F3, and F4 having different polarization directions, respectively. The polarization filters F1, F2, F3, and F4, for example, selectively transmit light having polarization directions of 0°, 45°, 90°, and 135°, respectively.
- The
camera 10 includes apolarization sensor 11. As illustrated inFIG. 11 , thepolarization sensor 11 includes apixel array 13 in which a plurality of imaging elements is arranged in a matrix shape, apolarization filter 14 that selectively causes light having different polarization directions associated with the respective imaging elements to enter the imaging elements, and microlenses 15 provided to every imaging element. - In the
polarization filter 14, as illustrated inFIG. 12 ,regions - Referring back to
FIG. 10 , theimaging control unit 12 simultaneously turns on all of the plurality of polarization lights 21, 22, 23, and 24, then causes thecamera 10 to image the subject 100 (seeFIG. 1 ), and then turns off the polarization lights 21, 22, 23, and 24. - Furthermore, in a case where moving image is captured, the
imaging control unit 12 causes thecamera 10 to repeatedly perform imaging while all of the plurality of polarization lights 21, 22, 23, and 24 are simultaneously turned on and continues imaging until there is an instruction to stop the imaging from a user. Theimaging control unit 12 turns off the polarization lights 21, 22, 23, and 24 after the moving image has been capturing. Theimaging control unit 12 acquires image data of the image that has been captured from thecamera 10 and outputs the image data to thesignal separating unit 5 in the subsequent stage. - [4.1.6. Signal Separating Unit]
- Next, the signal separating unit will be described with reference to
FIG. 13 .FIG. 13 is a block diagram illustrating an exemplary configuration of the signal separating unit according to the present disclosure. As illustrated inFIG. 13 , thesignal separating unit 5 according to the present disclosure includes apreprocessing unit 51, apolarization demosaic unit 52, a polarizationmodel estimating unit 53, and a polarizationluminance calculating unit 54. - The preprocessing
unit 51 performs linearization of the output luminance i′ of thecamera 10 in the image data input from thecamera 10 and shading correction. The preprocessingunit 51 performs linearization of the output luminance i′ using the following Equation (2). -
j′ x,y =f −1(j x,y) (2) - As expressed in Equation (2), the preprocessing
unit 51 calculates linearized output luminance j′x, y by applying an inverse transform function of a characteristic function of thecamera 10 to output luminance jx, y of each pixel. - In addition, the preprocessing
unit 51 performs shading correction using the following Equation (3). -
j″ x,y =j′ x,y/(l 1 l 2 . . . l M) (3) - As expressed in Equation (3), the preprocessing
unit 51 calculates output luminance j″x, y obtained by performing shading correction by dividing each linearized output luminance j′x, y by the luminance l1l2, . . . , lM of corresponding pixels in the images I11, I12, I13, and I14 illustrated inFIG. 5 . - The
polarization demosaic unit 52 obtains, from the data in the directions of 0°, 45°, 90°, and 135° assigned to each pixel, data in these four directions for each pixel (j′″x, y(0), j′″x, y(45), j′″x, y(90), j′″x, y(135)). -
FIG. 14 is an explanatory diagram of a polarization demosaic process according to the present disclosure. For example, as illustrated inFIG. 14 , in a case where data of the polarization direction of 90° in an image I30 captured by thepolarization sensor 11 is obtained, the following Equations (4) are used. -
- As expressed by Equation (4), the
polarization demosaic unit 52 calculates and complements the data of A, B, and C using data a, b, c, and d of imaging elements where a wire grid of 90° (seeFIG. 12 ) is included and thereby calculates data of an image I31 having been subjected to the polarization demosaic process. - The
polarization demosaic unit 52 calculates data for every pixel by a similar method also for data in the polarization directions of 0°, 45°, and 135°. Referring back toFIG. 13 , thepolarization demosaic unit 52 outputs the calculated data (j′″x, y(0), j′″x, y(45), j′″x, y(90), j′″x, y(135)) to the polarizationmodel estimating unit 53. - The polarization
model estimating unit 53 estimates a polarization model indicating a correspondence relationship between the polarization angle and the luminance.FIG. 15 is an explanatory graph illustrating an example of a polarization model according to the present disclosure. The polarizationmodel estimating unit 53 estimates the polarization model illustrated inFIG. 15 using the polarization sensor data (j′″x, y(0), j′″x, y(45), j′″x, y(90), j′″x, y(135)) obtained for each pixel. Thesignal separating unit 5 can estimate luminance I(α) for any polarization angle (a) by using such a polarization model. - The polarization model illustrated in
FIG. 15 is expressed by a formula as the following Equation (5). -
- The polarization
model estimating unit 53 obtains Imax, Imin, and ψ, which are unknown parameters in the above Equation (5), from I(α1), . . . , I(αm), which are imaging data. - The above Equation (5) is expressed by a matrix as the following Equation (6).
-
- Furthermore, let the known matrix in Equation (6) be A, the unknown parameter be x, and the imaging data be b, Equation (6) is obtained as the following Equations (7).
-
- Then, Equations (7) are transformed using an inverse matrix A−1 of the known matrix A, and the following (8) is obtained.
-
x=A −1 b (8) - As a result, the polarization
model estimating unit 53 can obtain the unknown parameters Imax, Imin, and ψ by the following Equations (9). -
- Using the unknown parameters Imax, Imin, ψ, and Equation (5) obtained by the polarization
model estimating unit 53, the polarizationluminance calculating unit 54 obtains, for every pixel, image luminance (i=I(ϕ1) to (ϕ4)) in a case where light is emitted in a polarization direction corresponding to each of the light sources L1, L2, L3, and L4. At this point, the polarizationluminance calculating unit 54 uses the angle of the polarization direction in the light source direction and polarization direction correspondence table 55. - [5. Normal Line Calculating Unit]
- Next, a normal line calculation method by the normal
line calculating unit 6 will be described with reference toFIG. 16 .FIG. 16 is an explanatory diagram of the normal line calculation method according to the present disclosure. - As illustrated in
FIG. 16 , for example, in a case where there are three light sources L1, L2, and L3, the normalline calculating unit 6 calculates a normal vector n for every pixel by calculating the following Equations (10) and (12) using a light source vector Si corresponding to the light source direction in the light source direction and polarization direction correspondence table 55 and the luminance i of each pixel input from the polarizationluminance calculating unit 54. - Furthermore, in a case where there are M light sources L1, . . . , LM, the normal
line calculating unit 6 calculates the normal vector n for every pixel by performing calculations of the following Equations (11) and (12). -
- Note that, in the present embodiment, since the four light sources L1, L2, L3, and L4 are used, M in Equation (11) is 4.
- [6. Distance Estimation Unit]
- Next, the distance estimation unit 7 will be described. The distance estimation unit 7 calculates a distance Z from a certain reference point to a corresponding point on the subject for each pixel by using normal line information obtained for each pixel. The distance estimation unit 7 calculates the distance Z using, for example, Frankot-Chellappa Algorithm expressed by the following Equation (13) using a Fourier basis.
-
- Variables p and q in the above Equation (13) are the x component and the y component, respectively, of the normal vector n calculated by the normal
line calculating unit 6. In addition, F denotes a Fourier transform, ξx denotes a spatial frequency (x), and ξx is a spatial frequency (y). - The distance estimation unit 7 sets a certain reference point in advance, integrates the gradient field from the reference point, and estimates the shape (distance Z) of the subject. At this point, the distance estimation unit 7 calculates the distance Z so that the differentiation between the gradient field and the shape matches.
- [7. Processes Executed by Imaging Device]
- Next, an example of processes executed by the imaging device according to the present disclosure will be described with reference to
FIGS. 17 to 19 .FIGS. 17 to 19 are flowcharts illustrating processes executed by the imaging device according to the present disclosure. - Note that illustrated in
FIG. 17 is an example of a calibration process executed by thecalibration unit 4. Illustrated inFIG. 18 is an example of a three-dimensional shape measuring process performed by the imaging device. Illustrated inFIG. 19 is an example of a signal separating process in the three-dimensional shape measuring process. - In a case of performing the calibration process, as illustrated in
FIG. 17 , thecalibration unit 4 first calculates and stores an inverse transform function of the camera characteristics (characteristic function indicating the relationship between the exposure amount and the output luminance of thecamera 10 illustrated inFIG. 3 ) (Step S101). Subsequently, thecalibration unit 4 acquires and stores shading data (seeFIG. 5 ) (Step S102). - Then, the
calibration unit 4 calculates the light source direction of each of the light sources (light sources L1f L2, L3, and L4) (Step S103) and calculates the polarization direction of each of the light sources (light sources L1, L2, L3, and L4) (Step S104). Then, thecalibration unit 4 stores each of the light source directions and the polarization direction in association with each other as the light source direction and polarization direction correspondence table 55 (Step S105) and ends the calibration process. - In a case where the three-dimensional shape measuring process is performed, as illustrated in
FIG. 18 , first, theimaging unit 2 performs an imaging process (Step S201). In the imaging process, theimaging unit 2 simultaneously irradiates a three-dimensional measurement target with light having different polarization directions from a plurality of directions and captures an image of the measurement target by thepolarization sensor 11. Subsequently, thesignal separating unit 5 performs the signal separating process of separating image signals corresponding to each of the polarization directions from the image captured by the imaging unit 2 (Step S202). - Hereinafter, the signal separating process will be described with reference to
FIG. 19 . In the signal separating process, as illustrated inFIG. 19 , thesignal separating unit 5 first performs preprocessing (Step S301). In the preprocessing, thesignal separating unit 5 linearizes the output luminance of thecamera 10 in the captured image using the inverse transform function of the camera characteristic and performs shading correction of the captured image using the shading data. - Subsequently, the
signal separating unit 5 performs polarization demosaic process (Step S302). In the polarization demosaic process, thesignal separating unit 5 generates image data for every polarization direction by complementing the captured image, which has been subjected to the shading correction, by performing the demosaic process (seeFIG. 14 ). - Thereafter, the
signal separating unit 5 performs a polarization model estimating process (Step S303). In the polarization model estimating process, thesignal separating unit 5 estimates a polarization model by calculating unknown parameters (Imax, Imin, and ψ) in the polarization model (Equation (5)) from the image data (luminance for every pixel) for every polarization direction. - Subsequently, the
signal separating unit 5 performs a polarization luminance calculating process (Step S304). In the polarization luminance calculating process, thesignal separating unit 5 calculates the luminance of each pixel in the image for every light source direction on the basis of the polarization directions corresponding to the light source directions included in the light source direction and polarization direction correspondence table 55 and the polarization model and outputs the luminance to the normalline calculating unit 6. - Referring back to
FIG. 18 , when the signal separating process is finished, the normalline calculating unit 6 performs a normal line calculating process (Step S203). In the normal line calculating process, the normalline calculating unit 6 calculates a normal vector on the surface of the measurement target for every pixel on the basis of the luminance of each pixel in the image for every light source direction calculated by the signal separating process and the known light source direction. - Subsequently, the distance estimation unit 7 performs a distance estimating process (Step S204). In the distance estimating process, the distance estimation unit 7 measures the three-dimensional shape of the measurement target by calculating the distance from a predetermined reference point to a point on the measurement target for every pixel using the normal vector for every pixel calculated in the normal line calculating process.
- [8. Modification of Camera]
- Next, a modification of the
camera 10 will be described with reference toFIG. 20 .FIG. 20 is an explanatory diagram illustrating a modification of the camera according to the present disclosure. The camera according to the modification includes apolarization sensor 10A illustrated inFIG. 20 . - The
polarization sensor 10A includesbeam splitters image sensors - The
beam splitters image sensors image sensors beam splitters image sensors - The
polarization filter 11 a selectively transmits, for example, light having a polarization angle of 0°. The polarization filter lib selectively transmits, for example, light having a polarization angle of 45°. Thepolarization filter 11 c selectively transmits, for example, light having a polarization angle of 90°. Thepolarization filter 11 d selectively transmits, for example, light having a polarization angle of 135°. - As a result, the
image sensor 10 a can capture an image of a subject to which only the light having the polarization angle of 0° is emitted. Theimage sensor 10 b can capture an image of the subject to which only the light having the polarization angle of 45° is emitted. Theimage sensor 10 c can capture an image of the subject to which only the light having the polarization angle of 90° is emitted. Theimage sensor 10 d can capture an image of the subject to which only the light having the polarization angle of 135° is emitted. - With the
imaging device 1 including thepolarization sensor 10A illustrated inFIG. 20 instead of thepolarization sensor 11, the polarization demosaic process performed by thepolarization demosaic unit 52 becomes unnecessary, and thus it becomes possible to reduce the processing load. - [9. Effects]
- The imaging device includes the
imaging unit 2, thesignal separating unit 5, the normalline calculating unit 6, and the distance estimation unit 7. Theimaging unit 2 includes the plurality of polarization lights 21, 22, 23, and 24 having different polarization directions of light emitted to the subject 100 and thepolarization sensor 11 and captures an image of the subject 100 that is simultaneously irradiated with the light from the plurality of polarization lights 21, 22, 23, and 24. Thesignal separating unit 5 separates pixel signals corresponding to each of the polarization directions from the image captured by theimaging unit 2 and generates an image for every polarization direction. The normalline calculating unit 6 calculates a normal line on the surface of the subject from the image for every polarization direction by photometric stereo. The distance estimation unit 7 estimates the shape of the subject on the basis of the normal line calculated by the normal line calculating unit. As a result, theimaging device 1 can accurately measure the three-dimensional shape of a moving object. - The
imaging device 1 further includes the storage unit that stores the light source direction and polarization direction correspondence table 55 that is correspondence information in which the polarization lights 21, 22, 23, and 24, the polarization directions of light emitter by the polarization lights 21, 22, 23, and 24, and directions of the polarization lights 21, 22, 23, and 24 with respect to the subject 100 are associated with each other. Thesignal separating unit 5 estimates a polarization model indicating a correspondence relationship between any polarization direction and the luminance of each pixel in an image of the subject irradiated with light having that polarization direction on the basis of the luminance of each pixel in the image for every polarization direction and calculates the luminance of each pixel in the image for each of the polarization lights 21, 22, 23, and 24 on the basis of the polarization model and the correspondence information. As a result, theimaging device 1 can calculate accurate luminance of each pixel in the image for each of the polarization lights 21, 22, 23, and 24. - The normal
line calculating unit 6 calculates the normal line on the surface of the subject 100 on the basis of the luminance of each pixel in the image for each of the polarization lights 21, 22, 23, and 24 and the correspondence information. As a result, theimaging device 1 can calculate a more accurate normal line. - In the plurality of polarization lights 21, 22, 23, and 24, the polarization filters F1, F2, F3, and F4 having different polarization directions are provided on the light emission surfaces of the light sources L1, L2, L3, and L4, respectively. As a result, the
imaging device 1 can increase the number of light source directions without increasing the cost only by, for example, increasing the number of light sources that emits white light and polarization directions of light transmitted by polarization filters provided to the light sources. - The
polarization sensor 11 includes thepixel array 13 and thepolarization filter 14. In thepixel array 13, a plurality of imaging elements is arrayed in a matrix shape. Thepolarization filter 14 selectively causes light of different polarization directions associated with the imaging elements to enter the imaging elements. As a result, theimaging device 1 can capture an image for each of a plurality of light rays having different polarization directions by thesingle pixel array 13. - The
polarization sensor 11 includes thebeam splitters image sensors beam splitters image sensors image sensors beam splitters image sensors imaging device 1 does not need to perform the polarization demosaic process, and thus it becomes possible to reduce the processing load. - The imaging method includes: by a computer, capturing, by the
polarization sensor 11, an image of the subject that is simultaneously irradiated with light from a plurality of lights having different polarization directions, the light emitted to the subject 100; separating a pixel signal corresponding to each of the polarization directions from the image captured and generating an image for each of the polarization directions by thepolarization sensor 11; calculating a normal line on the surface of the subject from the image for each of the polarization directions by photometric stereo; and estimating the shape of the subject on the basis of the normal line. As a result, the imaging method can accurately measure the three-dimensional shape of a moving object. - The
information processing device 3 includes the storage unit, thesignal separating unit 5, the normalline calculating unit 6, and the distance estimation unit 7. The storage unit stores the light source direction and polarization direction correspondence table 55 that is correspondence information in which the plurality of polarization lights 21, 22, 23, and 24 having different polarization directions of light emitted to the subject 100, polarization directions of light emitted by the polarization lights 21, 22, 23, and 24, and directions of the polarization lights 21, 22, 23, and 24 with respect to the subject are associated with each other. Thesignal separating unit 5 separates pixel signals corresponding to each of the polarization directions from the image in which the subject 100 simultaneously irradiated with light from the plurality of polarization lights 21, 22, 23, and 24 is imaged by thepolarization sensor 11 and generates an image for each of the lights on the basis of the correspondence information. The normalline calculating unit 6 calculates the normal line on the surface of the subject 100 from the image for each of the polarization lights 21, 22, 23, and 24 by the photometric stereo. The distance estimation unit 7 estimates the shape of the subject 100 on the basis of the normal line calculated by the normalline calculating unit 6. As a result, theinformation processing device 3 can accurately measure the three-dimensional shape of a moving object. - The information processing method includes, by a computer: storing the light source direction and polarization direction correspondence table 55 which is correspondence information in which a plurality of
lights lights lights polarization sensor 11, and generating an image for each of the polarization lights 21, 22, 23, and 24 on the basis of the correspondence information; calculating a normal line on the surface of the subject 100 from the image for each of the polarization lights 21, 22, 23, and 24 by photometric stereo; and estimating the shape of the subject 100 on the basis of the normal line. Thus, the information processing method can accurately measure the three-dimensional shape of a moving object. - Note that the effects described herein are merely examples and are not limiting, and other effects may also be achieved.
- Note that the present technology can also have the following configurations.
- (1)
- An imaging device including:
-
- an imaging unit including a plurality of lights having different polarization directions of light emitted to a subject and a polarization sensor and captures an image of the subject that is simultaneously irradiated with the light from the plurality of lights;
- a separating unit that separates a pixel signal corresponding to each of the polarization directions from an image captured by the imaging unit and generates an image for each of the polarization directions;
- a calculating unit that calculates a normal line on a surface of the subject from the image for each of the polarization directions by photometric stereo; and an estimation unit that estimates a shape of the subject on a basis of the normal line calculated by the calculating unit.
(2)
- The imaging device according to (1), further including:
-
- a storage unit that stores correspondence information in which each of the lights, a polarization direction of light emitted by the light, and a direction of the light with respect to the subject are associated with each other,
- wherein the separating unit estimates a polarization model indicating a correspondence relationship between a polarization direction and luminance of each pixel in an image of the subject irradiated with light having the polarization direction on a basis of luminance of each pixel in the image for each of the polarization directions and calculates luminance of each pixel in an image for each of the lights on a basis of the polarization model and the correspondence information.
(3)
- The imaging device according to (2),
-
- wherein the calculating unit calculates the normal line on the surface of the subject on a basis of luminance of each pixel in the image for each of the lights and the correspondence information.
(4)
- wherein the calculating unit calculates the normal line on the surface of the subject on a basis of luminance of each pixel in the image for each of the lights and the correspondence information.
- The imaging device according to any one of (1) to (3),
-
- wherein the plurality of lights includes polarization filters having different polarization directions from each other on a light emission surface of a light source.
(5)
- wherein the plurality of lights includes polarization filters having different polarization directions from each other on a light emission surface of a light source.
- The imaging device according to any one of (1) to (4),
-
- wherein the polarization sensor comprises:
- a pixel array in which a plurality of imaging elements is arrayed in a matrix shape; and
- a polarization filter that selectively causes light having different polarization directions associated with the imaging elements to enter the imaging elements.
(6)
- The imaging device according to any one of (1) to (4),
-
- wherein the polarization sensor comprises:
- a beam splitter that splits incident light into a plurality of light beams;
- an image sensor that receives each of the light beams; and
- a polarization filter disposed between the image sensor and the beam splitter, the polarization filter having a different polarization direction for each of the image sensors.
(7)
- An imaging method including the steps of: by a computer,
-
- capturing, by a polarization sensor, an image of a subject that is simultaneously irradiated with light from a plurality of lights, light emitted to a subject by which having different polarization directions;
- separating a pixel signal corresponding to each of the polarization directions from the image captured by the polarization sensor and generating an image for each of the polarization directions;
- calculating a normal line on a surface of the subject from the image for each of the polarization directions by photometric stereo; and
- estimating a shape of the subject on a basis of the normal line.
(8)
- An information processing device including:
-
- a storage unit that stores correspondence information in which a plurality of lights having different polarization directions of light emitted to a subject, the polarization directions of light emitted by the lights, and directions of the lights with respect to the subject are associated with each other;
- a separating unit that separates a pixel signal corresponding to each of the polarization directions from an image, in which the subject simultaneously irradiated with light from the plurality of lights is captured by a polarization sensor, and generates an image for each of the lights on a basis of the correspondence information;
- a calculating unit that calculates a normal line on a surface of the subject from the image for each of the lights by photometric stereo; and
- an estimation unit that estimates a shape of the subject on a basis of the normal line calculated by the calculating unit.
(9)
- An information processing method including, by a computer:
-
- storing correspondence information in which a plurality of lights having different polarization directions of light emitted to a subject, the polarization directions of light emitted by the lights, and directions of the lights with respect to the subject are associated with each other;
- separating a pixel signal corresponding to each of the polarization directions from an image, in which the subject simultaneously irradiated with light from the plurality of lights is captured by a polarization sensor, and generating an image for each of the lights on a basis of the correspondence information;
- calculating a normal line on a surface of the subject from the image for each of the lights by photometric stereo; and
- estimating a shape of the subject on a basis of the normal line.
-
-
- 1 IMAGING DEVICE
- 2 IMAGING UNIT
- 3 INFORMATION PROCESSING DEVICE
- 4 CALIBRATION UNIT
- 5 SIGNAL SEPARATING UNIT
- 6 NORMAL LINE CALCULATING UNIT
- 7 DISTANCE ESTIMATION UNIT
- 10 CAMERA
- 11 POLARIZATION SENSOR
- 12 IMAGING CONTROL UNIT
- 51 PREPROCESSING UNIT
- 52 POLARIZATION DEMOSAIC UNIT
- 53 POLARIZATION MODEL ESTIMATING UNIT
- 54 POLARIZATION LUMINANCE CALCULATING UNIT
- 55 LIGHT SOURCE DIRECTION AND POLARIZATION DIRECTION CORRESPONDENCE TABLE
Claims (9)
1. An imaging device comprising:
an imaging unit comprising a plurality of lights having different polarization directions of light emitted to a subject and a polarization sensor and captures an image of the subject that is simultaneously irradiated with the light from the plurality of lights;
a separating unit that separates a pixel signal corresponding to each of the polarization directions from an image captured by the imaging unit and generates an image for each of the polarization directions;
a calculating unit that calculates a normal line on a surface of the subject from the image for each of the polarization directions by photometric stereo; and
an estimation unit that estimates a shape of the subject on a basis of the normal line calculated by the calculating unit.
2. The imaging device according to claim 1 , further comprising:
a storage unit that stores correspondence information in which each of the lights, a polarization direction of light emitted by the light, and a direction of the light with respect to the subject are associated with each other,
wherein the separating unit estimates a polarization model indicating a correspondence relationship between a polarization direction and luminance of each pixel in an image of the subject irradiated with light having the polarization direction on a basis of luminance of each pixel in the image for each of the polarization directions and calculates luminance of each pixel in an image for each of the lights on a basis of the polarization model and the correspondence information.
3. The imaging device according to claim 2 ,
wherein the calculating unit calculates the normal line on the surface of the subject on a basis of luminance of each pixel in the image for each of the lights and the correspondence information.
4. The imaging device according to claim 1 ,
wherein the plurality of lights includes polarization filters having different polarization directions from each other on a light emission surface of a light source.
5. The imaging device according to claim 1 ,
wherein the polarization sensor comprises:
a pixel array in which a plurality of imaging elements is arrayed in a matrix shape; and
a polarization filter that selectively causes light having different polarization directions associated with the imaging elements to enter the imaging elements.
6. The imaging device according to claim 1 ,
wherein the polarization sensor comprises:
a beam splitter that splits incident light into a plurality of light beams;
an image sensor that receives each of the light beams; and
a polarization filter disposed between the image sensor and the beam splitter, the polarization filter having a different polarization direction for each of the image sensors.
7. An imaging method comprising the steps of:
by a computer,
capturing, by a polarization sensor, an image of a subject that is simultaneously irradiated with light from a plurality of lights, light emitted to a subject by which having different polarization directions;
separating a pixel signal corresponding to each of the polarization directions from the image captured by the polarization sensor and generating an image for each of the polarization directions;
calculating a normal line on a surface of the subject from the image for each of the polarization directions by photometric stereo; and
estimating a shape of the subject on a basis of the normal line.
8. An information processing device comprising:
a storage unit that stores correspondence information in which a plurality of lights having different polarization directions of light emitted to a subject, the polarization directions of light emitted by the lights, and directions of the lights with respect to the subject are associated with each other;
a separating unit that separates a pixel signal corresponding to each of the polarization directions from an image, in which the subject simultaneously irradiated with light from the plurality of lights is captured by a polarization sensor, and generates an image for each of the lights on a basis of the correspondence information;
a calculating unit that calculates a normal line on a surface of the subject from the image for each of the lights by photometric stereo; and
an estimation unit that estimates a shape of the subject on a basis of the normal line calculated by the calculating unit.
9. An information processing method comprising, by a computer:
storing correspondence information in which a plurality of lights having different polarization directions of light emitted to a subject, the polarization directions of light emitted by the lights, and directions of the lights with respect to the subject are associated with each other;
separating a pixel signal corresponding to each of the polarization directions from an image, in which the subject simultaneously irradiated with light from the plurality of lights is captured by a polarization sensor, and generating an image for each of the lights on a basis of the correspondence information;
calculating a normal line on a surface of the subject from the image for each of the lights by photometric stereo; and
estimating a shape of the subject on a basis of the normal line.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019-225887 | 2019-12-13 | ||
JP2019225887 | 2019-12-13 | ||
PCT/JP2020/045246 WO2021117633A1 (en) | 2019-12-13 | 2020-12-04 | Imaging device, information processing device, imaging method, and information processing method |
Publications (1)
Publication Number | Publication Date |
---|---|
US20230342963A1 true US20230342963A1 (en) | 2023-10-26 |
Family
ID=76330334
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/756,776 Pending US20230342963A1 (en) | 2019-12-13 | 2020-12-04 | Imaging device, information processing device, imaging method, and information processing method |
Country Status (4)
Country | Link |
---|---|
US (1) | US20230342963A1 (en) |
JP (1) | JP7447916B2 (en) |
CN (1) | CN114731368A (en) |
WO (1) | WO2021117633A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102558937B1 (en) * | 2020-01-27 | 2023-07-21 | 코그넥스코오포레이션 | Systems and method for vision inspection with multiple types of light |
Family Cites Families (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3955616B2 (en) * | 2005-09-01 | 2007-08-08 | 松下電器産業株式会社 | Image processing method, image processing apparatus, and image processing program |
JP2007206797A (en) * | 2006-01-31 | 2007-08-16 | Omron Corp | Image processing method and image processor |
WO2007139067A1 (en) * | 2006-05-29 | 2007-12-06 | Panasonic Corporation | Image high-resolution upgrading device, image high-resolution upgrading method, image high-resolution upgrading program and image high-resolution upgrading system |
JP4435867B2 (en) * | 2008-06-02 | 2010-03-24 | パナソニック株式会社 | Image processing apparatus, method, computer program, and viewpoint conversion image generation apparatus for generating normal line information |
JP4469021B2 (en) * | 2008-07-08 | 2010-05-26 | パナソニック株式会社 | Image processing method, image processing apparatus, image processing program, image composition method, and image composition apparatus |
WO2010073547A1 (en) * | 2008-12-25 | 2010-07-01 | パナソニック株式会社 | Image processing device and pseudo-3d image creation device |
EP2223650A1 (en) * | 2009-02-25 | 2010-09-01 | The Provost, Fellows and Scholars of the College of the Holy and Undivided Trinity of Queen Elizabeth near Dublin | Method and apparatus for imaging tissue topography |
CN103037752B (en) * | 2010-09-24 | 2015-05-13 | 松下电器产业株式会社 | Image processing device |
JP2012143363A (en) * | 2011-01-11 | 2012-08-02 | Panasonic Corp | Image processing apparatus |
JP5503573B2 (en) * | 2011-02-21 | 2014-05-28 | 日本放送協会 | Imaging apparatus and image processing information generation program |
WO2012176944A1 (en) * | 2011-06-22 | 2012-12-27 | 동국대학교 경주캠퍼스 산학협력단 | Method and system for reliable 3d shape extraction of metal surface |
KR101087172B1 (en) | 2011-07-13 | 2011-11-28 | 동국대학교 경주캠퍼스 산학협력단 | Extraction system of 3 dimensional shape of high temperature metallic surface and method thereof |
JP5341285B1 (en) * | 2011-11-29 | 2013-11-13 | オリンパスメディカルシステムズ株式会社 | Polarized light observation equipment |
US9300931B2 (en) * | 2012-07-25 | 2016-03-29 | Panasonic Intellectual Property Management Co., Ltd. | Image pickup system |
US20170065178A1 (en) * | 2014-03-12 | 2017-03-09 | Sony Corporation | Measurement device and measurement method |
JP6432968B2 (en) * | 2014-06-26 | 2018-12-05 | 国立大学法人岐阜大学 | Object shape estimation apparatus and program |
JP6673327B2 (en) * | 2015-02-27 | 2020-03-25 | ソニー株式会社 | Image processing apparatus, image processing method, and imaging device |
US10444617B2 (en) * | 2015-04-30 | 2019-10-15 | Sony Corporation | Image processing apparatus and image processing method |
WO2017013913A1 (en) * | 2015-07-17 | 2017-01-26 | ソニー株式会社 | Gaze detection device, eyewear terminal, gaze detection method, and program |
JP6556013B2 (en) * | 2015-10-08 | 2019-08-07 | キヤノン株式会社 | PROCESSING DEVICE, PROCESSING SYSTEM, IMAGING DEVICE, PROCESSING METHOD, PROGRAM, AND RECORDING MEDIUM |
JP6679289B2 (en) * | 2015-11-30 | 2020-04-15 | キヤノン株式会社 | Processing device, processing system, imaging device, processing method, processing program, and recording medium |
WO2018017897A1 (en) * | 2016-07-20 | 2018-01-25 | Mura Inc. | Systems and methods for 3d surface measurements |
JP2018073122A (en) * | 2016-10-28 | 2018-05-10 | キヤノン株式会社 | Image processing device and image processing method |
CN108267896A (en) * | 2016-12-30 | 2018-07-10 | 深圳超多维光电子有限公司 | polarization imaging device and method |
US10108261B1 (en) * | 2017-07-05 | 2018-10-23 | Oculus Vr, Llc | Eye tracking based on light polarization |
US11189042B2 (en) * | 2017-07-26 | 2021-11-30 | Sony Corporation | Information processing device, information processing method, and computer program |
JP6892603B2 (en) * | 2017-12-07 | 2021-06-23 | 富士通株式会社 | Distance measuring device, distance measuring method and distance measuring program |
CN109285213A (en) * | 2018-07-18 | 2019-01-29 | 西安电子科技大学 | Comprehensive polarization three-dimensional rebuilding method |
-
2020
- 2020-12-04 US US17/756,776 patent/US20230342963A1/en active Pending
- 2020-12-04 WO PCT/JP2020/045246 patent/WO2021117633A1/en active Application Filing
- 2020-12-04 JP JP2021563921A patent/JP7447916B2/en active Active
- 2020-12-04 CN CN202080079371.7A patent/CN114731368A/en active Pending
Also Published As
Publication number | Publication date |
---|---|
WO2021117633A1 (en) | 2021-06-17 |
JPWO2021117633A1 (en) | 2021-06-17 |
JP7447916B2 (en) | 2024-03-12 |
CN114731368A (en) | 2022-07-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11423562B2 (en) | Device and method for obtaining distance information from views | |
JP6456156B2 (en) | Normal line information generating apparatus, imaging apparatus, normal line information generating method, and normal line information generating program | |
US9582889B2 (en) | Depth mapping based on pattern matching and stereoscopic information | |
US20110063437A1 (en) | Distance estimating device, distance estimating method, program, integrated circuit, and camera | |
JP2022120005A (en) | Depth-sensing computer vision system | |
US10713810B2 (en) | Information processing apparatus, method of controlling information processing apparatus, and storage medium | |
CN107860337B (en) | Structured light three-dimensional reconstruction method and device based on array camera | |
US10121246B2 (en) | Measurement apparatus that obtains information of a shape of a surface using a corrected image and measurement method | |
WO2020066637A1 (en) | Depth acquisition device, depth acquisition method, and program | |
US20150062302A1 (en) | Measurement device, measurement method, and computer program product | |
JP7168077B2 (en) | Three-dimensional measurement system and three-dimensional measurement method | |
US20210150744A1 (en) | System and method for hybrid depth estimation | |
US11803982B2 (en) | Image processing device and three-dimensional measuring system | |
US20230342963A1 (en) | Imaging device, information processing device, imaging method, and information processing method | |
US9217670B2 (en) | Object recognition apparatus using spectrometer and method thereof | |
JP2013024653A (en) | Distance measuring apparatus and program | |
KR20100138985A (en) | Method and apparatus for multiplexed image acquisition and processing | |
JP2018116032A (en) | Measurement device for measuring shape of target measurement object | |
JP6432968B2 (en) | Object shape estimation apparatus and program | |
US11132579B2 (en) | Contour recognition device, contour recognition system and contour recognition method | |
JP6524680B2 (en) | Imaging system, method of acquiring distance information, and method of producing distance information | |
KR101985287B1 (en) | Correcting device for color camera and thermal camera | |
US11610324B2 (en) | Three-dimensional shape measuring method and three-dimensional shape measuring device | |
CN109754418B (en) | Stereoscopic vision method for single camera | |
JP2002013918A (en) | Three-dimensional image forming device and three- dimensional image forming method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SONY GROUP CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GOMI, SHINICHIRO;KURITA, TEPPEI;SIGNING DATES FROM 20220418 TO 20220511;REEL/FRAME:060080/0713 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |