WO2022054167A1 - Position estimation method, position estimation device, and program - Google Patents

Position estimation method, position estimation device, and program Download PDF

Info

Publication number
WO2022054167A1
WO2022054167A1 PCT/JP2020/034113 JP2020034113W WO2022054167A1 WO 2022054167 A1 WO2022054167 A1 WO 2022054167A1 JP 2020034113 W JP2020034113 W JP 2020034113W WO 2022054167 A1 WO2022054167 A1 WO 2022054167A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
polarization
subject
polarization angle
polarized
Prior art date
Application number
PCT/JP2020/034113
Other languages
French (fr)
Japanese (ja)
Inventor
志織 杉本
陽光 曽我部
隆行 黒住
英明 木全
Original Assignee
日本電信電話株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電信電話株式会社 filed Critical 日本電信電話株式会社
Priority to JP2022548293A priority Critical patent/JP7445176B2/en
Priority to PCT/JP2020/034113 priority patent/WO2022054167A1/en
Publication of WO2022054167A1 publication Critical patent/WO2022054167A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Definitions

  • the present invention relates to a position estimation method, a position estimation device and a program.
  • a method for estimating the three-dimensional position (depth) of the subject and the reflection parameter on the surface of the subject for example, there are a stereo method, a pattern projection method, and a TOF (Time of Flight).
  • a stereo method a compound eye camera or a plurality of cameras are used.
  • the stereo method the corresponding points between the multi-viewpoint images having parallax are obtained, and the three-dimensional position is estimated by triangulation.
  • the pattern projection method an illumination that projects a pattern light and a camera are used.
  • a pulse wave or a modulated wave is applied to the subject, and the reflection time of the pulse wave or the modulated wave is measured.
  • TCA three color-filtered apertures
  • red, green, and blue color filters are placed in the vicinity of the lens of a camera equipped with an RGB (Red, Green, Blue) image sensor, with their positions offset from each other.
  • RGB Red, Green, Blue
  • a multi-viewpoint image having a minute parallax between the red, green, and blue color channels is generated.
  • the three-dimensional position (depth) of the subject is estimated by performing a corresponding point search similar to the stereo method for the multi-viewpoint image between each color channel.
  • DCA Dual off-axis color-filtered aperture
  • the sub-openings are arranged away from the optical axis so that the estimation accuracy of the three-dimensional position of the subject is improved. Further, in DCA, the number of sub-apertures is reduced to two so that the amount of light per sub-aperture increases (see Non-Patent Document 2).
  • a red filter and a cyan filter may be prepared to prevent an error in the search for the corresponding point.
  • the optical image of the subject is captured in both the green and blue channels via a cyan filter.
  • Ellipsometry is often used as a method for estimating the reflection parameters on the surface of a subject having specular reflection characteristics.
  • the subject is irradiated with light from a plurality of lights, and ellipsometry is performed for each light.
  • the reflection parameter on the surface of the subject is estimated based on the relationship between the incident angle and the reflection angle of the light.
  • the method in which ellipsometry is performed cannot estimate the three-dimensional position of the surface of the subject. Therefore, the method in which the ellipsometry is performed is often combined with a method such as a stereo method, a pattern method, or a TOF method in order to estimate the three-dimensional position of the surface of the subject.
  • the estimation accuracy of the 3D position of the surface of the subject may decrease due to the estimation result of the color shift.
  • the three-dimensional position of the subject is estimated using the sub-aperture
  • the three-dimensional position of the surface of the subject having the specular reflection characteristic cannot be estimated.
  • the polarization analysis is performed in TCA or DCA, the brightness changes greatly for each polarization filter, so that it is not possible to estimate the three-dimensional position of the surface of the subject having specular reflection characteristics. As described above, there is a problem that the three-dimensional position of the subject may not be estimated.
  • the present invention provides a position estimation method, a position estimation device, and a program capable of suppressing a decrease in accuracy due to the influence of the subject or the environment in estimating a three-dimensional position of a subject in real space. It is an object.
  • One aspect of the present invention is a position estimation method executed by a position estimation device, which is a polarized image of a channel corresponding to a first polarization angle, which is a polarization angle that is tilted from a reference in the negative direction and is not orthogonal to the reference.
  • a certain first observed image and a second observed image which is a polarized image of a channel corresponding to a second polarization angle which is inclined in the positive direction from the reference and is not orthogonal to the reference are polarized with respect to the reference.
  • One aspect of the present invention is a first observed image which is a polarized image of a channel corresponding to a first polarization angle which is a polarization angle which is inclined from a reference in a negative direction and is not orthogonal to the reference, and a positive direction from the reference.
  • the light reflected from the subject irradiated with the polarization of the reference polarization angle with the second observation image which is the polarized image of the channel corresponding to the second polarization angle which is inclined and is not orthogonal to the reference.
  • It is a position estimation device including an image imaging unit generated according to the above and a position estimation unit that estimates a three-dimensional position of the subject based on the first observation image and the second observation image.
  • One aspect of the present invention is a program for operating a computer as the above-mentioned position estimation device.
  • the present invention it is possible to suppress a decrease in accuracy due to the influence of the subject or the environment in estimating the three-dimensional position of the subject in the real space.
  • FIG. 1 is a diagram showing a configuration example of the position estimation device 1a in the first embodiment.
  • the position estimation device 1a is a device that estimates the three-dimensional position (depth) of the surface of the subject. In FIG. 1, the position estimation device 1a estimates a three-dimensional position on the surface of the subject 100.
  • the position estimation device 1a includes a polarized lighting unit 10, an image imaging unit 11, and an image processing unit 12a.
  • the polarized illumination unit 10 irradiates the subject 100 (imaging target) with polarization.
  • the image capturing unit 11 (polarized camera) generates an image of the subject 100 by capturing the subject 100.
  • the image capturing unit 11 outputs the image of the subject 100 to the image processing unit 12a.
  • the image processing unit 12a acquires an image of the subject 100 from the image imaging unit 11.
  • the image processing unit 12a estimates the three-dimensional position (depth) of the surface of the subject 100 based on the image acquired from the image capturing unit 11.
  • FIG. 2 is a diagram showing a configuration example of the image capturing unit 11 in the first embodiment.
  • the image capturing unit 11 includes a lens 110, a mask 111, and a polarized image sensor 112.
  • the mask 111 is provided with a first opening region 113 and a second opening region 114 as each sub-opening.
  • the first aperture region 113 is provided with a polarization filter having a first angle with respect to the deflection angle of the polarization transmitted through the lens 110.
  • ⁇ c x eff represents the effective aperture of the lens 110.
  • ⁇ c x represents the distance between the first opening region 113 and the second opening region 114.
  • Cz represents the distance between the lens 110 and the mask 111.
  • F represents the distance between the lens 110 and the polarized image sensor 112.
  • ⁇ x represents an image spacing.
  • Z represents the distance between the polarized image sensor 112 and the subject 100.
  • Z 0 represents the distance between the polarized image sensor 112 and the focal surface 200.
  • Reflected light is incident on the lens 110 from the surface of the subject 100 irradiated with polarized light.
  • the lens 110 transmits the incident light and sends the light transmitted through the lens 110 to the mask 111.
  • the first opening region 113 and the second opening region 114 of the mask 111 transmit a part of the light sent to the mask 111.
  • the first aperture region 113 and the second aperture region 114 project the transmitted light onto the polarized image sensor 112.
  • the light projected on the polarized image sensor 112 is observed as a polarized image of the subject 100.
  • the polarized image sensor 112 generates an image of the subject 100 according to the polarized image of the subject 100.
  • the polarized image sensor 112 transmits an image of the subject 100 to the image processing unit 12a.
  • the light that is not blocked by any of the mask 111, the first aperture region 113, and the second aperture region 114 is projected onto the polarization image sensor 112. You may.
  • FIG. 3 is a diagram showing a configuration example of the image processing unit 12a in the first embodiment.
  • the image processing unit 12a includes an image input unit 120a and a position estimation unit 121a.
  • the image input unit 120a acquires an image of the subject 100 from the image imaging unit 11.
  • the image input unit 120a outputs the image of the subject 100 to the position estimation unit 121a.
  • the position estimation unit 121a estimates the three-dimensional position (depth) of the surface of the subject 100 based on the image of the subject 100.
  • the position estimation unit 121a outputs the estimation result of the three-dimensional position of the surface of the subject 100 to a predetermined external device (not shown).
  • FIG. 4 is a flowchart showing an operation example of the position estimation device 1a in the first embodiment.
  • the polarized illumination unit 10 irradiates the subject 100 to be imaged with polarized light (step S101).
  • the polarization applied to the subject 100 is not limited to a specific polarization. In the following, the linearly polarized light is applied to the subject 100 with a single polarization angle “ ⁇ ”.
  • the polarized light applied to the subject 100 is reflected on the surface of the subject 100 and passes through the lens 110.
  • a part of the light transmitted through the lens 110 is absorbed by the mask 111.
  • the unabsorbed light selectively passes through the first aperture region 113 or the second aperture region 114.
  • the first opening region 113 transmits polarized light having an angle component corresponding to the angle (polarization angle) of the polarizing filter provided in the first opening region 113.
  • the second opening region 114 transmits polarized light having an angle component corresponding to the angle (polarization angle) of the polarizing filter provided in the second opening region 114.
  • the first opening region 113 absorbs a part of the polarized light having an angular component orthogonal to the angle of the polarizing filter provided in the first opening region 113, and reflects the remaining polarized light.
  • the second opening region 114 absorbs a part of the polarized light having an angular component orthogonal to the angle of the polarizing filter provided in the second opening region 114, and reflects the remaining polarized light.
  • the angle (polarization angle) of the polarization filter in each aperture region is predetermined according to the relative positional relationship between the polarization illumination unit 10 and the polarization image sensor 112.
  • the first aperture region 113 includes a polarizing filter having a polarization angle of “ ⁇ 1 ”.
  • the second aperture region 114 includes a polarizing filter having a polarization angle of “ ⁇ 2 ”.
  • FIG. 5 is a diagram showing the relationship between the polarization angle and the brightness in the first embodiment.
  • the reflected light from the surface of the subject 100 is composed of unpolarized light due to diffuse reflection and linearly polarized light due to specular reflection.
  • the polarization angle of the linearly polarized light mirror-reflected in the subject 100 is equal to the polarization angle of the polarized light applied to the subject 100.
  • I max represents the luminance (maximum luminance) at the angle “ ⁇ ” where the specular reflection is the strongest.
  • I min represents the luminance (minimum luminance) at the angle at which the specular reflection is the weakest. That is.
  • I min represents the luminance at an angle at which only diffusely reflected polarization is observed.
  • the brightness of the reflected light is observed for four types of polarization angles (for example, “0, ⁇ / 4, ⁇ / 2, ⁇ / 4”) shifted by “ ⁇ / 4” from each other. Further, based on the parameter (Stokes parameter) representing the polarization component, the maximum brightness "I max " and the minimum brightness "I min ", the normal direction or bidirectional reflectance distribution of the surface of the subject 100 (Bidirectional Reflectance Distribution). Function: BRDF) The function is estimated.
  • the polarizing filter provided in the polarizing image sensor is often a polarizing filter having four types of polarization angles shifted by " ⁇ / 4" from each other.
  • the polarization image sensor 112 includes four types of polarization filters [ ⁇ ⁇ / 4, ⁇ , ⁇ + ⁇ / 4, ⁇ + ⁇ / 2] with reference to the polarization angle “ ⁇ ”.
  • the intensity and polarization angle of the specularly reflected light on the surface of the subject change according to Fresnel's law.
  • Fresnel's law is that the intensity and polarization angle of light specularly reflected on the surface of a subject depend on the refractive index of air and the subject and the incident angle of light.
  • the incident angle and the reflection angle are equal, so that the specularly reflected light can be observed only from the direction of the illumination and the direction of the regular reflection across the subject surface.
  • the polarization angle and intensity of the specularly reflected light are substantially the same as the polarization angle and intensity of the polarized illumination.
  • the polarization angle (polarization angle of the polarization filter of the polarization illumination unit 10) “ ⁇ ” of the polarization applied to the subject 100 is “0 (radian)” as an example, the angle “ ⁇ ” at which the mirror reflection is the strongest. Is 0. Therefore, the brightness observed in the polarizing image sensor 112 according to the polarization transmitted through the polarizing filter having a polarization angle of “ ⁇ / 4” shifted from 0 to “ ⁇ / 4” and “ ⁇ / 4” from 0. It is equal to the brightness observed in the polarization image sensor 112 according to the polarization transmitted through the polarization filter having the shifted polarization angle “ ⁇ / 4”.
  • the image capturing unit does not include the first opening region, the second opening region, and the mask, the subject generated by the polarized image sensor according to the polarization transmitted through the lens and projected on the polarized image sensor.
  • the image of the channel corresponding to the polarization angle “ ⁇ / 4” and the image of the channel corresponding to the polarization angle “ ⁇ / 4” are substantially the same.
  • the polarized light selectively transmitted through the first aperture region 113 or the second aperture region 114 is the first aperture region 113 or the second aperture region 114.
  • the polarized light selectively transmitted through the image is formed in the polarized image sensor 112 at positions deviated from each other according to the direction in which the polarized light is incident on the polarized image sensor 112.
  • the image (optical image) imaged in this way passes through filters of four types of polarization angles provided in the polarization image sensor 112 and is generated by the polarization image sensor 112 as a polarization image having four channels. Will be done.
  • the generated polarized image may be recorded in a memory.
  • the polarization of the polarization angle different from the polarization angle of " ⁇ - ⁇ / 4" is the polarization of the polarization. Since the polarization angle and the polarization angle of the polarization filter are orthogonal to each other, they are blocked by the polarization filter of the first opening region 113. Therefore, a polarized image of the channel corresponding to the polarization angle “ ⁇ / 4” is generated according to the polarization transmitted through the first opening region 113.
  • the polarized image of the channel corresponding to the polarization angle “ ⁇ / 4” is referred to as a “first observed image”.
  • the polarization filter having the polarization angle “ ⁇ + ⁇ / 4” is provided in the second opening region 114, the polarization of the polarization angle different from the polarization angle “ ⁇ + ⁇ / 4” is the polarization angle of the polarization and the polarization of the polarization filter. Since the corners are orthogonal to each other, they are blocked by the polarizing filter of the second opening region 114. Therefore, a polarized image of the channel corresponding to the polarization angle “ ⁇ + ⁇ / 4” is generated according to the polarization transmitted through the second opening region 114.
  • the polarized image of the channel corresponding to the polarization angle “ ⁇ + ⁇ / 4” is referred to as a “second observation image”.
  • the image capturing unit 11 (polarized image sensor 112) generates the first observed image and the second observed image (step S102).
  • the image input unit 120a acquires the first observation image and the second observation image from the image imaging unit 11 (polarized image sensor 112).
  • the image input unit 120a outputs the first observation image and the second observation image to the position estimation unit 121a.
  • the image input unit 120a may record the first observation image and the second observation image in the memory (step S103).
  • the position estimation unit 121a estimates the position parameter of the subject 100 based on the first observation image and the second observation image.
  • the position estimation unit 121a outputs the position parameter of the surface of the subject 100 to a predetermined external device (not shown) (step S104).
  • the position parameter is not limited to a specific parameter as long as it is a parameter related to the position of the subject 100. Further, the unit used for estimating the position parameter is not limited to a specific unit.
  • the position vector indicating the three-dimensional position (depth) is determined for each pixel of the image as an example.
  • the position parameter may be a map parameter such as a parallax map or a depth map.
  • the position parameter may be set for each block or for each area where the image is divided.
  • the method of estimating the position parameter is not limited to a specific method.
  • the position estimation unit 121a estimates the position parameter by the stereo method using the first observation image and the second observation image.
  • the position estimation unit 121a estimates the parallax using the result of block matching between the images of each channel.
  • the position estimation unit 121a may convert the parallax into a three-dimensional position by using geometric information such as the distance “ ⁇ c x ” between the opening regions.
  • the position estimation unit 121a may estimate the position parameter of the subject 100 based on the detection result of the feature point of the subject 100.
  • the position estimation unit 121a may estimate the position parameter of the subject 100 by using a marker placed within the angle of view of the image imaging unit 11 while the subject 100 is being imaged. For example, when the polarized light applied to the subject 100 is the pattern light, the position estimation unit 121a may estimate the position parameter on the surface of the subject 100 by using the pattern light applied to the subject 100.
  • the second observed image is a polarized image of the channel corresponding to the second polarization angle (for example, ⁇ + ⁇ / 4), which is a polarization angle that is tilted from the reference in the positive direction and is not orthogonal to the reference.
  • the image capturing unit 11 generates a first observed image and a second observed image according to the light reflected from the subject 100 irradiated with the polarization of the reference polarization angle.
  • the position estimation unit 121a estimates the three-dimensional position of the subject 100 in the real space based on the first observation image and the second observation image.
  • the position estimation unit 121a estimates the three-dimensional position of the subject 100 in the real space by using, for example, a stereo method, a pattern projection method, or the like.
  • a plurality of sub-apertures are provided on the mask 111 in the vicinity of the lens 110 of the image capturing unit 11.
  • a polarization filter having a constant first angle with respect to the polarization angle of the polarization from the polarization illumination unit 10 is provided in the first opening region 113.
  • a polarization filter having a constant second angle with respect to the polarization angle of the polarization from the polarization illumination unit 10 is provided in the second opening region 114.
  • the polarized lighting unit 10 irradiates the subject 100 with polarized light.
  • the polarized light (reflected light) reflected from the subject 100 is observed by the polarized image sensor 112 using the lens 110 and the mask 111 (plural sub-apertures).
  • the image of the channel (polarized image) corresponding to the polarization angle equal to the polarization angle " ⁇ - ⁇ / 4" of the polarization filter provided in the first opening region 113 and the polarization provided in the second opening region 114. It is possible to cancel the change in brightness between the image of the channel corresponding to the polarization angle equal to the polarization angle “ ⁇ + ⁇ / 4” of the filter (polarized image). Therefore, it is possible to easily estimate the three-dimensional position of the surface of the subject 100. This makes it possible to suppress a decrease in accuracy due to the influence of the subject or the environment in estimating the three-dimensional position of the subject in the real space.
  • the second embodiment differs from the first embodiment in that the image processing unit estimates the reflection parameter on the surface of the subject.
  • the differences from the first embodiment will be mainly described.
  • FIG. 6 is a diagram showing a configuration example of the position estimation device 1b in the first embodiment.
  • the position estimation device 1b is a device that estimates the three-dimensional position (depth) of the surface of the subject. In FIG. 6, the position estimation device 1b estimates the reflection parameter on the surface of the subject 100.
  • the position estimation device 1b includes a polarized lighting unit 10, an image imaging unit 11, and an image processing unit 12b.
  • the image capturing unit 11 outputs the image of the subject 100 to the image processing unit 12b.
  • the image processing unit 12b acquires an image of the subject 100 from the image imaging unit 11.
  • the image processing unit 12b estimates the three-dimensional position (depth) of the surface of the subject 100 based on the image acquired from the image capturing unit 11.
  • the image processing unit 12b estimates the reflection parameter of the surface of the subject 100 based on the three-dimensional position of the surface of the subject 100 and the image (observation image) of the subject 100.
  • FIG. 7 is a diagram showing a configuration example of the image processing unit 12b in the first embodiment.
  • the image processing unit 12b includes an image input unit 120b, a position estimation unit 121b, and a reflection component estimation unit 122.
  • the image input unit 120b outputs the image of the subject 100 to the position estimation unit 121b and the reflection component estimation unit 122.
  • the position estimation unit 121b estimates the three-dimensional position (depth) of the surface of the subject 100 based on the image of the subject 100.
  • the position estimation unit 121b outputs the estimation result of the three-dimensional position of the surface of the subject 100 to a predetermined external device (not shown) and the reflection component estimation unit 122.
  • the reflection component estimation unit 122 acquires an image of the subject 100 from the image input unit 120b.
  • the reflection component estimation unit 122 acquires the estimation result of the three-dimensional position of the surface of the subject 100 from the position estimation unit 121b.
  • the reflection component estimation unit 122 estimates the reflection parameter (reflection component) on the surface of the subject 100 based on the three-dimensional position of the subject 100 and the image of the subject 100.
  • the reflection component estimation unit 122 outputs the reflection parameter on the surface of the subject 100 to a predetermined external device (not shown).
  • FIG. 8 is a flowchart showing an operation example of the position estimation device 1b in the first embodiment.
  • the operation from step S201 to step S202 is the same as the operation from step S101 to step S102 shown in FIG.
  • the polarized image of the channel corresponding to the polarization angle “ ⁇ ” is referred to as a “third observation image”.
  • the polarized image of the channel corresponding to the polarization angle “ ⁇ + ⁇ / 2” is referred to as a “fourth observation image”.
  • the image input unit 120b acquires the first observation image, the second observation image, the third observation image, and the fourth observation image from the image imaging unit 11 (polarized image sensor 112).
  • the image input unit 120b outputs the first observation image, the second observation image, the third observation image, and the fourth observation image to the position estimation unit 121b and the reflection component estimation unit 122.
  • the image input unit 120b may record the first observation image, the second observation image, the third observation image, and the fourth observation image in the memory (step S203).
  • the position estimation unit 121b estimates the position parameter of the subject 100 based on the first observation image and the second observation image.
  • the position estimation unit 121b outputs the position parameter of the surface of the subject 100 to a predetermined external device (not shown) and the reflection component estimation unit 122 (step S204).
  • the reflection component estimation unit 122 is the surface of the subject 100 based on at least one of the first observation image, the second observation image, the third observation image, and the fourth observation image, and the position parameter of the surface of the subject 100. Estimate the reflection parameters of.
  • the reflection component estimation unit 122 outputs the reflection parameter on the surface of the subject 100 to a predetermined external device (not shown) (step S205).
  • the reflection parameter is not limited to a specific parameter as long as it is a parameter related to the reflection on the surface of the subject 100.
  • the reflection parameter is a Stokes parameter.
  • the reflection parameter may be a combination of the maximum luminance "I max " and the minimum luminance "I min " in an image such as the fourth observation image.
  • the reflection parameter may be a mirror reflection component and a diffuse reflection component separated in the reflected light, or may be a normal direction or bidirectional reflectance distribution (BRDF) function of the surface of the subject 100.
  • BRDF bidirectional reflectance distribution
  • the reflection parameter can be obtained based on the observation results for four types of polarization angles for each pixel.
  • the brightness of the corresponding point of the first observed image and the brightness of the corresponding point of the second observed image match.
  • This matched brightness is the average of the brightness of the corresponding point of the first observed image and the brightness of the corresponding point of the second observed image when the polarization angle changes from " ⁇ - ⁇ / 4" to " ⁇ + 3 ⁇ / 4". It becomes brightness.
  • the reflection component estimation unit 122 acquires the first observation image, the second observation image, the third observation image, and the fourth observation image from the image input unit 120b.
  • the reflection component estimation unit 122 acquires the position parameter of the surface of the subject 100 from the position estimation unit 121b.
  • the reflection component estimation unit 122 estimates the reflection parameter on the surface of the subject 100 for each pixel of the first observed image.
  • the third observation image is an image generated by the polarization selectively transmitted through the first aperture region 113 or the second aperture region 114 and passing through the polarization filter having the polarization angle “ ⁇ ” of the polarization image sensor 112.
  • the third observation image includes an image corresponding to the polarization transmitted through the polarization filter of the first opening region 113 and the polarization filter of the polarization angle “ ⁇ ” of the polarization image sensor 112, and the polarization filter of the second opening region 114. It is generated according to the superposition with the image corresponding to the polarization transmitted through the polarizing filter having the polarization angle “ ⁇ ” of the polarization image sensor 112.
  • the fourth observation image is an image generated by the polarization selectively transmitted through the first opening region 113 or the second opening region 114 and transmitted through a polarizing filter having a polarization angle of “ ⁇ + ⁇ / 2” of the polarization image sensor 112 (. Polarized image). That is, the fourth observation image is an image corresponding to the polarization transmitted through the polarization filter of the first opening region 113 and the polarization filter of the polarization angle “ ⁇ + ⁇ / 2” of the polarization image sensor 112, and the polarization of the second opening region 114.
  • the fourth observation image is an image containing only the diffuse reflection component.
  • the method for estimating the reflection parameter is not limited to a specific method as long as it is a method for estimating the parameter related to the reflection of light on the surface of the subject 100.
  • the reflection component estimation unit 122 estimates the minimum luminance “I min ” which is the diffuse reflection component of the fourth observation image based on the fourth observation image and the position parameter. Further, based on the assumption that the brightness changes according to the polarization angle with a sine wave based on the average brightness "I mean " of the fourth observed image, the reflection component estimation unit 122 has a case where the specular reflection component becomes the maximum. The maximum brightness "I max " of the 4th observation image observed in 1 is estimated.
  • the correspondence can be known about the fourth observation image.
  • the minimum luminance "I min ", which is the diffuse reflection component of the fourth observed image, is expressed by the equation (1). Further, the equation (2) holds.
  • d represents a vector representing parallax to a polarized image at a position different from that of the first observed image, with reference to a polarized image at the same position as the first observed image.
  • the minimum luminance "I mix " represents the diffuse reflection component of the fourth observed image.
  • I a is generated by the polarization transmitted through the first opening region 113 among the polarized images constituting the fourth observed image and transmitted through the polarizing filter having the polarization angle “ ⁇ / 2” of the polarization image sensor 112. Represents a polarized image.
  • I b is generated by the polarization transmitted through the second opening region 114 in the polarized image constituting the fourth observed image and transmitted through the polarizing filter having the polarization angle “ ⁇ / 2” of the polarization image sensor 112. Represents a polarized image.
  • the method for separating the polarized image from the fourth observation image is not limited to a specific method as long as the polarized image can be separated.
  • the reflection component estimation unit 122 estimates the image “I b ” corresponding to the polarization transmitted through the second opening region 114 by solving the optimization problem shown in the equation (3).
  • an arbitrary regularization term such as "total variation” may be added to the equation (3).
  • At least one of the fourth observed image which is a polarized image of the channel corresponding to the fourth polarization angle (for example, ⁇ + ⁇ / 2), which is the polarization angle, may be generated.
  • the reflection component estimation unit 122 estimates the reflection parameter of the subject based on at least one of the first observation image, the second observation image, the third observation image, and the fourth observation image, and the position parameter. good.
  • the normal estimation unit 123 estimates the reflection parameter on the surface of the subject 100 based on the position parameter of the subject 100 and the fourth observation image.
  • the image processing unit 12b may estimate the three-dimensional shape (three-dimensional geometric information) of the subject 100 based on the three-dimensional position of the surface of the subject 100 and the reflection parameter of the surface of the subject 100.
  • the position estimation device 1b may include a plurality of polarized lighting units 10 and a plurality of image capturing units 11.
  • the plurality of polarized lighting units 10 and the plurality of image capturing units 11 may be arranged in a spherical shape centered on the subject 100 to be imaged. This makes it possible to estimate a complex bidirectional reflectance distribution (BRDF) function or the like based on a plurality of combinations of the incident angle and the reflected angle of the polarized light.
  • BRDF complex bidirectional reflectance distribution
  • the polarization illumination unit 10 may time-modulate the polarization angle of the polarization.
  • the polarization angle of the reflected light that is not completely specular reflection is set to the polarization angle of the first opening region 113 and the polarization angle of the second opening region 114.
  • the polarization angle of the reflected light and the polarization angle of the illumination do not match because the device and the subject cannot be arranged so that complete specular reflection occurs, the three-dimensional position of the surface of the subject can be obtained. It is possible to estimate the reflection parameters.
  • FIG. 9 is a diagram showing a hardware configuration example of the position estimation device 1 (position estimation device 1a, position estimation device 1b) common to each embodiment.
  • a part or all of each functional unit of the position estimation device 1 is a storage device 3 and a memory 4 in which a processor 2 such as a CPU (Central Processing Unit) has a non-volatile recording medium (non-temporary recording medium). It is realized as software by executing the program stored in.
  • the program may be recorded on a computer-readable recording medium.
  • Computer-readable recording media include, for example, flexible disks, optomagnetic disks, portable media such as ROM (ReadOnlyMemory) and CD-ROM (CompactDiscReadOnlyMemory), and storage of hard disks built into computer systems. It is a non-temporary recording medium such as a device.
  • the display unit 5 displays an image.
  • each functional part of the position estimation device 1 uses, for example, an LSI (Large Scale Integration circuit), an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), an FPGA (Field Programmable Gate Array), or the like. It may be realized by using the hardware including the electronic circuit (electronic circuit or circuitry) that has been used.
  • LSI Large Scale Integration circuit
  • ASIC Application Specific Integrated Circuit
  • PLD Programmable Logic Device
  • FPGA Field Programmable Gate Array
  • the present invention is applicable to a device that estimates the three-dimensional shape of an imaged subject.

Abstract

This position estimation method is performed by a position estimation device and comprises: an image capture step for, in accordance with light reflected from a subject irradiated with polarized light having a reference polarization angle, generating both a first observed image, which is a polarized image of a channel corresponding to a first polarization angle that is inclined from a reference in a negative direction and is not orthogonal to the reference, and a second observed image, which is a polarized image of a channel corresponding to a second polarization angle that is inclined from the reference in a positive direction and is not orthogonal to the reference; and a position estimation step for estimating the three-dimensional position of the subject on the basis of the first observed image and the second observed image.

Description

位置推定方法、位置推定装置及びプログラムPosition estimation method, position estimation device and program
 本発明は、位置推定方法、位置推定装置及びプログラムに関する。 The present invention relates to a position estimation method, a position estimation device and a program.
 被写体の3次元位置(奥行)とその被写体の表面の反射パラメータとを推定する方法として、例えば、ステレオ法と、パターン投影法と、TOF(Time of Flight)とがある。ステレオ法では、複眼カメラ又は複数のカメラが使用される。ステレオ法では、視差を有する多視点画像同士の対応点が求められ、三角測量によって三次元位置が推定される。パターン投影法では、パターン光を投影する照明とカメラとが使用される。TOFでは、パルス波又は変調波が被写体に照射され、パルス波又は変調波の反射時間が計測される。 As a method for estimating the three-dimensional position (depth) of the subject and the reflection parameter on the surface of the subject, for example, there are a stereo method, a pattern projection method, and a TOF (Time of Flight). In the stereo method, a compound eye camera or a plurality of cameras are used. In the stereo method, the corresponding points between the multi-viewpoint images having parallax are obtained, and the three-dimensional position is estimated by triangulation. In the pattern projection method, an illumination that projects a pattern light and a camera are used. In the TOF, a pulse wave or a modulated wave is applied to the subject, and the reflection time of the pulse wave or the modulated wave is measured.
 また、計算光学(Computational Photography)の分野において、TCA(three color-filtered apertures)という方法が用いられることによって、単眼カメラで撮像された被写体の画像に基づいてその被写体の3次元位置を推定することが可能である(非特許文献1参照)。 Further, in the field of computational photography, a method called TCA (three color-filtered apertures) is used to estimate the three-dimensional position of the subject based on the image of the subject captured by the monocular camera. (See Non-Patent Document 1).
 TCAでは、RGB(Red, Green, Blue)のイメージセンサを備えるカメラのレンズの付近に、赤と緑と青との各カラーフィルタ(サブ開口)が、互いに位置をずらして配置される。このようなカメラが被写体を撮像することによって、赤と緑と青との各カラーチャンネルの間で微小な視差を有する多視点画像が生成される。各カラーチャンネルの間でステレオ法と同様の対応点探索が多視点画像に対して実行されることによって、被写体の3次元位置(奥行)が推定される。 In TCA, red, green, and blue color filters (sub-openings) are placed in the vicinity of the lens of a camera equipped with an RGB (Red, Green, Blue) image sensor, with their positions offset from each other. When such a camera captures a subject, a multi-viewpoint image having a minute parallax between the red, green, and blue color channels is generated. The three-dimensional position (depth) of the subject is estimated by performing a corresponding point search similar to the stereo method for the multi-viewpoint image between each color channel.
 計算光学の分野において、DCA(Dual off-axis color-filtered aperture)と呼ばれる方法もある。DCAでは、被写体の3次元位置の推定精度が向上するように、サブ開口が光軸から離されて配置される。また、DCAでは、サブ開口あたりの光量が増えるように、サブ開口の数が2個まで減らされる(非特許文献2参照)。 In the field of computational optics, there is also a method called DCA (Dual off-axis color-filtered aperture). In the DCA, the sub-openings are arranged away from the optical axis so that the estimation accuracy of the three-dimensional position of the subject is improved. Further, in DCA, the number of sub-apertures is reduced to two so that the amount of light per sub-aperture increases (see Non-Patent Document 2).
 TCAとDCAとのいずれでも、カラーチャンネルの間の色シフトが考慮された対応点探索が実行される必要がある。ここで、被写体の色が既知でない場合には、位置推定装置が色シフトを正確に推定することが難しいので、対応点の探索の誤りが生じてしまうという問題がある。 In both TCA and DCA, it is necessary to execute a corresponding point search in consideration of the color shift between the color channels. Here, when the color of the subject is not known, it is difficult for the position estimation device to accurately estimate the color shift, so that there is a problem that an error in searching for the corresponding point occurs.
 対応点の探索の誤りが生じてしまうことがないようするために、赤フィルタとシアンフィルタとが用意される場合がある。カメラにおいて、被写体の光学像が、シアンフィルタを経由し、緑チャンネルと青チャンネルとの両方において取得される。緑チャンネルと青チャンネルとの両方において取得された各光学像が3次元位置の推定に使用されることによって、被写体の3次元位置の推定精度が向上する(非特許文献3参照)。 A red filter and a cyan filter may be prepared to prevent an error in the search for the corresponding point. In the camera, the optical image of the subject is captured in both the green and blue channels via a cyan filter. By using each optical image acquired in both the green channel and the blue channel for estimating the three-dimensional position, the estimation accuracy of the three-dimensional position of the subject is improved (see Non-Patent Document 3).
 しかしながらこれらの方法では、緑チャンネルと青チャンネルとの両方において取得された各光学像の輝度は、シアンフィルタを経由しているので、赤チャンネルにおいて取得された光学像と比較して低下している。このため、正確な色再現が難しくなる。また、これらの方法では、被写体の表面の反射特性として拡散反射のみが仮定されている。このため、これらの方法は、鏡面反射特性を有する被写体(光の反射方向に応じて輝度が大きく異なる被写体)の表面の3次元位置を推定する方法としては適さない。 However, in these methods, the brightness of each optical image acquired in both the green channel and the blue channel is lower than that of the optical image acquired in the red channel because it passes through the cyan filter. .. Therefore, accurate color reproduction becomes difficult. Further, in these methods, only diffuse reflection is assumed as the reflection characteristic of the surface of the subject. Therefore, these methods are not suitable as a method for estimating the three-dimensional position of the surface of a subject having a specular reflection characteristic (a subject whose brightness varies greatly depending on the light reflection direction).
 近年、RGBイメージセンサのカラーフィルタの代わりに4方向の偏光フィルタを備えた偏光イメージセンサがある。被写体の表面の3次元位置を位置推定装置が推定するために、このような偏光イメージセンサとサブ開口を備えるカメラを用いて被写体が観測され、TCA又はDCAを用いて偏光解析が実行される場合がある。この場合でも、鏡面反射特性を有する被写体の表面では、偏光フィルタごとに輝度が大きく変化するので、被写体の表面の3次元位置を推定することが困難である。 In recent years, there is a polarized image sensor equipped with a four-direction polarizing filter instead of the color filter of the RGB image sensor. When the subject is observed using a camera equipped with such a polarization image sensor and a sub-aperture and the polarization analysis is performed using TCA or DCA in order for the position estimation device to estimate the three-dimensional position of the surface of the subject. There is. Even in this case, it is difficult to estimate the three-dimensional position of the surface of the subject because the brightness of the surface of the subject having the specular reflection characteristic changes greatly for each polarization filter.
 また、鏡面反射特性を有する被写体の表面の反射パラメータを推定する方法として、偏光解析が多く用いられている。この方法では、光が被写体に複数の照明から照射され、照明ごとに偏光解析が実行される。これによって、光の入射角と反射角との関係に基づいて、被写体の表面の反射パラメータが推定される。 Ellipsometry is often used as a method for estimating the reflection parameters on the surface of a subject having specular reflection characteristics. In this method, the subject is irradiated with light from a plurality of lights, and ellipsometry is performed for each light. As a result, the reflection parameter on the surface of the subject is estimated based on the relationship between the incident angle and the reflection angle of the light.
 しかしながら、偏光解析が実行される方法では、被写体の表面の3次元位置を推定することができない。このため、偏光解析が実行される方法は、被写体の表面の3次元位置を推定するために、ステレオ法、パターン法又はTOF等の方法と組み合わされることが多い。 However, the method in which ellipsometry is performed cannot estimate the three-dimensional position of the surface of the subject. Therefore, the method in which the ellipsometry is performed is often combined with a method such as a stereo method, a pattern method, or a TOF method in order to estimate the three-dimensional position of the surface of the subject.
 サブ開口を用いて被写体の表面の3次元位置(奥行)が推定される場合、色シフトの推定結果に起因して、被写体の表面の3次元位置の推定精度が低下する場合がある。また、サブ開口を用いて被写体の3次元位置が推定される場合、鏡面反射特性を有する被写体の表面の3次元位置を推定することができない。さらに、TCA又はDCAにおいて偏光解析が実行される場合、偏光フィルタごとに輝度が大きく変化するので、鏡面反射特性を有する被写体の表面の3次元位置を推定することができない。これらのように、被写体の3次元位置を推定することができない場合があるという問題がある。 When the 3D position (depth) of the surface of the subject is estimated using the sub-aperture, the estimation accuracy of the 3D position of the surface of the subject may decrease due to the estimation result of the color shift. Further, when the three-dimensional position of the subject is estimated using the sub-aperture, the three-dimensional position of the surface of the subject having the specular reflection characteristic cannot be estimated. Further, when the polarization analysis is performed in TCA or DCA, the brightness changes greatly for each polarization filter, so that it is not possible to estimate the three-dimensional position of the surface of the subject having specular reflection characteristics. As described above, there is a problem that the three-dimensional position of the subject may not be estimated.
 上記事情に鑑み、本発明は、実空間における被写体の3次元位置の推定において、被写体又は環境の影響による精度の低下を抑えることが可能である位置推定方法、位置推定装置及びプログラムを提供することを目的としている。 In view of the above circumstances, the present invention provides a position estimation method, a position estimation device, and a program capable of suppressing a decrease in accuracy due to the influence of the subject or the environment in estimating a three-dimensional position of a subject in real space. It is an object.
 本発明の一態様は、位置推定装置が実行する位置推定方法であって、マイナス方向に基準から傾いており前記基準とは直交しない偏光角である第1偏光角に対応するチャンネルの偏光画像である第1観測画像と、プラス方向に前記基準から傾いており前記基準とは直交しない偏光角である第2偏光角に対応するチャンネルの偏光画像である第2観測画像とを、前記基準の偏光角の偏光が照射された被写体から反射された光に応じて生成する画像撮像ステップと、前記第1観測画像と前記第2観測画像とに基づいて前記被写体の3次元位置を推定する位置推定ステップとを含む位置推定方法である。 One aspect of the present invention is a position estimation method executed by a position estimation device, which is a polarized image of a channel corresponding to a first polarization angle, which is a polarization angle that is tilted from a reference in the negative direction and is not orthogonal to the reference. A certain first observed image and a second observed image which is a polarized image of a channel corresponding to a second polarization angle which is inclined in the positive direction from the reference and is not orthogonal to the reference are polarized with respect to the reference. An image imaging step generated according to the light reflected from the subject irradiated with angular polarization, and a position estimation step for estimating the three-dimensional position of the subject based on the first observation image and the second observation image. It is a position estimation method including and.
 本発明の一態様は、マイナス方向に基準から傾いており前記基準とは直交しない偏光角である第1偏光角に対応するチャンネルの偏光画像である第1観測画像と、プラス方向に前記基準から傾いており前記基準とは直交しない偏光角である第2偏光角に対応するチャンネルの偏光画像である第2観測画像とを、前記基準の偏光角の偏光が照射された被写体から反射された光に応じて生成する画像撮像部と、前記第1観測画像と前記第2観測画像とに基づいて前記被写体の3次元位置を推定する位置推定部とを備える位置推定装置である。 One aspect of the present invention is a first observed image which is a polarized image of a channel corresponding to a first polarization angle which is a polarization angle which is inclined from a reference in a negative direction and is not orthogonal to the reference, and a positive direction from the reference. The light reflected from the subject irradiated with the polarization of the reference polarization angle with the second observation image which is the polarized image of the channel corresponding to the second polarization angle which is inclined and is not orthogonal to the reference. It is a position estimation device including an image imaging unit generated according to the above and a position estimation unit that estimates a three-dimensional position of the subject based on the first observation image and the second observation image.
 本発明の一態様は、上記の位置推定装置としてコンピュータを機能させるためのプログラムである。 One aspect of the present invention is a program for operating a computer as the above-mentioned position estimation device.
 本発明により、実空間における被写体の3次元位置の推定において、被写体又は環境の影響による精度の低下を抑えることが可能である。 According to the present invention, it is possible to suppress a decrease in accuracy due to the influence of the subject or the environment in estimating the three-dimensional position of the subject in the real space.
第1実施形態における、位置推定装置の構成例を示す図である。It is a figure which shows the structural example of the position estimation apparatus in 1st Embodiment. 第1実施形態における、画像撮像部の構成例を示す図である。It is a figure which shows the structural example of the image imaging unit in 1st Embodiment. 第1実施形態における、画像処理部の構成例を示す図である。It is a figure which shows the structural example of the image processing part in 1st Embodiment. 第1実施形態における、位置推定装置の動作例を示すフローチャートである。It is a flowchart which shows the operation example of the position estimation apparatus in 1st Embodiment. 第1実施形態における、偏光角と輝度との関係を示す図である。It is a figure which shows the relationship between the polarization angle and the luminance in 1st Embodiment. 第2実施形態における、位置推定装置の構成例を示す図である。It is a figure which shows the structural example of the position estimation apparatus in 2nd Embodiment. 第2実施形態における、画像処理部の構成例を示す図である。It is a figure which shows the structural example of the image processing part in 2nd Embodiment. 第2実施形態における、位置推定装置の動作例を示すフローチャートである。It is a flowchart which shows the operation example of the position estimation apparatus in 2nd Embodiment. 各実施形態に共通する位置推定装置のハードウェア構成例を示す図である。It is a figure which shows the hardware configuration example of the position estimation apparatus common to each embodiment.
 本発明の実施形態について、図面を参照して詳細に説明する。
 (第1実施形態)
 図1は、第1実施形態における、位置推定装置1aの構成例を示す図である。位置推定装置1aは、被写体の表面の3次元位置(奥行)を推定する装置である。図1では、位置推定装置1aは、被写体100の表面の3次元位置を推定する。
Embodiments of the present invention will be described in detail with reference to the drawings.
(First Embodiment)
FIG. 1 is a diagram showing a configuration example of the position estimation device 1a in the first embodiment. The position estimation device 1a is a device that estimates the three-dimensional position (depth) of the surface of the subject. In FIG. 1, the position estimation device 1a estimates a three-dimensional position on the surface of the subject 100.
 位置推定装置1aは、偏光照明部10と、画像撮像部11と、画像処理部12aとを備える。偏光照明部10は、被写体100(撮像対象)に対して、偏光を照射する。画像撮像部11(偏光カメラ)は、被写体100を撮像することによって、被写体100の画像を生成する。画像撮像部11は、被写体100の画像を、画像処理部12aに出力する。 The position estimation device 1a includes a polarized lighting unit 10, an image imaging unit 11, and an image processing unit 12a. The polarized illumination unit 10 irradiates the subject 100 (imaging target) with polarization. The image capturing unit 11 (polarized camera) generates an image of the subject 100 by capturing the subject 100. The image capturing unit 11 outputs the image of the subject 100 to the image processing unit 12a.
 画像処理部12aは、被写体100の画像を、画像撮像部11から取得する。画像処理部12aは、画像撮像部11から取得された画像に基づいて、被写体100の表面の3次元位置(奥行)を推定する。 The image processing unit 12a acquires an image of the subject 100 from the image imaging unit 11. The image processing unit 12a estimates the three-dimensional position (depth) of the surface of the subject 100 based on the image acquired from the image capturing unit 11.
 図2は、第1実施形態における、画像撮像部11の構成例を示す図である。画像撮像部11は、レンズ110と、マスク111と、偏光イメージセンサ112とを備える。マスク111には、第1開口領域113と、第2開口領域114とが、各サブ開口として設けられる。 FIG. 2 is a diagram showing a configuration example of the image capturing unit 11 in the first embodiment. The image capturing unit 11 includes a lens 110, a mask 111, and a polarized image sensor 112. The mask 111 is provided with a first opening region 113 and a second opening region 114 as each sub-opening.
 第1開口領域113には、レンズ110を透過した偏光の偏向角に対して第1角度を持つ偏光フィルタが備えられる。第2開口領域114についても同様であり、第2開口領域114には、レンズ110を透過した偏光の偏向角に対して第2角度を持つ偏光フィルタが備えられる。 The first aperture region 113 is provided with a polarization filter having a first angle with respect to the deflection angle of the polarization transmitted through the lens 110. The same applies to the second aperture region 114, and the second aperture region 114 is provided with a polarization filter having a second angle with respect to the deflection angle of the polarization transmitted through the lens 110.
 図2において、「Δc eff」は、レンズ110の有効口径を表す。「Δc」は、第1開口領域113と第2開口領域114との間の距離を表す。「c」は、レンズ110とマスク111との間の距離を表す。「f」は、レンズ110と偏光イメージセンサ112との間の距離を表す。「Δx」は、像間隔を表す。「Z」は、偏光イメージセンサ112と被写体100との間の距離を表す。「Z」は、偏光イメージセンサ112と合焦面200との間の距離を表す。 In FIG. 2, “Δc x eff ” represents the effective aperture of the lens 110. “Δc x ” represents the distance between the first opening region 113 and the second opening region 114. “ Cz ” represents the distance between the lens 110 and the mask 111. “F” represents the distance between the lens 110 and the polarized image sensor 112. “Δx” represents an image spacing. “Z” represents the distance between the polarized image sensor 112 and the subject 100. “Z 0 ” represents the distance between the polarized image sensor 112 and the focal surface 200.
 レンズ110には、偏光が照射された被写体100の表面から反射光が入射する。レンズ110は、入射された光を透過させ、レンズ110を透過した光をマスク111に送る。マスク111の第1開口領域113及び第2開口領域114は、マスク111に送られた光の一部を透過させる。第1開口領域113及び第2開口領域114は、透過した光を偏光イメージセンサ112に投影する。 Reflected light is incident on the lens 110 from the surface of the subject 100 irradiated with polarized light. The lens 110 transmits the incident light and sends the light transmitted through the lens 110 to the mask 111. The first opening region 113 and the second opening region 114 of the mask 111 transmit a part of the light sent to the mask 111. The first aperture region 113 and the second aperture region 114 project the transmitted light onto the polarized image sensor 112.
 偏光イメージセンサ112に投影された光は、被写体100の偏光画像として観測される。偏光イメージセンサ112は、被写体100の偏光画像に応じて、被写体100の画像を生成する。偏光イメージセンサ112は、被写体100の画像を画像処理部12aに送信する。 The light projected on the polarized image sensor 112 is observed as a polarized image of the subject 100. The polarized image sensor 112 generates an image of the subject 100 according to the polarized image of the subject 100. The polarized image sensor 112 transmits an image of the subject 100 to the image processing unit 12a.
 なお、図2においてレンズ110を透過した光の一部のうち、マスク111と第1開口領域113と第2開口領域114とのいずれにも遮断されなかった光が、偏光イメージセンサ112に投射されてもよい。 Of the part of the light transmitted through the lens 110 in FIG. 2, the light that is not blocked by any of the mask 111, the first aperture region 113, and the second aperture region 114 is projected onto the polarization image sensor 112. You may.
 図3は、第1実施形態における、画像処理部12aの構成例を示す図である。画像処理部12aは、画像入力部120aと、位置推定部121aとを備える。 FIG. 3 is a diagram showing a configuration example of the image processing unit 12a in the first embodiment. The image processing unit 12a includes an image input unit 120a and a position estimation unit 121a.
 画像入力部120aは、被写体100の画像を、画像撮像部11から取得する。画像入力部120aは、被写体100の画像を、位置推定部121aに出力する。 The image input unit 120a acquires an image of the subject 100 from the image imaging unit 11. The image input unit 120a outputs the image of the subject 100 to the position estimation unit 121a.
 位置推定部121aは、被写体100の画像に基づいて、被写体100の表面の3次元位置(奥行)を推定する。位置推定部121aは、被写体100の表面の3次元位置の推定結果を、所定の外部装置(不図示)に出力する。 The position estimation unit 121a estimates the three-dimensional position (depth) of the surface of the subject 100 based on the image of the subject 100. The position estimation unit 121a outputs the estimation result of the three-dimensional position of the surface of the subject 100 to a predetermined external device (not shown).
 次に、位置推定装置1aの動作例を説明する。
 図4は、第1実施形態における、位置推定装置1aの動作例を示すフローチャートである。偏光照明部10は、撮像対象である被写体100に偏光を照射する(ステップS101)。被写体100に照射される偏光は、特定の偏光に限定されない。以下では、直線偏光が、単一の偏光角「θ」で被写体100に照射される。
Next, an operation example of the position estimation device 1a will be described.
FIG. 4 is a flowchart showing an operation example of the position estimation device 1a in the first embodiment. The polarized illumination unit 10 irradiates the subject 100 to be imaged with polarized light (step S101). The polarization applied to the subject 100 is not limited to a specific polarization. In the following, the linearly polarized light is applied to the subject 100 with a single polarization angle “θ”.
 被写体100に照射された偏光は、被写体100の表面で反射され、レンズ110を透過する。レンズ110を透過した光の一部は、マスク111によって吸収される。吸収されなかった光は、第1開口領域113又は第2開口領域114を選択的に透過する。 The polarized light applied to the subject 100 is reflected on the surface of the subject 100 and passes through the lens 110. A part of the light transmitted through the lens 110 is absorbed by the mask 111. The unabsorbed light selectively passes through the first aperture region 113 or the second aperture region 114.
 第1開口領域113は、第1開口領域113に備えられた偏光フィルタの角度(偏光角)に一致する角度成分を持つ偏光を透過させる。第2開口領域114は、第2開口領域114に備えられた偏光フィルタの角度(偏光角)に一致する角度成分を持つ偏光を透過させる。 The first opening region 113 transmits polarized light having an angle component corresponding to the angle (polarization angle) of the polarizing filter provided in the first opening region 113. The second opening region 114 transmits polarized light having an angle component corresponding to the angle (polarization angle) of the polarizing filter provided in the second opening region 114.
 第1開口領域113は、第1開口領域113に備えられた偏光フィルタの角度に直交する角度成分を持つ偏光の一部を吸収し、残りの偏光を反射させる。第2開口領域114は、第2開口領域114に備えられた偏光フィルタの角度に直交する角度成分を持つ偏光の一部を吸収し、残りの偏光を反射させる。 The first opening region 113 absorbs a part of the polarized light having an angular component orthogonal to the angle of the polarizing filter provided in the first opening region 113, and reflects the remaining polarized light. The second opening region 114 absorbs a part of the polarized light having an angular component orthogonal to the angle of the polarizing filter provided in the second opening region 114, and reflects the remaining polarized light.
 各開口領域の偏光フィルタの角度(偏光角)は、偏光照明部10と偏光イメージセンサ112との相対的な位置関係に応じて予め定められる。以下では、第1開口領域113は、偏光角「ω」の偏光フィルタを備える。第2開口領域114は、偏光角「ω」の偏光フィルタを備える。 The angle (polarization angle) of the polarization filter in each aperture region is predetermined according to the relative positional relationship between the polarization illumination unit 10 and the polarization image sensor 112. In the following, the first aperture region 113 includes a polarizing filter having a polarization angle of “ω 1 ”. The second aperture region 114 includes a polarizing filter having a polarization angle of “ω 2 ”.
 図5は、第1実施形態における、偏光角と輝度との関係を示す図である。一般に、被写体100の表面からの反射光は、拡散反射による無偏光と、鏡面反射による直線偏光とから成る。被写体100において鏡面反射された直線偏光の偏光角は、被写体100に照射された偏光の偏光角に等しい。 FIG. 5 is a diagram showing the relationship between the polarization angle and the brightness in the first embodiment. Generally, the reflected light from the surface of the subject 100 is composed of unpolarized light due to diffuse reflection and linearly polarized light due to specular reflection. The polarization angle of the linearly polarized light mirror-reflected in the subject 100 is equal to the polarization angle of the polarized light applied to the subject 100.
 一般に、回転する偏光板を介して反射光の輝度が観察された場合、偏光角(偏光板の回転角)に応じて、輝度は正弦波状に変化する。図5では、「Imax」は、鏡面反射が最も強くなる角度「φ」における輝度(最大輝度)を表す。「Imin」は、鏡面反射が最も弱くなる角度における輝度(最小輝度)を表す。すなわち、。「Imin」は、拡散反射された偏光のみが観測される角度における輝度を表す。 Generally, when the brightness of the reflected light is observed through the rotating polarizing plate, the brightness changes in a sine wave shape according to the polarization angle (rotation angle of the polarizing plate). In FIG. 5, “I max ” represents the luminance (maximum luminance) at the angle “φ” where the specular reflection is the strongest. "I min " represents the luminance (minimum luminance) at the angle at which the specular reflection is the weakest. That is. "I min " represents the luminance at an angle at which only diffusely reflected polarization is observed.
 偏光解析では、互いに「π/4」シフトされた4種類の偏光角(例えば、「0、π/4、π/2、-π/4」)について、反射光の輝度が観測される。また、偏光の成分を表すパラメータ(ストークスパラメータ)と、最大輝度「Imax」と最小輝度「Imin」とに基づいて、被写体100の表面の法線方向又は双方向反射率分布(Bidirectional Reflectance Distribution Function : BRDF)関数が推定される。 In the polarization analysis, the brightness of the reflected light is observed for four types of polarization angles (for example, “0, π / 4, π / 2, −π / 4”) shifted by “π / 4” from each other. Further, based on the parameter (Stokes parameter) representing the polarization component, the maximum brightness "I max " and the minimum brightness "I min ", the normal direction or bidirectional reflectance distribution of the surface of the subject 100 (Bidirectional Reflectance Distribution). Function: BRDF) The function is estimated.
 このため、偏光イメージセンサに備えられる偏光フィルタは、互いに「π/4」シフトされた4種類の偏光角の偏光フィルタであることが多い。偏光イメージセンサ112は、偏光角「Ψ」を基準として、このような4種類の偏光角[Ψ-π/4,Ψ,Ψ+π/4,Ψ+π/2]の偏光フィルタを備える。 For this reason, the polarizing filter provided in the polarizing image sensor is often a polarizing filter having four types of polarization angles shifted by "π / 4" from each other. The polarization image sensor 112 includes four types of polarization filters [Ψ −π / 4, Ψ, Ψ + π / 4, Ψ + π / 2] with reference to the polarization angle “Ψ”.
 被写体の表面において鏡面反射する光の強度及び偏光角は、フレネルの法則に従って変化する。フレネルの法則とは、被写体の表面において鏡面反射する光の強度及び偏光角は、空気及び被写体の屈折率と光の入射角とに依存するという法則である。光が鏡面反射する場合には入射角と反射角とが等しいので、照明の方向と被写体表面を挟んで正反射の方向とからのみ、鏡面反射光を観測することが可能である。例えば、被写体に対する照明及びカメラの方向がほぼ同じとみなせる場合、鏡面反射した光は、照明及びカメラに対して正対する被写体の表面からのみ観測される。この場合、鏡面反射した光の偏光角及び強度は、偏光照明の偏光角及び強度とほぼ同じである。 The intensity and polarization angle of the specularly reflected light on the surface of the subject change according to Fresnel's law. Fresnel's law is that the intensity and polarization angle of light specularly reflected on the surface of a subject depend on the refractive index of air and the subject and the incident angle of light. When the light is specularly reflected, the incident angle and the reflection angle are equal, so that the specularly reflected light can be observed only from the direction of the illumination and the direction of the regular reflection across the subject surface. For example, if the lighting to the subject and the direction of the camera can be regarded as substantially the same, the specularly reflected light is observed only from the surface of the subject facing the lighting and the camera. In this case, the polarization angle and intensity of the specularly reflected light are substantially the same as the polarization angle and intensity of the polarized illumination.
 この場合、被写体100に照射された偏光の偏光角(偏光照明部10の偏光フィルタの偏光角)「θ」が一例として「0(radian)」である場合、鏡面反射が最も強くなる角度「φ」は0である。このため、0から「-π/4」シフトされた偏光角「-π/4」の偏光フィルタを透過した偏光に応じて偏光イメージセンサ112において観測される輝度と、0から「π/4」シフトされた偏光角「π/4」の偏光フィルタを透過した偏光に応じて偏光イメージセンサ112において観測される輝度とは等しい。 In this case, when the polarization angle (polarization angle of the polarization filter of the polarization illumination unit 10) “θ” of the polarization applied to the subject 100 is “0 (radian)” as an example, the angle “φ” at which the mirror reflection is the strongest. Is 0. Therefore, the brightness observed in the polarizing image sensor 112 according to the polarization transmitted through the polarizing filter having a polarization angle of “−π / 4” shifted from 0 to “−π / 4” and “π / 4” from 0. It is equal to the brightness observed in the polarization image sensor 112 according to the polarization transmitted through the polarization filter having the shifted polarization angle “π / 4”.
 したがって、仮に、第1開口領域及び第2開口領域とマスクとを画像撮像部が備えていない場合には、レンズを透過し偏光イメージセンサに投射された偏光に応じて偏光イメージセンサが生成した被写体100の偏光画像のチャンネルに関して、偏光角「π/4」に対応するチャンネルの画像と、偏光角「-π/4」に対応するチャンネルの画像とは、ほぼ同一になる。 Therefore, if the image capturing unit does not include the first opening region, the second opening region, and the mask, the subject generated by the polarized image sensor according to the polarization transmitted through the lens and projected on the polarized image sensor. With respect to the channels of the 100 polarized images, the image of the channel corresponding to the polarization angle “π / 4” and the image of the channel corresponding to the polarization angle “−π / 4” are substantially the same.
 これに対して画像撮像部11では、レンズ110を透過した偏光のうち、第1開口領域113又は第2開口領域114を選択的に透過した偏光は、第1開口領域113又は第2開口領域114を選択的に透過した偏光が偏光イメージセンサ112に入射する方向に応じて、偏光イメージセンサ112において互いにずれた位置に結像する。このように結像された像(光学像)は、偏光イメージセンサ112に備えられた4種類の偏光角のフィルタを通過して、4個のチャンネルを持つ偏光画像として、偏光イメージセンサ112によって生成される。生成された偏光画像は、メモリに記録されてもよい。 On the other hand, in the image capturing unit 11, among the polarized light transmitted through the lens 110, the polarized light selectively transmitted through the first aperture region 113 or the second aperture region 114 is the first aperture region 113 or the second aperture region 114. The polarized light selectively transmitted through the image is formed in the polarized image sensor 112 at positions deviated from each other according to the direction in which the polarized light is incident on the polarized image sensor 112. The image (optical image) imaged in this way passes through filters of four types of polarization angles provided in the polarization image sensor 112 and is generated by the polarization image sensor 112 as a polarization image having four channels. Will be done. The generated polarized image may be recorded in a memory.
 このような場合、偏光角「Ψ-π/4」の偏光フィルタが第1開口領域113に備えられることによって、偏光角「Ψ-π/4」とは異なる偏光角の偏光は、その偏光の偏光角と偏光フィルタの偏光角とが互いに直交関係にあるために、第1開口領域113の偏光フィルタによって遮断される。このため、第1開口領域113を透過した偏光に応じて、偏光角「Ψ-π/4」に対応するチャンネルの偏光画像が生成される。以下、偏光角「Ψ-π/4」に対応するチャンネルの偏光画像を「第1観測画像」という。 In such a case, by providing a polarization filter having a polarization angle of "Ψ-π / 4" in the first opening region 113, the polarization of the polarization angle different from the polarization angle of "Ψ-π / 4" is the polarization of the polarization. Since the polarization angle and the polarization angle of the polarization filter are orthogonal to each other, they are blocked by the polarization filter of the first opening region 113. Therefore, a polarized image of the channel corresponding to the polarization angle “Ψ−π / 4” is generated according to the polarization transmitted through the first opening region 113. Hereinafter, the polarized image of the channel corresponding to the polarization angle “Ψ−π / 4” is referred to as a “first observed image”.
 また、偏光角「Ψ+π/4」の偏光フィルタが第2開口領域114に備えられることによって、偏光角「Ψ+π/4」とは異なる偏光角の偏光は、その偏光の偏光角と偏光フィルタの偏光角とが互いに直交関係にあるために、第2開口領域114の偏光フィルタによって遮断される。このため、第2開口領域114を透過した偏光に応じて、偏光角「Ψ+π/4」に対応するチャンネルの偏光画像が生成される。以下、偏光角「Ψ+π/4」に対応するチャンネルの偏光画像を「第2観測画像」という。 Further, since the polarization filter having the polarization angle “Ψ + π / 4” is provided in the second opening region 114, the polarization of the polarization angle different from the polarization angle “Ψ + π / 4” is the polarization angle of the polarization and the polarization of the polarization filter. Since the corners are orthogonal to each other, they are blocked by the polarizing filter of the second opening region 114. Therefore, a polarized image of the channel corresponding to the polarization angle “Ψ + π / 4” is generated according to the polarization transmitted through the second opening region 114. Hereinafter, the polarized image of the channel corresponding to the polarization angle “Ψ + π / 4” is referred to as a “second observation image”.
 「θ=Ψ」である場合、偏光角「Ψ-π/4」に対応するチャンネルの偏光画像と偏光角「Ψ+π/4」に対応するチャンネルの偏光画像との間では、上述の理由により、色シフトは生じていない。 When “θ = Ψ”, the polarized image of the channel corresponding to the polarization angle “Ψ−π / 4” and the polarized image of the channel corresponding to the polarization angle “Ψ + π / 4” are separated from each other for the above reason. No color shift has occurred.
 図4に戻り、位置推定装置1aの動作例の説明を続ける。上述のようにして、画像撮像部11(偏光イメージセンサ112)は、第1観測画像と第2観測画像とを生成する(ステップS102)。 Returning to FIG. 4, the explanation of the operation example of the position estimation device 1a is continued. As described above, the image capturing unit 11 (polarized image sensor 112) generates the first observed image and the second observed image (step S102).
 画像入力部120aは、第1観測画像と第2観測画像とを、画像撮像部11(偏光イメージセンサ112)から取得する。画像入力部120aは、第1観測画像と第2観測画像とを、位置推定部121aに出力する。画像入力部120aは、第1観測画像と第2観測画像とを、メモリに記録してもよい(ステップS103)。 The image input unit 120a acquires the first observation image and the second observation image from the image imaging unit 11 (polarized image sensor 112). The image input unit 120a outputs the first observation image and the second observation image to the position estimation unit 121a. The image input unit 120a may record the first observation image and the second observation image in the memory (step S103).
 位置推定部121aは、第1観測画像と第2観測画像とに基づいて、被写体100の位置パラメータを推定する。位置推定部121aは、被写体100の表面の位置パラメータを、所定の外部装置(不図示)に出力する(ステップS104)。 The position estimation unit 121a estimates the position parameter of the subject 100 based on the first observation image and the second observation image. The position estimation unit 121a outputs the position parameter of the surface of the subject 100 to a predetermined external device (not shown) (step S104).
 位置パラメータは、被写体100の位置に関するパラメータであれば、特定のパラメータに限定されない。また、位置パラメータを推定するために用いられる単位は、特定の単位に限定されない。以下では、3次元位置(奥行)を示す位置ベクトルは、一例として、画像の画素ごとに定められる。位置パラメータは、視差マップ又は深度マップ等のマップのパラメータでもよい。位置パラメータは、ブロックごとに定められてもよいし、画像を分割する領域ごとに定められてもよい。 The position parameter is not limited to a specific parameter as long as it is a parameter related to the position of the subject 100. Further, the unit used for estimating the position parameter is not limited to a specific unit. In the following, the position vector indicating the three-dimensional position (depth) is determined for each pixel of the image as an example. The position parameter may be a map parameter such as a parallax map or a depth map. The position parameter may be set for each block or for each area where the image is divided.
 位置パラメータを推定する方法は、特定の方法に限定されない。例えば、位置推定部121aは、第1観測画像と第2観測画像とを用いるステレオ法によって位置パラメータを推定する。例えば、位置推定部121aは、各チャンネルの画像の間におけるブロックマッチングの結果を用いて視差を推定する。位置推定部121aは、各開口領域の間の距離「Δc」等の幾何情報を用いて、視差を3次元位置に変換してもよい。例えば、位置推定部121aは、被写体100の特徴点の検出結果に基づいて、被写体100の位置パラメータを推定してもよい。例えば、位置推定部121aは、被写体100の撮像中に画像撮像部11の画角内に置かれたマーカーを用いて、被写体100の位置パラメータを推定してもよい。例えば、被写体100に照射される偏光がパターン光である場合には、位置推定部121aは、被写体100に照射されたパターン光を用いて、被写体100の表面の位置パラメータを推定してもよい。 The method of estimating the position parameter is not limited to a specific method. For example, the position estimation unit 121a estimates the position parameter by the stereo method using the first observation image and the second observation image. For example, the position estimation unit 121a estimates the parallax using the result of block matching between the images of each channel. The position estimation unit 121a may convert the parallax into a three-dimensional position by using geometric information such as the distance “Δc x ” between the opening regions. For example, the position estimation unit 121a may estimate the position parameter of the subject 100 based on the detection result of the feature point of the subject 100. For example, the position estimation unit 121a may estimate the position parameter of the subject 100 by using a marker placed within the angle of view of the image imaging unit 11 while the subject 100 is being imaged. For example, when the polarized light applied to the subject 100 is the pattern light, the position estimation unit 121a may estimate the position parameter on the surface of the subject 100 by using the pattern light applied to the subject 100.
 以上のように、第1観測画像は、マイナス方向に基準(例えば、Ψ=0)から傾いており基準とは直交しない偏光角である第1偏光角(例えば、Ψ-π/4)に対応するチャンネルの偏光画像である。第2観測画像は、プラス方向に基準から傾いており基準とは直交しない偏光角である第2偏光角(例えば、Ψ+π/4)に対応するチャンネルの偏光画像である。画像撮像部11は、基準の偏光角の偏光が照射された被写体100から反射された光に応じて、第1観測画像と第2観測画像とを生成する。位置推定部121aは、第1観測画像と第2観測画像とに基づいて、実空間における被写体100の3次元位置を推定する。位置推定部121aは、例えばステレオ法、パターン投影法等を用いて、実空間における被写体100の3次元位置を推定する。 As described above, the first observed image corresponds to the first polarization angle (for example, Ψ−π / 4), which is a polarization angle that is tilted from the reference (for example, Ψ = 0) in the negative direction and is not orthogonal to the reference. It is a polarized image of the channel to be used. The second observed image is a polarized image of the channel corresponding to the second polarization angle (for example, Ψ + π / 4), which is a polarization angle that is tilted from the reference in the positive direction and is not orthogonal to the reference. The image capturing unit 11 generates a first observed image and a second observed image according to the light reflected from the subject 100 irradiated with the polarization of the reference polarization angle. The position estimation unit 121a estimates the three-dimensional position of the subject 100 in the real space based on the first observation image and the second observation image. The position estimation unit 121a estimates the three-dimensional position of the subject 100 in the real space by using, for example, a stereo method, a pattern projection method, or the like.
 画像撮像部11のレンズ110の付近において、複数のサブ開口(第1開口領域113、第2開口領域114)がマスク111に設けられている。偏光照明部10からの偏光の偏光角に対して一定の第1角度を有する偏光フィルタが、第1開口領域113に備えられる。同様に、偏光照明部10からの偏光の偏光角に対して一定の第2角度を有する偏光フィルタが、第2開口領域114に備えられる。偏光照明部10は、被写体100に偏光を照射する。被写体100から反射された偏光(反射光)を、レンズ110とマスク111(複数のサブ開口)とを用いて、偏光イメージセンサ112が観測する。 A plurality of sub-apertures (first aperture region 113, second aperture region 114) are provided on the mask 111 in the vicinity of the lens 110 of the image capturing unit 11. A polarization filter having a constant first angle with respect to the polarization angle of the polarization from the polarization illumination unit 10 is provided in the first opening region 113. Similarly, a polarization filter having a constant second angle with respect to the polarization angle of the polarization from the polarization illumination unit 10 is provided in the second opening region 114. The polarized lighting unit 10 irradiates the subject 100 with polarized light. The polarized light (reflected light) reflected from the subject 100 is observed by the polarized image sensor 112 using the lens 110 and the mask 111 (plural sub-apertures).
 これによって、第1開口領域113に備えられた偏光フィルタの偏光角「Ψ-π/4」に等しい偏光角に対応するチャンネルの画像(偏光画像)と、第2開口領域114に備えられた偏光フィルタの偏光角「Ψ+π/4」に等しい偏光角に対応するチャンネルの画像(偏光画像)との間の輝度変化を打ち消すことが可能である。このため、被写体100の表面の3次元位置を容易に推定することが可能である。これによって、実空間における被写体の3次元位置の推定において、被写体又は環境の影響による精度の低下を抑えることが可能である。 As a result, the image of the channel (polarized image) corresponding to the polarization angle equal to the polarization angle "Ψ-π / 4" of the polarization filter provided in the first opening region 113 and the polarization provided in the second opening region 114. It is possible to cancel the change in brightness between the image of the channel corresponding to the polarization angle equal to the polarization angle “Ψ + π / 4” of the filter (polarized image). Therefore, it is possible to easily estimate the three-dimensional position of the surface of the subject 100. This makes it possible to suppress a decrease in accuracy due to the influence of the subject or the environment in estimating the three-dimensional position of the subject in the real space.
 (第2実施形態)
 第2実施形態では、画像処理部が被写体の表面の反射パラメータを推定する点が、第1実施形態と相違する。第2実施形態では、第1実施形態との相違点を主に説明する。
(Second Embodiment)
The second embodiment differs from the first embodiment in that the image processing unit estimates the reflection parameter on the surface of the subject. In the second embodiment, the differences from the first embodiment will be mainly described.
 図6は、第1実施形態における、位置推定装置1bの構成例を示す図である。位置推定装置1bは、被写体の表面の3次元位置(奥行)を推定する装置である。図6では、位置推定装置1bは、被写体100の表面の反射パラメータを推定する。 FIG. 6 is a diagram showing a configuration example of the position estimation device 1b in the first embodiment. The position estimation device 1b is a device that estimates the three-dimensional position (depth) of the surface of the subject. In FIG. 6, the position estimation device 1b estimates the reflection parameter on the surface of the subject 100.
 位置推定装置1bは、偏光照明部10と、画像撮像部11と、画像処理部12bとを備える。画像撮像部11は、被写体100の画像を、画像処理部12bに出力する。 The position estimation device 1b includes a polarized lighting unit 10, an image imaging unit 11, and an image processing unit 12b. The image capturing unit 11 outputs the image of the subject 100 to the image processing unit 12b.
 画像処理部12bは、被写体100の画像を、画像撮像部11から取得する。画像処理部12bは、画像撮像部11から取得された画像に基づいて、被写体100の表面の3次元位置(奥行)を推定する。画像処理部12bは、被写体100の表面の3次元位置と、被写体100の画像(観測画像)とに基づいて、被写体100の表面の反射パラメータを推定する。 The image processing unit 12b acquires an image of the subject 100 from the image imaging unit 11. The image processing unit 12b estimates the three-dimensional position (depth) of the surface of the subject 100 based on the image acquired from the image capturing unit 11. The image processing unit 12b estimates the reflection parameter of the surface of the subject 100 based on the three-dimensional position of the surface of the subject 100 and the image (observation image) of the subject 100.
 図7は、第1実施形態における、画像処理部12bの構成例を示す図である。画像処理部12bは、画像入力部120bと、位置推定部121bと、反射成分推定部122とを備える。 FIG. 7 is a diagram showing a configuration example of the image processing unit 12b in the first embodiment. The image processing unit 12b includes an image input unit 120b, a position estimation unit 121b, and a reflection component estimation unit 122.
 画像入力部120bは、被写体100の画像を、位置推定部121bと反射成分推定部122とに出力する。位置推定部121bは、被写体100の画像に基づいて、被写体100の表面の3次元位置(奥行)を推定する。位置推定部121bは、被写体100の表面の3次元位置の推定結果を、所定の外部装置(不図示)と反射成分推定部122とに出力する。 The image input unit 120b outputs the image of the subject 100 to the position estimation unit 121b and the reflection component estimation unit 122. The position estimation unit 121b estimates the three-dimensional position (depth) of the surface of the subject 100 based on the image of the subject 100. The position estimation unit 121b outputs the estimation result of the three-dimensional position of the surface of the subject 100 to a predetermined external device (not shown) and the reflection component estimation unit 122.
 反射成分推定部122は、被写体100の画像を、画像入力部120bから取得する。反射成分推定部122は、被写体100の表面の3次元位置の推定結果を、位置推定部121bから取得する。反射成分推定部122は、被写体100の3次元位置と被写体100の画像とに基づいて、被写体100の表面の反射パラメータ(反射成分)を推定する。反射成分推定部122は、被写体100の表面の反射パラメータを、所定の外部装置(不図示)に出力する。 The reflection component estimation unit 122 acquires an image of the subject 100 from the image input unit 120b. The reflection component estimation unit 122 acquires the estimation result of the three-dimensional position of the surface of the subject 100 from the position estimation unit 121b. The reflection component estimation unit 122 estimates the reflection parameter (reflection component) on the surface of the subject 100 based on the three-dimensional position of the subject 100 and the image of the subject 100. The reflection component estimation unit 122 outputs the reflection parameter on the surface of the subject 100 to a predetermined external device (not shown).
 図8は、第1実施形態における、位置推定装置1bの動作例を示すフローチャートである。ステップS201からステップS202までの動作は、図4に示されたステップS101からステップS102までの動作と同様である。以下、偏光角「Ψ」に対応するチャンネルの偏光画像を「第3観測画像」という。以下、偏光角「Ψ+π/2」に対応するチャンネルの偏光画像を「第4観測画像」という。 FIG. 8 is a flowchart showing an operation example of the position estimation device 1b in the first embodiment. The operation from step S201 to step S202 is the same as the operation from step S101 to step S102 shown in FIG. Hereinafter, the polarized image of the channel corresponding to the polarization angle “Ψ” is referred to as a “third observation image”. Hereinafter, the polarized image of the channel corresponding to the polarization angle “Ψ + π / 2” is referred to as a “fourth observation image”.
 画像入力部120bは、第1観測画像と第2観測画像と第3観測画像と第4観測画像とを、画像撮像部11(偏光イメージセンサ112)から取得する。画像入力部120bは、第1観測画像と第2観測画像と第3観測画像と第4観測画像とを、位置推定部121bと反射成分推定部122とに出力する。画像入力部120bは、第1観測画像と第2観測画像と第3観測画像と第4観測画像とを、メモリに記録してもよい(ステップS203)。 The image input unit 120b acquires the first observation image, the second observation image, the third observation image, and the fourth observation image from the image imaging unit 11 (polarized image sensor 112). The image input unit 120b outputs the first observation image, the second observation image, the third observation image, and the fourth observation image to the position estimation unit 121b and the reflection component estimation unit 122. The image input unit 120b may record the first observation image, the second observation image, the third observation image, and the fourth observation image in the memory (step S203).
 位置推定部121bは、第1観測画像と第2観測画像とに基づいて、被写体100の位置パラメータを推定する。位置推定部121bは、被写体100の表面の位置パラメータを、所定の外部装置(不図示)と反射成分推定部122とに出力する(ステップS204)。 The position estimation unit 121b estimates the position parameter of the subject 100 based on the first observation image and the second observation image. The position estimation unit 121b outputs the position parameter of the surface of the subject 100 to a predetermined external device (not shown) and the reflection component estimation unit 122 (step S204).
 反射成分推定部122は、第1観測画像と第2観測画像と第3観測画像と第4観測画像とのうちの少なくとも一つと、被写体100の表面の位置パラメータとに基づいて、被写体100の表面の反射パラメータを推定する。反射成分推定部122は、被写体100の表面の反射パラメータを、所定の外部装置(不図示)に出力する(ステップS205)。 The reflection component estimation unit 122 is the surface of the subject 100 based on at least one of the first observation image, the second observation image, the third observation image, and the fourth observation image, and the position parameter of the surface of the subject 100. Estimate the reflection parameters of. The reflection component estimation unit 122 outputs the reflection parameter on the surface of the subject 100 to a predetermined external device (not shown) (step S205).
 反射パラメータは、被写体100の表面の反射に関するパラメータであれば、特定のパラメータに限定されない。例えば、反射パラメータは、ストークスパラメータである。例えば、反射パラメータは、第4観測画像等の画像における、最大輝度「Imax」と最小輝度「Imin」との組み合わせでもよい。反射パラメータは、反射光において分離された鏡面反射成分及び拡散反射成分でもよいし、被写体100の表面の法線方向又は双方向反射率分布(BRDF)関数でもよい。 The reflection parameter is not limited to a specific parameter as long as it is a parameter related to the reflection on the surface of the subject 100. For example, the reflection parameter is a Stokes parameter. For example, the reflection parameter may be a combination of the maximum luminance "I max " and the minimum luminance "I min " in an image such as the fourth observation image. The reflection parameter may be a mirror reflection component and a diffuse reflection component separated in the reflected light, or may be a normal direction or bidirectional reflectance distribution (BRDF) function of the surface of the subject 100.
 一般に、偏光カメラによって被写体が観測された場合、画素ごとの4種類の偏光角についての観測結果に基づいて、反射パラメータを求めることができる。本実施形態では、第1観測画像の対応点の輝度と、第2観測画像の対応点の輝度とが一致している。この一致した輝度は、「Ψ-π/4」から「Ψ+3π/4」まで偏光角が変化した場合における、第1観測画像の対応点の輝度と第2観測画像の対応点の輝度との平均輝度となる。 Generally, when a subject is observed by a polarizing camera, the reflection parameter can be obtained based on the observation results for four types of polarization angles for each pixel. In the present embodiment, the brightness of the corresponding point of the first observed image and the brightness of the corresponding point of the second observed image match. This matched brightness is the average of the brightness of the corresponding point of the first observed image and the brightness of the corresponding point of the second observed image when the polarization angle changes from "Ψ-π / 4" to "Ψ + 3π / 4". It becomes brightness.
 反射成分推定部122は、第1観測画像と第2観測画像と第3観測画像と第4観測画像とを、画像入力部120bから取得する。反射成分推定部122は、被写体100の表面の位置パラメータを、位置推定部121bから取得する。反射成分推定部122は、第1観測画像の画素ごとに、被写体100の表面の反射パラメータを推定する。 The reflection component estimation unit 122 acquires the first observation image, the second observation image, the third observation image, and the fourth observation image from the image input unit 120b. The reflection component estimation unit 122 acquires the position parameter of the surface of the subject 100 from the position estimation unit 121b. The reflection component estimation unit 122 estimates the reflection parameter on the surface of the subject 100 for each pixel of the first observed image.
 ここで、第3観測画像は、第1開口領域113又は第2開口領域114を選択的に透過した偏光が偏光イメージセンサ112の偏光角「Ψ」の偏光フィルタを透過することによって生成された画像(偏光画像)である。すなわち、第3観測画像は、第1開口領域113の偏光フィルタと偏光イメージセンサ112の偏光角「Ψ」の偏光フィルタとを透過した偏光に応じた画像と、第2開口領域114の偏光フィルタと偏光イメージセンサ112の偏光角「Ψ」の偏光フィルタとを透過した偏光に応じた画像との重ね合わせに応じて生成される。 Here, the third observation image is an image generated by the polarization selectively transmitted through the first aperture region 113 or the second aperture region 114 and passing through the polarization filter having the polarization angle “Ψ” of the polarization image sensor 112. (Polarized image). That is, the third observation image includes an image corresponding to the polarization transmitted through the polarization filter of the first opening region 113 and the polarization filter of the polarization angle “Ψ” of the polarization image sensor 112, and the polarization filter of the second opening region 114. It is generated according to the superposition with the image corresponding to the polarization transmitted through the polarizing filter having the polarization angle “Ψ” of the polarization image sensor 112.
 第4観測画像は、第1開口領域113又は第2開口領域114を選択的に透過した偏光が偏光イメージセンサ112の偏光角「Ψ+π/2」の偏光フィルタを透過することによって生成された画像(偏光画像)である。すなわち、第4観測画像は、第1開口領域113の偏光フィルタと偏光イメージセンサ112の偏光角「Ψ+π/2」の偏光フィルタとを透過した偏光に応じた画像と、第2開口領域114の偏光フィルタと偏光イメージセンサ112の偏光角「Ψ+π/2」の偏光フィルタとを透過した偏光に応じた画像との重ね合わせに応じて生成される。また、「θ=Ψ」である場合、第4観測画像に対応する偏光フィルタの偏光角「Ψ+π/2」と偏光照明部10の偏光フィルタの偏光角「θ」とが直交しているので、第4観測画像は拡散反射成分のみを含む画像である。 The fourth observation image is an image generated by the polarization selectively transmitted through the first opening region 113 or the second opening region 114 and transmitted through a polarizing filter having a polarization angle of “Ψ + π / 2” of the polarization image sensor 112 (. Polarized image). That is, the fourth observation image is an image corresponding to the polarization transmitted through the polarization filter of the first opening region 113 and the polarization filter of the polarization angle “Ψ + π / 2” of the polarization image sensor 112, and the polarization of the second opening region 114. It is generated according to the superposition of the image corresponding to the polarization transmitted through the filter and the polarization filter having the polarization angle “Ψ + π / 2” of the polarization image sensor 112. Further, when “θ = Ψ”, the polarization angle “Ψ + π / 2” of the polarization filter corresponding to the fourth observed image and the polarization angle “θ” of the polarization filter of the polarization illumination unit 10 are orthogonal to each other. The fourth observation image is an image containing only the diffuse reflection component.
 反射パラメータを推定する方法は、被写体100の表面における光の反射に関するパラメータを推定する方法であれば、特定の方法に限定されない。例えば、反射成分推定部122は、第4観測画像と位置パラメータとに基づいて、第4観測画像の拡散反射成分である最小輝度「Imin」を推定する。また、第4観測画像の平均輝度「Imean」を基準とする正弦波で輝度が偏光角に応じて変化するとの仮定に基づいて、反射成分推定部122は、鏡面反射成分が最大となる場合に観測された第4観測画像の最大輝度「Imax」を推定する。 The method for estimating the reflection parameter is not limited to a specific method as long as it is a method for estimating the parameter related to the reflection of light on the surface of the subject 100. For example, the reflection component estimation unit 122 estimates the minimum luminance “I min ” which is the diffuse reflection component of the fourth observation image based on the fourth observation image and the position parameter. Further, based on the assumption that the brightness changes according to the polarization angle with a sine wave based on the average brightness "I mean " of the fourth observed image, the reflection component estimation unit 122 has a case where the specular reflection component becomes the maximum. The maximum brightness "I max " of the 4th observation image observed in 1 is estimated.
 第1開口領域113を透過した偏光に応じた画像に対応する被写体100の表面の位置パラメータと、第2開口領域114を透過した偏光に応じた画像に対応する被写体100の表面の位置パラメータとの対応関係を、第4観測画像に関して知ることができる。第4観測画像の拡散反射成分である最小輝度「Imin」は、式(1)のように表される。また、式(2)が成り立つ。 The position parameter of the surface of the subject 100 corresponding to the image corresponding to the polarization transmitted through the first opening region 113 and the position parameter of the surface of the subject 100 corresponding to the image corresponding to the polarization transmitted through the second opening region 114. The correspondence can be known about the fourth observation image. The minimum luminance "I min ", which is the diffuse reflection component of the fourth observed image, is expressed by the equation (1). Further, the equation (2) holds.
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
 ここで、「d」は、第1観測画像と同じ位置の偏光画像を基準として、第1観測画像とは異なる位置の偏光画像への視差を表すベクトルを表す。最小輝度「Imix」は、第4観測画像の拡散反射成分を表す。「I」は、第4観測画像を構成する偏光画像のうち、第1開口領域113を透過した偏光が偏光イメージセンサ112の偏光角「π/2」の偏光フィルタを透過することによって生成された偏光画像を表す。「I」は、第4観測画像を構成する偏光画像のうち、第2開口領域114を透過した偏光が偏光イメージセンサ112の偏光角「π/2」の偏光フィルタを透過することによって生成された偏光画像を表す。 Here, "d" represents a vector representing parallax to a polarized image at a position different from that of the first observed image, with reference to a polarized image at the same position as the first observed image. The minimum luminance "I mix " represents the diffuse reflection component of the fourth observed image. “I a ” is generated by the polarization transmitted through the first opening region 113 among the polarized images constituting the fourth observed image and transmitted through the polarizing filter having the polarization angle “π / 2” of the polarization image sensor 112. Represents a polarized image. “I b ” is generated by the polarization transmitted through the second opening region 114 in the polarized image constituting the fourth observed image and transmitted through the polarizing filter having the polarization angle “π / 2” of the polarization image sensor 112. Represents a polarized image.
 偏光画像を第4観測画像から分離する方法は、偏光画像を分離可能な方法であれば、特定の方法に限定されない。例えば、反射成分推定部122は、式(3)に示された最適化問題を解くことによって、第2開口領域114を透過した偏光に応じた画像「I」を推定する。なお、式(3)には、全変動正則化「total variation」等の任意の正則化項が追加されてもよい。 The method for separating the polarized image from the fourth observation image is not limited to a specific method as long as the polarized image can be separated. For example, the reflection component estimation unit 122 estimates the image “I b ” corresponding to the polarization transmitted through the second opening region 114 by solving the optimization problem shown in the equation (3). In addition, an arbitrary regularization term such as "total variation" may be added to the equation (3).
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000003
 以上のように、画像撮像部11は、更に、基準(例えば、Ψ=0)に等しい偏光角である第3偏光角に対応するチャンネルの偏光画像である第3観測画像と、基準に直交する偏光角である第4偏光角(例えば、Ψ+π/2)に対応するチャンネルの偏光画像である第4観測画像とのうちの少なくとも一方を生成してもよい。反射成分推定部122は、第1観測画像と第2観測画像と第3観測画像と第4観測画像とのうちの少なくとも一つと、位置パラメータとに基づいて、被写体の反射パラメータを推定してもよい。例えば、第4観測画像が生成された場合、法線推定部123は、被写体100の位置パラメータと第4観測画像とに基づいて、被写体100の表面の反射パラメータを推定する。 As described above, the image capturing unit 11 is further orthogonal to the reference with the third observation image which is the polarized image of the channel corresponding to the third polarization angle which is the polarization angle equal to the reference (for example, Ψ = 0). At least one of the fourth observed image, which is a polarized image of the channel corresponding to the fourth polarization angle (for example, Ψ + π / 2), which is the polarization angle, may be generated. Even if the reflection component estimation unit 122 estimates the reflection parameter of the subject based on at least one of the first observation image, the second observation image, the third observation image, and the fourth observation image, and the position parameter. good. For example, when the fourth observation image is generated, the normal estimation unit 123 estimates the reflection parameter on the surface of the subject 100 based on the position parameter of the subject 100 and the fourth observation image.
 これによって、第1開口領域113に備えられた偏光フィルタの偏光角「Ψ-π/4」と第2開口領域114に備えられた偏光フィルタの偏光角「Ψ+π/4」とのいずれとも異なる偏光角(例えば、「Ψ=0」、「Ψ+π/2」)に対応するチャンネルの各偏光画像を利用して、法線情報又はBRDF関数などの反射パラメータを推定することが可能である。 As a result, the polarization angle is different from both the polarization angle “Ψ−π / 4” of the polarization filter provided in the first opening region 113 and the polarization angle “Ψ + π / 4” of the polarization filter provided in the second opening region 114. It is possible to estimate reflection parameters such as normal information or BRDF function using each polarized image of the channel corresponding to the angle (eg, “Ψ = 0”, “Ψ + π / 2”).
 これらのようにして、被写体100の表面の3次元位置(奥行)と被写体100の表面の反射パラメータとを、実空間における被写体100の3次元形状(3次元幾何情報)として推定することが可能である。 In this way, it is possible to estimate the three-dimensional position (depth) of the surface of the subject 100 and the reflection parameter of the surface of the subject 100 as the three-dimensional shape (three-dimensional geometric information) of the subject 100 in the real space. be.
 また、画像処理部12bは、被写体100の表面の3次元位置と、被写体100の表面の反射パラメータとに基づいて、被写体100の3次元形状(3次元幾何情報)を推定してもよい。 Further, the image processing unit 12b may estimate the three-dimensional shape (three-dimensional geometric information) of the subject 100 based on the three-dimensional position of the surface of the subject 100 and the reflection parameter of the surface of the subject 100.
 (第1変形例)
 位置推定装置1bは、複数の偏光照明部10と複数の画像撮像部11とを備えてもよい。複数の偏光照明部10と複数の画像撮像部11とは、撮像対象である被写体100を中心とする球状に配置されてもよい。これによって、偏光の入射角及び反射角の複数の組み合わせに基づいて、複雑な双方向反射率分布(BRDF)関数等を推定することが可能である。
(First modification)
The position estimation device 1b may include a plurality of polarized lighting units 10 and a plurality of image capturing units 11. The plurality of polarized lighting units 10 and the plurality of image capturing units 11 may be arranged in a spherical shape centered on the subject 100 to be imaged. This makes it possible to estimate a complex bidirectional reflectance distribution (BRDF) function or the like based on a plurality of combinations of the incident angle and the reflected angle of the polarized light.
 (第2変形例)
 偏光照明部10は、偏光の偏光角を時間変調してもよい。被写体100に照射される偏光の偏光角を偏光照明部10が変化させることによって、完全鏡面反射でない反射光の偏光角を、第1開口領域113の偏光角と第2開口領域114の偏光角とに合わせることができる。これによって、完全鏡面反射が起きるように装置と被写体とを配置することができない等の理由で、反射光の偏光角と照明の偏光角とが一致しない場合でも、被写体の表面の三次元位置と反射パラメータとを推定することが可能である。
(Second modification)
The polarization illumination unit 10 may time-modulate the polarization angle of the polarization. By changing the polarization angle of the polarization applied to the subject 100 by the polarization illumination unit 10, the polarization angle of the reflected light that is not completely specular reflection is set to the polarization angle of the first opening region 113 and the polarization angle of the second opening region 114. Can be adjusted to. As a result, even if the polarization angle of the reflected light and the polarization angle of the illumination do not match because the device and the subject cannot be arranged so that complete specular reflection occurs, the three-dimensional position of the surface of the subject can be obtained. It is possible to estimate the reflection parameters.
 図9は、各実施形態に共通する位置推定装置1(位置推定装置1a、位置推定装置1b)のハードウェア構成例を示す図である。位置推定装置1の各機能部のうちの一部又は全部は、CPU(Central Processing Unit)等のプロセッサ2が、不揮発性の記録媒体(非一時的な記録媒体)を有する記憶装置3とメモリ4とに記憶されたプログラムを実行することにより、ソフトウェアとして実現される。プログラムは、コンピュータ読み取り可能な記録媒体に記録されてもよい。コンピュータ読み取り可能な記録媒体とは、例えばフレキシブルディスク、光磁気ディスク、ROM(Read Only Memory)、CD-ROM(Compact Disc Read Only Memory)等の可搬媒体、コンピュータシステムに内蔵されるハードディスク等の記憶装置などの非一時的な記録媒体である。表示部5は、画像を表示する。 FIG. 9 is a diagram showing a hardware configuration example of the position estimation device 1 (position estimation device 1a, position estimation device 1b) common to each embodiment. A part or all of each functional unit of the position estimation device 1 is a storage device 3 and a memory 4 in which a processor 2 such as a CPU (Central Processing Unit) has a non-volatile recording medium (non-temporary recording medium). It is realized as software by executing the program stored in. The program may be recorded on a computer-readable recording medium. Computer-readable recording media include, for example, flexible disks, optomagnetic disks, portable media such as ROM (ReadOnlyMemory) and CD-ROM (CompactDiscReadOnlyMemory), and storage of hard disks built into computer systems. It is a non-temporary recording medium such as a device. The display unit 5 displays an image.
 位置推定装置1の各機能部の一部又は全部は、例えば、LSI(Large Scale Integration circuit)、ASIC(Application Specific Integrated Circuit)、PLD(Programmable Logic Device)又はFPGA(Field Programmable Gate Array)等を用いた電子回路(electronic circuit又はcircuitry)を含むハードウェアを用いて実現されてもよい。 A part or all of each functional part of the position estimation device 1 uses, for example, an LSI (Large Scale Integration circuit), an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), an FPGA (Field Programmable Gate Array), or the like. It may be realized by using the hardware including the electronic circuit (electronic circuit or circuitry) that has been used.
 以上、この発明の実施形態について図面を参照して詳述してきたが、具体的な構成はこの実施形態に限られるものではなく、この発明の要旨を逸脱しない範囲の設計等も含まれる。 As described above, the embodiment of the present invention has been described in detail with reference to the drawings, but the specific configuration is not limited to this embodiment, and the design and the like within a range not deviating from the gist of the present invention are also included.
 本発明は、撮像された被写体の3次元形状を推定する装置に適用可能である。 The present invention is applicable to a device that estimates the three-dimensional shape of an imaged subject.
1,1a,1b…位置推定装置、2…プロセッサ、3…記憶装置、4…メモリ、5…表示部、10…偏光照明部、11…画像撮像部、12a,12b…画像処理部、100…被写体、110…レンズ、111…マスク、112…偏光イメージセンサ、113…第1開口領域、114…第2開口領域、120a,120b…画像入力部、121a,121b…位置推定部、122…反射成分推定部、200…合焦面 1,1a, 1b ... Position estimation device, 2 ... Processor, 3 ... Storage device, 4 ... Memory, 5 ... Display unit, 10 ... Polarized illumination unit, 11 ... Image imaging unit, 12a, 12b ... Image processing unit, 100 ... Subject, 110 ... lens, 111 ... mask, 112 ... polarized image sensor, 113 ... first aperture region, 114 ... second aperture region, 120a, 120b ... image input unit, 121a, 121b ... position estimation unit, 122 ... reflection component Estimator, 200 ... Focusing surface

Claims (4)

  1.  位置推定装置が実行する位置推定方法であって、
     マイナス方向に基準から傾いており前記基準とは直交しない偏光角である第1偏光角に対応するチャンネルの偏光画像である第1観測画像と、プラス方向に前記基準から傾いており前記基準とは直交しない偏光角である第2偏光角に対応するチャンネルの偏光画像である第2観測画像とを、前記基準の偏光角の偏光が照射された被写体から反射された光に応じて生成する画像撮像ステップと、
     前記第1観測画像と前記第2観測画像とに基づいて前記被写体の3次元位置を推定する位置推定ステップと
     を含む位置推定方法。
    It is a position estimation method executed by the position estimation device.
    The first observation image, which is a polarized image of the channel corresponding to the first polarization angle, which is a polarization angle that is tilted from the reference in the negative direction and is not orthogonal to the reference, and the reference, which is tilted from the reference in the plus direction. Image imaging that generates a second observation image, which is a polarized image of the channel corresponding to the second polarization angle, which is a non-orthogonal polarization angle, according to the light reflected from the subject irradiated with the polarization of the reference polarization angle. Steps and
    A position estimation method including a position estimation step for estimating a three-dimensional position of the subject based on the first observation image and the second observation image.
  2.  前記画像撮像ステップでは、更に、前記基準に等しい偏光角である第3偏光角に対応するチャンネルの偏光画像である第3観測画像と、前記基準に直交する偏光角である第4偏光角に対応するチャンネルの偏光画像である第4観測画像とを生成し、
     前記第1観測画像と前記第2観測画像と前記第3観測画像と前記第4観測画像とのうちの少なくとも一つに基づいて前記被写体の表面の反射パラメータを推定する反射成分推定ステップを更に含む、
     請求項1に記載の位置推定方法。
    In the image imaging step, the third observation image, which is a polarized image of the channel corresponding to the third polarization angle, which is a polarization angle equal to the reference, and the fourth polarization angle, which is a polarization angle orthogonal to the reference, are further supported. Generates a fourth observation image, which is a polarized image of the channel to be used.
    Further included is a reflection component estimation step for estimating the reflection parameters on the surface of the subject based on at least one of the first observation image, the second observation image, the third observation image, and the fourth observation image. ,
    The position estimation method according to claim 1.
  3.  マイナス方向に基準から傾いており前記基準とは直交しない偏光角である第1偏光角に対応するチャンネルの偏光画像である第1観測画像と、プラス方向に前記基準から傾いており前記基準とは直交しない偏光角である第2偏光角に対応するチャンネルの偏光画像である第2観測画像とを、前記基準の偏光角の偏光が照射された被写体から反射された光に応じて生成する画像撮像部と、
     前記第1観測画像と前記第2観測画像とに基づいて前記被写体の3次元位置を推定する位置推定部と
     を備える位置推定装置。
    The first observation image, which is a polarized image of the channel corresponding to the first polarization angle, which is a polarization angle that is tilted from the reference in the negative direction and is not orthogonal to the reference, and the reference, which is tilted from the reference in the plus direction. Image imaging that generates a second observation image, which is a polarized image of the channel corresponding to the second polarization angle, which is a non-orthogonal polarization angle, according to the light reflected from the subject irradiated with the polarization of the reference polarization angle. Department and
    A position estimation device including a position estimation unit that estimates a three-dimensional position of the subject based on the first observation image and the second observation image.
  4.  請求項3に記載の位置推定装置としてコンピュータを機能させるためのプログラム。 A program for operating a computer as the position estimation device according to claim 3.
PCT/JP2020/034113 2020-09-09 2020-09-09 Position estimation method, position estimation device, and program WO2022054167A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2022548293A JP7445176B2 (en) 2020-09-09 2020-09-09 Position estimation method, position estimation device and program
PCT/JP2020/034113 WO2022054167A1 (en) 2020-09-09 2020-09-09 Position estimation method, position estimation device, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/034113 WO2022054167A1 (en) 2020-09-09 2020-09-09 Position estimation method, position estimation device, and program

Publications (1)

Publication Number Publication Date
WO2022054167A1 true WO2022054167A1 (en) 2022-03-17

Family

ID=80631408

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/034113 WO2022054167A1 (en) 2020-09-09 2020-09-09 Position estimation method, position estimation device, and program

Country Status (2)

Country Link
JP (1) JP7445176B2 (en)
WO (1) WO2022054167A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002213931A (en) * 2001-01-17 2002-07-31 Fuji Xerox Co Ltd Instrument and method for measuring three-dimensional shape
JP2012143363A (en) * 2011-01-11 2012-08-02 Panasonic Corp Image processing apparatus
JP2019049457A (en) * 2017-09-08 2019-03-28 株式会社東芝 Image processing apparatus and ranging device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003194669A (en) 2001-12-27 2003-07-09 Seiko Epson Corp Inspection method and inspection device for liquid crystal device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002213931A (en) * 2001-01-17 2002-07-31 Fuji Xerox Co Ltd Instrument and method for measuring three-dimensional shape
JP2012143363A (en) * 2011-01-11 2012-08-02 Panasonic Corp Image processing apparatus
JP2019049457A (en) * 2017-09-08 2019-03-28 株式会社東芝 Image processing apparatus and ranging device

Also Published As

Publication number Publication date
JPWO2022054167A1 (en) 2022-03-17
JP7445176B2 (en) 2024-03-07

Similar Documents

Publication Publication Date Title
CN108650447B (en) Image sensor, depth data measuring head and measuring system
US10571668B2 (en) Catadioptric projector systems, devices, and methods
JP6875081B2 (en) Parameter estimation method for display devices and devices using that method
JP6585006B2 (en) Imaging device and vehicle
JP5167279B2 (en) Method and apparatus for quantitative three-dimensional imaging
CN107533763B (en) Image processing apparatus, image processing method, and program
WO2022017257A1 (en) Method for identifying worn area of brake disk, and wear identification system
US9048153B2 (en) Three-dimensional image sensor
JP2010526481A (en) Single lens 3D imaging device using a diaphragm aperture mask encoded with polarization combined with a sensor sensitive to polarization
JP2004239886A (en) Three-dimensional image imaging apparatus and method
US20210150744A1 (en) System and method for hybrid depth estimation
JP2008096162A (en) Three-dimensional distance measuring sensor and three-dimensional distance measuring method
JP2010276433A (en) Imaging device, image processor, and distance measuring device
EP3707551A1 (en) Imaging method and apparatus using circularly polarized light
JP6933776B2 (en) Information processing device and subject information acquisition method
WO2022054167A1 (en) Position estimation method, position estimation device, and program
US6580557B2 (en) Single lens instantaneous 3D image taking device
US20220164971A1 (en) Methods for depth sensing using candidate images selected based on an epipolar line
WO2021093804A1 (en) Omnidirectional stereo vision camera configuration system and camera configuration method
KR20220078447A (en) Operation method of image restoration apparatus for restoring low-density area
JP6934565B2 (en) Information processing device and subject information acquisition method
US11972590B2 (en) Image processing apparatus, image pickup apparatus, image processing method, and storage medium
WO2022107530A1 (en) Signal processing device, signal processing method, and program
JP2005106820A5 (en)
US20220172388A1 (en) Image processing apparatus, image pickup apparatus, image processing method, and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20953233

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022548293

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20953233

Country of ref document: EP

Kind code of ref document: A1