WO2022259457A1 - Dispositif d'estimation de forme, procédé d'estimation de forme et programme - Google Patents

Dispositif d'estimation de forme, procédé d'estimation de forme et programme Download PDF

Info

Publication number
WO2022259457A1
WO2022259457A1 PCT/JP2021/022095 JP2021022095W WO2022259457A1 WO 2022259457 A1 WO2022259457 A1 WO 2022259457A1 JP 2021022095 W JP2021022095 W JP 2021022095W WO 2022259457 A1 WO2022259457 A1 WO 2022259457A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
polarization
refraction
shape
feature
Prior art date
Application number
PCT/JP2021/022095
Other languages
English (en)
Japanese (ja)
Inventor
裕之 石原
孝之 仲地
Original Assignee
日本電信電話株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電信電話株式会社 filed Critical 日本電信電話株式会社
Priority to JP2023526751A priority Critical patent/JPWO2022259457A1/ja
Priority to PCT/JP2021/022095 priority patent/WO2022259457A1/fr
Publication of WO2022259457A1 publication Critical patent/WO2022259457A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures

Definitions

  • the present invention relates to a shape estimation device, a shape estimation method, and a program for estimating the three-dimensional shape of transparent refraction layers such as water surfaces and air fluctuations from images.
  • Technology for estimating the 3D shape of an object in an image is particularly important in fields such as robot vision, augmented reality, and autonomous driving.
  • 3D shape estimation multiple cameras are prepared and the 3D shape is estimated from the difference in appearance based on the difference in camera installation position.
  • This conventional method assumes that the object to be estimated is an opaque and diffusely reflective surface. Therefore, it is not possible to estimate the shape of a transparent refracting surface such as a water surface.
  • Non-Patent Document 1 theoretically and experimentally discloses that at least two cameras are required to estimate the three-dimensional shape of a scene in which one refraction occurs.
  • Non-Patent Document 2 discloses that it is possible to estimate the three-dimensional shape of a refracting surface with a single camera by taking into account the difference in appearance of the background due to the presence or absence of refraction.
  • Non-Patent Document 3 discloses a method of estimating the three-dimensional shape of a transparent surface using polarization information.
  • Non-Patent Document 1 requires at least two cameras, and requires alignment and time synchronization between the cameras.
  • the model formulated as the optimization problem is complicated, and shape estimation takes a long time, requiring a large calculation cost.
  • Non-Patent Document 3 it is necessary to know a rough three-dimensional shape of an estimation target in advance.
  • the conventional technology requires multiple cameras, a large calculation cost, and the 3D shape is known, etc., and there is a problem that there is no suitable technology for estimating the 3D shape.
  • the present invention has been made in view of this problem. and to provide programs.
  • a shape estimating device includes a polarizing camera that captures a first image of a subject when no transparent refractive layer is interposed and a second image of the subject when the refractive layer is interposed; a feature acquisition unit that acquires a distortion vector, which is a feature representing a change in geometric appearance due to refraction, by applying a feature point tracking method between the first image and the second image; and the second image.
  • a first estimating unit that acquires sets of luminance values respectively corresponding to at least three different polarization angles from the a second estimating unit that estimates two candidates; and a refractive surface that selects one of the candidates for the normal vector using the distortion vector and generates refractive surface three-dimensional shape information representing the surface shape of the refractive layer. and a generator.
  • a method for estimating a three-dimensional shape of a refraction surface is a method for estimating a three-dimensional shape of a refraction surface, which is performed by the three-dimensional shape estimation device for a refraction surface, wherein the polarization camera has a transparent refraction layer interposed therebetween.
  • a first image which is an image of the subject when the refractive layer is not present
  • a second image which is an image of the subject when the refraction layer is present, are captured, and the feature amount acquisition unit obtains the first image and the second image.
  • a distortion vector which is a feature quantity representing a change in geometric appearance due to refraction, is obtained by applying a feature point tracking method between the two images, and a first estimating unit obtains at least three different angles and brightness from the second image.
  • a second estimating unit estimates two candidates for the normal vector of the refraction layer to be estimated using the characteristic amount of polarization by obtaining a set of values, and a refraction surface generating unit selects one of the candidates for the normal vector using the distortion vector, and generates refractive surface three-dimensional shape information representing the surface shape of the refractive layer.
  • a program according to one aspect of the present invention is summarized as a program for causing a computer to function as the above-described refractive surface three-dimensional shape estimation device.
  • the present invention it is possible to estimate the three-dimensional shape of a transparent refractive surface with a single camera, low computational complexity, and no pre-learning data.
  • FIG. 2 is a diagram schematically showing the relationship between the shape estimation device shown in FIG. 1, a subject, and a transparent refractive layer;
  • FIG. FIG. 4 is a diagram schematically showing modeling of changes in polarization, where (a) shows the normal vector and light reflected by the refraction layer, and (b) shows the relationship between the zenith angle and the degree of polarization.
  • FIG. 4 is a diagram schematically showing the relationship between the azimuth angle and the brightness of an image;
  • FIG. 4 schematically illustrates modeling of geometrical changes;
  • FIG. 4 is a diagram schematically showing an example of refracting surface three-dimensional shape information; 1. It is a flowchart which shows the processing procedure of the shape estimation method which the shape estimation apparatus shown in FIG. 1 performs.
  • 1 is a block diagram showing a configuration example of a general-purpose computer system;
  • FIG. 1 is a block diagram showing a functional configuration example of a shape estimation device according to an embodiment of the present invention.
  • a shape estimation device 100 shown in FIG. 1 is a device for estimating the three-dimensional shape of a transparent refraction layer interposed between a subject.
  • the shape estimation device 100 includes a polarization camera 10 , a feature quantity acquisition unit 20 , a first estimation unit 30 , a second estimation unit 40 , and a refractive surface generation unit 50 .
  • Each functional component except for the polarization camera 10 can be realized by a computer including ROM, RAM, CPU, and the like. In that case, the content of the processing is described by the program.
  • the polarization camera 10 is a general polarization camera.
  • the polarization camera 10 incorporates, for example, polarizers (polarization filters) with four different polarization angles.
  • the polarization camera 10 captures a first image that is an image of the subject without a transparent refractive layer and a second image that is an image of the subject with a refractive layer interposed.
  • a transparent refraction layer is a surface of water, a fluctuation layer of air, or the like.
  • the feature quantity acquisition unit 20 applies a feature point tracking method between the first image and the second image captured by the polarizing camera 10 to obtain a distortion vector, which is a feature quantity representing a geometric change in appearance due to refraction. get.
  • the feature point tracking method is, for example, optical flow.
  • the first estimating unit 30 acquires sets of luminance values respectively corresponding to at least three different polarization angles from the second image and estimates the polarization feature amount.
  • the characteristic quantity of polarization is the degree of polarization. Details will be described later.
  • the second estimation unit 40 estimates two candidates for the normal vector of the refraction layer to be estimated using the polarization feature amount. In other words, the candidates for the normal vector of the change in polarization are narrowed down to two.
  • the refraction surface generation unit 50 selects one of the normal vector candidates using the distortion vector, and generates refraction surface three-dimensional shape information representing the surface shape of the refraction layer.
  • a normal vector is a vector orthogonal to a tangent to a pixel in the second image.
  • Refraction surface three-dimensional shape information representing the surface shape of the refraction layer can be generated from the normal vectors of the pixels in which the refraction layer is reflected in the second image.
  • FIG. 2 is a diagram schematically showing the relationship between the shape estimation device 100, the subject (background), and the transparent refraction layer (refraction surface to be estimated).
  • FIG. 2 shows only an image sensor and a polarizing filter that constitute the polarizing camera 10 .
  • the strip-shaped image sensor shown in FIG. 2 is, for example, a CMOS image sensor with millions of pixels.
  • Pixel i on the image plane receives light through the polarizing filter from the background through the refractive layer.
  • a polarizing filter with four different polarization angles is placed in front of the image sensor. It is common to have four polarization angles.
  • the shape estimating apparatus 100 estimates the azimuth angle ⁇ and the elevation angle ⁇ of the normal vector n i , which is a vector orthogonal to the tangent line of the pixel i, for each pixel i when the direction of the incident light is the optical axis Zc. Estimate the surface shape of the refractive layer.
  • n i which is a vector orthogonal to the tangent line of the pixel i
  • the azimuth angle ⁇ of the normal vector n i and the polarization angle have the same meaning geometrically. described as.
  • the first estimating unit 30 obtains sets of luminance values respectively corresponding to at least three different polarization angles from the second image of the subject in the case where the refraction layer is interposed, and estimates the polarization feature amount.
  • the polarization features are I max , I min , and ⁇ .
  • a polarization feature can be estimated from three or more different sets of polarization angles and luminance values.
  • the Stokes vector s representing the polarization state can be expressed by the following equation.
  • Ts is the Fresnel transmission coefficient (component horizontal to the plane of incidence)
  • Tt is the Fresnel transmission coefficient (component perpendicular to the plane of incidence).
  • the degree of polarization ⁇ (Degree of Polarization), which indicates the degree of change in polarization, can be expressed by the following formula.
  • Equation (7) is seemingly complicated, but it is a monotonically increasing function and can be formulated as a convex optimization problem. Therefore, the elevation angle ⁇ of the normal vector n i can be uniquely estimated from the observed value of the degree of polarization.
  • FIG. 3 is a diagram schematically showing the modeling of the change in polarization, (a) showing the normal vector n i and the light reflected by the refracting layer, and (b) the relationship between the zenith angle and the degree of polarization. It is a figure which shows.
  • s in represents light incident on the refractive surface (surface of the refractive layer), and s out represents light captured by the polarization camera 10 .
  • the horizontal axis represents the zenith angle and the vertical axis represents the degree of polarization.
  • the elevation angle ⁇ is known, the degree of polarization ⁇ can be uniquely determined.
  • the second estimator 40 estimates two candidates for the normal vector n i of the refraction layer (refraction surface (surface of the refraction layer)) to be estimated using the polarization feature amount.
  • the azimuth angle ⁇ of the normal vector n i coincides with the polarization angle ⁇ (polarization angle at which the luminance value is maximized).
  • the polarizing filter is rotated once, the ambiguity of 180° remains because there are two angles at which the luminance value is maximized.
  • FIG. 4 is a diagram schematically showing the relationship between the azimuth angle ⁇ and the luminance value.
  • the horizontal axis of FIG. 4 indicates the azimuth angle ⁇ , and the vertical axis indicates the image brightness I( ⁇ ).
  • the luminance value I( ⁇ ) has two maximum values.
  • a feature point tracking technique is applied between a first image of an object without a transparent refractive layer and a second image of the object with a refractive layer to represent the geometric change in appearance due to refraction. Acquire a distortion vector, which is a feature quantity.
  • FIG. 5 is a diagram schematically showing modeling of geometric changes.
  • v f shown in FIG. 5 represents the ray space on the polarization camera 10 side.
  • is the relative refractive index.
  • vr is the direction vector of the refracted light.
  • the direction vector v r of the refracted light can be expressed by the following equation.
  • a distortion vector ⁇ g which is a feature quantity representing a change in geometric appearance due to refraction, can be expressed by the following equation.
  • the refraction surface generator 50 selects one of the candidates n i + and n i ⁇ for the normal vector n i using the strain vector ⁇ g and generates refraction surface three-dimensional shape information representing the surface shape of the refraction layer. do.
  • the refractive surface generation unit 50 generates refractive surface three-dimensional shape information by solving the optimum solution of the following equation from the candidates n i + and n i ⁇ of the normal vector n i narrowed down from the polarization constraint.
  • Equation (12) The calculation for solving the optimum solution of Equation (12) is performed for all pixels i.
  • FIG. 6 is a diagram schematically showing an example of refracting surface three-dimensional shape information. Three-dimensional shape information can be generated as shown in FIG.
  • FIG. 7 is a flow chart showing the processing procedure of the shape estimation method performed by the shape estimation device 100. As shown in FIG.
  • the polarization camera 10 captures a first image of the subject without a transparent refraction layer and a second image of the subject with a refraction layer (step S1).
  • the first estimator 30 acquires sets of luminance values respectively corresponding to at least three different polarization angles from the second image, and estimates polarization feature quantities I max , I min , and ⁇ (step S2). .
  • the second estimating unit 40 estimates two (n i + , n i ⁇ ) candidates for the normal vector n i of the refraction layer to be estimated using the polarization feature amount (step S3).
  • the feature amount acquisition unit 20 applies a feature point tracking method between the first image and the second image to acquire a distortion vector ⁇ g , which is a feature amount representing a geometric change in appearance due to refraction. (step S4).
  • the refractive surface generator 50 selects one of the candidates n i + and n i ⁇ for the normal vector n i using the strain vector ⁇ g , Information is generated (step S5). The processing of steps S2 to S5 is repeated until all pixels i are completed (NO in step S6).
  • steps S2 to S5 can be easily parallelized because each pixel is processed independently. Parallelization enables faster 3D shape estimation.
  • the shape estimating apparatus 100 captures a first image of a subject without a transparent refraction layer and a second image of the subject with a refraction layer.
  • a camera 10 and a feature acquisition unit 20 that acquires a distortion vector ⁇ g , which is a feature representing a geometric change in appearance due to refraction, by applying a feature point tracking method between the first image and the second image.
  • a second estimator 40 that estimates two (n i + , n i ⁇ ) candidates for the normal vector n i of the refraction layer to be estimated using the distortion vector ⁇ g , and a candidate n for the normal vector n i using the distortion vector ⁇ g a refraction surface generator 50 that selects one from i + and n i ⁇ and generates refraction surface three-dimensional shape information representing the surface shape of the refraction layer.
  • the shape estimation method according to the present embodiment is a shape estimation method performed by the shape estimation device 100, and the polarization camera 10 obtains the first image of the subject when no transparent refraction layer is interposed and the refraction layer is interposed. A second image of the subject is photographed in the case where the feature amount acquisition unit 20 applies a feature point tracking method between the first image and the second image to obtain a feature representing a change in geometric appearance due to refraction.
  • the first estimating unit 30 obtains sets of luminance values respectively corresponding to at least three different polarization angles from the second image, and obtains the polarization feature quantities I max , I min , and ⁇ is estimated, and the second estimating unit 40 estimates two (n i + , n i ⁇ ) candidates for the normal vector n i of the refraction layer to be estimated using the polarization feature amount, and the refraction surface generation unit 50 selects one of normal vector n i candidates n i + and n i ⁇ using the distortion vector ⁇ g to generate refractive surface three-dimensional shape information representing the surface shape of the refractive layer.
  • the shape estimation device 100 can be realized by a general-purpose computer system shown in FIG.
  • a general-purpose computer system including a CPU 90, a memory 91, a storage 92, a communication unit 93, an input unit 94, and an output unit 95
  • the CPU 90 executes a predetermined program loaded on the memory 91 to obtain a shape.
  • Each function of the estimation device 100 is realized.
  • the prescribed program can be recorded on computer-readable recording media such as HDD, SSD, USB memory, CD-ROM, DVD-ROM, MO, etc., or can be distributed via a network.
  • the shape estimating apparatus 100 and the shape estimating method according to the present embodiment use a single camera, low computational complexity, no prior knowledge of shape (no prior learning data), and transparent Allows estimation of the three-dimensional shape of the refractive surface.
  • the shape of the refraction surface can be obtained from the information obtained with only a single camera.
  • Polarization camera 20 Polarization camera 20: Feature amount acquisition unit 30: First estimation unit 40: Second estimation unit 50: Refraction surface generation unit 100: Shape estimation device

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

Ce dispositif d'estimation de forme comprend : une caméra de polarisation 10 pour capturer une première image d'un sujet sans couche de réfraction transparente interposée entre la caméra 10 et le sujet et une seconde image du sujet avec une couche de réfraction interposée entre la caméra 10 et le sujet ; une unité d'acquisition de valeur de caractéristique 20 pour appliquer un procédé de suivi de point caractéristique à la première image et à la seconde image et acquérir ainsi un vecteur de distorsion Δg qui est une valeur caractéristique représentant une variation d'aspect géométrique résultant de la réfraction ; une première unité d'estimation 30 pour acquérir, à partir de la seconde image, des paires de valeurs de luminosité correspondant respectivement à au moins trois angles de polarisation différents et estimer les valeurs de caractéristiques de polarisation Imax, Imin, Ψ ; une seconde unité d'estimation 40 pour estimer deux (ni +, ni -) candidats pour des vecteurs normaux ni de la couche de réfraction à estimer à l'aide des valeurs de caractéristique de polarisation ; et une unité de génération de surface de réfraction 50 pour utiliser le vecteur de distorsion Δg afin de sélectionner un candidat vecteur normal parmi les candidats vecteurs normaux ni +, ni - et générer des informations de forme de surface de réfraction tridimensionnelle représentant la forme de surface de la couche de réfraction.
PCT/JP2021/022095 2021-06-10 2021-06-10 Dispositif d'estimation de forme, procédé d'estimation de forme et programme WO2022259457A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2023526751A JPWO2022259457A1 (fr) 2021-06-10 2021-06-10
PCT/JP2021/022095 WO2022259457A1 (fr) 2021-06-10 2021-06-10 Dispositif d'estimation de forme, procédé d'estimation de forme et programme

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/022095 WO2022259457A1 (fr) 2021-06-10 2021-06-10 Dispositif d'estimation de forme, procédé d'estimation de forme et programme

Publications (1)

Publication Number Publication Date
WO2022259457A1 true WO2022259457A1 (fr) 2022-12-15

Family

ID=84426025

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/022095 WO2022259457A1 (fr) 2021-06-10 2021-06-10 Dispositif d'estimation de forme, procédé d'estimation de forme et programme

Country Status (2)

Country Link
JP (1) JPWO2022259457A1 (fr)
WO (1) WO2022259457A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008099589A1 (fr) * 2007-02-13 2008-08-21 Panasonic Corporation Système, procédé et dispositif de traitement d'image et format d'image
JP2010279044A (ja) * 2008-12-25 2010-12-09 Panasonic Corp 画像処理装置及び擬似立体画像生成装置

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008099589A1 (fr) * 2007-02-13 2008-08-21 Panasonic Corporation Système, procédé et dispositif de traitement d'image et format d'image
JP2010279044A (ja) * 2008-12-25 2010-12-09 Panasonic Corp 画像処理装置及び擬似立体画像生成装置

Also Published As

Publication number Publication date
JPWO2022259457A1 (fr) 2022-12-15

Similar Documents

Publication Publication Date Title
EP3279803B1 (fr) Procédé et dispositif d'affichage d'image
Kim et al. Robust radiometric calibration and vignetting correction
US10260866B2 (en) Methods and apparatus for enhancing depth maps with polarization cues
CN101673395B (zh) 图像拼接方法及装置
US7948514B2 (en) Image processing apparatus, method and computer program for generating normal information, and viewpoint-converted image generating apparatus
WO2018153374A1 (fr) Étalonnage de caméras
Hughes et al. Equidistant fish-eye calibration and rectification by vanishing point extraction
Lee et al. Automatic upright adjustment of photographs with robust camera calibration
WO2018068719A1 (fr) Procédé et appareil de collage d'image
JP2016133396A (ja) 法線情報生成装置、撮像装置、法線情報生成方法および法線情報生成プログラム
US20090214107A1 (en) Image processing apparatus, method, and program
JP6580761B1 (ja) 偏光ステレオカメラによる深度取得装置及びその方法
Tingdahl et al. A public system for image based 3d model generation
US8749652B2 (en) Imaging module having plural optical units in which each of at least two optical units include a polarization filter and at least one optical unit includes no polarization filter and image processing method and apparatus thereof
CN111080669A (zh) 一种图像反射分离方法及装置
Taamazyan et al. Shape from mixed polarization
Ying et al. Self-calibration of catadioptric camera with two planar mirrors from silhouettes
WO2022259457A1 (fr) Dispositif d'estimation de forme, procédé d'estimation de forme et programme
CN109325912A (zh) 基于偏振光光场的反光分离方法及标定拼合系统
Lyu et al. Physics-guided reflection separation from a pair of unpolarized and polarized images
Illgner et al. Lightfield imaging for industrial applications
JP6550102B2 (ja) 光源方向推定装置
JP5086120B2 (ja) 奥行き情報取得方法、奥行き情報取得装置、プログラムおよび記録媒体
Georgopoulos Photogrammetric automation: is it worth?
US11651475B2 (en) Image restoration method and device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21945133

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023526751

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE