WO2018018363A1 - 一种结构光场三维成像方法及其系统 - Google Patents

一种结构光场三维成像方法及其系统 Download PDF

Info

Publication number
WO2018018363A1
WO2018018363A1 PCT/CN2016/091547 CN2016091547W WO2018018363A1 WO 2018018363 A1 WO2018018363 A1 WO 2018018363A1 CN 2016091547 W CN2016091547 W CN 2016091547W WO 2018018363 A1 WO2018018363 A1 WO 2018018363A1
Authority
WO
WIPO (PCT)
Prior art keywords
light field
structured light
dimensional imaging
phase
depth
Prior art date
Application number
PCT/CN2016/091547
Other languages
English (en)
French (fr)
Inventor
刘晓利
蔡泽伟
彭翔
Original Assignee
深圳大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳大学 filed Critical 深圳大学
Priority to CN201680000610.9A priority Critical patent/CN106257995B/zh
Priority to PCT/CN2016/091547 priority patent/WO2018018363A1/zh
Publication of WO2018018363A1 publication Critical patent/WO2018018363A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/254Projection of a pattern, viewing through a pattern, e.g. moiré
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object

Definitions

  • the present invention relates to the field of optical imaging, and in particular to a three-dimensional imaging method for structured light field and a system thereof.
  • Optical three-dimensional imaging based on fringe projection is a non-contact, full-field measurement method, which is widely used for its high imaging density, high imaging speed and high imaging accuracy.
  • the traditional stripe projection 3D imaging method can only acquire 3D data in a specific direction.
  • single-view three-dimensional imaging has been difficult to meet the three-dimensional imaging requirements for special objects and special performance. Therefore, there is an urgent need to develop an optical imaging method for acquisition and multi-directional three-dimensional imaging.
  • the object of the present invention is to provide a three-dimensional imaging method for structured light field and a system thereof, aiming at solving the limitation problem of the single-view imaging of the existing stripe projection three-dimensional imaging method, so as to realize one-time acquisition and multi-directional structured light.
  • Field 3D imaging is to provide a three-dimensional imaging method for structured light field and a system thereof, aiming at solving the limitation problem of the single-view imaging of the existing stripe projection three-dimensional imaging method, so as to realize one-time acquisition and multi-directional structured light.
  • the invention provides a three-dimensional imaging method for structured light field, the method comprising:
  • Multi-directional three-dimensional imaging is constructed using the direction recorded by each ray in the structured light field and the calculated depth value.
  • the structured light field is a light field of light recorded under structural illumination conditions, and the light field records structured light information modulated by the scene depth, the structured light information including modulation phase information and light direction information,
  • the reference plane of the structured light field is a reference plane in the measurement space of the structured light field three-dimensional imaging system, and the measured object is located between the structured light field three-dimensional imaging system and the reference plane.
  • the structured light field is expressed as:
  • L(u,s) R a (u,s)+R b (u,s)cos ⁇ , where R a (u, s) and R b (u,s) represent the background of the structured light field, respectively Intensity and fringe modulation intensity, u represents the direction of the light l us , S represents the position of the light l us , L represents the radiation intensity of the light, and ⁇ is the phase modulated by the depth of the scene.
  • the calculating the depth value of each ray by using the phase-depth mapping relationship of the structured light field is expressed as:
  • d us is the depth value corresponding to the light l us
  • m us and n us are the phase-depth mapping coefficients of the structured light field
  • is the phase difference between the modulation phase of the scene and the reference plane phase.
  • the step of constructing multi-directional three-dimensional imaging by using the direction recorded by each ray in the structured light field and the calculated depth value comprises:
  • the method before the step of acquiring the structured light field recorded by the imaging device of the structured light field three-dimensional imaging system, the method further comprises:
  • the method further comprises: calibrating the structured light field three-dimensional imaging system to determine the phase of each recorded ray - depth mapping coefficient;
  • the specific steps of calibration include:
  • the displacement platform is used to control the planar target to move to a position d i relative to the reference surface.
  • the imaging device of the structured light field three-dimensional imaging system records the structured light field at the position and calculates the phase difference of each light relative to the reference surface. ⁇ us
  • phase-depth mapping coefficient lookup table LUT us ⁇ (m us , n us ) ⁇ of the ray index is generated.
  • the present invention also provides a structured light field three-dimensional imaging system, the identification system comprising:
  • a structured light illumination module for generating a fringe projection image projected onto a surface of the object by a projection device of the structured light field three-dimensional imaging system
  • the light field imaging module records the information of the reflected light on the surface of the object under the structured light illumination by using the light field data form, wherein the recorded information includes the direction information of the light and the phase information subjected to the depth modulation;
  • phase-depth calibration module for calibrating a structured light field three-dimensional imaging system to determine a phase-depth mapping coefficient of each recorded light
  • a three-dimensional imaging module configured to calculate a depth value of each ray according to a phase-depth mapping coefficient of each recorded ray, and construct a multi-directional three-dimensional shape by using a direction recorded by each ray in the structured light field and a calculated depth value Imaging.
  • the three-dimensional imaging module specifically includes:
  • a modeling sub-module for selecting depth information of a ray of direction u i from depth information of all rays, obtaining a scene depth estimation of the selected direction, and establishing a three-dimensional imaging model
  • the technical solution provided by the invention combines the light field imaging and the structural illumination to obtain the structured light field, and derives the mapping relationship between the phase and the scene depth in the structured light field, and proposes a ray-based structured light field phase-depth mapping calibration and three-dimensional imaging.
  • Methods and systems. can realize one-time acquisition and multi-directional three-dimensional imaging, which is beneficial to further study the theory and application of three-dimensional imaging of structured light field, and meet the requirements of multi-view three-dimensional digital imaging and measurement.
  • FIG. 1 is a flow chart of a method for three-dimensional imaging of a structured light field according to an embodiment of the present invention
  • FIG. 2 is a schematic plan view showing a three-dimensional imaging method for a structured light field according to an embodiment of the present invention
  • FIG. 4 is a schematic diagram showing the internal structure of a structured light field three-dimensional imaging system 10 according to an embodiment of the present invention.
  • a three-dimensional imaging method for structured light field provided by the present invention will be described in detail below.
  • FIG. 1 is a flowchart of a method for three-dimensional imaging of a structured light field according to an embodiment of the present invention.
  • step S1 the structured light field recorded by the imaging device of the structured light field three-dimensional imaging system is acquired, and the phase of the structured light field is solved.
  • the structured light field is a light field of light recorded under structural illumination conditions, and the light field records structured light information modulated by the scene depth, the structured light information including modulation phase information and light direction Information, the reference plane of the structured light field is a reference plane in the measurement space of the structured light field three-dimensional imaging system, and the measured object is located between the structured light field three-dimensional imaging system and the reference plane.
  • the structured light field three-dimensional imaging method before the step S1 of acquiring the structured light field recorded by the imaging device of the structured light field three-dimensional imaging system, the structured light field three-dimensional imaging method further includes:
  • the structured light field recorded by the imaging device of the structured light field three-dimensional imaging system is expressed as:
  • L(u,s) R a (u,s)+R b (u,s)cos ⁇ , where L(u,s) represents a four-dimensional light field; u represents the direction of the light l us ; S represents light l us Position; L represents the radiation intensity of the light; ⁇ is the phase modulated by the depth of the scene, which can be solved by the phase shift method; R a (u, s) and R b (u, s) respectively represent the background intensity of the structured light field And fringe modulation intensity, which are all related to the direction of the light.
  • step S2 the phase of the obtained structured light field is compared with the phase of the reference plane to obtain the phase difference therebetween, and the depth value of each ray is calculated by the phase-depth mapping relationship of the structured light field.
  • the calculating the depth value of each ray by using the phase-depth mapping relationship of the structured light field is expressed as:
  • d us is the depth value corresponding to the light l us
  • m us and n us are the phase-depth mapping coefficients of the structured light field
  • is the phase difference between the modulation phase of the scene and the reference plane phase.
  • a us parallel line pair represents a biplane parameterized light field
  • a P point represents a projection center of the projection device
  • the reflected light of the projection line PA on the reference plane A is recorded when there is no object
  • the reflected light of the projection line PD at the point D on the object is recorded when there is an object.
  • phase of the structured light field obtained by the solution is compared with the phase of the reference plane, and the phase difference between the two is expressed as:
  • ⁇ obj and ⁇ ref are the phases of the object point and the reference plane, respectively, calculated from the structured light fields Lobj and L ref on the object and the reference surface by step S1
  • f is the stripe spatial frequency
  • X E and X A Is the X coordinate of point E and point A.
  • the depth value d us of each ray is calculated by using the phase-depth mapping relationship of the structured light field:
  • (X B , Z B ), (X C , Z C ) and (X P , Z P ) are the coordinates of point B, point C and point P , respectively, and m us and n us are phase-depth mapping coefficients.
  • the coordinates of points B, C, and P are established, so the phase-depth mapping relationship corresponding to each ray can be determined by the mapping coefficients m us and n us .
  • the structured light field three-dimensional imaging method before the step S2 of calculating the depth value of each ray by using the phase-depth mapping relationship of the structured light field, further comprises: calibrating the structured light field three-dimensional imaging a system that determines a phase-depth mapping coefficient for each recorded ray;
  • the specific steps of calibration include:
  • the displacement platform is used to control the planar target to move to a position d i relative to the reference surface.
  • the imaging device of the structured light field three-dimensional imaging system records the structured light field at the position and calculates the phase difference of each light relative to the reference surface. ⁇ us
  • phase-depth mapping coefficient lookup table LUT us ⁇ (m us , n us ) ⁇ of the ray index is generated.
  • step S3 multi-directional three-dimensional imaging is constructed using the direction recorded by each ray in the structured light field and the calculated depth value.
  • the step S3 of constructing multi-directional three-dimensional imaging by using the direction recorded by each ray in the structured light field and the calculated depth value includes:
  • a displacement platform and a calibration whiteboard are arranged in the imaging range of the structured light field three-dimensional imaging system, and the displacement platform moves the whiteboard according to a certain step size.
  • the imaging device collects the structured light field and solves it. Its phase
  • the position farthest from the imaging system is the relative reference plane, and the remaining positions are compared with them to obtain the corresponding relative depth and phase difference;
  • phase-depth mapping coefficient lookup table that generates a ray index
  • the projection device projects a stripe pattern onto the surface of the object
  • the imaging device records the structured light field and solves its phase, and compares the phase with respect to the reference plane to obtain a phase difference of the object relative to the reference surface;
  • Figure 3 shows a three-dimensional imaging model of several specific directions of a plaster image.
  • the circle at the top left of each subgraph represents the microlens, and the small square in the circle represents a particular direction of the light.
  • the invention provides a three-dimensional imaging method for structured light field, which combines light field imaging and structural illumination to obtain a structured light field, and derives a mapping relationship between phase and scene depth in a structured light field, and proposes a light-based structured light field phase- Methods and systems for depth mapping calibration and three-dimensional imaging.
  • the method and system can realize one-time acquisition and multi-directional three-dimensional imaging, which is beneficial to further study the theory and application of three-dimensional imaging of structured light field, and meet the requirements of multi-view three-dimensional digital imaging and measurement.
  • the structured light field three-dimensional imaging system 10 mainly includes a structured light illumination module M1, a light field imaging module M2, a phase-depth calibration module M3, and a three-dimensional imaging module M4.
  • the structured light illumination module M1 is configured to generate a fringe projection image projected onto the surface of the object by a projection device of the structured light field three-dimensional imaging system.
  • the specific generation method of the stripe projection map is as described in the related steps described above, and will not be repeatedly described herein.
  • the light field imaging module M2 records the information of the reflected light on the surface of the object under the structured light illumination using the light field data form, wherein the recorded information includes the direction information of the light and the phase information subjected to the depth modulation.
  • the method of information recording is as described in the related steps described above, and the description thereof will not be repeated here.
  • the phase-depth calibration module M3 is used to calibrate the structured light field three-dimensional imaging system to determine the phase-depth mapping coefficient of each recorded light.
  • the calibration method of the structured light field three-dimensional imaging system is as described in the related steps described above, and will not be repeatedly described herein.
  • a three-dimensional imaging module M4 configured to calculate a depth value of each ray according to a phase-depth mapping coefficient of each recorded ray, and construct a multi-directional direction by using a direction recorded by each ray in the structured light field and a calculated depth value Three-dimensional imaging.
  • the method of calculating the depth value of each ray and the method of constructing the multi-directional three-dimensional imaging are as described in the related steps described above, and will not be repeatedly described herein.
  • the three-dimensional imaging module M4 specifically includes:
  • the modeling sub-module is configured to select the depth information of the ray with the direction u i from the depth information of all the ray, obtain the scene depth estimation of the selected direction, and establish a three-dimensional imaging model.
  • the present invention provides a structured light field three-dimensional imaging system 10, which combines light field imaging and structural illumination to obtain a structured light field, and derives a mapping relationship between phase and scene depth in a structured light field, and proposes a light-based structured light field phase.
  • a structured light field three-dimensional imaging system 10 which combines light field imaging and structural illumination to obtain a structured light field, and derives a mapping relationship between phase and scene depth in a structured light field, and proposes a light-based structured light field phase.
  • the method and system can realize one-time acquisition and multi-directional three-dimensional imaging, which is beneficial to further study the theory and application of three-dimensional imaging of structured light field, and meet the requirements of multi-view three-dimensional digital imaging and measurement.
  • each unit included is only divided according to functional logic, but is not limited to the above division, as long as the corresponding function can be implemented; in addition, the specific name of each functional unit is also They are only used to facilitate mutual differentiation and are not intended to limit the scope of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

一种结构光场三维成像方法,包括:获取结构光场三维成像系统的成像装置所记录的结构光场,求解所述结构光场的相位(S1);将求解得到的所述结构光场的相位与参考面相位进行比较以获取二者相位差,利用所述结构光场的相位-深度映射关系计算每条光线的深度值(S2);利用所述结构光场中每条光线所记录的方向和计算的深度值,构建多方向三维成像(S3)。还提供一种结构光场三维成像系统。结合光场成像和结构照明获取结构光场,推导了结构光场中相位和场景深度的映射关系,能够实现一次采集、多方向三维成像,有利于进一步深入研究结构光场三维成像的理论和应用,满足多视角三维数字成像和测量的要求。

Description

一种结构光场三维成像方法及其系统 技术领域
本发明涉及光学成像领域,尤其涉及一种结构光场三维成像方法及其系统。
背景技术
基于条纹投影的光学三维成像是一种非接触式、全场测量的方法,以其高成像密度、高成像速度和高成像精度而得到广泛应用。
传统的条纹投影三维成像方法一次采集只能获取一个特定方向的三维数据。随着条纹投影三维成像方法应用范围的扩展,单视点三维成像已经难以满足针对特殊物体、特殊性能的三维成像要求。因此,亟需发展一次采集、多方向三维成像的光学成像方法。
发明内容
有鉴于此,本发明的目的在于提供一种结构光场三维成像方法及其系统,旨在解决现有的条纹投影三维成像方法单视点成像的局限性问题,以实现一次采集、多方向结构光场三维成像。
本发明提出一种结构光场三维成像方法,所述方法包括:
获取结构光场三维成像系统的成像装置所记录的结构光场,求解所述结构光场的相位;
将求解得到的所述结构光场的相位与参考面相位进行比较以获取二者相位差,利用所述结构光场的相位-深度映射关系计算每条光线的深度值;
利用所述结构光场中每条光线所记录的方向和计算的深度值,构建多方向三维成像。
优选的,所述结构光场为在结构照明条件下记录的光线的光场,该光场记录了受场景深度调制的结构光信息,所述结构光信息包括调制相位信息和光线方向信息,所述结构光场的参考面为所述结构光场三维成像系统的测量空间中的一个基准平面,被测物体位于所述结构光场三维成像系统与所述基准平面之间。
优选的,所述结构光场表示为:
L(u,s)=Ra(u,s)+Rb(u,s)cosφ,其中,Ra(u,s)和Rb(u,s)分别表示所述结构光场的背景强度和条纹调制强度,u表示光线lus的方向,S表示光线lus的位置,L表示光线的辐射强度,φ是经场景深度调制的相位。
优选的,所述利用所述结构光场的相位-深度映射关系计算每条光线的深度值表示为:
Figure PCTCN2016091547-appb-000001
其中,dus是对应于光线lus的深度值,mus和nus是结构光场的相位-深度映射系数,Δφ是场景的调制相位和参考面相位的相位差。
优选的,所述利用所述结构光场中每条光线所记录的方向和计算的深度值,构建多方向三维成像的步骤包括:
在所记录的结构光场的光线方向中选取一个特定的光线方向ui
从所有光线的深度信息中选取方向为ui的光线的深度信息,得到该选取方向的场景深度估计,并建立三维成像模型
Figure PCTCN2016091547-appb-000002
重复上述两个步骤,得到不同方向的三维成像模型
Figure PCTCN2016091547-appb-000003
以完成构建多方向三维成像,其中,N是光场的角度分辨率。
优选的,在所述获取结构光场三维成像系统的成像装置所记录的结构光场的步骤之前,所述方法还包括:
利用计算机生成标准正弦条纹投影图案,并由所述结构光场三维成像系统的投影装置投射到待测物体表面,经所述待测物体表面调制的变形条纹由所述结构光场三维成像系统的成像装置进行记录;
其中,利用计算机生成标准正弦条纹投影图案表示为:I(X)=A+Bcos(2πfX),I是条纹投影图的归一化强度,A和B分别是用户设计的条纹背景强度和调制强度,f是条纹投影图的条纹空间频率。
优选的,在所述利用所述结构光场的相位-深度映射关系计算每条光线的深度值的步骤之前,所述方法还包括:标定结构光场三维成像系统,确定每条记录光线的相位-深度映射系数;
其中,标定的具体步骤包括:
利用位移平台控制平面标靶移动到相对于参考面深度为di的位置,结构光场三维成像系统的成像装置记录该位置下的结构光场,并计算每条光线相对参 考面的相位差值Δφus|i
利用一系列的深度和相位差值(di,Δφus|i),i=1,2,...,N拟合出该光线lus的相位-深度映射系数(mus,nus),最终生成光线索引的相位-深度映射系数查找表LUTus{(mus,nus)}。
另一方面,本发明还提供一种结构光场三维成像系统,所述识别系统包括:
结构光照明模块,用于生成条纹投影图,由结构光场三维成像系统的投影装置投射到物体表面;
光场成像模块,使用光场数据形式记录在结构光照明下物体表面反射光线的信息,其中,记录的信息包括光线的方向信息和受深度调制的相位信息;
相位-深度标定模块,用于标定结构光场三维成像系统,确定每条记录光线的相位-深度映射系数;
三维成像模块,用于根据每条记录光线的相位-深度映射系数计算每条光线的深度值,以及利用所述结构光场中每条光线所记录的方向和计算的深度值,构建多方向三维成像。
优选的,所述三维成像模块具体包括:
选取子模块,用于在所记录的结构光场的光线方向中选取一个特定的光线方向ui
建模子模块,用于从所有光线的深度信息中选取方向为ui的光线的深度信息,得到该选取方向的场景深度估计,并建立三维成像模型
Figure PCTCN2016091547-appb-000004
重复子模块,用于重复上述两个步骤,得到不同方向的三维成像模型
Figure PCTCN2016091547-appb-000005
以完成构建多方向三维成像,其中,N是光场的角度分辨率。
本发明提供的技术方案结合光场成像和结构照明获取结构光场,推导了结构光场中相位和场景深度的映射关系,提出一种基于光线的结构光场相位-深度映射标定和三维成像的方法和系统。该方法和系统能够实现一次采集、多方向三维成像,有利于进一步深入研究结构光场三维成像的理论和应用,满足多视角三维数字成像和测量的要求。
附图说明
图1为本发明一实施方式中结构光场三维成像方法流程图;
图2为本发明一实施方式中结构光场三维成像方法的平面原理图;
图3为本发明一实施方式中实验中石膏像的几个特定方向的三维成像模型;
图4为本发明一实施方式中结构光场三维成像系统10的内部结构示意图。
具体实施方式
为了使本发明的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本发明进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本发明,并不用于限定本发明。
以下将对本发明所提供的一种结构光场三维成像方法进行详细说明。
请参阅图1,为本发明一实施方式中结构光场三维成像方法流程图。
在步骤S1中,获取结构光场三维成像系统的成像装置所记录的结构光场,求解所述结构光场的相位。
在本实施方式中,所述结构光场为在结构照明条件下记录的光线的光场,该光场记录了受场景深度调制的结构光信息,所述结构光信息包括调制相位信息和光线方向信息,所述结构光场的参考面为所述结构光场三维成像系统的测量空间中的一个基准平面,被测物体位于所述结构光场三维成像系统与所述基准平面之间。
在本实施方式中,在所述获取结构光场三维成像系统的成像装置所记录的结构光场的步骤S1之前,所述结构光场三维成像方法还包括:
利用计算机生成标准正弦条纹投影图案,并由所述结构光场三维成像系统的投影装置投射到待测物体表面,经所述待测物体表面调制的变形条纹由所述结构光场三维成像系统的成像装置进行记录;
其中,利用计算机生成标准正弦条纹投影图案表示为:I(X)=A+Bcos(2πfX),I是条纹投影图的归一化强度,A和B分别是用户设计的条纹背景强度和调制强度,f是条纹投影图的条纹空间频率。
在本实施方式中,结构光场三维成像系统的成像装置所记录的所述结构光场表示为:
L(u,s)=Ra(u,s)+Rb(u,s)cosφ,其中,L(u,s)表示四维光场;u表示光线lus的方向;S表示光线lus的位置;L表示光线的辐射强度;φ是经场景深度调制的相位,利用相移法即可求解;Ra(u,s)和Rb(u,s)分别表示结构光场的背景强度和条纹调制强度,它们都与光线方向有关。
在步骤S2中,将求解得到的所述结构光场的相位与参考面相位进行比较以获取二者相位差,利用所述结构光场的相位-深度映射关系计算每条光线的深度值。
在本实施方式中,所述利用所述结构光场的相位-深度映射关系计算每条光线的深度值表示为:
Figure PCTCN2016091547-appb-000006
其中,dus是对应于光线lus的深度值,mus和nus是结构光场的相位-深度映射系数,Δφ是场景的调制相位和参考面相位的相位差。
图2示出了本发明提供的结构光场三维成像系统的平面原理图,其中,u-s平行线对表示双平面参数化光场,P点表示投影装置的投影中心,参考平面设置于Z=0处。对于光线BC,无物体时记录了投影线PA在参考面上A点的反射光线;有物体时记录了投影线PD在物体上D点的反射光线。
在本实施方式中,将求解得到的结构光场相位与参考面相位进行比较,得到二者相位差表示为:
Δφ=φobjref=2πf(XE-XA)
其中,φobj和φref分别是物点和参考面上的相位,由物体和参考面上的结 构光场Lobj和Lref经步骤S1计算得到,f是条纹空间频率,XE和XA是E点和A点的X坐标。
进一步的,在本实施方式中,利用所述结构光场的相位-深度映射关系计算每条光线的深度值dus表示为:
Figure PCTCN2016091547-appb-000007
其中,(XB,ZB),(XC,ZC)和(XP,ZP)分别是B点,C点和P点的坐标,mus和nus是相位-深度映射系数。对于每条记录光线,B、C、P点的坐标都是确立的,因此每条光线对应的相位-深度映射关系可由映射系数mus和nus来确定。一旦标定出结构光场三维成像系统的相位-深度映射系数(mus,nus),就能够进行多方向深度估计,实现结构光场多方向三维成像。
在本实施方式中,在所述利用所述结构光场的相位-深度映射关系计算每条光线的深度值的步骤S2之前,所述结构光场三维成像方法还包括:标定结构光场三维成像系统,确定每条记录光线的相位-深度映射系数;
其中,标定的具体步骤包括:
利用位移平台控制平面标靶移动到相对于参考面深度为di的位置,结构光场三维成像系统的成像装置记录该位置下的结构光场,并计算每条光线相对参考面的相位差值Δφus|i
利用一系列的深度和相位差值(di,Δφus|i),i=1,2,...,N拟合出该光线lus的 相位-深度映射系数(mus,nus),最终生成光线索引的相位-深度映射系数查找表LUTus{(mus,nus)}。
在步骤S3中,利用所述结构光场中每条光线所记录的方向和计算的深度值,构建多方向三维成像。
在本实施方式中,所述利用所述结构光场中每条光线所记录的方向和计算的深度值,构建多方向三维成像的步骤S3包括:
在所记录的结构光场的光线方向中选取一个特定的光线方向ui
从所有光线的深度信息中选取方向为ui的光线的深度信息,得到该选取方向的场景深度估计,并建立三维成像模型
Figure PCTCN2016091547-appb-000008
重复上述两个步骤,得到不同方向的三维成像模型
Figure PCTCN2016091547-appb-000009
以完成构建多方向三维成像,其中,N是光场的角度分辨率。
以下以实例说明上述结构光场三维成像方法的执行过程:
首先,计算生成投影条纹图案,作为系统标定与场景重建的结构照明;
之后,在结构光场三维成像系统的成像范围内设置位移平台和一块标定白板,位移平台按照一定的步长移动白板,每到一个位置投影装置投射条纹图,成像装置采集该结构光场并求解其相位;
之后,在得到一系列位置和相位的情况下,以离成像系统最远的位置为相对参考面,其余位置与之相比较,得到对应的相对深度和相位差;
之后,对每条光线的一系列深度和相位差拟合出该光线的相位-深度映射系 数,生成光线索引的相位-深度映射系数查找表;
之后,投影装置投射条纹图案到物体表面,成像装置记录该结构光场并求解其相位,与相对参考面的相位比较,得到该物体相对于参考面的相位差;
之后,对于每条记录光线,从相位-深度映射系数查找表中找出相应的映射系数,计算该光线的深度值;
之后,所有相同方向的深度值组合为该方向的深度图,最后重建该方向下物体的三维模型。
图3示出石膏像的几个特定方向的三维成像模型。每一子图左上方的圆形表示微透镜,圆形中的小方块表示光线的一个特定方向。
本发明提供的一种结构光场三维成像方法,结合光场成像和结构照明获取结构光场,推导了结构光场中相位和场景深度的映射关系,提出一种基于光线的结构光场相位-深度映射标定和三维成像的方法和系统。该方法和系统能够实现一次采集、多方向三维成像,有利于进一步深入研究结构光场三维成像的理论和应用,满足多视角三维数字成像和测量的要求。
请参阅图4,所示为本发明一实施方式中结构光场三维成像系统10的结构示意图。在本实施方式中,结构光场三维成像系统10主要包括结构光照明模块M1、光场成像模块M2、相位-深度标定模块M3以及三维成像模块M4。
结构光照明模块M1,用于生成条纹投影图,由结构光场三维成像系统的投影装置投射到物体表面。
在本实施方式中,条纹投影图的具体生成方法如前述的相关步骤所述,在此不做重复描述。
光场成像模块M2,使用光场数据形式记录在结构光照明下物体表面反射光线的信息,其中,记录的信息包括光线的方向信息和受深度调制的相位信息。
在本实施方式中,信息记录的方法如前述的相关步骤所述,在此不做重复描述。
相位-深度标定模块M3,用于标定结构光场三维成像系统,确定每条记录光线的相位-深度映射系数。
在本实施方式中,结构光场三维成像系统的标定方法如前述的相关步骤所述,在此不做重复描述。
三维成像模块M4,用于根据每条记录光线的相位-深度映射系数计算每条光线的深度值,以及利用所述结构光场中每条光线所记录的方向和计算的深度值,构建多方向三维成像。
在本实施方式中,计算每条光线的深度值的方法和构建多方向三维成像的方法如前述的相关步骤所述,在此不做重复描述。
在本实施方式中,所述三维成像模块M4具体包括:
选取子模块,用于在所记录的结构光场的光线方向中选取一个特定的光线方向ui
建模子模块,用于从所有光线的深度信息中选取方向为ui的光线的深度信 息,得到该选取方向的场景深度估计,并建立三维成像模型
Figure PCTCN2016091547-appb-000010
重复子模块,用于重复上述两个步骤,得到不同方向的三维成像模型
Figure PCTCN2016091547-appb-000011
以完成构建多方向三维成像,其中,N是光场的角度分辨率。
本发明提供的一种结构光场三维成像系统10,结合光场成像和结构照明获取结构光场,推导了结构光场中相位和场景深度的映射关系,提出一种基于光线的结构光场相位-深度映射标定和三维成像的方法和系统。该方法和系统能够实现一次采集、多方向三维成像,有利于进一步深入研究结构光场三维成像的理论和应用,满足多视角三维数字成像和测量的要求。
值得注意的是,上述实施例中,所包括的各个单元只是按照功能逻辑进行划分的,但并不局限于上述的划分,只要能够实现相应的功能即可;另外,各功能单元的具体名称也只是为了便于相互区分,并不用于限制本发明的保护范围。
另外,本领域普通技术人员可以理解实现上述各实施例方法中的全部或部分步骤是可以通过程序来指令相关的硬件来完成,相应的程序可以存储于一计算机可读取存储介质中,所述的存储介质,如ROM/RAM、磁盘或光盘等。
以上所述仅为本发明的较佳实施例而已,并不用以限制本发明,凡在本发明的精神和原则之内所作的任何修改、等同替换和改进等,均应包含在本发明的保护范围之内。

Claims (9)

  1. 一种结构光场三维成像方法,其特征在于,所述方法包括:
    获取结构光场三维成像系统的成像装置所记录的结构光场,求解所述结构光场的相位;
    将求解得到的所述结构光场的相位与参考面相位进行比较以获取二者相位差,利用所述结构光场的相位-深度映射关系计算每条光线的深度值;
    利用所述结构光场中每条光线所记录的方向和计算的深度值,构建多方向三维成像。
  2. 如权利要求1所述的结构光场三维成像方法,其特征在于,所述结构光场为在结构照明条件下记录的光线的光场,该光场记录了受场景深度调制的结构光信息,所述结构光信息包括调制相位信息和光线方向信息,所述结构光场的参考面为所述结构光场三维成像系统的测量空间中的一个基准平面,被测物体位于所述结构光场三维成像系统与所述基准平面之间。
  3. 如权利要求2所述的结构光场三维成像方法,其特征在于,所述结构光场表示为:
    L(u,s)=Ra(u,s)+Rb(u,s)cosφ,其中,Ra(u,s)和Rb(u,s)分别表示所述结构光场的背景强度和条纹调制强度,u表示光线lus的方向,S表示光线lus的位置,L表示光线的辐射强度,φ是经场景深度调制的相位。
  4. 如权利要求3所述的结构光场三维成像方法,其特征在于,所述利用所 述结构光场的相位-深度映射关系计算每条光线的深度值表示为:
    Figure PCTCN2016091547-appb-100001
    其中,dus是对应于光线lus的深度值,mus和nus是结构光场的相位-深度映射系数,Δφ是场景的调制相位和参考面相位的相位差。
  5. 如权利要求4所述的结构光场三维成像方法,其特征在于,所述利用所述结构光场中每条光线所记录的方向和计算的深度值,构建多方向三维成像的步骤包括:
    在所记录的结构光场的光线方向中选取一个特定的光线方向ui
    从所有光线的深度信息中选取方向为ui的光线的深度信息,得到该选取方向的场景深度估计,并建立三维成像模型
    Figure PCTCN2016091547-appb-100002
    重复上述两个步骤,得到不同方向的三维成像模型
    Figure PCTCN2016091547-appb-100003
    以完成构建多方向三维成像,其中,N是光场的角度分辨率。
  6. 如权利要求2所述的结构光场三维成像方法,其特征在于,在所述获取结构光场三维成像系统的成像装置所记录的结构光场的步骤之前,所述方法还包括:
    利用计算机生成标准正弦条纹投影图案,并由所述结构光场三维成像系统的投影装置投射到待测物体表面,经所述待测物体表面调制的变形条纹由所述结构光场三维成像系统的成像装置进行记录;
    其中,利用计算机生成标准正弦条纹投影图案表示为:I(X)=A+Bcos(2πfX),I是条纹投影图的归一化强度,A和B分别是用户设 计的条纹背景强度和调制强度,f是条纹投影图的条纹空间频率。
  7. 如权利要求4所述的结构光场三维成像方法,其特征在于,在所述利用所述结构光场的相位-深度映射关系计算每条光线的深度值的步骤之前,所述方法还包括:标定结构光场三维成像系统,确定每条记录光线的相位-深度映射系数;
    其中,标定的具体步骤包括:
    利用位移平台控制平面标靶移动到相对于参考面深度为di的位置,结构光场三维成像系统的成像装置记录该位置下的结构光场,并计算每条光线相对参考面的相位差值Δφusi
    利用一系列的深度和相位差值(di,Δφus|i),i=1,2,…,N拟合出该光线lus的相位-深度映射系数(mus,nus),最终生成光线索引的相位-深度映射系数查找表LUTus{(mUs,nus)}。
  8. 一种结构光场三维成像系统,其特征在于,所述系统包括:
    结构光照明模块,用于生成条纹投影图,由结构光场三维成像系统的投影装置投射到物体表面;
    光场成像模块,使用光场数据形式记录在结构光照明下物体表面反射光线的信息,其中,记录的信息包括光线的方向信息和受深度调制的相位信息;
    相位-深度标定模块,用于标定结构光场三维成像系统,确定每条记录光线的相位-深度映射系数;
    三维成像模块,用于根据每条记录光线的相位-深度映射系数计算每条光线的深度值,以及利用所述结构光场中每条光线所记录的方向和计算的深度值,构建多方向三维成像。
  9. 如权利要求8所述的结构光场三维成像系统,其特征在于,所述三维成像模块具体包括:
    选取子模块,用于在所记录的结构光场的光线方向中选取一个特定的光线方向ui
    建模子模块,用于从所有光线的深度信息中选取方向为ui的光线的深度信息,得到该选取方向的场景深度估计,并建立三维成像模型
    Figure PCTCN2016091547-appb-100004
    重复子模块,用于重复上述两个步骤,得到不同方向的三维成像模型
    Figure PCTCN2016091547-appb-100005
    以完成构建多方向三维成像,其中,N是光场的角度分辨率。
PCT/CN2016/091547 2016-07-25 2016-07-25 一种结构光场三维成像方法及其系统 WO2018018363A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201680000610.9A CN106257995B (zh) 2016-07-25 2016-07-25 一种结构光场三维成像方法及其系统
PCT/CN2016/091547 WO2018018363A1 (zh) 2016-07-25 2016-07-25 一种结构光场三维成像方法及其系统

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/091547 WO2018018363A1 (zh) 2016-07-25 2016-07-25 一种结构光场三维成像方法及其系统

Publications (1)

Publication Number Publication Date
WO2018018363A1 true WO2018018363A1 (zh) 2018-02-01

Family

ID=57713818

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/091547 WO2018018363A1 (zh) 2016-07-25 2016-07-25 一种结构光场三维成像方法及其系统

Country Status (2)

Country Link
CN (1) CN106257995B (zh)
WO (1) WO2018018363A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108873363A (zh) * 2018-09-06 2018-11-23 深圳技术大学(筹) 基于结构光标记的三维光场成像系统及方法
CN113566709A (zh) * 2021-08-26 2021-10-29 苏州小优智能科技有限公司 一种结构光测量系统的标定方法、装置及电子设备

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107063122B (zh) * 2017-04-28 2019-11-29 西安工业大学 光学非球面面形的检测方法及其装置
CN106991715A (zh) * 2017-05-11 2017-07-28 中国科学院自动化研究所 基于光场采集的光栅棱柱三维显示渲染方法
CN107024850B (zh) * 2017-05-26 2019-11-01 清华大学 高速结构光三维成像系统
CN107562185B (zh) * 2017-07-14 2020-04-07 西安电子科技大学 一种基于头戴vr设备的光场显示系统及实现方法
KR102411661B1 (ko) 2017-07-31 2022-06-21 삼성전자주식회사 영상 처리 방법 및 디바이스
TWI647661B (zh) * 2017-08-10 2019-01-11 緯創資通股份有限公司 影像深度感測方法與影像深度感測裝置
CN107529096A (zh) * 2017-09-11 2017-12-29 广东欧珀移动通信有限公司 图像处理方法及装置
CN107886053A (zh) * 2017-10-27 2018-04-06 广东欧珀移动通信有限公司 眼镜佩戴状态检测方法、装置及电子装置
CN108120392B (zh) * 2017-11-30 2020-03-31 东南大学 气液两相流中气泡三维测量系统及方法
CN108924407B (zh) * 2018-06-15 2020-12-18 深圳奥比中光科技有限公司 一种深度成像方法及系统
CN109282757B (zh) * 2018-07-27 2020-09-25 深圳大学 一种光场度量标定方法及标定系统
CN108989682B (zh) * 2018-08-06 2020-06-05 深圳大学 一种主动光场深度成像方法及系统
CN108965853B (zh) * 2018-08-15 2021-02-19 张家港康得新光电材料有限公司 一种集成成像三维显示方法、装置、设备及存储介质
CN109945802B (zh) * 2018-10-11 2021-03-09 苏州深浅优视智能科技有限公司 一种结构光三维测量方法
CN110310365B (zh) * 2019-06-27 2021-01-05 四川大学 一种三维重建方法及装置
CN111583323B (zh) * 2020-04-30 2023-04-25 深圳大学 一种单帧结构光场三维成像方法和系统
CN112945141B (zh) * 2021-01-29 2023-03-14 中北大学 基于微透镜阵列的结构光快速成像方法及系统
CN113237437B (zh) * 2021-06-02 2023-11-10 苏州大学 一种基于位相编码元件的结构光三维形貌测量方法及装置
CN113724371B (zh) * 2021-08-13 2023-06-13 深圳技术大学 同轴照明光场的三维成像方法、系统、电子装置及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004325096A (ja) * 2003-04-22 2004-11-18 Fujitsu Ltd 格子パターン投影法における画像処理方法、画像処理装置及び計測装置
CN104331924A (zh) * 2014-11-26 2015-02-04 西安冉科信息技术有限公司 基于单摄像机sfs算法的三维重建方法
CN104567730A (zh) * 2015-01-15 2015-04-29 四川大学 一种时空二元编码产生正弦结构光场的方法
CN105627952A (zh) * 2015-12-25 2016-06-01 暨南大学 一种使用单像素探测器的物体三维形貌测量方法和装置

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102261896A (zh) * 2011-04-19 2011-11-30 长春东瑞科技发展有限公司 一种基于相位测量的物体三维形貌测量方法及系统
CN105157616B (zh) * 2015-07-31 2017-08-22 西安工业大学 一种阴影莫尔轮廓测量装置、其标定方法和测量方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004325096A (ja) * 2003-04-22 2004-11-18 Fujitsu Ltd 格子パターン投影法における画像処理方法、画像処理装置及び計測装置
CN104331924A (zh) * 2014-11-26 2015-02-04 西安冉科信息技术有限公司 基于单摄像机sfs算法的三维重建方法
CN104567730A (zh) * 2015-01-15 2015-04-29 四川大学 一种时空二元编码产生正弦结构光场的方法
CN105627952A (zh) * 2015-12-25 2016-06-01 暨南大学 一种使用单像素探测器的物体三维形貌测量方法和装置

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108873363A (zh) * 2018-09-06 2018-11-23 深圳技术大学(筹) 基于结构光标记的三维光场成像系统及方法
CN108873363B (zh) * 2018-09-06 2023-11-17 深圳技术大学(筹) 基于结构光标记的三维光场成像系统及方法
CN113566709A (zh) * 2021-08-26 2021-10-29 苏州小优智能科技有限公司 一种结构光测量系统的标定方法、装置及电子设备

Also Published As

Publication number Publication date
CN106257995A (zh) 2016-12-28
CN106257995B (zh) 2019-06-07

Similar Documents

Publication Publication Date Title
WO2018018363A1 (zh) 一种结构光场三维成像方法及其系统
CN110288642B (zh) 基于相机阵列的三维物体快速重建方法
CN111242990B (zh) 基于连续相位稠密匹配的360°三维重建优化方法
CN106548489B (zh) 一种深度图像与彩色图像的配准方法、三维图像采集装置
CN108759669B (zh) 一种室内自定位三维扫描方法及系统
US20120176478A1 (en) Forming range maps using periodic illumination patterns
US20120176380A1 (en) Forming 3d models using periodic illumination patterns
JP2012504771A (ja) 三次元及び距離の面間推定を与えるための方法及びシステム
WO2021140886A1 (ja) 三次元モデル生成方法、情報処理装置およびプログラム
JP2016075637A (ja) 情報処理装置およびその方法
CN112308963B (zh) 一种无感三维人脸重建方法及采集重建系统
CN109307483B (zh) 一种基于结构光系统几何约束的相位展开方法
Peng et al. Model and algorithms for point cloud construction using digital projection patterns
Garrido-Jurado et al. Simultaneous reconstruction and calibration for multi-view structured light scanning
Zhou et al. A novel way of understanding for calibrating stereo vision sensor constructed by a single camera and mirrors
CN113012277A (zh) 一种基于dlp面结构光多相机重建方法
CN107990846A (zh) 基于单帧结构光的主被动结合深度信息获取方法
Sansoni et al. 3-D optical measurements in the field of cultural heritage: the case of the Vittoria Alata of Brescia
CN114543787B (zh) 基于条纹投影轮廓术的毫米级室内建图定位方法
CN106767405B (zh) 相位映射辅助三维成像系统快速对应点匹配的方法及装置
CN107504919B (zh) 基于相位映射的折叠相位三维数字成像方法及装置
US10049461B2 (en) Motion evaluation system and method
CN103559710B (zh) 一种用于三维重建系统的标定方法
Orghidan et al. Modelling and accuracy estimation of a new omnidirectional depth computation sensor
Peng et al. Hybrid calibration procedure for structured light field system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16909944

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 17/06/2019)

122 Ep: pct application non-entry in european phase

Ref document number: 16909944

Country of ref document: EP

Kind code of ref document: A1