WO2018107427A1 - 相位映射辅助三维成像系统快速对应点匹配的方法及装置 - Google Patents

相位映射辅助三维成像系统快速对应点匹配的方法及装置 Download PDF

Info

Publication number
WO2018107427A1
WO2018107427A1 PCT/CN2016/110082 CN2016110082W WO2018107427A1 WO 2018107427 A1 WO2018107427 A1 WO 2018107427A1 CN 2016110082 W CN2016110082 W CN 2016110082W WO 2018107427 A1 WO2018107427 A1 WO 2018107427A1
Authority
WO
WIPO (PCT)
Prior art keywords
point
phase
imaging device
dimensional
distribution map
Prior art date
Application number
PCT/CN2016/110082
Other languages
English (en)
French (fr)
Inventor
刘晓利
杨洋
蔡泽伟
彭翔
Original Assignee
深圳大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳大学 filed Critical 深圳大学
Priority to PCT/CN2016/110082 priority Critical patent/WO2018107427A1/zh
Publication of WO2018107427A1 publication Critical patent/WO2018107427A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof

Definitions

  • the invention belongs to the technical field of optical three-dimensional digital imaging, and in particular relates to a method and a device for quickly matching points matching of a phase map assisted three-dimensional imaging system.
  • the trinocular three-dimensional imaging system is a non-contact, full-field optical three-dimensional digital imaging system.
  • the system uses a projection device to project a set of sinusoidal gratings or quasi-sinusoidal gratings onto the surface of the object, and uses an imaging device to collect the fringe pattern modulated by the surface of the object surface, and combines the phase shift technique to calculate the spatial phase value of each measuring point, and then uses the phase.
  • the information finds corresponding points on the imaging planes of the two imaging devices, and the three-dimensional information of the surface of the object is calculated according to the triangulation method.
  • the trinocular three-dimensional imaging system is widely used due to its high imaging density, high imaging speed, high measurement accuracy, and high measurement universality.
  • the most critical step in 3D imaging and measurement technology is the matching of corresponding points.
  • the prior art proposes some methods for matching corresponding points.
  • the typical ones are: 1. Using gray value or quadrature phase value
  • the global search on the image surface realizes the corresponding point matching; 2.
  • the corresponding point matching is realized by using the gray value or the single direction phase value based on the polar line constraint.
  • the first method is that each effective point finds a corresponding point in the entire image range
  • the second method is that each effective point searches for a corresponding point along the polar line; the corresponding point matching method proposed in the prior art needs to be performed in the matching process.
  • Performing a global search on each of the image planes on another image surface or searching along the corresponding polar line results in a large search range and a long search time, which directly affects the matching of corresponding points in the three-dimensional imaging process. time.
  • the technical problem to be solved by the present invention is to provide a method and a device for fast corresponding point matching of a phase map assisted three-dimensional imaging system, aiming at solving the problem that the search range is large and the search time is too long in the process of matching three-dimensional corresponding points.
  • the present invention provides a method for fast corresponding point matching of a phase map assisted three-dimensional imaging system, the three-dimensional imaging system comprising: a first imaging device, a projection device and a second imaging device, the first imaging device and the second An imaging device is located on both sides of the projection device, and the method includes:
  • Step S1 projecting light onto the surface of the object to be measured by using the projection device, and collecting an image containing the information of the measured object by using the first imaging device, and calculating a phase value of each point of the measured object according to the image.
  • Obtaining the phase value of each point of the object to be measured Forming a first phase profile; and acquiring an image containing the measured object information by using the second imaging device, and calculating a phase value of each point of the measured object according to the image
  • Step S2 using the phase value of each point in the first phase distribution map And pre-setting the first calibration data, using the phase map to estimate the spatial three-dimensional point coordinates of the measured object;
  • Step S3 re-projecting the spatial three-dimensional point coordinates of the measured object onto a plane on which the second phase distribution map is located, and obtaining each point in the first phase distribution map in the second phase distribution map. Corresponding reference point on the plane;
  • Step S4 determining an equation of a polar line on a plane of the second phase distribution map according to the three-dimensional coordinates of a certain point in the first phase distribution map and the preset second calibration data, at the certain point phase Corresponding reference corresponding point is centered on the polar line within a pixel size range, according to the phase value of the certain point Find the corresponding point of the point to achieve the corresponding point matching.
  • phase mapping formula is:
  • the values of the polynomial coefficients a i , b i , c i are the first calibration data, in the formula
  • X, Y, Z are values of the estimated three-dimensional point coordinates of each point in the first imaging device coordinate system
  • the second calibration data includes: internal fixed parameters of the first imaging device and the second imaging device, and further includes a rotation matrix R, a translation matrix T, the rotation matrix R, the translation matrix T is used for the first imaging device coordinate system and the second Inter-conversion calculation between imaging device coordinate systems.
  • step S2 is specifically: using a phase value of each point in the first phase distribution map And the first calibration data a i , b i , c i , using a phase mapping formula
  • the values X, Y, and Z of the spatial three-dimensional point coordinates of the measured object in the coordinate system of the first imaging device are estimated.
  • step S3 is specifically: converting the spatial three-dimensional point coordinates X, Y, Z of the estimated measured object into the coordinate system of the second imaging device by using the rotation matrix R and the translation matrix T, and then Calculating coordinates of a reference corresponding point corresponding to each point in the first phase distribution map
  • the specific formula is:
  • X r , Y r , Z r are the estimated three-dimensional point coordinates of the measured object in the coordinate system of the second imaging device.
  • the point is that the coordinates in the first phase distribution map are Phase value Point.
  • the present invention also provides a device for fast corresponding point matching of a phase map assisted three-dimensional imaging system, the three-dimensional imaging system comprising: a first imaging device, a projection device and a second imaging device, the first imaging device and the first Two imaging devices are located on both sides of the projection device, the device comprising:
  • phase information acquiring module configured to project light onto the surface of the object to be measured by using the projection device, and acquire an image containing the information of the measured object by using the first imaging device, and calculate a phase value of each point of the measured object according to the image Obtaining the phase value of each point of the object to be measured a first phase distribution map; further configured to acquire an image including the measured object information by using the second imaging device, and calculate a phase value of each point of the measured object according to the image Obtaining the phase value of each point of the object to be measured a second phase profile composed of;
  • the plane in which the first phase distribution map is located is a normalized plane obtained after the lens distortion of the image plane of the first imaging device is removed, and the plane where the second phase distribution map is located is obtained after the image plane of the second imaging device is removed from the lens distortion. Normalized plane
  • a spatial three-dimensional point coordinate estimation module for utilizing a phase value of each point in the first phase distribution map And pre-setting the first calibration data, and estimating the spatial three-dimensional point coordinates of the measured object by using the phase map;
  • a reference point obtaining module configured to reproject the spatial three-dimensional point coordinates of the measured object onto a plane on which the second phase distribution map is located, to obtain each point in the first phase distribution map a corresponding reference corresponding point on the plane of the second phase distribution map;
  • Corresponding point matching module for using a phase value within a pixel size range centered on a reference corresponding point corresponding to a certain point in the first phase distribution map, according to a phase value of the certain point Find the corresponding point of the point to achieve the corresponding point matching;
  • the polar line equation of the polar line is: a polar line equation on a plane of the second phase distribution map determined according to the three-dimensional coordinates of a certain point in the first phase distribution map and the preset second calibration data.
  • phase mapping formula is:
  • the values of the polynomial coefficients a i , b i , c i are the first calibration data, in the formula
  • X, Y, Z are values of the estimated three-dimensional point coordinates of each point in the first imaging device coordinate system
  • the second calibration data includes: internal fixed parameters of the first imaging device and the second imaging device, and further includes a rotation matrix R, a translation matrix T, the rotation matrix R, the translation matrix T is used for the first imaging device coordinate system and the second Inter-conversion calculation between imaging device coordinate systems.
  • the spatial three-dimensional point coordinate estimation module is specifically configured to: utilize a phase value of each point in the first phase distribution map And the first calibration data a i , b i , c i , using a phase mapping formula
  • the values X, Y, and Z of the spatial three-dimensional point coordinates of the measured object in the coordinate system of the first imaging device are estimated.
  • the reference corresponding point obtaining module is specifically configured to: convert the spatial three-dimensional point coordinates X, Y, Z of the estimated measured object to the second imaging by using the rotation matrix R and the translation matrix T In the device coordinate system, the coordinates of the reference corresponding point corresponding to each point in the first phase distribution map are calculated.
  • the specific formula is:
  • X r , Y r , Z r are the estimated three-dimensional point coordinates of the measured object in the coordinate system of the second imaging device.
  • the point is that the coordinates in the first phase distribution map are Phase value Point.
  • the present invention has the beneficial effects that the present invention provides a method and apparatus for fast corresponding point matching of a phase map assisted three-dimensional imaging system.
  • the first imaging device is used for acquisition and calculation. a first phase distribution map of the measured object, and estimating a spatial three-dimensional point coordinate of the measured object, and re-projecting the estimated spatial three-dimensional point coordinate to a plane on which the second phase distribution map is located, to obtain a reference corresponding point, and combining the Refer to the corresponding point and the traditional search method for the corresponding point search.
  • the invention searches in a pixel size range centered on the reference corresponding point, and combines the classical method of finding the corresponding point by using the two conditions of the phase information and the polar line constraint, thereby greatly reducing the range of the corresponding point search and shortening the correspondence.
  • the time of point search realizes the matching of the fast corresponding points of the three-dimensional imaging system; meets the requirements of high-speed, high-precision and high-precision three-dimensional digital imaging and measurement.
  • FIG. 1 is a schematic diagram of a method for quickly matching points of a phase map assisted three-dimensional imaging system according to an embodiment of the present invention.
  • FIG. 2 is a schematic flow chart of a method for quickly matching points of a phase map assisted three-dimensional imaging system according to an embodiment of the present invention
  • 3-1 is a left camera phase distribution diagram of a measured object model acquired by a left camera according to an embodiment of the present invention
  • 3-2 is a right camera phase distribution diagram of a measured object model acquired by a right camera according to an embodiment of the present invention
  • FIG. 4 is a three-dimensional effect diagram of three-dimensional point coordinates formed by a phase map estimated by a phase map according to an embodiment of the present invention
  • FIG. 5 is a schematic diagram of a three-dimensional digital image for reconstructing a measured object by using the found exact point corresponding to the embodiment of the present invention
  • FIG. 6 is a schematic block diagram of an apparatus for fast corresponding point matching of a phase map assisted three-dimensional imaging system according to an embodiment of the present invention.
  • the invention introduces a phase mapping method based on a classical polar line-constrained phase sub-pixel finding corresponding point method, and then proposes a phase mapping assisted three-dimensional imaging system fast corresponding point matching method and device.
  • the method of fast matching point matching of the phase map assisted three-dimensional imaging system is specifically described below, and is combined with FIG. 1 and FIG.
  • the method is based on the three-dimensional imaging system, which is a trinocular structure comprising: a first imaging device, a projection device, and a second imaging device, the first imaging device and the second imaging device being located at Both sides of the projection device.
  • the imaging device is a camera
  • the projection device is a projector.
  • the camera is located on the left and right sides of the projector 1, respectively, a left camera 2 and a right camera 3, and the measured object Placed in the field of view and depth of field of the left camera 2 and the right camera 3;
  • the projector 1 in the middle projects a sinusoidal or quasi-sinusoidal stripe light onto the surface of the object to be measured as a projection device, and the left camera 2 and the right camera 3 as an imaging device A stripe pattern modulated by the surface of the object to be measured.
  • the three-dimensional imaging system is calibrated to obtain first calibration data and second calibration data, and the first calibration data is used to substitute phase In the mapping formula, thereby estimating the values X, Y, Z of the spatial three-dimensional point coordinates of the measured object in the coordinate system of the first imaging device; R, T in the second calibration data is used for the object to be measured
  • the spatial three-dimensional point coordinates X, Y, Z are converted into the coordinate system of the second imaging device, and are also used to calculate the values of the parameters a, b, c in the polar line equation, and the internal fixed parameters in the second calibration data Used to dedistort.
  • the phase mapping formula is:
  • the polynomial coefficients a i, b i, c i the value of the first calibration data; calibrate the polynomial coefficient values a i, b i, c i is the calculated value for X, Y, Z of the formula , a phase value of each point in the first phase distribution map, X, Y, Z is a value of a spatial three-dimensional point coordinate of each point of the estimated object under the first imaging device coordinate system;
  • the second calibration data includes: internal fixed parameters of the first imaging device and the second imaging device, and further includes a spatial rigid body transformation of the first imaging device and the second imaging device position: a rotation matrix R, a translation matrix T, the rotation matrix R.
  • the translation matrix T is used for mutual conversion calculation between the first imaging device coordinate system and the second imaging device coordinate system.
  • the method includes:
  • Step S1 projecting light onto the surface of the object to be measured by using the projection device, and collecting an image containing the information of the measured object by using the first imaging device, and calculating a phase value of each point of the measured object according to the image.
  • Obtaining the phase value of each point of the object to be measured Forming a first phase profile; and acquiring an image containing the measured object information by using the second imaging device, and calculating a phase value of each point of the measured object according to the image
  • Obtaining the phase value of each point of the object to be measured A second phase profile composed.
  • the plane where the first phase distribution map is located is a normalized plane obtained after the lens distortion of the image plane of the first imaging device is removed, and the plane where the second phase distribution map is located is the image plane removal lens of the second imaging device.
  • the normalized plane obtained after distortion is a normalized plane obtained after the lens distortion of the image plane of the first imaging device is removed.
  • the first imaging device is a left camera
  • the second imaging device is a right camera
  • the image plane of the first imaging device is a left camera image plane
  • the image plane of the second imaging device is a right camera image plane
  • the first phase distribution map and the first imaging device image plane are in a first imaging device coordinate system, ie, a left camera coordinate system
  • the second phase distribution map and the second imaging device image plane are in a second imaging device coordinate system, ie, a right camera
  • the point on the first phase distribution map in the embodiment of the present invention is obtained by performing a distortion correction operation on a point of the image plane of the first imaging device, and the obtained point is on a normalized plane of the first imaging device.
  • the point on the first phase distribution map has a z value of 1 in the first imaging device coordinate system; similarly, the point on the second phase distribution map in the embodiment of the present invention is also the image plane of the second imaging device.
  • the point obtained by the dedistortion operation is obtained on the normalized plane of the second imaging device, so the point on the second phase distribution map is in the second imaging device coordinate system. a z-coordinate value.
  • the projector projects a series of sinusoidal or quasi-sinusoidal stripe light to illuminate the surface of the object to be measured, and the left camera and the right camera synchronously collect a series of stripe images modulated by the object to be measured, and then collect the image by using the left camera.
  • the series of fringe images are calculated by dephasing the phase values of each point that the measured object is captured by the camera.
  • the phase value Forming the phase distribution map of the left camera of the system; similarly, using a series of fringe images acquired by the right camera to obtain the phase value of each point of the measured object collected by the camera by the phase calculation
  • the phase value The phase distribution map of the left camera of the system is formed.
  • the left camera phase distribution map IL and the right camera phase distribution map IR of the measured object model acquired by the left and right cameras provided by the embodiments of the present invention are as shown in FIGS. 3-1 and 3-2, respectively. Show.
  • phase value of each point in the first phase distribution map may be used to obtain the three-dimensional point coordinates of each point in the first imaging device coordinate system by phase mapping estimation; or the second phase distribution map may also be utilized.
  • the phase value of each point is estimated by phase mapping to obtain the three-dimensional point coordinates of each point in the second imaging device coordinate system.
  • Step S2 using the phase value of each point in the first phase distribution map And the preset first calibration data, using the phase map to estimate the spatial three-dimensional point coordinates of the measured object in the left camera coordinate system.
  • the measured three-dimensional point coordinates of the measured object can be estimated by using the phase distribution map of the measured object collected and calculated by the left camera or the right camera; if used in this process, The left camera phase distribution map corresponding to the left camera, then the estimated spatial three-dimensional point coordinates are projected onto the right camera phase distribution map corresponding to the right camera as the reference corresponding point; if the right camera phase map corresponding to the right camera is utilized , the camera is the left camera.
  • the left camera phase distribution map acquired and calculated by the left camera is used to estimate the spatial three-dimensional point coordinates; the plane in which the first and second phase distribution maps are located in FIG. 1 is the normalized plane of the left and right cameras.
  • the step S2 is specifically: using the phase value of each point in the left camera phase distribution map And the preset first calibration data a i , b i , c i , using the phase mapping formula
  • the spatial three-dimensional point coordinates of the measured object corresponding to the point are estimated, and all the estimated three-dimensional points of the measured object space can be fitted into a virtual model of the three-dimensional image of the measured object, as shown in FIG.
  • Step S3 re-projecting the spatial three-dimensional point coordinates of the measured object onto a plane on which the second phase distribution map is located, and obtaining each point in the first phase distribution map in the second phase distribution map.
  • the corresponding reference point on the plane The corresponding reference point on the plane.
  • step S3 the re-projection process in step S3 is divided into two steps: first, converting the estimated spatial three-dimensional point of the measured object into the second imaging device coordinate system; and in the second step, the estimated The three-dimensional point of the object in the coordinate system of the second imaging device is projected onto the plane in which the second phase profile is located.
  • the step S3 is specifically: converting the spatial three-dimensional point coordinates X, Y, Z of the estimated measured object into the coordinate system of the second imaging device by using the rotation matrix R and the translation matrix T, and combining The positional relationship of the plane where the two phase distribution map is located, and the estimated spatial coordinate point is projected onto the plane where the right camera phase distribution map is located, and the reference corresponding point is obtained; the specific formula is:
  • X r , Y r , Z r are the estimated three-dimensional point coordinates of the measured object in the coordinate system of the second imaging device, Is the coordinates of the reference corresponding point corresponding to each point in the first phase profile.
  • Step S4 determining an equation of a polar line on a plane of the second phase distribution map according to the three-dimensional coordinates of a certain point in the first phase distribution map and the preset second calibration data, at the certain point phase Corresponding reference corresponding point is centered on the polar line within a pixel size range, according to the phase value of the certain point Find the corresponding point of the point to achieve the corresponding point matching.
  • the corresponding point corresponds to the certain point in the first phase distribution map.
  • the coordinates of the reference corresponding point and the first phase distribution map are Phase value A certain point corresponds.
  • the search for the corresponding point is not the same as the classic corresponding point search method, and the corresponding point is searched on the entire image plane or along the entire polar line, but within a pixel size range near the reference corresponding point. Find the corresponding point on the included pole line.
  • the coordinate value of a point in the plane where the left camera phase map is located is:
  • the phase value is:
  • the points with the closest phase are interpolated along the polar line to the coordinates of the closest point.
  • the present invention there is a reference corresponding point for estimating the three-dimensional coordinates of the measured object space by using the phase map and projecting It only needs to search and search on the polar segments within a pixel size range near the reference point. Similar phase values, the coordinates of the corresponding points can be calculated along the phase of the polar line, which greatly reduces the range of the search and shortens the search time; as shown in FIG. 5, the object to be measured is provided by the embodiment of the present invention.
  • the model uses a corresponding point matching method to reconstruct a three-dimensional digital image of the measured object.
  • the present invention also provides a device for quickly corresponding point matching of a phase map assisted three-dimensional imaging system, wherein the three-dimensional imaging system comprises: a first imaging device, a projection device and a second imaging device, the first imaging device and the The second imaging device is located at two sides of the projection device.
  • the device includes: a phase information acquisition module 1, a spatial three-dimensional point coordinate estimation module 2, a reference corresponding point acquisition module 3, and a corresponding point matching module. 4.
  • the phase information acquiring module 1 is configured to project light onto the surface of the object to be measured by using the projection device, and collect an image containing the information of the measured object by using the first imaging device, and calculate each point of the measured object according to the image.
  • Phase value Obtaining the phase value of each point of the object to be measured a first phase distribution map; further configured to acquire an image including the measured object information by using the second imaging device, and calculate a phase value of each point of the measured object according to the image Obtaining the phase value of each point of the object to be measured a second phase profile composed of;
  • the plane in which the first phase distribution map is located is a normalized plane obtained after the lens distortion of the image plane of the first imaging device is removed, and the plane where the second phase distribution map is located is obtained after the image plane of the second imaging device is removed from the lens distortion. Normalized plane.
  • the spatial three-dimensional point coordinate estimation module 2 is configured to utilize a phase value of each point in the first phase distribution map And pre-setting the first calibration data, using the phase map to estimate the spatial three-dimensional point coordinates of the measured object;
  • phase mapping formula is:
  • the values of the polynomial coefficients a i , b i , c i are the first calibration data, in the formula
  • X, Y, and Z are values of the estimated three-dimensional point coordinates of each point in the first imaging device coordinate system of the measured object.
  • the spatial three-dimensional point coordinate estimation module 2 is specifically configured to: utilize a phase value of each point in the first phase distribution map And the first calibration data a i , b i , c i , using a phase mapping formula
  • the values X, Y, and Z of the spatial three-dimensional point coordinates of the measured object in the coordinate system of the first imaging device are estimated.
  • the reference corresponding point obtaining module 3 is configured to reproject the spatial three-dimensional point coordinates of the measured object onto a plane on which the second phase distribution map is located, to obtain each point in the first phase distribution map. Corresponding reference points on the plane of the second phase distribution map;
  • the reference corresponding point acquisition module 3 is specifically configured to: convert the spatial three-dimensional point coordinates X, Y, Z of the estimated measured object to the second imaging device coordinate by using the rotation matrix R and the translation matrix T In the system, the coordinates of the reference corresponding point corresponding to each point in the first phase distribution map are calculated.
  • the specific formula is:
  • X r , Y r , Z r are the estimated three-dimensional point coordinates of the measured object in the coordinate system of the second imaging device.
  • the corresponding point matching module 4 is configured to select a phase value according to the certain point on a polar line within a pixel size range centered on a reference corresponding point corresponding to a point in the first phase distribution map Find the corresponding point of the point to achieve the corresponding point matching.
  • the polar line equation of the polar line is: a polar line equation on a plane of the second phase distribution map determined according to the three-dimensional coordinates of a certain point in the first phase distribution map and the preset second calibration data.
  • the second calibration data includes: internal fixed parameters of the first imaging device and the second imaging device, and further includes a rotation matrix R, a translation matrix T between the first imaging device and the second imaging device, the rotation matrix R,
  • the translation matrix T is used for mutual conversion calculation between the first imaging device coordinate system and the second imaging device coordinate system.
  • the corresponding point matching module 4 is specifically configured to: search and search along the polar line along a polar line within a pixel size range centered on a reference corresponding point corresponding to a certain point in the first phase distribution map.
  • the point is that the coordinates in the first phase distribution map are Phase value Point.
  • the most important thing for three-dimensional imaging is corresponding point matching. Only when two imaging devices find corresponding points with the same feature, the corresponding point matching can be completed to reconstruct the object model on the computer.
  • the search of the corresponding points is searched in the entire image plane or the entire pole line, and the number of points to be searched is large and the range is wide.
  • the pixels of the camera are 1280 ⁇ 1024, and the number of valid points is n. It is discussed in the following three cases: 1. If each valid point finds a corresponding point in the entire image range, it needs to be searched according to the binary tree algorithm.
  • Times, a total of need to find Second if each effective point is searched along the polar line, there are 200-300 points to be searched on the calculated pole line, and at least the search is required according to the binary tree algorithm. Times, at least need to find at least 3.
  • the corresponding point is found on the polar line within about one pixel range near the reference corresponding point, and the total number of searches is at most 2n times. Therefore, the advantage of the corresponding point matching in the three-dimensional imaging is greatly realized by the algorithm, which greatly reduces the number of times of corresponding point searching, saves the time of corresponding point search, and realizes the corresponding point of the three-dimensional imaging system. Quick match.
  • the invention combines the classical method of finding the corresponding points by using the two conditions of phase information and polar line constraint, and creates a method for quickly matching the point matching of the phase map assisted three-dimensional imaging system, and then proposes an efficient, high-speed and high-precision three-dimensional imaging.
  • the system corresponds to the method of point matching.
  • the method not only inherits the high precision of the 3D imaging system, but also introduces the method of estimating the 3D point coordinates of the object space by the phase map, and then obtains the reference corresponding point to narrow the range of the corresponding point search, thereby greatly improving the 3D imaging.
  • the speed at which the corresponding point in the measurement matches.
  • All or part of the steps in the above embodiments are controlled by a program to control related hardware, and the program may be stored in a computer readable storage medium, such as a ROM/RAM, a disk, or the like. CD, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

本发明适用于光学三维数字成像技术领域,提供了一种相位映射辅助三维成像系统快速对应点匹配的方法,包括:步骤S1,采集包含被测物体信息的图像,并经计算得到由被采集到的每个点的相位值φ组成的第一相位分布图和第二相位分布图;步骤S2,利用第一相位分布图中每个点的相位值φ和预置的第一标定数据,用相位映射估计出被测物体的空间三维点坐标;步骤S3,将空间三维点坐标重投影到第二相位分布图所在的平面上,得到参考对应点;步骤S4,在以与第一相位分布图中某个点相对应的参考对应点为中心的一个像素大小范围内的极线上,根据某个点的相位值φ查找出该点的对应点,实现对应点匹配。本发明提供的方法缩小了查找范围并缩短了查找时间。

Description

相位映射辅助三维成像系统快速对应点匹配的方法及装置 技术领域
本发明属于光学三维数字成像技术领域,尤其涉及一种相位映射辅助三维成像系统快速对应点匹配的方法及装置。
背景技术
三目三维成像系统是一种非接触式、全场测量的光学三维数字成像系统。该系统采用投影装置投影一组正弦光栅或准正弦光栅到物体表面,采用成像装置采集经物体表面面形调制后的条纹图,结合相移技术计算每一测量点的空间相位值,之后利用相位信息在两个成像装置的成像面上找到对应点,根据三角测量法计算得到物体表面的三维信息。三目三维成像系统由于其高成像密度、高成像速度、高测量精度和高测量普适性而得到广泛应用。
随着三维成像和测量技术的快速发展,缩短测量时间和提高测量精度成为目前主要的研究方向。三维成像和测量技术最为关键的步骤是对应点的匹配,为了缩短对应点匹配时间,现有技术提出了一些匹配对应点的方法,比较典型的有:1、利用灰度值或正交相位值在图像面上全局搜索实现对应点匹配;2、基于极线约束利用灰度值或单方向相位值实现对应点匹配。第1种方法是每个有效点在整个图像范围内找对应点,第2种方法是每个有效点沿极线查找对应点;现有技术提出的对应点匹配法,在匹配过程中都需要使其中一个图像面上的每一个有效点在另一个图像面上进行全局搜索或者沿着对应的极线搜索,从而导致搜索范围大、查找时间长,直接影响到了三维成像过程中对应点匹配的时间。
发明内容
本发明所要解决的技术问题在于提供一种相位映射辅助三维成像系统快速对应点匹配的方法及装置,旨在解决三维立体对应点匹配过程中存在的搜索范围大、查找时间过长的问题。
本发明提供了一种相位映射辅助三维成像系统快速对应点匹配的方法,所述三维成像系统包括:第一成像装置、投影装置和第二成像装置,所述第一成像装置和所述第二成像装置位于所述投影装置的两侧,所述方法包括:
步骤S1,利用投影装置投影光到被测物体表面,并利用第一成像装置采集包含被测物体信息的图像,根据所述图像计算所述被测物体的每个点的相位值
Figure PCTCN2016110082-appb-000001
得到由所述被测物体的每个点的相位值
Figure PCTCN2016110082-appb-000002
组成的第一相位分布图;并利用第二成像装置采集包含被测物体信息的图像,根据所述图像计算所述被测物体的每个点的相位值
Figure PCTCN2016110082-appb-000003
得到由所述被测物体的每个点的相位值
Figure PCTCN2016110082-appb-000004
组成的第二相位分布图;
所述第一相位分布图所在的平面为第一成像装置图像平面去除镜头畸变之后得到的归 一化平面,所述第二相位分布图所在的平面为第二成像装置图像平面去除镜头畸变之后得到的归一化平面;
步骤S2,利用所述第一相位分布图中每个点的相位值
Figure PCTCN2016110082-appb-000005
和预置的第一标定数据,用相位映射估计出被测物体的空间三维点坐标;
步骤S3,将被测物体的所述空间三维点坐标重投影到所述第二相位分布图所在的平面上,得到所述第一相位分布图中的每个点在所述第二相位分布图所在平面上相对应的参考对应点;
步骤S4,根据所述第一相位分布图中的某个点的三维坐标和预置的第二标定数据确定第二相位分布图所在平面上的极线的方程,在以所述某个点相对应的参考对应点为中心的一个像素大小范围内的所述极线上,根据所述某个点的相位值
Figure PCTCN2016110082-appb-000006
查找出该点的对应点,实现对应点匹配。
进一步地,相位映射公式为:
Figure PCTCN2016110082-appb-000007
Figure PCTCN2016110082-appb-000008
Figure PCTCN2016110082-appb-000009
其中,多项式系数ai、bi、ci的值为所述第一标定数据,公式中
Figure PCTCN2016110082-appb-000010
为所述第一相位分布图中的每个点的相位值,X、Y、Z为估计出的被测物体中每个点在第一成像装置坐标系下空间三维点坐标的值;所述第二标定数据包括:第一成像装置和第二成像装置的内部固定参数,还包括旋转矩阵R、平移矩阵T,所述旋转矩阵R、平移矩阵T用于第一成像装置坐标系与第二成像装置坐标系之间的相互转换计算。
进一步地,所述步骤S2具体为:利用所述第一相位分布图中每个点的相位值
Figure PCTCN2016110082-appb-000011
和所述第一标定数据ai、bi、ci,用相位映射公式
Figure PCTCN2016110082-appb-000012
Figure PCTCN2016110082-appb-000013
Figure PCTCN2016110082-appb-000014
估计出被测物体在第一成像装置坐标系下空间三维点坐标的值X、Y、Z。
进一步地,所述步骤S3具体为:将估计出的被测物体的所述空间三维点坐标X,Y,Z利用所述旋转矩阵R、平移矩阵T转换到第二成像装置坐标系中,再经计算得到与所述第一相位分布图中的每个点相对应的参考对应点的坐标
Figure PCTCN2016110082-appb-000015
具体公式为:
Figure PCTCN2016110082-appb-000016
Xnor=Xr/Zr
Ynor=Yr/Zr
其中,Xr,Yr,Zr为估计出的被测物体在第二成像装置坐标系中的三维点坐标。
进一步地,所述步骤S4中,所述极线的方程为:
ax+by+c=0,
极线的三个参数由公式:
Figure PCTCN2016110082-appb-000017
计算;其中,[T]X为平移矩阵T的反对称矩阵,R为旋转矩阵,
Figure PCTCN2016110082-appb-000018
为第一相位分布图中某个点在第一成像装置坐标系中的三维坐标,其中,znorl=1;
所述步骤S4具体为:在以与第一相位分布图中某个点相对应的参考对应点为中心的一个像素大小范围内的极线上,沿着所述极线查找和所述某个点的相位值
Figure PCTCN2016110082-appb-000019
最接近的点并沿所述极线插值求出该点的坐标
Figure PCTCN2016110082-appb-000020
该点即为查找出的所述某个点的对应点,其中,该点在第二成像装置坐标系中的Z轴方向的坐标znorr=1;
所述某个点为第一相位分布图中的坐标为
Figure PCTCN2016110082-appb-000021
相位值为
Figure PCTCN2016110082-appb-000022
的点。
本发明还提供了一种相位映射辅助三维成像系统快速对应点匹配的装置,所述三维成像系统包括:第一成像装置、投影装置和第二成像装置,所述第一成像装置和所述第二成像装置位于所述投影装置的两侧,所述装置包括:
相位信息获取模块,用于利用投影装置投影光到被测物体表面,并利用第一成像装置采集包含被测物体信息的图像,根据所述图像计算所述被测物体的每个点的相位值
Figure PCTCN2016110082-appb-000023
得到由所述被测物体的每个点的相位值
Figure PCTCN2016110082-appb-000024
组成的第一相位分布图;还用于利用第二成像装置采集包含被测物体信息的图像,根据所述图像计算所述被测物体的每个点的相位值
Figure PCTCN2016110082-appb-000025
得到由所述被测物体的每个点的相位值
Figure PCTCN2016110082-appb-000026
组成的第二相位分布图;
所述第一相位分布图所在的平面为第一成像装置图像平面去除镜头畸变之后得到的归一化平面,所述第二相位分布图所在的平面为第二成像装置图像平面去除镜头畸变之后得到的归一化平面;
空间三维点坐标估计模块,用于利用所述第一相位分布图中每个点的相位值
Figure PCTCN2016110082-appb-000027
和预置的 第一标定数据,用相位映射估计出被测物体的空间三维点坐标;
参考对应点获取模块,用于将被测物体的所述空间三维点坐标重投影到所述第二相位分布图所在的平面上,得到所述第一相位分布图中的每个点在所述第二相位分布图所在平面上相对应的参考对应点;
对应点匹配模块,用于在以与第一相位分布图中某个点相对应的参考对应点为中心的一个像素大小范围内的极线上,根据所述某个点的相位值
Figure PCTCN2016110082-appb-000028
查找出该点的对应点,实现对应点匹配;
所述极线的极线方程为:根据所述第一相位分布图中的某个点的三维坐标和预置的第二标定数据确定的第二相位分布图所在平面上的极线方程。
进一步地,相位映射公式为:
Figure PCTCN2016110082-appb-000029
Figure PCTCN2016110082-appb-000030
Figure PCTCN2016110082-appb-000031
其中,多项式系数ai、bi、ci的值为所述第一标定数据,公式中
Figure PCTCN2016110082-appb-000032
为所述第一相位分布图中的每个点的相位值,X、Y、Z为估计出的被测物体中每个点在第一成像装置坐标系下空间三维点坐标的值;所述第二标定数据包括:第一成像装置和第二成像装置的内部固定参数,还包括旋转矩阵R、平移矩阵T,所述旋转矩阵R、平移矩阵T用于第一成像装置坐标系与第二成像装置坐标系之间的相互转换计算。
进一步地,所述空间三维点坐标估计模块具体用于:利用所述第一相位分布图中的每个点的相位值
Figure PCTCN2016110082-appb-000033
和所述第一标定数据ai、bi、ci,用相位映射公式
Figure PCTCN2016110082-appb-000034
Figure PCTCN2016110082-appb-000035
Figure PCTCN2016110082-appb-000036
估计出被测物体在第一成像装置坐标系下空间三维点坐标的值X、Y、Z。
进一步地,所述参考对应点获取模块具体用于:将估计出的被测物体的所述空间三维点坐标X,Y,Z利用所述旋转矩阵R、平移矩阵T通过计算转换到第二成像装置坐标系中,再经计算得到与所述第一相位分布图中的每个点相对应的参考对应点的坐标
Figure PCTCN2016110082-appb-000037
具体公式为:
Figure PCTCN2016110082-appb-000038
Xnor=Xr/Zr
Ynor=Yr/Zr
其中,Xr,Yr,Zr为估计出的被测物体在第二成像装置坐标系中的三维点坐标。
进一步地,所述对应点匹配模块中,所述极线方程为:
ax+by+c=0,
极线的三个参数由公式:
Figure PCTCN2016110082-appb-000039
计算;其中,[T]X为平移矩阵T的反对称矩阵,R为旋转矩阵,
Figure PCTCN2016110082-appb-000040
为第一相位分布图中某个点在第一成像装置坐标系中的三维坐标,其中,znorl=1;
所述对应点匹配模块具体用于:在以与第一相位分布图中某个点相对应的参考对应点为中心的一个像素大小范围内的极线上,沿着所述极线查找和所述某个点的相位值
Figure PCTCN2016110082-appb-000041
最接近的点并沿极线插值求出该点的坐标
Figure PCTCN2016110082-appb-000042
该点即为所查找出的所述某个点的对应点,其中,该点在第二成像装置坐标系中的Z轴方向的坐标anorr=1;
所述某个点为第一相位分布图中的坐标为
Figure PCTCN2016110082-appb-000043
相位值为
Figure PCTCN2016110082-appb-000044
的点。
本发明与现有技术相比,有益效果在于:本发明提供了一种相位映射辅助三维成像系统快速对应点匹配的方法及装置,在三维成像系统中,利用第一成像装置采集并经计算得到被测物的第一相位分布图,并估计出被测物体的空间三维点坐标,将估计出的空间三维点坐标重投影到第二相位分布图所在的平面上,得到参考对应点,结合该参考对应点和传统的查找方法进行对应点查找。
本发明在该参考对应点为中心的一个像素大小范围内,结合经典的利用相位信息和极线约束两个条件查找对应点的方法进行查找,极大地缩小了对应点查找的范围,缩短了对应点查找的时间,实现了三维成像系统快速对应点的匹配;满足高速、高精度、高普适性的三维数字成像和测量的要求。
附图说明
图1是本发明实施例提供的一种相位映射辅助三维成像系统快速对应点匹配的方法的过 程示意图;
图2是本发明实施例提供的一种相位映射辅助三维成像系统快速对应点匹配的方法的流程示意图;
图3-1是本发明实施例提供的左相机获取的被测物体模型的左相机相位分布图;
图3-2是本发明实施例提供的右相机获取的被测物体模型的右相机相位分布图;
图4是本发明实施例提供的由相位映射估计出的被测物体空间三维点坐标形成的三维效果图;
图5是本发明实施例提供的,用所查找到的精确对应点,计算重建被测物的三维数字图像;
图6是本发明实施例提供的一种相位映射辅助三维成像系统快速对应点匹配的装置的模块示意图。
具体实施方式
为了使本发明的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本发明进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本发明,并不用于限定本发明。
本发明在经典的基于极线约束的相位亚像素找对应点方法的基础之上,引入相位映射法,进而提出一种相位映射辅助三维成像系统快速对应点匹配的方法及装置。
下面具体介绍这种相位映射辅助三维成像系统快速对应点匹配的方法,结合图1和图2所示。所述方法基于所述三维成像系统,所述三维成像系统为三目结构,包括:第一成像装置、投影装置和第二成像装置,所述第一成像装置和所述第二成像装置位于所述投影装置的两侧。本发明实施例提供的三维成像系统中,成像装置为相机,投影装置为投影仪,图1中,相机位于所述投影仪1的左右两边,分别为左相机2和右相机3,被测物体置于左相机2和右相机3的视场和景深范围内;中间的投影仪1作为投影装置投射正弦或准正弦条纹光到被测物体表面,左相机2和右相机3作为成像装置采集经被测物体表面面形调制后的条纹图。
在执行这种相位映射辅助三维成像系统快速对应点匹配的方法之前,会对所述三维成像系统进行标定,从而获取第一标定数据和第二标定数据,所述第一标定数据用于代入相位映射公式中,从而估计出被测物体在第一成像装置坐标系下的空间三维点坐标的值X、Y、Z;所述第二标定数据中的R、T用于将被测物体的所述空间三维点坐标X、Y、Z转换到第二成像装置的坐标系中,还用于计算极线方程中的参数a,b,c的值,所述第二标定数据中的内部固定参数用于去畸变。具体地,相位映射公式为:
Figure PCTCN2016110082-appb-000045
Figure PCTCN2016110082-appb-000046
Figure PCTCN2016110082-appb-000047
其中,多项式系数ai、bi、ci的值为所述第一标定数据;标定出多项式系数ai、bi、ci的值以便于计算X、Y、Z的值,上述公式中,
Figure PCTCN2016110082-appb-000048
为所述第一相位分布图中的每个点的相位值,X,Y,Z为估计出的被测物体中每个点在第一成像装置坐标系下的空间三维点坐标的值;所述第二标定数据包括:第一成像装置和第二成像装置的内部固定参数,还包括第一成像装置与第二成像装置位置的空间刚体变换:旋转矩阵R、平移矩阵T,所述旋转矩阵R、平移矩阵T用于第一成像装置坐标系与第二成像装置坐标系之间的相互转换计算。
结合图1和图2所示,所述方法包括:
步骤S1,利用投影装置投影光到被测物体表面,并利用第一成像装置采集包含被测物体信息的图像,根据所述图像计算所述被测物体的每个点的相位值
Figure PCTCN2016110082-appb-000049
得到由所述被测物体的每个点的相位值
Figure PCTCN2016110082-appb-000050
组成的第一相位分布图;并利用第二成像装置采集包含被测物体信息的图像,根据所述图像计算所述被测物体的每个点的相位值
Figure PCTCN2016110082-appb-000051
得到由所述被测物体的每个点的相位值
Figure PCTCN2016110082-appb-000052
组成的第二相位分布图。
具体地,所述第一相位分布图所在的平面为第一成像装置图像平面去除镜头畸变之后得到的归一化平面,所述第二相位分布图所在的平面为第二成像装置图像平面去除镜头畸变之后得到的归一化平面。
更具体地,本发明实施例中,第一成像装置为左相机,第二成像装置为右相机,第一成像装置图像平面为左相机图像平面,第二成像装置图像平面为右相机图像平面;第一相位分布图和第一成像装置图像平面是在第一成像装置坐标系即左相机坐标系下,第二相位分布图和第二成像装置图像平面是在第二成像装置坐标系即右相机坐标系下;本发明实施例中的第一相位分布图上的点,是第一成像装置图像平面的点经过去畸变运算得到的,得到的点在第一成像装置的归一化平面上,所以第一相位分布图上的点在第一成像装置坐标系下三维坐标的z值为1;同理,本发明实施例中的第二相位分布图上的点,也是第二成像装置图像平面的点经过去畸变运算得到的,得到的点在第二成像装置的归一化平面上,所以第二相位分布图上的点在第二成像装置坐标系下三维坐标的z值为1。
具体地,所述投影仪投射一系列正弦或准正弦条纹光照射到被测物体表面,左相机和右相机同步采集被所述被测物体调制过的一系列条纹图像后,利用左相机采集到的一系列条纹图像通过解相位计算得到被测物体被相机采集到的每个点的相位值
Figure PCTCN2016110082-appb-000053
由所述相位值
Figure PCTCN2016110082-appb-000054
形成系统左相机的相位分布图;同理,利用右相机采集到的一系列条纹图像通过解相位计算得到被测物体被相机采集到的每个点的相位值
Figure PCTCN2016110082-appb-000055
由所述相位值
Figure PCTCN2016110082-appb-000056
形成系统左相机的相位分布 图,本发明实施例提供的左、右相机获取的被测物体模型的左相机相位分布图IL和右相机相位分布图IR分别如图3-1、3-2所示。
进一步地,可以利用第一相位分布图中的每个点的相位值通过相位映射估计得到所述每个点在第一成像装置坐标系下的三维点坐标;也可以利用第二相位分布图中的每个点的相位值通过相位映射估计得到所述每个点在第二成像装置坐标系下的三维点坐标。
步骤S2,利用所述第一相位分布图中每个点的相位值
Figure PCTCN2016110082-appb-000057
和预置的第一标定数据,用相位映射估计出被测物体在左相机坐标系下的空间三维点坐标。
具体地,由图1可知,在三维成像系统中可以利用左相机或右相机采集并经计算得到的被测物体相位分布图估计出被测物体的空间三维点坐标;若在此过程中是利用左相机对应的左相机相位分布图,那么估计出的空间三维点坐标则要投影到右相机对应的右相机相位分布图上作为参考对应点;若利用的是右相机对应的右相机相位分布图,则被投影相机则为左相机。在本发明实施例中,规定利用左相机采集并计算得到的左相机相位分布图估计空间三维点坐标;图1中第一、第二相位分布图所在的平面为左右相机的归一化平面。
所述步骤S2具体为:利用左相机相位分布图中的每个点的相位值
Figure PCTCN2016110082-appb-000058
和预置的第一标定数据ai、bi、ci,用相位映射公式
Figure PCTCN2016110082-appb-000059
Figure PCTCN2016110082-appb-000060
Figure PCTCN2016110082-appb-000061
估计出被测物体在左相机坐标系下的空间三维点坐标的值X、Y、Z;
根据上述每个点估计出该点对应的被测物体的空间三维点坐标,所有估计出的被测物体空间三维点可以拟合成一个被测物体的三维像的虚拟模型,如图4所示为本发明实施例提供的被测物体模型通过相位映射,估计出的空间三维点坐标形成的被测物的三维效果图。
步骤S3,将被测物体的所述空间三维点坐标重投影到所述第二相位分布图所在的平面上,得到所述第一相位分布图中的每个点在所述第二相位分布图所在平面上相对应的参考对应点。
具体地,所述步骤S3中的重投影过程分为两步:第一步,将估计的被测物体的空间三维点转换到第二成像装置坐标系中;第二步,将估计出的被测物体在第二成像装置坐标系中的三维点投影到第二相位分布图所在的平面。
所述步骤S3具体为:将估计出的被测物体的所述空间三维点坐标X,Y,Z利用所述旋转矩阵R、平移矩阵T转换到第二成像装置坐标系中,并结合与第二相位分布图所在的平面的位置关系,将估计得到的空间坐标点投影到右相机相位分布图所在的平面,得到参考对应点;具体公式为:
Figure PCTCN2016110082-appb-000062
Xnor=Xr/Zr
Ynor=Yr/Zr
其中,Xr,Yr,Zr为估计出的被测物体在第二成像装置坐标系中的三维点坐标,
Figure PCTCN2016110082-appb-000063
为与所述第一相位分布图中的每个点相对应的参考对应点的坐标。
步骤S4,根据所述第一相位分布图中的某个点的三维坐标和预置的第二标定数据确定第二相位分布图所在平面上的极线的方程,在以所述某个点相对应的参考对应点为中心的一个像素大小范围内的所述极线上,根据所述某个点的相位值
Figure PCTCN2016110082-appb-000064
查找出该点的对应点,实现对应点匹配。
进一步地,所述对应点与所述第一相位分布图中的所述某个点相对应。
上述步骤S4中,提到的所述极线方程为:
ax+by+c=0,
极线的三个参数由公式:
Figure PCTCN2016110082-appb-000065
计算;其中,[T]X为平移矩阵T的反对称矩阵,R为旋转矩阵,
Figure PCTCN2016110082-appb-000066
为第一相位分布图中某个点在第一成像装置坐标系中的三维坐标,其中,znorl=1。
所述参考对应点与所述第一相位分布图中的坐标为
Figure PCTCN2016110082-appb-000067
相位值为
Figure PCTCN2016110082-appb-000068
的某个点相对应。
进一步地,在本发明中,对应点的查找并非与经典的对应点查找方法一样,在整个像平面上或沿整条极线查找对应点,而是在参考对应点附近一个像素大小的范围内所包含的极线上查找对应点。具体来说,左相机相位分布图所在的平面中某一点坐标值为:
Figure PCTCN2016110082-appb-000069
相位值为:
Figure PCTCN2016110082-appb-000070
所找的对应点:
Figure PCTCN2016110082-appb-000071
在右相机相位分布图所在的平面由
Figure PCTCN2016110082-appb-000072
计算得到的极线上,其中,该点在第二成像装置坐标系(即右相机坐标系)中的Z轴方向的坐标znorr=1;查找对应点的方法为:在参考对应点附近一个像素大小的范围内所包含的极线上,沿着极线查找和
Figure PCTCN2016110082-appb-000073
相位最相近的点并沿极线插值出最相近点的坐标。
本发明中,有了用相位映射估计被测物体空间三维坐标并投影的参考对应点
Figure PCTCN2016110082-appb-000074
只需要在参考对应点附近一个像素大小范围内的极线段上搜索与
Figure PCTCN2016110082-appb-000075
相近的相位值,沿极线相位插值即可计算出对应点的坐标,极大得缩小了查找的范围,缩短了查找时间;如图5所示为本发明实施例提供的对被测物体的模型用对应点匹配的方法重建的被测物的三维数字图像。
本发明还提供了一种相位映射辅助三维成像系统快速对应点匹配的装置,其中,所述三维成像系统包括:第一成像装置、投影装置和第二成像装置,所述第一成像装置和所述第二成像装置位于所述投影装置的两侧,所述装置如图6所示,包括:相位信息获取模块1、空间三维点坐标估计模块2、参考对应点获取模块3和对应点匹配模块4。
所述相位信息获取模块1,用于利用投影装置投影光到被测物体表面,并利用第一成像装置采集包含被测物体信息的图像,根据所述图像计算所述被测物体的每个点的相位值
Figure PCTCN2016110082-appb-000076
得到由所述被测物体的每个点的相位值
Figure PCTCN2016110082-appb-000077
组成的第一相位分布图;还用于利用第二成像装置采集包含被测物体信息的图像,根据所述图像计算所述被测物体的每个点的相位值
Figure PCTCN2016110082-appb-000078
得到由所述被测物体的每个点的相位值
Figure PCTCN2016110082-appb-000079
组成的第二相位分布图;
所述第一相位分布图所在的平面为第一成像装置图像平面去除镜头畸变之后得到的归一化平面,所述第二相位分布图所在的平面为第二成像装置图像平面去除镜头畸变之后得到的归一化平面。
所述空间三维点坐标估计模块2,用于利用所述第一相位分布图中每个点的相位值
Figure PCTCN2016110082-appb-000080
和预置的第一标定数据,用相位映射估计出被测物体的空间三维点坐标;
相位映射公式为:
Figure PCTCN2016110082-appb-000081
Figure PCTCN2016110082-appb-000082
Figure PCTCN2016110082-appb-000083
其中,多项式系数ai、bi、ci的值为所述第一标定数据,公式中
Figure PCTCN2016110082-appb-000084
为所述第一相位分布图中的每个点的相位值,X、Y、Z为估计出的被测物体中每个点在第一成像装置坐标系下空间三维点坐标的值。
所述空间三维点坐标估计模块2具体用于:利用所述第一相位分布图中每个点的相位值
Figure PCTCN2016110082-appb-000085
和所述第一标定数据ai、bi、ci,用相位映射公式
Figure PCTCN2016110082-appb-000086
Figure PCTCN2016110082-appb-000087
Figure PCTCN2016110082-appb-000088
估计出被测物体在第一成像装置坐标系下空间三维点坐标的值X、Y、Z。
所述参考对应点获取模块3,用于将被测物体的所述空间三维点坐标重投影到所述第二相位分布图所在的平面上,得到所述第一相位分布图中的每个点在所述第二相位分布图所在平面上相对应的参考对应点;
所述参考对应点获取模块3具体用于:将估计出的被测物体的所述空间三维点坐标X,Y,Z利用所述旋转矩阵R、平移矩阵T通过计算转换到第二成像装置坐标系中,再经计算得到与所述第一相位分布图中的每个点相对应的参考对应点的坐标
Figure PCTCN2016110082-appb-000089
具体公式为:
Figure PCTCN2016110082-appb-000090
Xnor=Xr/Zr
Ynor=Yr/Zr
其中,Xr,Yr,Zr为估计出的被测物体在第二成像装置坐标系中的三维点坐标。
所述对应点匹配模块4,用于在以与第一相位分布图中某个点相对应的参考对应点为中心的一个像素大小范围内的极线上,根据所述某个点的相位值
Figure PCTCN2016110082-appb-000091
查找出该点的对应点,实现对应点匹配。
所述极线的极线方程为:根据所述第一相位分布图中的某个点的三维坐标和预置的第二标定数据确定的第二相位分布图所在平面上的极线方程。
所述第二标定数据包括:第一成像装置和第二成像装置的内部固定参数,还包括第一成像装置与第二成像装置之间的旋转矩阵R、平移矩阵T,所述旋转矩阵R、平移矩阵T用于第一成像装置坐标系与第二成像装置坐标系之间的相互转换计算。
所述对应点匹配模块4中,所述极线方程为:
ax+by+c=0,
极线的三个参数由公式:
Figure PCTCN2016110082-appb-000092
计算;其中,[T]X为平移矩阵T的反对称 矩阵,R为旋转矩阵,
Figure PCTCN2016110082-appb-000093
为第一相位分布图中某个点在第一成像装置坐标系中的三维坐标,其中,znorl=1。
所述对应点匹配模块4具体用于:在以与第一相位分布图中某个点相对应的参考对应点为中心的一个像素大小范围内的极线上,沿着所述极线查找和所述某个点的相位值
Figure PCTCN2016110082-appb-000094
最接近的点并沿极线插值求出该点的坐标
Figure PCTCN2016110082-appb-000095
该点即为所查找出的所述某个点的对应点,其中,该点在第二成像装置坐标系中的Z轴方向的坐标znorr=1。
所述某个点为第一相位分布图中的坐标为
Figure PCTCN2016110082-appb-000096
相位值为
Figure PCTCN2016110082-appb-000097
的点。
在本发明中,三维成像最重要的是对应点匹配,只有在两个成像装置找到特征相同的对应点,完成对应点匹配才能在计算机上重建物体模型。目前比较经典的对应点匹配方法中,对应点的查找均是在整个图像平面或者整条极线上搜索,需要查找的点数量多,范围广。具体到本发明实施例中,相机的像素为1280X1024个,若有效点个数为n。分以下三种情况进行讨论:1、若每个有效点在整个图像范围内找对应点,按照二叉树算法计算需要查找
Figure PCTCN2016110082-appb-000098
次,总共需要查找
Figure PCTCN2016110082-appb-000099
次;2、若每个有效点沿极线查找,经计算极线上有200-300个待查找点,按照二叉树算法计算需要至少查找
Figure PCTCN2016110082-appb-000100
次,总共至少需要查找
Figure PCTCN2016110082-appb-000101
次;3、若按照本发明的方法,在参考对应点附近约一个像素范围内的极线上找对应点,那么总的查找次数最多为2n次。由此在算法上很直观得体现出了本发明在三维成像中对应点匹配的优势,极大得减少了对应点查找的次数,节省了对应点搜索的时间,从而实现了三维成像系统对应点的快速匹配。
本发明结合经典的利用相位信息和极线约束两个条件找对应点的方法,创造一种相位映射辅助三维成像系统快速对应点匹配的方法,进而提出一种高效、高速、高精度的三维成像系统对应点匹配的方法。该方法既继承了三维成像系统的高精度的优势,又引入了相位映射估计被测物空间三维点坐标的方法,进而得到参考对应点来缩小对应点查找的范围,从而大幅度提升了三维成像和测量中对应点匹配的速度。
上述实施例方法中的全部或部分步骤是通过程序来控制相关的硬件完成,所述的程序可以在存储于一计算机可读取存储介质中,所述的存储介质,如ROM/RAM、磁盘、光盘等。
以上所述仅为本发明的较佳实施例而已,并不用以限制本发明,凡在本发明的精神和原则之内所作的任何修改、等同替换和改进等,均应包含在本发明的保护范围之内。

Claims (10)

  1. 一种相位映射辅助三维成像系统快速对应点匹配的方法,其特征在于,所述三维成像系统包括:第一成像装置、投影装置和第二成像装置,所述第一成像装置和所述第二成像装置位于所述投影装置的两侧,所述方法包括:
    步骤S1,利用投影装置投影光到被测物体表面,并利用第一成像装置采集包含被测物体信息的图像,根据所述图像计算所述被测物体的每个点的相位值
    Figure PCTCN2016110082-appb-100001
    得到由所述被测物体的每个点的相位值
    Figure PCTCN2016110082-appb-100002
    组成的第一相位分布图;并利用第二成像装置采集包含被测物体信息的图像,根据所述图像计算所述被测物体的每个点的相位值
    Figure PCTCN2016110082-appb-100003
    得到由所述被测物体的每个点的相位值
    Figure PCTCN2016110082-appb-100004
    组成的第二相位分布图;
    所述第一相位分布图所在的平面为第一成像装置图像平面去除镜头畸变之后得到的归一化平面,所述第二相位分布图所在的平面为第二成像装置图像平面去除镜头畸变之后得到的归一化平面;
    步骤S2,利用所述第一相位分布图中每个点的相位值
    Figure PCTCN2016110082-appb-100005
    和预置的第一标定数据,用相位映射估计出被测物体的空间三维点坐标;
    步骤S3,将被测物体的所述空间三维点坐标重投影到所述第二相位分布图所在的平面上,得到所述第一相位分布图中的每个点在所述第二相位分布图所在平面上相对应的参考对应点;
    步骤S4,根据所述第一相位分布图中的某个点的三维坐标和预置的第二标定数据确定第二相位分布图所在平面上的极线的方程,在以所述某个点相对应的参考对应点为中心的一个像素大小范围内的所述极线上,根据所述某个点的相位值
    Figure PCTCN2016110082-appb-100006
    查找出该点的对应点,实现对应点匹配。
  2. 如权利要求1所述的相位映射辅助三维成像系统快速对应点匹配的方法,其特征在于,相位映射公式为:
    Figure PCTCN2016110082-appb-100007
    其中,多项式系数ai、bi、ci的值为所述第一标定数据,公式中
    Figure PCTCN2016110082-appb-100008
    为所述第一相位分布图中的每个点的相位值,X、Y、Z为估计出的被测物体中每个点在第一成像装置坐标系下空间三维点坐标的值;所述第二标定数据包括:第一成像装置和第二成像装置的内部固定参数,还包括旋转矩阵R、平移矩阵T,所述旋转矩阵R、平移矩阵T用于第一成像装置坐标系与第二成像装置坐标系之间的相互转换计算。
  3. 如权利要求2所述的相位映射辅助三维成像系统快速对应点匹配的方法,其特征在于,所述步骤S2具体为:利用所述第一相位分布图中每个点的相位值
    Figure PCTCN2016110082-appb-100009
    和所述第一标定数据ai、bi、ci,用相位映射公式
    Figure PCTCN2016110082-appb-100010
    估计出被测物体在第一成像装置坐标系下空间三维点坐标的值X、Y、Z。
  4. 如权利要求2至3任一项所述的相位映射辅助三维成像系统快速对应点匹配的方法,其特征在于,所述步骤S3具体为:将估计出的被测物体的所述空间三维点坐标X,Y,Z利用所述旋转矩阵R、平移矩阵T转换到第二成像装置坐标系中,再经计算得到与所述第一相位分布图中的每个点相对应的参考对应点的坐标
    Figure PCTCN2016110082-appb-100011
    具体公式为:
    Figure PCTCN2016110082-appb-100012
    Xnor=Xr/Zr
    Ynor=Yr/Zr
    其中,Xr,Yr,Zr为估计出的被测物体在第二成像装置坐标系中的三维点坐标。
  5. 如权利要求4所述的相位映射辅助三维成像系统快速对应点匹配的方法,其特征在于,所述步骤S4中,所述极线的方程为:
    ax+by+c=0,
    极线的三个参数由公式:
    Figure PCTCN2016110082-appb-100013
    计算;其中,[T]X为平移矩阵T的反对称矩阵,R为旋转矩阵,
    Figure PCTCN2016110082-appb-100014
    为第一相位分布图中某个点在第一成像装置坐标系中的三维坐标,其中,znorl=1;
    所述步骤S4具体为:在以与第一相位分布图中某个点相对应的参考对应点为中心的一个像素大小范围内的极线上,沿着所述极线查找和所述某个点的相位值
    Figure PCTCN2016110082-appb-100015
    最接近的点并沿所述极线插值求出该点的坐标
    Figure PCTCN2016110082-appb-100016
    该点即为查找出的所述某个点的对应点,其中,该点在 第二成像装置坐标系中的Z轴方向的坐标znorr=1;
    所述某个点为第一相位分布图中的坐标为
    Figure PCTCN2016110082-appb-100017
    相位值为
    Figure PCTCN2016110082-appb-100018
    的点。
  6. 一种相位映射辅助三维成像系统快速对应点匹配的装置,其特征在于,所述三维成像系统包括:第一成像装置、投影装置和第二成像装置,所述第一成像装置和所述第二成像装置位于所述投影装置的两侧,所述装置包括:
    相位信息获取模块,用于利用投影装置投影光到被测物体表面,并利用第一成像装置采集包含被测物体信息的图像,根据所述图像计算所述被测物体的每个点的相位值
    Figure PCTCN2016110082-appb-100019
    得到由所述被测物体的每个点的相位值
    Figure PCTCN2016110082-appb-100020
    组成的第一相位分布图;还用于利用第二成像装置采集包含被测物体信息的图像,根据所述图像计算所述被测物体的每个点的相位值
    Figure PCTCN2016110082-appb-100021
    得到由所述被测物体的每个点的相位值
    Figure PCTCN2016110082-appb-100022
    组成的第二相位分布图;
    所述第一相位分布图所在的平面为第一成像装置图像平面去除镜头畸变之后得到的归一化平面,所述第二相位分布图所在的平面为第二成像装置图像平面去除镜头畸变之后得到的归一化平面;
    空间三维点坐标估计模块,用于利用所述第一相位分布图中每个点的相位值
    Figure PCTCN2016110082-appb-100023
    和预置的第一标定数据,用相位映射估计出被测物体的空间三维点坐标;
    参考对应点获取模块,用于将被测物体的所述空间三维点坐标重投影到所述第二相位分布图所在的平面上,得到所述第一相位分布图中的每个点在所述第二相位分布图所在平面上相对应的参考对应点;
    对应点匹配模块,用于在以与第一相位分布图中某个点相对应的参考对应点为中心的一个像素大小范围内的极线上,根据所述某个点的相位值
    Figure PCTCN2016110082-appb-100024
    查找出该点的对应点,实现对应点匹配;
    所述极线的极线方程为:根据所述第一相位分布图中的某个点的三维坐标和预置的第二标定数据确定的第二相位分布图所在平面上的极线方程。
  7. 如权利要求6所述的相位映射辅助三维成像系统快速对应点匹配的装置,其特征在于,相位映射公式为:
    Figure PCTCN2016110082-appb-100025
    其中,多项式系数ai、bi、ci的值为所述第一标定数据,公式中
    Figure PCTCN2016110082-appb-100026
    为所述第一相位分布图中的每个点的相位值,X、Y、Z为估计出的被测物体中每个点在第一成像装置坐标系下空间三维点坐标的值;所述第二标定数据包括:第一成像装置和第二成像装置的内部固定参 数,还包括旋转矩阵R、平移矩阵T,所述旋转矩阵R、平移矩阵T用于第一成像装置坐标系与第二成像装置坐标系之间的相互转换计算。
  8. 如权利要求7所述的相位映射辅助三维成像系统快速对应点匹配的装置,其特征在于,所述空间三维点坐标估计模块具体用于:利用所述第一相位分布图中的每个点的相位值
    Figure PCTCN2016110082-appb-100027
    和所述第一标定数据ai、bi、ci,用相位映射公式
    Figure PCTCN2016110082-appb-100028
    估计出被测物体在第一成像装置坐标系下空间三维点坐标的值X、Y、Z。
  9. 如权利要求7至8任一项所述的相位映射辅助三维成像系统快速对应点匹配的装置,其特征在于,所述参考对应点获取模块具体用于:将估计出的被测物体的所述空间三维点坐标X,Y,Z利用所述旋转矩阵R、平移矩阵T通过计算转换到第二成像装置坐标系中,再经计算得到与所述第一相位分布图中的每个点相对应的参考对应点的坐标
    Figure PCTCN2016110082-appb-100029
    具体公式为:
    Figure PCTCN2016110082-appb-100030
    Xnor=Xr/Zr
    Ynor=Yr/Zr
    其中,Xr,Yr,Zr为估计出的被测物体在第二成像装置坐标系中的三维点坐标。
  10. 如权利要求9所述的相位映射辅助三维成像系统快速对应点匹配的装置,其特征在于,所述对应点匹配模块中,所述极线方程为:
    ax+by+c=0,
    极线的三个参数由公式:
    Figure PCTCN2016110082-appb-100031
    计算;其中,[T]X为平移矩阵T的反对称矩阵,R为旋转矩阵,
    Figure PCTCN2016110082-appb-100032
    为第一相位分布图中某个点在第一成像装置坐标系中的三维坐标,其中,znorl=1;
    所述对应点匹配模块具体用于:在以与第一相位分布图中某个点相对应的参考对应点为中心的一个像素大小范围内的极线上,沿着所述极线查找和所述某个点的相位值
    Figure PCTCN2016110082-appb-100033
    最接近的点并沿极线插值求出该点的坐标
    Figure PCTCN2016110082-appb-100034
    该点即为所查找出的所述某个点的对应点,其中,该点在第二成像装置坐标系中的Z轴方向的坐标znorr=1;
    所述某个点为第一相位分布图中的坐标为
    Figure PCTCN2016110082-appb-100035
    相位值为
    Figure PCTCN2016110082-appb-100036
    的点。
PCT/CN2016/110082 2016-12-15 2016-12-15 相位映射辅助三维成像系统快速对应点匹配的方法及装置 WO2018107427A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/110082 WO2018107427A1 (zh) 2016-12-15 2016-12-15 相位映射辅助三维成像系统快速对应点匹配的方法及装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/110082 WO2018107427A1 (zh) 2016-12-15 2016-12-15 相位映射辅助三维成像系统快速对应点匹配的方法及装置

Publications (1)

Publication Number Publication Date
WO2018107427A1 true WO2018107427A1 (zh) 2018-06-21

Family

ID=62557887

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/110082 WO2018107427A1 (zh) 2016-12-15 2016-12-15 相位映射辅助三维成像系统快速对应点匹配的方法及装置

Country Status (1)

Country Link
WO (1) WO2018107427A1 (zh)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110033465A (zh) * 2019-04-18 2019-07-19 天津工业大学 一种应用于双目内窥镜医学图像的实时三维重建方法
CN111156928A (zh) * 2020-02-07 2020-05-15 武汉玄景科技有限公司 一种基于dlp投影的光栅三维扫描仪莫尔条纹消除方法
CN112489109A (zh) * 2020-11-19 2021-03-12 广州视源电子科技股份有限公司 一种三维成像系统方法、装置及三维成像系统
CN112562007A (zh) * 2020-11-24 2021-03-26 北京航空航天大学 一种基于三目约束的包裹相位非展开快速立体匹配技术
CN113191963A (zh) * 2021-04-02 2021-07-30 华中科技大学 一种无需附加操作的投影仪残余畸变全场标定方法及装置
CN115063468A (zh) * 2022-06-17 2022-09-16 梅卡曼德(北京)机器人科技有限公司 双目立体匹配方法、计算机存储介质以及电子设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104463863A (zh) * 2014-12-04 2015-03-25 深圳大学 基于时间外差投影的运动干涉场的标定方法与系统
US20150260509A1 (en) * 2014-03-11 2015-09-17 Jonathan Kofman Three dimensional (3d) imaging by a mobile communication device
CN105547190A (zh) * 2015-12-14 2016-05-04 深圳先进技术研究院 基于双角度单频率条纹投影的三维形貌测量方法及装置
CN106164979A (zh) * 2015-07-13 2016-11-23 深圳大学 一种三维人脸重建方法及系统

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150260509A1 (en) * 2014-03-11 2015-09-17 Jonathan Kofman Three dimensional (3d) imaging by a mobile communication device
CN104463863A (zh) * 2014-12-04 2015-03-25 深圳大学 基于时间外差投影的运动干涉场的标定方法与系统
CN106164979A (zh) * 2015-07-13 2016-11-23 深圳大学 一种三维人脸重建方法及系统
CN105547190A (zh) * 2015-12-14 2016-05-04 深圳先进技术研究院 基于双角度单频率条纹投影的三维形貌测量方法及装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
LI, YONG ET AL.: "Calibration and Data Merging of Two-Camera Phase Measuring Profilometry System", ACTA OPTICA SINICA, vol. 26, no. 4, 30 April 2006 (2006-04-30), ISSN: 0253-2239 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110033465A (zh) * 2019-04-18 2019-07-19 天津工业大学 一种应用于双目内窥镜医学图像的实时三维重建方法
CN110033465B (zh) * 2019-04-18 2023-04-25 天津工业大学 一种应用于双目内窥镜医学图像的实时三维重建方法
CN111156928A (zh) * 2020-02-07 2020-05-15 武汉玄景科技有限公司 一种基于dlp投影的光栅三维扫描仪莫尔条纹消除方法
CN112489109A (zh) * 2020-11-19 2021-03-12 广州视源电子科技股份有限公司 一种三维成像系统方法、装置及三维成像系统
CN112562007A (zh) * 2020-11-24 2021-03-26 北京航空航天大学 一种基于三目约束的包裹相位非展开快速立体匹配技术
CN112562007B (zh) * 2020-11-24 2023-01-24 北京航空航天大学 一种基于三目约束的包裹相位非展开快速立体匹配方法
CN113191963A (zh) * 2021-04-02 2021-07-30 华中科技大学 一种无需附加操作的投影仪残余畸变全场标定方法及装置
CN113191963B (zh) * 2021-04-02 2022-08-05 华中科技大学 一种无需附加操作的投影仪残余畸变全场标定方法及装置
CN115063468A (zh) * 2022-06-17 2022-09-16 梅卡曼德(北京)机器人科技有限公司 双目立体匹配方法、计算机存储介质以及电子设备
CN115063468B (zh) * 2022-06-17 2023-06-27 梅卡曼德(北京)机器人科技有限公司 双目立体匹配方法、计算机存储介质以及电子设备

Similar Documents

Publication Publication Date Title
WO2018107427A1 (zh) 相位映射辅助三维成像系统快速对应点匹配的方法及装置
CN110880185B (zh) 基于条纹投影的高精度动态实时360度全方位点云获取方法
CN110288642B (zh) 基于相机阵列的三维物体快速重建方法
WO2018127007A1 (zh) 深度图获取方法及系统
WO2017114507A1 (zh) 基于射线模型三维重构的图像定位方法以及装置
WO2018119771A1 (zh) 基于条纹投影轮廓术的高效相位-三维映射方法及系统
CN106875435B (zh) 获取深度图像的方法及系统
CN108519102B (zh) 一种基于二次投影的双目视觉里程计算方法
CN110672020A (zh) 一种基于单目视觉的立木高度测量方法
JPWO2008078744A1 (ja) パターン投影法による3次元形状計測装置、方法およびプログラム
WO2018201677A1 (zh) 基于光束平差的远心镜头三维成像系统的标定方法及装置
CN112967342B (zh) 一种高精度三维重建方法、系统、计算机设备及存储介质
JP6580761B1 (ja) 偏光ステレオカメラによる深度取得装置及びその方法
CN110223355B (zh) 一种基于双重极线约束的特征标志点匹配方法
CN113008195B (zh) 一种基于空间点云的三维曲面距离测量方法及系统
Ye et al. Accurate and dense point cloud generation for industrial Measurement via target-free photogrammetry
CN106767405B (zh) 相位映射辅助三维成像系统快速对应点匹配的方法及装置
CN111353997A (zh) 一种基于条纹投影的实时三维面型缺陷检测方法
CN110715646B (zh) 一种地图修测测量方法及装置
Liu et al. A novel phase unwrapping method for binocular structured light 3D reconstruction based on deep learning
Wang et al. Pixel-wise phase unwrapping with adaptive reference phase estimation for 3-D shape measurement
CN113074634B (zh) 一种快速相位匹配方法、存储介质和三维测量系统
CN113393413B (zh) 基于单目与双目视觉协同的水域面积测量方法和系统
CN109741389A (zh) 一种基于区域基匹配的局部立体匹配方法
CN113551617B (zh) 基于条纹投影的双目双频互补三维面型测量方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16924129

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16924129

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 02.10.2019)

122 Ep: pct application non-entry in european phase

Ref document number: 16924129

Country of ref document: EP

Kind code of ref document: A1