WO2018152929A1 - 一种三维扫描系统及其扫描方法 - Google Patents

一种三维扫描系统及其扫描方法 Download PDF

Info

Publication number
WO2018152929A1
WO2018152929A1 PCT/CN2017/079059 CN2017079059W WO2018152929A1 WO 2018152929 A1 WO2018152929 A1 WO 2018152929A1 CN 2017079059 W CN2017079059 W CN 2017079059W WO 2018152929 A1 WO2018152929 A1 WO 2018152929A1
Authority
WO
WIPO (PCT)
Prior art keywords
stripe
speckle
dimensional
data
images
Prior art date
Application number
PCT/CN2017/079059
Other languages
English (en)
French (fr)
Inventor
王文斌
刘增艺
赵晓波
Original Assignee
先临三维科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 先临三维科技股份有限公司 filed Critical 先临三维科技股份有限公司
Priority to AU2017400983A priority Critical patent/AU2017400983B2/en
Priority to EP17897432.5A priority patent/EP3444560B1/en
Priority to US16/094,210 priority patent/US10810750B1/en
Priority to CA3021967A priority patent/CA3021967C/en
Priority to JP2018560017A priority patent/JP6619893B2/ja
Priority to KR1020197028007A priority patent/KR102248944B1/ko
Publication of WO2018152929A1 publication Critical patent/WO2018152929A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/254Projection of a pattern, viewing through a pattern, e.g. moiré
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Definitions

  • the invention relates to a three-dimensional scanning system and a scanning method, in particular to a three-dimensional digital imaging sensor, a three-dimensional scanning system and a scanning method for a handheld multi-striped three-dimensional scanning system.
  • Three-dimensional digital technology is an emerging interdisciplinary field active in international research in recent years, and is widely used in many fields such as reverse engineering, cultural relics protection, industrial inspection and virtual reality.
  • Handheld portable 3D scanners are widely used in 3D scanning for their convenience and flexibility.
  • the principle of the existing handheld 3D scanner is mainly based on the active stereoscopic mode of structured light.
  • structured light such as infrared laser speckle, DLP (Digital Light Processing) projection speckle, and DLP projection analog laser. Stripes, laser stripes, etc.
  • DLP-projected analog laser stripes, laser stripes are structured light, and the handheld 3D scanner has the highest precision and best scanning details.
  • the three-dimensional reconstruction algorithm is used to perform three-dimensional reconstruction on the matched matching stripe and the corresponding marker point center;
  • the matching of the corresponding stripe on the left and right camera images during the scanning process is mainly based on the guidance of the light plane or the stripe plane equation.
  • the method corresponds to the camera image when the number of stripes is greater than 15.
  • the matching error rate of the stripes will be significantly increased, which will increase the noise and reduce the accuracy of the scanned data.
  • the scanning efficiency is not effectively improved. Therefore, an effective method for improving scanning efficiency under the inherent scanning frame rate limitation is to increase the number of stripes while improving the accuracy of stripe matching.
  • the splicing of the scan requires the implementation of the marker point, and a certain number of common landmarks are required between the two frames, and the entire scan model needs to be covered.
  • a three-dimensional scanning system is configured to acquire three-dimensional stripe point cloud data of an object to be measured, and includes: a light source, configured to alternately project a plurality of speckle patterns and stripe patterns in the object to be measured.
  • the left and right cameras are used to synchronously acquire the left and right speckle images and the left and right fringe images of the measured object.
  • the speckle data and the marker point data reconstruction module are configured to obtain speckle three-dimensional data and marker point three-dimensional data according to the speckle image.
  • the stripe matching module is configured to backproject the left and right stripe images according to the speckle three-dimensional data and the marker point three-dimensional data and guide the left and right stripe image stripe to match.
  • the three-dimensional reconstruction module is configured to reconstruct the corresponding stripe of the left and right stripe images into three-dimensional stripe point cloud data.
  • the stripe pattern is a natural light stripe pattern.
  • the number of stripes in the stripe pattern is greater than 15.
  • the three-dimensional scanning system is a handheld three-dimensional scanning system.
  • the three-dimensional scanning system further includes a speckle splicing module for performing ICP splicing of the point cloud of the common area of the two frames of the front and rear speckle data to calculate a rotation translation matrix R, T between the two frames.
  • a speckle splicing module for performing ICP splicing of the point cloud of the common area of the two frames of the front and rear speckle data to calculate a rotation translation matrix R, T between the two frames.
  • the three-dimensional scanning system further includes a data fusion module for performing fusion on the three-dimensional stripe point cloud data according to the rotation translation matrix R, T obtained by the speckle stitching, thereby implementing three-dimensional scanning of the stripe image.
  • a three-dimensional scanning system for acquiring three-dimensional stripe point cloud data of an object to be measured comprising:
  • the light source is configured to alternately project a plurality of speckle patterns and stripe patterns in the object to be measured.
  • the left and right cameras are used to synchronously acquire the left and right speckle images and the left and right fringe images of the measured object.
  • a data processing unit configured to obtain speckle three-dimensional data and marker point three-dimensional data according to the speckle image, and backproject to the left and right stripe images according to the speckle three-dimensional data and the marker point three-dimensional data, and guide left and right
  • the stripes of the stripe image are matched, and the left and right stripe images are matched to the corresponding stripe, and reconstructed into three-dimensional stripe point cloud data.
  • a three-dimensional scanning method includes the following steps:
  • Device construction construct a three-dimensional digital imaging sensor composed of two cameras and a light source;
  • a speckle pattern and a stripe pattern are alternately generated, and a light source is projected onto the object to be measured, and the speckle pattern and the stripe pattern are deformed by the height modulation of the object to be measured, and the modulated speckle is generated.
  • Pattern and stripe pattern, the left and right cameras synchronously acquire and modulate the speckle pattern to obtain left and right speckle images, and the left and right cameras synchronously acquire and modulate the stripe pattern to obtain left and right stripe images;
  • 3D data reconstruction of speckle and marker points 3D reconstruction based on the acquired left and right speckle images to obtain speckle 3D data PtS and marker point 3D data PtM;
  • Stripe matching backprojecting the speckle three-dimensional data PtS and the marker point three-dimensional data PtM to the left and right stripe images and guiding the left and right stripe images to match;
  • Three-dimensional reconstruction matching the left and right stripe images with the corresponding stripe, using the polar line geometric constraint relationship of the left and right cameras to find the corresponding relationship of the single points in the corresponding stripe center line segment, and then according to the calibration parameters , the corresponding point is reconstructed into three-dimensional stripe point cloud data.
  • the system calibration further includes the steps of: calibrating the left and right cameras to obtain a rotational translation matrix Mc corresponding to the relative position between the camera's internal and external parameters and the camera.
  • the speckle and landmark three-dimensional data reconstruction further includes a speckle three-dimensional data reconstruction step: according to the collected speckle image, at a certain image coordinate point pi on the left speckle image, centered on pi Go to a rectangular subgraph of 5x5 ⁇ 11x11; calculate the corresponding right speckle image polar line according to the calibration parameter, and take the same size matrix at the center of all corresponding coordinate points (q1 ⁇ qn) on the corresponding right speckle image polar line Figure, Calculate the correlation coefficient C1 ⁇ Cn between the sub-graph of the left speckle image and all the sub-pictures on the polar line of the right speckle image; compare the size of the correlation coefficient, define the maximum correlation coefficient as Cmax, and set the correlation coefficient threshold T, if Cmax If it is greater than T, it can be determined that the only corresponding matching point of the left camera's pi on the right camera is pr; traversing all the pixel coordinate points on the left speckle image to find the right speckle image corresponding matching point according to the above method, according to the calibration
  • the three-dimensional data reconstruction of the speckle and the marker point further comprises a step of reconstructing the three-dimensional data of the marker point: extracting all the marker point centers on the left and right speckle images according to the collected speckle image; searching for the left according to the polar constraint criterion And correspondingly matching the center points of the marker points on the right speckle image; and then reconstructing the corresponding points of the marker points into the marker point three-dimensional data PtM according to the calibration parameters.
  • the stripe matching further includes a marker point back projection offset compensation step, including: performing marker point center extraction on the left and right stripe images, recording the marker point pair PtMultiCoor; searching for corresponding matching on the left and right stripe images according to the polar line constraint criterion The center point of the marker point; the three-dimensional data PtM of the marker point on the speckle pattern is inversely projected onto the modulated left and right stripe images according to the respective calibration internal parameters of the left and right cameras, and the two-dimensional coordinate pair PtMacthCoor is recorded, and each pair on the PtMacthCoor is calculated.
  • a marker point back projection offset compensation step including: performing marker point center extraction on the left and right stripe images, recording the marker point pair PtMultiCoor; searching for corresponding matching on the left and right stripe images according to the polar line constraint criterion The center point of the marker point; the three-dimensional data PtM of the marker point on the speckle pattern is inversely projected onto the modulated left and right stripe
  • the back-projection marker point image coordinates are the pixel coordinate deviations of the nearest pair of marker point centers extracted on the stripe image, and the mean values of the respective deviations on the left and right stripe images are sequentially calculated, and the pixel deviation mean pixDivL and the right stripe pattern pixel deviation of the left stripe image are recorded.
  • the mean is pixDivR.
  • the three-dimensional matching further includes the following steps: after reconstructing the speckle three-dimensional data PtS and the corresponding marker point three-dimensional data PtM, performing center line extraction on the left and right stripe images; forming a plurality of independent segments for each center line connected domain segmentation a line segment, and then the speckle three-dimensional data PtS and the corresponding marker point three-dimensional data PtM are back-projected to the left and right stripe images according to the respective calibration parameters of the left and right cameras; the back projection coordinates of the left and right stripe images are sequentially added to the left
  • the fringe pixel deviation mean pixDivL and the right fringe pixel deviation mean pixDivR realize the offset compensation; the back projection coordinate pairs after the offset compensation are numbered, and each corresponding point has a corresponding serial number, forming a lookup table corresponding to the left and right image coordinates of the stripe;
  • the corresponding serial number of each point of each stripe segment on the left stripe image is traversed, and the stripe line segment
  • the three-dimensional reconstruction further comprises the steps of: matching the left and right images with the corresponding stripe center line segments, using the polar line geometric constraint relationship of the left and right cameras, searching for the correspondence relationship of the single points in the corresponding strip center line segment, and then according to the calibration parameters of the system , the corresponding point pairs are reconstructed into three-dimensional stripe point cloud data.
  • the method includes a speckle splicing step: performing icp splicing using a point cloud of a common area of two frames of speckle data before and after, and calculating a rotation translation matrix R, T between the two frames.
  • the data fusion step is performed: the rotation translation matrix R and T obtained by the speckle stitching are applied to the three-dimensional stripe point cloud data for fusion, thereby realizing the three-dimensional scanning of the stripe image.
  • the three-dimensional scanner and the scanning method thereof the three-dimensional data of the speckle and the three-dimensional data of the marker point are obtained by the speckle image of the measured object, and then the three-dimensional data of the speckle and the three-dimensional data of the marker point are obtained.
  • the three-dimensional scanner has the following advantages: 1.
  • the accuracy or accuracy of the stripe matching is high, so that the scanning efficiency of the three-dimensional scanning system can be improved by increasing the number of matched stripes; Real-time splicing can be realized; 3.
  • There is no need to calibrate the stripe light plane that is, there is no need to pass the light plane to guide the matching of the left and right images, and the installation accuracy of the relative position of the hardware is lower, which reduces the system cost.
  • FIG. 1 is a schematic structural diagram of a three-dimensional scanning system according to an embodiment of the present invention.
  • FIG. 2 is a left and right speckle pattern of the three-dimensional scanner of FIG. 1 projected onto an object to be measured.
  • FIG. 3 is a left and right stripe image captured by the left and right cameras of the three-dimensional scanning system of FIG. 1.
  • FIG. 5 is a reverse projection of speckle three-dimensional data and landmark three-dimensional data to left and right striped images. Get the back projection.
  • Figure 6 is a schematic diagram of the geometrical constraint of the polar line.
  • an embodiment of the present invention provides a three-dimensional scanning system for acquiring or acquiring three-dimensional stripe point cloud data of an object 105 to be measured.
  • the type of the three-dimensional scanning system is not limited, and includes a fixed three-dimensional scanning system and a handheld three-dimensional scanning system.
  • the three-dimensional scanning system is a handheld multi-striped binocular three-dimensional scanning system. It can be understood that when the three-dimensional scanning system is a hand-held multi-striped binocular three-dimensional scanning system, images obtained in different time periods may have errors due to jitter during operation.
  • the three-dimensional scanning system includes a light source 101, a left camera 102, a right camera 103, and a data processing unit 104.
  • the mutual position between the light source 101, the left camera 102, and the right camera 103 is not limited as long as the object to be measured 105 can be projected or collected. And in operation, the positions of the light source 101, the left camera 102, and the right camera 103 are relatively fixed.
  • the light source 101 is disposed in the middle of the left camera 102 and the right camera 103.
  • the light source 101 is configured to alternately project a plurality of speckle patterns and stripe patterns in the object 105 to be measured.
  • the so-called alternate projection that is, the first projected pattern is a speckle pattern
  • the light source 101 needs to project a stripe pattern between every two speckle patterns, and vice versa, and each two stripe patterns need to be projected.
  • a speckle pattern, and the first pattern is a speckle pattern.
  • the light source 101 includes a laser, a projector or other light source, and when the light source 101 is a projector, the projector is a digital projector.
  • the stripe pattern includes an analog laser stripe pattern, a laser stripe pattern, and other stripe patterns.
  • the light source 101 is a projector, and the stripe pattern is a natural light stripe pattern.
  • the natural light stripe pattern has no damage to the human eye, and when projected to the marker point, the black edge portion is not too bright, thereby improving the accurate extraction of the center of the marker point, thereby improving the stitching precision.
  • the number of stripes projected by the light source 101 is not limited, but in order to improve scanning efficiency, more than 15 strips are usually required. In the example, the number of stripes is greater than 80.
  • the left camera 102 and the right camera 103 are configured to synchronously acquire left and right speckle images and left and right stripe images of the measured object 105.
  • the types of the left and right cameras 103, 104 are not limited as long as a two-dimensional image of the object 105 to be measured can be acquired. It can be understood that the speckle pattern and the stripe pattern projected by the light source 101 to the object to be measured 105 are deformed by the height modulation of the object 105 to be measured, and the modulated speckle pattern and the stripe pattern are generated.
  • the left and right cameras 103 and 104 obtain the left and right speckle images and the left and right stripe patterns by acquiring the modulated stripe pattern.
  • the data processing unit 104 is configured to obtain speckle three-dimensional data and marker point three-dimensional data according to the speckle image, and backproject the left and right stripe images according to the speckle three-dimensional data and the marker point three-dimensional data and guide the left
  • the stripes of the right stripe image are matched, and the left and right stripe images are matched to the corresponding stripe, and reconstructed into three-dimensional stripe point cloud data.
  • the data processing unit 104 includes a speckle data and a marker point data reconstruction module, a stripe matching module, and a three-dimensional reconstruction module.
  • the speckle data and landmark data reconstruction module is configured to obtain speckle three-dimensional data and landmark three-dimensional data according to the speckle image.
  • the stripe matching module is configured to backproject the left and right stripe images according to the speckle three-dimensional data and the marker point three-dimensional data and guide the left and right stripe image stripe to match.
  • the three-dimensional reconstruction module is configured to reconstruct the corresponding stripe of the left and right stripe images into three-dimensional stripe point cloud data.
  • the three-dimensional reconstruction module is configured to match the left and right stripe images with the corresponding stripe, and use the polar line geometric constraint relationship of the left and right cameras to find a single point corresponding relationship in the center line segment of the corresponding stripe, and then according to the calibration parameter. , the corresponding point is reconstructed into three-dimensional stripe point cloud data.
  • the method or the technology for reconstructing the three-dimensional stripe point cloud data by the three-dimensional reconstruction module is not limited, as long as the matched left and right stripe images can be reconstructed into three-dimensional stripe point cloud data.
  • the data processing unit 104 may further include a speckle splicing module and a data fusion module, where the speckle splicing module is configured to perform ICP splicing using a point cloud of a common area of two frames of front and rear speckle data to calculate The rotation between the two frames translates the matrix R, T.
  • the data fusion module is configured to perform fusion on the three-dimensional stripe point cloud data according to the rotation translation matrix R and T obtained by the speckle stitching, thereby implementing three-dimensional scanning of the stripe image.
  • a scanning method for acquiring or acquiring three-dimensional point cloud data of an object 105 to be measured by the above three-dimensional scanning system comprising the following steps:
  • a three-dimensional digital imaging sensor composed of two cameras and a light source is constructed.
  • the system calibration further includes the steps of: calibrating the left and right cameras to obtain a rotational translation matrix Mc corresponding to the relative position between the camera's internal and external parameters and the camera.
  • a speckle pattern and a stripe pattern are alternately generated, and a light source is projected onto the object to be measured, and the speckle pattern and the stripe pattern are deformed by the height modulation of the object to be measured, and the modulated speckle is generated.
  • the pattern and the stripe pattern, the left and right cameras synchronously acquire and modulate the speckle pattern to obtain left and right speckle images, and the left and right cameras synchronously acquire and modulate the stripe pattern to obtain left and right stripe images.
  • Three-dimensional data reconstruction of speckle and landmarks According to the acquired left and right speckle images, three-dimensional reconstruction is performed to obtain speckle three-dimensional data PtS and marker point three-dimensional data PtM.
  • the speckle three-dimensional data PtS reconstruction includes the following steps: according to the collected speckle image, at a certain image coordinate point pi on the left speckle image, a rectangular subgraph of 5x5 to 11x11 is centered on pi Calculating the corresponding right speckle image polar line according to the calibration parameter, taking all the coordinate points (q1 ⁇ qn) on the corresponding right speckle image pole line as the center to take the same size matrix subgraph, and calculating the left speckle image pi Correlation coefficient C1 ⁇ Cn between the subgraph and all subgraphs on the right speckle image pole; comparing the correlation coefficient size, defining the correlation coefficient to be Cmax, setting the correlation coefficient threshold T, if Cmax is greater than T, then it can be determined The only corresponding matching point of the pi of the left camera on the right camera is pr; traversing all the pixel coordinate points on the left speckle image to find the corresponding matching point of the right speckle image according to the above method, and reconstructing the corresponding point according to the calibration parameter Three-dimensional data PtS
  • the marker point three-dimensional data PtM reconstruction comprises the steps of: extracting all the marker point centers on the left and right speckle images according to the collected speckle image; and searching for corresponding matches on the left and right speckle images according to the polar constraint criterion The marker point center pair; then, according to the calibration parameter, the corresponding point of the marker point is reconstructed into the marker point three-dimensional data PtM.
  • Stripe matching The speckle three-dimensional data PtS and the marker point three-dimensional data PtM are back-projected to the left and right stripe images and the left and right stripe images are instructed for matching.
  • the stripe matching further includes a marker point back projection offset compensation step, including: (a) performing marker point center extraction on the left and right stripe images, and recording the marker point pair PtMultiCoor. (b) Find the corresponding matching point center pair on the left and right fringe images according to the polar line constraint criterion. (c) The three-dimensional data PtM on the speckle map is inversely projected onto the modulated left and right fringe images according to the respective calibration internal parameters of the left and right cameras, and the two-dimensional coordinate pair PtMacthCoor is recorded, and each pair of back projections on the PtMacthCoor is calculated.
  • the pixel coordinate deviation of the center point of the nearest marker point extracted by the marker point image coordinate is sequentially calculated, and the mean value of the deviation of the left and right stripe images is sequentially calculated, and the pixel deviation mean pixDivL of the left stripe image and the pixel deviation mean pixDivR of the right stripe image are recorded. . (d) extracting the center line of the left and right stripe images; forming a plurality of independent line segments for each of the connected lines of the center line, and then sequentially selecting the speckle three-dimensional data PtS and the corresponding marker point three-dimensional data PtM according to the respective left and right cameras
  • the calibration parameters are backprojected onto the left and right striped images.
  • the back projection coordinates of the left and right stripe images are sequentially added with the left fringe pattern pixel deviation mean pixDivL and the right fringe pattern pixel deviation mean pixDivR to implement the offset compensation.
  • the back projection coordinate pairs after the offset compensation are numbered, and each corresponding point has a corresponding serial number, and a lookup table corresponding to the left and right image coordinates of the stripe is formed.
  • Three-dimensional reconstruction matching the left and right stripe images with the corresponding stripe, using the polar line geometric constraint relationship of the left and right cameras to find the corresponding relationship of the single points in the corresponding stripe center line segment, and then according to the calibration parameters , the corresponding point is reconstructed into three-dimensional stripe point cloud data.
  • the three-dimensional reconstruction further comprises the steps of: matching the left and right images with the corresponding stripe center line segments, using the polar line geometric constraint relationship of the left and right cameras, searching for the correspondence relationship of the single points in the corresponding strip center line segment, and then according to the calibration parameters of the system , the corresponding point pairs are reconstructed into three-dimensional stripe point cloud data.
  • the three-dimensional scanning method further includes:
  • Speckle stitching is performed by using a point cloud of a common area of two frames of speckle data before and after, The rotation translation matrix R, T between two frames is calculated.
  • the three-dimensional scanner and the scanning method thereof the three-dimensional data of the speckle and the three-dimensional data of the marker point are obtained by the speckle image of the measured object, and then the three-dimensional data of the speckle and the three-dimensional data of the marker point are obtained.
  • the three-dimensional scanner has the following advantages: 1.
  • the accuracy or accuracy of the stripe matching is high, so that the scanning efficiency of the three-dimensional scanning system can be improved by increasing the number of matched stripes; Real-time splicing can be realized; 3.
  • There is no need to calibrate the stripe light plane that is, there is no need to pass the light plane to guide the matching of the left and right images, and the installation accuracy of the relative position of the hardware is lower, which reduces the system cost.
  • FIG. 1 the structure of the hand-held multi-strip binocular three-dimensional scanning system actually designed is shown in FIG. 1 .
  • 101 is a digital projector
  • 102 is a camera
  • 103 is a right camera
  • 104 is a computer (data processing unit)
  • 105 is an object to be measured.
  • the internal parameters of the left camera after calibration are:
  • the internal parameters of the right camera after calibration are:
  • the system structure parameters between the left camera and the right camera are:
  • the digital speckle pattern is projected on the object to be measured 105, and the left and right cameras 102 and 103 synchronously acquire the modulated speckle pattern to obtain left and right speckle images as shown in FIG. 2 .
  • the digital multi-line stripe pattern is projected, and the left and right cameras 102, 103 synchronously acquire the modulated stripe pattern to obtain left and right stripe images as shown in FIG.
  • the three-dimensional reconstruction is performed to obtain the three-dimensional data of the speckle as shown in FIG. 4 and the three-dimensional data of the marker point, and the three-dimensional data of the speckle and the three-dimensional data of the marker point are back-projected to the left and right according to the calibration parameters.
  • the stripe image is given a sequence number on the corresponding points on the left and right sides to form a sequence number lookup table. Extracting the center of the stripe on the left and right stripe images and performing segmentation of the connected domain, and matching the stripe corresponding line segments according to the sequence number lookup table. The matched line segments perform corresponding point finding according to the polar line geometric constraint relationship of the left and right cameras as shown in FIG. 6, and then perform three-dimensional reconstruction according to the calibration parameters to generate three-dimensional stripe point cloud data. Repeat the above steps to perform real-time splicing using speckle data, and apply the splicing matrix to the three-dimensional stripe point cloud data to realize real-time scanning of the stripe image.
  • the three-dimensional scanner and the scanning method thereof the three-dimensional data of the speckle and the three-dimensional data of the marker point are obtained by the speckle image of the measured object, and then the three-dimensional data of the speckle and the three-dimensional data of the marker point are obtained.
  • the three-dimensional scanner has the following advantages: 1.
  • the accuracy or accuracy of the stripe matching is high, so that the scanning efficiency of the three-dimensional scanning system can be improved by increasing the number of matched stripes; Real-time splicing can be realized; 3.
  • There is no need to calibrate the stripe light plane that is, there is no need to pass the light plane to guide the matching of the left and right images, and the installation accuracy of the relative position of the hardware is lower, which reduces the system cost.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)

Abstract

本发明涉及一种三维扫描系统,用于获取被测物体的三维条纹点云数据,其包括:光源,用于在所述被测物体先后交替投射多个散斑图案及条纹图案。左、右相机,用于同步采集所述被测物体的左、右散斑图像及左、右条纹图像。散斑数据及标志点数据重构模组,用于根据所述散斑图像得到散斑三维数据及标志点三维数据。条纹匹配模块,用于根据所述散斑三维数据及标志点三维数据反投影到左、右条纹图像并指导左、右条纹图像条纹进行匹配。三维重构模块,用于将左、右条纹图像匹配好的对应条纹,重建为三维条纹点云数据。本发明还提供一种三维扫描系统的扫描方法。

Description

一种三维扫描系统及其扫描方法 技术领域
本发明涉及一种三维扫描系统及扫描方法,尤其涉及一种用于手持多条纹三维扫描系统的三维数字成像传感器、三维扫描系统及扫描方法。
背景技术
三维数字化技术是近年来国际上活跃研究的一个新兴交叉学科领域,被广泛的应用到逆向工程、文物保护、工业检测及虚拟现实等诸多领域。而手持便携式三维扫描仪以其便捷性,灵活性的优点在三维扫描领域被广泛应用。现有手持式三维扫描仪的原理主要是基于结构光的主动立体视觉方式,结构光的模式可以有多种,如红外激光散斑、DLP(Digital Light Processing)投影散斑、DLP投影的模拟激光条纹、激光条纹等。这些结构光模式中以DLP投影的模拟激光条纹,激光条纹为结构光的手持三维扫描仪的精度最高、扫描细节最好。
以DLP投影的模拟激光条纹,激光条纹为结构光为例的基本工作流程是:
(1)对投射的条纹进行平面拟合;
(2)根据采集到的条纹图进行标志点提取及条纹中心提取;
(3)对条纹中心进行连通域分割,根据平面方程对左右相机图像上的条纹进行对应点匹配;
(4)利用两相机的极线约束关系查找左右相机图像上对应的标志点中心;
(5)根据扫描系统的标定参数,采用三维重建算法对已经匹配好的对应条纹及对应标志点中心进行三维重建;
(6)标志点拼接及条纹三维点旋转平移实现手持三维扫描。
然而,该扫描过程中左右相机图像上的对应条纹匹配主要是基于光平面或条纹平面方程的指导,该方法在条纹数量大于15的时候左右相机图像上的对应 条纹的匹配错误率将显著提高,进而增加噪声,降低扫描数据的准确性。当条纹数量小于15时,扫描效率得不到有效提高。故而在固有的扫描帧率限制下提高扫描效率的有效方法是增加条纹数量同时提高条纹匹配的准确性。另外,该扫描的拼接需要标志点实现,两两帧之间需要一定数量的公共标志点,需要贴满整个扫描模型。
发明内容
有鉴于此,有必要提供一种手持多条纹三维扫描系统及其扫描方法,以解决现有手持三维扫描系统无法兼顾高扫描效率和高扫描数据准确性的问题,同时在扫描过程中无需标志点进行拼接。
一种三维扫描系统,用于获取被测物体的三维条纹点云数据,其包括:光源,用于在所述被测物体先后交替投射多个散斑图案及条纹图案。左、右相机,用于同步采集所述被测物体的左、右散斑图像及左、右条纹图像。散斑数据及标志点数据重构模组,用于根据所述散斑图像得到散斑三维数据及标志点三维数据。条纹匹配模块,用于根据所述散斑三维数据及标志点三维数据反投影到左、右条纹图像并指导左、右条纹图像条纹进行匹配。三维重构模块,用于将左、右条纹图像匹配好的对应条纹,重建为三维条纹点云数据。
所述条纹图案为自然光条纹图案。
所述条纹图案中的条纹条数大于15。
所述三维扫描系统为手持式三维扫描系统。
所述三维扫描系统进一步包括散斑拼接模块,用于利用前后两帧散斑数据公共区域的点云的进行ICP拼接,计算出两帧之间的旋转平移矩阵R、T。
所述三维扫描系统进一步包括数据融合模块,用于根据散斑拼接所得到的旋转平移矩阵R、T作用到三维条纹点云数据上进行融合,从而实现条纹图像的三维扫描。
一种三维扫描系统,用于获取被测物体的三维条纹点云数据,其包括:一 光源,用于在所述被测物体先后交替投射多个散斑图案及条纹图案。左、右相机,用于同步采集所述被测物体的左、右散斑图像及左、右条纹图像。一数据处理单元,用于根据所述散斑图像得到散斑三维数据及标志点三维数据,同时根据所述散斑三维数据及标志点三维数据反投影到左、右条纹图像并指导左、右条纹图像的条纹进行匹配,进而将左、右条纹图像匹配好的对应条纹,重建为三维条纹点云数据。
一种三维扫描方法,其包括如下步骤:
(1)设备构建:构建由两个相机和光源组成的三维数字成像传感器;
(2)系统标定:对左右相机进行标定,获得标定参数;
(3)投影与图像采集:先后交替生成散斑图案及条纹图案,并用光源向被测物体投射,所述散斑图案及条纹图案被被测物体的高度调制发生变形,产生调制后的散斑图案及条纹图案,左右相机同步采集调制后的散斑图案得到左、右散斑图像,左右相机同步采集调制后的条纹图案得到左、右条纹图像;
(4)散斑及标志点三维数据重构:根据采集到的左、右散斑图像,进行三维重构得到散斑三维数据PtS及标志点三维数据PtM;
(5)条纹匹配:将所述散斑三维数据PtS及标志点三维数据PtM反投影到左、右条纹图像并指导左、右条纹图像进行匹配;
(6)三维重构:将左、右条纹图像匹配好的对应条纹,利用左、右两个相机的极线几何约束关系,查找对应条纹中心线段中单个点对应关系,然后根据所述标定参数,将对应点重建为三维条纹点云数据。
所述系统标定进一步包括如下步骤:对左右相机进行标定从而获取相机的内外参及相机之间的相对位置对应的旋转平移矩阵Mc。
所述散斑及标志点三维数据重构进一步包括散斑三维数据重构步骤:根据采集到的散斑图像,在所述左散斑图像上的某一个图像坐标点pi处,以pi为中心去5x5~11x11的矩形子图;根据所述标定参数计算对应的右散斑图像极线,在对应的右散斑图像极线上所有坐标点(q1~qn)为中心取同样大小的矩阵子图, 计算左散斑图像pi处子图与右散斑图像极线上所有子图之间的相关系数C1~Cn;比较相关系数大小,定义相关系数最大的为Cmax,设定相关系数阈值T,如果Cmax大于T,则可以确定左相机的pi在右相机上唯一的对应匹配点为pr;遍历左散斑图像上所有的像素坐标点根据上述方法寻找右散斑图像对应匹配点,根据所述标定参数,将对应点重构成三维数据PtS。
所述散斑及标志点三维数据重构进一步包括标志点三维数据重构步骤:根据采集到的散斑图像,提取左、右散斑图像上所有的标志点中心;根据极线约束准则寻找左、右散斑图像上对应匹配的标志点中心对;然后根据标定参数,将标志点对应点重建成标志点三维数据PtM。
所述条纹匹配进一步包括标志点反投影偏差补偿步骤,包括:对所述左、右条纹图像进行标志点中心提取,记录其标志点对PtMultiCoor;根据极线约束准则寻找左右条纹图像上对应匹配的标志点中心对;将散斑图上的标志点三维数据PtM依次根据左右相机各自的标定内参反投影到调制后的左、右条纹图像上,记录其二维坐标对PtMacthCoor,计算PtMacthCoor上每对反投影标志点图像坐标在条纹图像上提取的最近的标志点中心对的像素坐标偏差,依次计算左、右条纹图像上各自偏差的均值,记录左条纹图像像素偏差均值pixDivL及右条纹图像素偏差均值pixDivR。
所述三维匹配进一步包括如下步骤:重建完散斑三维数据PtS及对应的标志点三维数据PtM后,对左、右条纹图像进行中心线提取;对每条中心线连通域的分割形成多条独立线段,然后将散斑三维数据PtS及对应的标志点三维数据PtM依次根据左右相机各自的标定参数反投影到左、右条纹图像上;所述左、右条纹图像的反投影坐标依次加上左条纹图像素偏差均值pixDivL及右条纹图像素偏差均值pixDivR实现偏差补偿;对偏差补偿后的反投影坐标对进行编号,每个对应点均有相应的序号,形成条纹左右图像坐标对应的查找表;遍历左条纹图像上每个条纹线段每个点的所对应的序号,根据查找表可直接查找到右条纹图像相匹配的条纹线段,如此即可实现左、右条纹图像线段的准确匹配。
所述三维重构进一步包括如下步骤:将左右图像匹配好的对应条纹中心线段,利用左右两个相机的极线几何约束关系,查找对应条纹中心线段中单个点对应关系,然后根据系统的标定参数,将对应点对重建为三维条纹点云数据。
进一步包括散斑拼接步骤:利用前后两帧散斑数据的公共区域的点云进行icp拼接,计算出两帧之间的旋转平移矩阵R、T。
进一步包括数据融合步骤:根据散斑拼接所求得的旋转平移矩阵R、T作用到三维条纹点云数据上进行融合,从而实现条纹图像的三维扫描。
与现有技术相比,本发明的三维扫描仪及其扫描方法中,通过被测物体的散斑图像得到散斑三维数据及标志点三维数据,然后将该散斑三维数据及标志点三维数据反投影到左、右条纹图像并指导左、右条纹图像条纹进行匹配,从而获得三维条纹点云数据。该三维扫描仪相对传统的三维扫描仪具有如下优点:1、条纹匹配的精度或准确性较高,从而可通过增加匹配的条纹数量来提高三维扫描系统的扫描效率;2、不用借助标志点即可实现实时拼接;3、无需标定条纹光平面,即无需通过光平面来指导左右图像的匹配,对硬件的相对位置的安装精度要求较低,降低了系统成本。
上述说明仅是本发明技术方案的概述,为了能够更清楚了解本发明的技术手段,而可依照说明书的内容予以实施,并且为了让本发明的上述和其它目的、特征和优点能够更明显易懂,以下特举实施例,并配合附图,详细说明如下。
附图说明
以下结合附图描述本发明的实施例,其中:
图1是本发明实施例提供的三维扫描系统的结构示意图。
图2是图1中三维扫描仪投射到被测物体上的左、右散斑图。
图3是图1中三维扫描系统左、右相机采集到的左、右条纹图像。
图4是根据图1中的散斑图进行三维重构得到的的散斑三维数据。
图5是将散斑三维数据及标志点三维数据依次反投影到左、右条纹图像所 得到的反投影。
图6是极线几何约束示意图。
具体实施方式
以下基于附图对本发明的具体实施例进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅作为实施例,并不用于限定本发明的保护范围。
请参照图1,本发明实施例提供一种三维扫描系统,用于获取或采集被测物体105的三维条纹点云数据。所述三维扫描系统的类型不限,包括固定式三维扫描系统和手持式三维扫描系统,优选地,所述三维扫描系统为手持多条纹双目三维扫描系统。可以理解,当所述三维扫描系统为手持时多条纹双目三维扫描系统时,在工作过程中,不同时段获得到的图像因为抖动会有误差。
该三维扫描系统包括光源101,左相机102、右相机103及数据处理单元104。所述光源101、左相机102、右相机103之间的相互位置不限,只要可以投射或采集到被测物体105即可。且在工作时,所述光源101,左相机102、右相机103的位置相对固定。优选地,所述光源101设置在所述左相机102与右相机103正中间。
所述光源101用于在所述被测物体105先后交替投射多个散斑图案及条纹图案。所谓先后交替投射,即第一幅投射的图案为散斑图案,且所述光源101每两幅散斑图案之间需投射一幅条纹图案,反之亦然,每两幅条纹图案之间需投射一幅散斑图案,且第一幅图案为散斑图案。所述光源101包括激光、投影仪或其他光源,当所述光源101为投影仪时,所述投影仪为数字投影仪。所述条纹图案包括模拟激光条纹图案、激光条纹图案及其他条纹图案等。优选地,所述光源101为投影仪,且所述条纹图案为自然光条纹图案。相对激光条纹图案,所述自然光条纹图案对人眼无伤害,且投射到标志点时,黑色边缘部分不会过亮,从而能提高标志点中心的准确提取,从而提高拼接精度。所述光源101所投射的条纹的数量不限,但为了提高扫描效率,通常需要大于15条,本实施 例中,所述条纹数量大于80条。
所述左相机102、右相机103用于同步采集所述被测物体105的左、右散斑图像及左、右条纹图像。所述左右相机103、104的类型不限,只要能采集到所述被测物体105的二维图像即可。可以理解,由于所述光源101向被测物体105投射的散斑图案及条纹图案,被被测物体105的高度调制发生变形,产生调制后的散斑图案及条纹图案。而所述左右相机103、104则通过采集调制后的条纹图案得到左、右散斑图像及左、右条纹图案。
所述数据处理单元104,用于根据所述散斑图像得到散斑三维数据及标志点三维数据,同时根据所述散斑三维数据及标志点三维数据反投影到左、右条纹图像并指导左、右条纹图像的条纹进行匹配,进而将左、右条纹图像匹配好的对应条纹,重建为三维条纹点云数据。
具体地,所述数据处理单元104包括散斑数据及标志点数据重构模组、条纹匹配模块及三维重构模块。所述散斑数据及标志点数据重构模组用于根据所述散斑图像得到散斑三维数据及标志点三维数据。所述条纹匹配模块用于根据所述散斑三维数据及标志点三维数据反投影到左、右条纹图像并指导左、右条纹图像条纹进行匹配。所述三维重构模块,用于将左、右条纹图像匹配好的对应条纹,重建为三维条纹点云数据。而所述三维重构模块用于将左右条纹图像匹配好的对应条纹,利用左、右两个相机的极线几何约束关系,查找对应条纹中心线段中单个点对应关系,然后根据所述标定参数,将对应点重建为三维条纹点云数据。当然,所述三维重构模块重建三维条纹点云数据的方式或技术不限,只要能将匹配好的左右条纹图像重建为三维条纹点云数据即可。
进一步地,所述所述数据处理单元104还可包括散斑拼接模块及数据融合模块,所述散斑拼接模块用于利用前后两帧散斑数据公共区域的点云的进行ICP拼接,计算出两帧之间的旋转平移矩阵R、T。所述数据融合模块用于根据散斑拼接所得到的旋转平移矩阵R、T作用到三维条纹点云数据上进行融合,从而实现条纹图像的三维扫描。
一种用上述三维扫描系统获取或采集被测物体105的三维点云数据的扫描方法,其包括如下步骤:
(1)设备构建:构建由两个相机和光源组成的三维数字成像传感器。
(2)系统标定:对左右相机进行标定,获得标定参数。
所述系统标定进一步包括如下步骤:对左右相机进行标定从而获取相机的内外参及相机之间的相对位置对应的旋转平移矩阵Mc。
(3)投影与图像采集:先后交替生成散斑图案及条纹图案,并用光源向被测物体投射,所述散斑图案及条纹图案被被测物体的高度调制发生变形,产生调制后的散斑图案及条纹图案,左右相机同步采集调制后的散斑图案得到左、右散斑图像,左右相机同步采集调制后的条纹图案得到左、右条纹图像。
(4)散斑及标志点三维数据重构:根据采集到的左、右散斑图像,进行三维重构得到散斑三维数据PtS及标志点三维数据PtM。
所述散斑三维数据PtS重构包括如下步骤:根据采集到的散斑图像,在所述左散斑图像上的某一个图像坐标点pi处,以pi为中心去5x5~11x11的矩形子图;根据所述标定参数计算对应的右散斑图像极线,在对应的右散斑图像极线上所有坐标点(q1~qn)为中心取同样大小的矩阵子图,计算左散斑图像pi处子图与右散斑图像极线上所有子图之间的相关系数C1~Cn;比较相关系数大小,定义相关系数最大的为Cmax,设定相关系数阈值T,如果Cmax大于T,则可以确定左相机的pi在右相机上唯一的对应匹配点为pr;遍历左散斑图像上所有的像素坐标点根据上述方法寻找右散斑图像对应匹配点,根据所述标定参数,将对应点重构成三维数据PtS。
所述标志点三维数据PtM重构包括如下步骤:根据采集到的散斑图像,提取左、右散斑图像上所有的标志点中心;根据极线约束准则寻找左、右散斑图像上对应匹配的标志点中心对;然后根据标定参数,将标志点对应点重建成标志点三维数据PtM。
(5)条纹匹配:将所述散斑三维数据PtS及标志点三维数据PtM反投影到左、右条纹图像并指导左、右条纹图像进行匹配。
所述条纹匹配进一步包括标志点反投影偏差补偿步骤,包括:(a)对所述左、右条纹图像进行标志点中心提取,记录其标志点对PtMultiCoor。(b)根据极线约束准则寻找左右条纹图像上对应匹配的标志点中心对。(c)将散斑图上的标志点三维数据PtM依次根据左右相机各自的标定内参反投影到调制后的左、右条纹图像上,记录其二维坐标对PtMacthCoor,计算PtMacthCoor上每对反投影标志点图像坐标在条纹图像上提取的最近的标志点中心对的像素坐标偏差,依次计算左、右条纹图像上各自偏差的均值,记录左条纹图像像素偏差均值pixDivL及右条纹图像素偏差均值pixDivR。(d)对左、右条纹图像进行中心线提取;对每条中心线连通域的分割形成多条独立线段,然后将散斑三维数据PtS及对应的标志点三维数据PtM依次根据左右相机各自的标定参数反投影到左、右条纹图像上。(e)所述左、右条纹图像的反投影坐标依次加上左条纹图像素偏差均值pixDivL及右条纹图像素偏差均值pixDivR实现偏差补偿。(f)对偏差补偿后的反投影坐标对进行编号,每个对应点均有相应的序号,形成条纹左右图像坐标对应的查找表。(g)遍历左条纹图像上每个条纹线段每个点的所对应的序号,根据查找表可直接查找到右条纹图像相匹配的条纹线段,如此即可实现左、右条纹图像线段的准确匹配。
(6)三维重构:将左、右条纹图像匹配好的对应条纹,利用左、右两个相机的极线几何约束关系,查找对应条纹中心线段中单个点对应关系,然后根据所述标定参数,将对应点重建为三维条纹点云数据。所述三维重构进一步包括如下步骤:将左右图像匹配好的对应条纹中心线段,利用左右两个相机的极线几何约束关系,查找对应条纹中心线段中单个点对应关系,然后根据系统的标定参数,将对应点对重建为三维条纹点云数据。
所述三维扫描方法进一步包括:
(7)散斑拼接:利用前后两帧散斑数据的公共区域的点云进行icp拼接, 计算出两帧之间的旋转平移矩阵R、T。
(8)数据融合:根据散斑拼接所求得的旋转平移矩阵R、T作用到三维条纹点云数据上进行融合,从而实现条纹图像的三维扫描。
与现有技术相比,本发明的三维扫描仪及其扫描方法中,通过被测物体的散斑图像得到散斑三维数据及标志点三维数据,然后将该散斑三维数据及标志点三维数据反投影到左、右条纹图像并指导左、右条纹图像条纹进行匹配,从而获得三维条纹点云数据。该三维扫描仪相对传统的三维扫描仪具有如下优点:1、条纹匹配的精度或准确性较高,从而可通过增加匹配的条纹数量来提高三维扫描系统的扫描效率;2、不用借助标志点即可实现实时拼接;3、无需标定条纹光平面,即无需通过光平面来指导左右图像的匹配,对硬件的相对位置的安装精度要求较低,降低了系统成本。
为进一步阐述本发明的三维扫描系统及其扫描方法,下面以具体的实施例给予说明。
请参照图1,实际设计的手持多条纹双目三维扫描系统的结构如图1所示。101为数字投影仪,102为相机,103位右相机,104为计算机(数据处理单元),105为被测物体。
标定后的左相机的内部参数为:
K1=[2271.084,0,645.632,
0,2265.112,511.553,
0,0,1]
标定后的右相机的内部参数为:
K2=[2275.181,0,644.405,
0,2270.321,510.053,
0,0,1]
左相机和右相机之间的系统结构参数为:
R=[8.749981e-001,6.547051e-003,4.840819e-001,
-2.904034e-003,9.999615e-001,-8.274993e-003,
-4.841175e-001,5.834813e-003,8.749835e-001]
T=[-1.778995e+002,-4.162821e-001,5.074737e+001]
按照上面的叙述的步骤,对被测物体105投射数字散斑图案,左、右相机102、103同步采集调制后的散斑图案得到左、右散斑图像如图2所示。然后投射数字多线条纹图案,左、右相机102、103同步采集调制后的条纹图案得到左、右条纹图像如图3所示。根据采集到的数字散斑图像进行三维重构取得散斑三维数据如图4所示及标志点三维数据,同时根据标定参数将该散斑三维数据及标志点三维数据依次反投影到左、右条纹图像上如图5所示,左右对应点上赋予序号,形成序号查找表。提取所述左、右条纹图像上条纹中心并进行连通域分割,根据序号查找表进行条纹对应线段的匹配。匹配完的线段对根据左、右相机的极线几何约束关系进行对应点查找如图6所示,然后根据标定参数进行三维重构,生成三维条纹点云数据。重复上述步骤利用散斑数据进行实时拼接,将拼接矩阵作用到三维条纹点云数据上实现条纹图像的实时扫描。
与现有技术相比,本发明的三维扫描仪及其扫描方法中,通过被测物体的散斑图像得到散斑三维数据及标志点三维数据,然后将该散斑三维数据及标志点三维数据反投影到左、右条纹图像并指导左、右条纹图像条纹进行匹配,从而获得三维条纹点云数据。该三维扫描仪相对传统的三维扫描仪具有如下优点:1、条纹匹配的精度或准确性较高,从而可通过增加匹配的条纹数量来提高三维扫描系统的扫描效率;2、不用借助标志点即可实现实时拼接;3、无需标定条纹光平面,即无需通过光平面来指导左右图像的匹配,对硬件的相对位置的安装精度要求较低,降低了系统成本。
以上所述仅为本发明的较佳实施例而已,并不用以限制本发明,凡在本发明的精神和原则之内所作的任何修改、等同替换和改进等,均应包含在本发明的保护范围之内。

Claims (16)

  1. 一种三维扫描系统,用于获取被测物体的三维条纹点云数据,其包括:
    光源,用于在所述被测物体先后交替投射多个散斑图案及条纹图案;
    左、右相机,用于同步采集所述被测物体的左、右散斑图像及左、右条纹图像;
    散斑数据及标志点数据重构模组,用于根据所述散斑图像得到散斑三维数据及标志点三维数据;
    条纹匹配模块,用于根据所述散斑三维数据及标志点三维数据反投影到左、右条纹图像并指导左、右条纹图像条纹进行匹配;
    三维重构模块,用于将左、右条纹图像匹配好的对应条纹,重建为三维条纹点云数据。
  2. 如权利要求1所述的三维扫描系统,其特征在于,所述条纹图案为自然光条纹图案。
  3. 如权利要求2所述的三维扫描系统,其特征在于,所述条纹图案中的条纹条数大于15。
  4. 如权利要求1所述的三维扫描系统,其特征在于,所述三维扫描系统为手持式三维扫描系统。
  5. 如权利要求1所述的三维扫描系统,其特征在于,进一步包括散斑拼接模块,用于利用前后两帧散斑数据公共区域的点云的进行ICP拼接,计算出两帧之间的旋转平移矩阵R、T。
  6. 如权利要求5所述的三维扫描系统,其特征在于,进一步包括数据融合模块,用于根据散斑拼接所得到的旋转平移矩阵R、T作用到三维条纹点云数据上进行融合,从而实现条纹图像的三维扫描。
  7. 一种三维扫描系统,用于获取被测物体的三维条纹点云数据,其包括:
    一光源,用于在所述被测物体先后交替投射多个散斑图案及条纹图案;
    左、右相机,用于同步采集所述被测物体的左、右散斑图像及左、右条纹 图像;
    一数据处理单元,用于根据所述散斑图像得到散斑三维数据及标志点三维数据,同时根据所述散斑三维数据及标志点三维数据反投影到左、右条纹图像并指导左、右条纹图像的条纹进行匹配,进而将左、右条纹图像匹配好的对应条纹,重建为三维条纹点云数据。
  8. 一种三维扫描方法,其包括如下步骤:
    (1)设备构建:构建由两个相机和光源组成的三维数字成像传感器;
    (2)系统标定:对左右相机进行标定,获得标定参数;
    (3)投影与图像采集:先后交替生成散斑图案及条纹图案,并用光源向被测物体投射,所述散斑图案及条纹图案被被测物体的高度调制发生变形,产生调制后的散斑图案及条纹图案,左右相机同步采集调制后的散斑图案得到左、右散斑图像,左右相机同步采集调制后的条纹图案得到左、右条纹图像;
    (4)散斑及标志点三维数据重构:根据采集到的左、右散斑图像,进行三维重构得到散斑三维数据PtS及标志点三维数据PtM;
    (5)条纹匹配:将所述散斑三维数据PtS及标志点三维数据PtM反投影到左、右条纹图像并指导左、右条纹图像进行匹配;
    (6)三维重构:将左、右条纹图像匹配好的对应条纹,利用左、右两个相机的极线几何约束关系,查找对应条纹中心线段中单个点对应关系,然后根据所述标定参数,将对应点重建为三维条纹点云数据。
  9. 一种如权利要求8所述的三维扫描方法,其特征在于,所述系统标定进一步包括如下步骤:对左右相机进行标定从而获取相机的内外参及相机之间的相对位置对应的旋转平移矩阵Mc。
  10. 一种如权利要求8所述的三维扫描方法,其特征在于,所述散斑及标志点三维数据重构进一步包括散斑三维数据重构步骤:
    根据采集到的散斑图像,在所述左散斑图像上的某一个图像坐标点pi处,以pi为中心去5x5~11x11的矩形子图;
    根据所述标定参数计算对应的右散斑图像极线,在对应的右散斑图像极线上所有坐标点(q1~qn)为中心取同样大小的矩阵子图,计算左散斑图像pi处子图与右散斑图像极线上所有子图之间的相关系数C1~Cn;
    比较相关系数大小,定义相关系数最大的为Cmax,设定相关系数阈值T,如果Cmax大于T,则可以确定左相机的pi在右相机上唯一的对应匹配点为pr;
    遍历左散斑图像上所有的像素坐标点根据上述方法寻找右散斑图像对应匹配点,根据所述标定参数,将对应点重构成三维数据PtS。
  11. 一种如权利要求10所述的三维扫描方法,其特征在于,所述散斑及标志点三维数据重构进一步包括标志点三维数据重构步骤:
    根据采集到的散斑图像,提取左、右散斑图像上所有的标志点中心;
    根据极线约束准则寻找左、右散斑图像上对应匹配的标志点中心对;
    然后根据标定参数,将标志点对应点重建成标志点三维数据PtM。
  12. 一种如权利要求11所述的三维扫描方法,其特征在于,所述条纹匹配进一步包括标志点反投影偏差补偿步骤,包括:
    对所述左、右条纹图像进行标志点中心提取,记录其标志点对PtMultiCoor;
    根据极线约束准则寻找左右条纹图像上对应匹配的标志点中心对;
    将散斑图上的标志点三维数据PtM依次根据左右相机各自的标定内参反投影到调制后的左、右条纹图像上,记录其二维坐标对PtMacthCoor,计算PtMacthCoor上每对反投影标志点图像坐标在条纹图像上提取的最近的标志点中心对的像素坐标偏差,依次计算左、右条纹图像上各自偏差的均值,记录左条纹图像像素偏差均值pixDivL及右条纹图像素偏差均值pixDivR。
  13. 一种如权利要求12所述的三维扫描方法,其特征在于,所述三维匹配进一步包括如下步骤:
    重建完散斑三维数据PtS及对应的标志点三维数据PtM后,对左、右条纹图像进行中心线提取;
    对每条中心线连通域的分割形成多条独立线段,然后将散斑三维数据PtS 及对应的标志点三维数据PtM依次根据左右相机各自的标定参数反投影到左、右条纹图像上;
    所述左、右条纹图像的反投影坐标依次加上左条纹图像素偏差均值pixDivL及右条纹图像素偏差均值pixDivR实现偏差补偿;
    对偏差补偿后的反投影坐标对进行编号,每个对应点均有相应的序号,形成条纹左右图像坐标对应的查找表;
    遍历左条纹图像上每个条纹线段每个点的所对应的序号,根据查找表可直接查找到右条纹图像相匹配的条纹线段,如此即可实现左、右条纹图像线段的准确匹配。
  14. 一种如权利要求8所述的三维扫描方法,其特征在于,所述三维重构进一步包括如下步骤:将左右图像匹配好的对应条纹中心线段,利用左右两个相机的极线几何约束关系,查找对应条纹中心线段中单个点对应关系,然后根据系统的标定参数,将对应点对重建为三维条纹点云数据。
  15. 一种如权利要求8所述的三维扫描方法,其特征在于,进一步包括散斑拼接步骤:利用前后两帧散斑数据的公共区域的点云进行icp拼接,计算出两帧之间的旋转平移矩阵R、T。
  16. 一种如权利要求15所述的三维扫描方法,其特征在于,进一步包括数据融合步骤:根据散斑拼接所求得的旋转平移矩阵R、T作用到三维条纹点云数据上进行融合,从而实现条纹图像的三维扫描。
PCT/CN2017/079059 2017-02-24 2017-03-31 一种三维扫描系统及其扫描方法 WO2018152929A1 (zh)

Priority Applications (6)

Application Number Priority Date Filing Date Title
AU2017400983A AU2017400983B2 (en) 2017-02-24 2017-03-31 Three-dimensional scanning system and scanning method thereof
EP17897432.5A EP3444560B1 (en) 2017-02-24 2017-03-31 Three-dimensional scanning system and scanning method thereof
US16/094,210 US10810750B1 (en) 2017-02-24 2017-03-31 Three-dimensional scanning system and scanning method thereof
CA3021967A CA3021967C (en) 2017-02-24 2017-03-31 Three-dimensional scanning system and scanning method thereof
JP2018560017A JP6619893B2 (ja) 2017-02-24 2017-03-31 3次元走査システムおよびその走査方法
KR1020197028007A KR102248944B1 (ko) 2017-02-24 2017-03-31 3차원 스캐닝 시스템 및 그 스캐닝 방법

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710105838.0A CN106802138B (zh) 2017-02-24 2017-02-24 一种三维扫描系统及其扫描方法
CN201710105838.0 2017-02-24

Publications (1)

Publication Number Publication Date
WO2018152929A1 true WO2018152929A1 (zh) 2018-08-30

Family

ID=58988568

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/079059 WO2018152929A1 (zh) 2017-02-24 2017-03-31 一种三维扫描系统及其扫描方法

Country Status (8)

Country Link
US (1) US10810750B1 (zh)
EP (1) EP3444560B1 (zh)
JP (1) JP6619893B2 (zh)
KR (1) KR102248944B1 (zh)
CN (1) CN106802138B (zh)
AU (1) AU2017400983B2 (zh)
CA (1) CA3021967C (zh)
WO (1) WO2018152929A1 (zh)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109583377A (zh) * 2018-11-30 2019-04-05 北京理工大学 一种管路模型重建的控制方法、装置及上位机
CN111008602A (zh) * 2019-12-06 2020-04-14 青岛海之晨工业装备有限公司 一种小曲率薄壁零件用二维和三维视觉结合的划线特征提取方法
CN111833392A (zh) * 2019-04-16 2020-10-27 杭州思看科技有限公司 标记点多角度扫描方法、系统及装置
CN111883271A (zh) * 2020-06-03 2020-11-03 湖北工业大学 核反应堆压力容器自动检测平台精确定位方法及系统
CN112325798A (zh) * 2020-10-28 2021-02-05 南京理工大学智能计算成像研究院有限公司 一种基于相位单调一致性的双目远心匹配方法
CN112815843A (zh) * 2021-01-07 2021-05-18 西安理工大学 一种3d打印过程中工件表面打印偏差的在线监测方法
CN113793387A (zh) * 2021-08-06 2021-12-14 中国科学院深圳先进技术研究院 单目散斑结构光系统的标定方法、装置及终端
CN114463251A (zh) * 2021-12-13 2022-05-10 西安交通大学 一种测量航空发动机中介机匣内表面变形的方法及装置
CN114993207A (zh) * 2022-08-03 2022-09-02 广东省智能机器人研究院 基于双目测量系统的三维重建方法
CN117629105A (zh) * 2023-12-06 2024-03-01 北京锐达仪表有限公司 物料三维形态测量系统

Families Citing this family (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108269279B (zh) 2017-07-17 2019-11-08 先临三维科技股份有限公司 基于单目三维扫描系统的三维重构方法和装置
CN107680039B (zh) * 2017-09-22 2020-07-10 武汉中观自动化科技有限公司 一种基于白光扫描仪的点云拼接方法及系统
CN108269300B (zh) * 2017-10-31 2019-07-09 先临三维科技股份有限公司 牙齿三维数据重建方法、装置和系统
CN107869967A (zh) * 2017-11-02 2018-04-03 渭南领智三维科技有限公司 一种人体足部快速三维扫描方法
TWI651687B (zh) * 2017-11-24 2019-02-21 財團法人工業技術研究院 三維模型建構方法及其系統
CN108362220A (zh) * 2018-01-19 2018-08-03 中国科学技术大学 用于印制线路板的三维形貌测量及缺陷检测的方法
CN110070598B (zh) * 2018-01-22 2022-11-25 宁波盈芯信息科技有限公司 用于3d扫描重建的移动终端及其进行3d扫描重建方法
FI128523B (en) 2018-06-07 2020-07-15 Ladimo Oy Modeling of topography of a 3D surface
CN109373913B (zh) * 2018-09-04 2020-09-29 中国科学院力学研究所 一种非接触式受电弓弓头碳滑块厚度检测方法
CN109410318B (zh) 2018-09-30 2020-09-08 先临三维科技股份有限公司 三维模型生成方法、装置、设备和存储介质
CN109541875B (zh) * 2018-11-24 2024-02-13 深圳阜时科技有限公司 一种光源结构、光学投影模组、感测装置及设备
CN111508012B (zh) * 2019-01-31 2024-04-19 先临三维科技股份有限公司 线条纹误配检测和三维重建的方法、装置
CN110702025B (zh) * 2019-05-30 2021-03-19 北京航空航天大学 一种光栅式双目立体视觉三维测量系统及方法
CN110223388B (zh) * 2019-05-31 2022-12-16 中国科学院深圳先进技术研究院 基于空间结构光的三维重建方法、装置、终端设备及存储介质
CN110044266B (zh) * 2019-06-03 2023-10-31 易思维(杭州)科技有限公司 基于散斑投影的摄影测量系统
CN111102938B (zh) * 2019-06-20 2022-09-06 杭州光粒科技有限公司 一种物体三维形貌测量方法、系统和计算机可读存储介质
CN110595392B (zh) * 2019-09-26 2021-03-02 桂林电子科技大学 一种十字线结构光双目视觉扫描系统及方法
DE102020130877A1 (de) 2019-12-09 2021-06-10 Mitutoyo Corporation Messvorrichtung für dreidimensionale geometrien und messverfahren für dreidimensionale geometrien
CN111023970B (zh) * 2019-12-17 2021-11-16 杭州思看科技有限公司 多模式三维扫描方法及系统
CN111023999B (zh) * 2019-12-26 2020-12-01 北京交通大学 一种基于空间编码结构光的稠密点云生成方法
CN111145342B (zh) * 2019-12-27 2024-04-12 山东中科先进技术研究院有限公司 一种双目散斑结构光三维重建方法及系统
CN111325683B (zh) * 2020-01-23 2023-06-20 深圳市易尚展示股份有限公司 基于复合编码三维重建的散斑灰度校正方法和装置
CN111473744B (zh) * 2020-06-03 2022-01-14 北京航空航天大学 一种基于散斑嵌入相移条纹的三维形貌视觉测量方法及系统
CN111947600B (zh) * 2020-07-24 2022-05-20 南京理工大学 基于相位级次代价滤波的鲁棒立体相位展开方法
US20230298270A1 (en) * 2020-08-05 2023-09-21 Medit Corp. Method and device for acquiring three-dimensional data, and computer-readable storage medium storing program for performing method
CN111982025A (zh) * 2020-08-21 2020-11-24 苏州合之木智能科技有限公司 用于模具检测的点云数据获取方法及系统
KR20220026238A (ko) * 2020-08-25 2022-03-04 주식회사 메디트 복수의 이미지 센서를 구비하는 3차원 스캐너
CN112330732A (zh) * 2020-09-29 2021-02-05 先临三维科技股份有限公司 三维数据拼接方法及三维扫描系统、手持扫描仪
CN112489190A (zh) * 2020-11-18 2021-03-12 新拓三维技术(深圳)有限公司 一种全自动室内扫描方法、系统及计算机可读存储介质
CN112633293B (zh) * 2020-11-24 2022-05-20 北京航空航天大学青岛研究院 一种基于图分割的三维稀疏点云重建图像集分类方法
CN112754658B (zh) * 2020-12-31 2023-03-14 华科精准(北京)医疗科技有限公司 一种手术导航系统
CN113008164A (zh) * 2021-03-23 2021-06-22 南京理工大学 快速高精度三维面形重构方法
CN113137938B (zh) * 2021-04-13 2023-04-25 思看科技(杭州)股份有限公司 三维扫描系统、方法、计算机设备和存储介质
CN113129357B (zh) * 2021-05-10 2022-09-30 合肥工业大学 一种复杂背景下三维扫描测量光条中心提取方法
CN113971691A (zh) * 2021-09-16 2022-01-25 中国海洋大学 一种基于多视角双目结构光的水下三维重建方法
CN114332373A (zh) * 2021-12-29 2022-04-12 华侨大学 一种克服继电器金属表面反光的磁路落差检测方法及系统
CN116206069B (zh) * 2023-04-28 2023-10-13 思看科技(杭州)股份有限公司 三维扫描中的图像数据处理方法、装置和三维扫描仪
CN116576791B (zh) * 2023-06-09 2024-05-07 常州高常科技有限公司 三维全场应变测量分析系统
CN117579754B (zh) * 2024-01-16 2024-05-07 思看科技(杭州)股份有限公司 三维扫描方法、装置、计算机设备以及存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070211258A1 (en) * 2006-03-07 2007-09-13 Korea Advanced Institute Of Science And Technology Three-dimensional shape measurement apparatus and method for eliminating2pi ambiguity of moire principle and omitting phase shifting means
CN101089547A (zh) * 2007-07-11 2007-12-19 华中科技大学 一种基于彩色结构光的二维三频解相测量方法
CN101608908A (zh) * 2009-07-20 2009-12-23 杭州先临三维科技股份有限公司 数字散斑投影和相位测量轮廓术相结合的三维数字成像方法
CN101739717A (zh) * 2009-11-12 2010-06-16 天津汇信软件有限公司 三维彩色点云非接触扫描方法
CN101794461A (zh) * 2010-03-09 2010-08-04 深圳大学 一种三维建模方法及系统
CN103868524A (zh) * 2013-12-23 2014-06-18 西安新拓三维光测科技有限公司 一种基于散斑图案的单目测量系统标定方法及装置
CN104596439A (zh) * 2015-01-07 2015-05-06 东南大学 一种基于相位信息辅助的散斑匹配三维测量方法

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2003219433A1 (en) * 2003-04-25 2004-11-23 Ecole Polytechnique Federale De Lausanne (Epfl) Shape and deformation measurements of large objects by fringe projection
US20070057946A1 (en) 2003-07-24 2007-03-15 Dan Albeck Method and system for the three-dimensional surface reconstruction of an object
US7912673B2 (en) * 2005-03-11 2011-03-22 Creaform Inc. Auto-referenced system and apparatus for three-dimensional scanning
US8400494B2 (en) * 2005-10-11 2013-03-19 Primesense Ltd. Method and system for object reconstruction
CA2528791A1 (en) * 2005-12-01 2007-06-01 Peirong Jia Full-field three-dimensional measurement method
JP5049246B2 (ja) * 2008-10-29 2012-10-17 アイシン精機株式会社 物体形状評価装置
DK2438397T3 (en) * 2009-06-01 2019-01-28 Dentsply Sirona Inc Method and device for three-dimensional surface detection with a dynamic frame of reference
JP2013061248A (ja) * 2011-09-13 2013-04-04 Konica Minolta Advanced Layers Inc 情報処理装置および情報処理プログラム
TWI503618B (zh) * 2012-12-27 2015-10-11 Ind Tech Res Inst 深度影像擷取裝置、其校正方法與量測方法
US9349174B2 (en) * 2013-05-31 2016-05-24 Microsoft Technology Licensing, Llc Absolute phase measurement with secondary pattern-embedded fringe
TWI485361B (zh) * 2013-09-11 2015-05-21 Univ Nat Taiwan 三維形貌輪廓量測裝置及其方法
CN106662433B (zh) * 2014-06-27 2019-09-06 新加坡恒立私人有限公司 结构化光成像系统及方法
CN104132624B (zh) * 2014-08-14 2017-01-11 北京卫星环境工程研究所 基于散斑干涉和条纹投影测量航天器结构变形的装置及测量方法
CN105203044B (zh) * 2015-05-27 2019-06-11 珠海真幻科技有限公司 以计算激光散斑为纹理的立体视觉三维测量方法及系统
CN105241397A (zh) * 2015-06-29 2016-01-13 北航温州研究院 基于结构光的实时测量拼接方法及其设备
DE102016002398B4 (de) * 2016-02-26 2019-04-25 Gerd Häusler Optischer 3D-Sensor zur schnellen und dichten Formerfassung
CN105928472B (zh) * 2016-07-11 2019-04-16 西安交通大学 一种基于主动斑投射器的三维形貌动态测量方法
EP3680857B1 (en) * 2017-09-11 2021-04-28 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Image processing method and apparatus, electronic device and computer-readable storage medium
CN108088391B (zh) * 2018-01-05 2020-02-07 深度创新科技(深圳)有限公司 一种三维形貌测量的方法和系统
US11105754B2 (en) * 2018-10-08 2021-08-31 Araz Yacoubian Multi-parameter inspection apparatus for monitoring of manufacturing parts
EP3680607A1 (en) * 2019-01-08 2020-07-15 Rolls-Royce plc Surface roughness measurement

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070211258A1 (en) * 2006-03-07 2007-09-13 Korea Advanced Institute Of Science And Technology Three-dimensional shape measurement apparatus and method for eliminating2pi ambiguity of moire principle and omitting phase shifting means
CN101089547A (zh) * 2007-07-11 2007-12-19 华中科技大学 一种基于彩色结构光的二维三频解相测量方法
CN101608908A (zh) * 2009-07-20 2009-12-23 杭州先临三维科技股份有限公司 数字散斑投影和相位测量轮廓术相结合的三维数字成像方法
CN101739717A (zh) * 2009-11-12 2010-06-16 天津汇信软件有限公司 三维彩色点云非接触扫描方法
CN101794461A (zh) * 2010-03-09 2010-08-04 深圳大学 一种三维建模方法及系统
CN103868524A (zh) * 2013-12-23 2014-06-18 西安新拓三维光测科技有限公司 一种基于散斑图案的单目测量系统标定方法及装置
CN104596439A (zh) * 2015-01-07 2015-05-06 东南大学 一种基于相位信息辅助的散斑匹配三维测量方法

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109583377B (zh) * 2018-11-30 2022-12-27 北京理工大学 一种管路模型重建的控制方法、装置及上位机
CN109583377A (zh) * 2018-11-30 2019-04-05 北京理工大学 一种管路模型重建的控制方法、装置及上位机
CN111833392A (zh) * 2019-04-16 2020-10-27 杭州思看科技有限公司 标记点多角度扫描方法、系统及装置
CN111008602A (zh) * 2019-12-06 2020-04-14 青岛海之晨工业装备有限公司 一种小曲率薄壁零件用二维和三维视觉结合的划线特征提取方法
CN111008602B (zh) * 2019-12-06 2023-07-25 青岛海之晨工业装备有限公司 一种小曲率薄壁零件用二维和三维视觉结合的划线特征提取方法
CN111883271A (zh) * 2020-06-03 2020-11-03 湖北工业大学 核反应堆压力容器自动检测平台精确定位方法及系统
CN111883271B (zh) * 2020-06-03 2022-08-16 湖北工业大学 核反应堆压力容器自动检测平台精确定位方法及系统
CN112325798A (zh) * 2020-10-28 2021-02-05 南京理工大学智能计算成像研究院有限公司 一种基于相位单调一致性的双目远心匹配方法
CN112815843A (zh) * 2021-01-07 2021-05-18 西安理工大学 一种3d打印过程中工件表面打印偏差的在线监测方法
CN112815843B (zh) * 2021-01-07 2023-12-29 西安理工大学 一种3d打印过程中工件表面打印偏差的在线监测方法
CN113793387A (zh) * 2021-08-06 2021-12-14 中国科学院深圳先进技术研究院 单目散斑结构光系统的标定方法、装置及终端
CN114463251A (zh) * 2021-12-13 2022-05-10 西安交通大学 一种测量航空发动机中介机匣内表面变形的方法及装置
CN114463251B (zh) * 2021-12-13 2024-03-15 西安交通大学 一种测量航空发动机中介机匣内表面变形的方法及装置
CN114993207A (zh) * 2022-08-03 2022-09-02 广东省智能机器人研究院 基于双目测量系统的三维重建方法
CN117629105A (zh) * 2023-12-06 2024-03-01 北京锐达仪表有限公司 物料三维形态测量系统

Also Published As

Publication number Publication date
CA3021967A1 (en) 2018-08-30
JP2019516983A (ja) 2019-06-20
AU2017400983A8 (en) 2019-12-12
EP3444560A4 (en) 2019-04-03
CN106802138A (zh) 2017-06-06
JP6619893B2 (ja) 2019-12-11
AU2017400983B2 (en) 2020-05-21
US10810750B1 (en) 2020-10-20
EP3444560B1 (en) 2020-06-17
US20200334840A1 (en) 2020-10-22
KR102248944B1 (ko) 2021-05-07
EP3444560A1 (en) 2019-02-20
CN106802138B (zh) 2019-09-24
CA3021967C (en) 2019-05-28
KR20190121359A (ko) 2019-10-25
AU2017400983A1 (en) 2019-10-17

Similar Documents

Publication Publication Date Title
WO2018152929A1 (zh) 一种三维扫描系统及其扫描方法
WO2018103152A1 (zh) 一种三维数字成像传感器、三维扫描系统及其扫描方法
EP3457078B1 (en) Monocular three-dimensional scanning system based three-dimensional reconstruction method and apparatus
CN110288642B (zh) 基于相机阵列的三维物体快速重建方法
US6781618B2 (en) Hand-held 3D vision system
CN106683173B (zh) 一种基于邻域块匹配提高三维重建点云稠密程度的方法
US7953271B2 (en) Enhanced object reconstruction
JP5583761B2 (ja) 動的基準フレームを用いた3次元表面検出方法及び装置
CN104077804A (zh) 一种基于多帧视频图像构建三维人脸模型的方法
CN110940295B (zh) 基于激光散斑极限约束投影的高反射物体测量方法及系统
CN111047678B (zh) 一种三维人脸采集装置和方法
WO2018056802A1 (en) A method for estimating three-dimensional depth value from two-dimensional images
Li et al. Accurate and efficient 3D reconstruction system for the human body with color texture based on DIC
CN110363806A (zh) 一种利用不可见光投射特征进行三维空间建模的方法
CN115442584A (zh) 一种多传感器融合的异形面动态投影方法
Xu et al. A visual processing system for facial prediction

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 3021967

Country of ref document: CA

ENP Entry into the national phase

Ref document number: 2018560017

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2017897432

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2017897432

Country of ref document: EP

Effective date: 20181115

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17897432

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 20197028007

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2017400983

Country of ref document: AU

Date of ref document: 20170331

Kind code of ref document: A