CN104299211B - Free-moving type three-dimensional scanning method - Google Patents

Free-moving type three-dimensional scanning method Download PDF

Info

Publication number
CN104299211B
CN104299211B CN201410494897.8A CN201410494897A CN104299211B CN 104299211 B CN104299211 B CN 104299211B CN 201410494897 A CN201410494897 A CN 201410494897A CN 104299211 B CN104299211 B CN 104299211B
Authority
CN
China
Prior art keywords
measurement
point cloud
angle
frequency
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410494897.8A
Other languages
Chinese (zh)
Other versions
CN104299211A (en
Inventor
李欢欢
郭家玉
李玉勤
王超
赵磊
杨涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou cartesan Testing Technology Co. Ltd
Original Assignee
Suzhou Dika Testing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Dika Testing Technology Co ltd filed Critical Suzhou Dika Testing Technology Co ltd
Priority to CN201410494897.8A priority Critical patent/CN104299211B/en
Publication of CN104299211A publication Critical patent/CN104299211A/en
Application granted granted Critical
Publication of CN104299211B publication Critical patent/CN104299211B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses a free-moving type three-dimensional scanning method, belongs to the field of optical detection, and relates to a full-field three-dimensional contour measuring method, handheld measuring equipment and a multi-angle splicing method based on textures, curved surface curvatures and normal directions. The invention adopts a three-frequency color stripe projection method to project three-frequency color stripes of 1: 4: 12 onto the surface of a measured object through a projection device, and a camera performs single-frame exposure to shoot a deformed stripe pattern. On one hand, the deformed fringe pattern calculates a high-frequency carrier frequency unwrapping phase by adopting a three-frequency phase unwrapping method, and three-dimensional point cloud of the measured object is obtained by system calibration parameters. On the other hand, the texture map of the surface of the measured object can be recovered from the deformed fringe map. The handheld scanning puts higher requirements on multi-angle splicing, and the invention realizes the unmarked splicing of multi-angle point cloud by adopting the method of measuring the curvature of a curved surface and the normal direction constraint according to the color texture characteristics.

Description

Free-moving type three-dimensional scanning method
Technical Field
The invention belongs to the field of optical detection, and relates to a full-field three-dimensional contour measuring method, handheld measuring equipment and a multi-angle splicing method based on textures, curved surface curvatures and normal directions.
Background
Some portable scanners that have in the market at present are only small, portable, but measure at every turn and need multiframe continuous exposure to gather the deformation stripe map just can reconstruct the three-dimensional point cloud, and the operation mode generally is: selecting a position, fixing an instrument, measuring, moving the position, fixing and measuring. The working mode has low measuring efficiency, cannot realize handheld continuous scanning, and is not suitable for scanning dynamic objects. Some portable scanners can use a handheld scanning method and a stereoscopic vision technique, but mark points need to be marked on a measured object before measurement, which results in low measurement efficiency and large workload, and how many points are reconstructed by a machine vision method and marked points? Location? Correlation and therefore reconstruction of dense point clouds is not possible. The laser three-dimensional scanner can also be made into a portable scanner, which also requires the scanner to be kept still during the measurement process, so that the handheld scanning cannot be realized, and in addition, the laser scanner is point scanning or line scanning, so that the scanning efficiency is low.
Multi-angle stitching of handheld three-dimensional scanners is also an important issue with respect to several of the above techniques. On one hand, splicing can be achieved by calibrating the position of the multiple measurement scanner or the position of the measured object, but calibration requires auxiliary equipment, such as a displacement table, which increases cost and limits application occasions. On the other hand, the splicing is assisted by attaching some mark points on the surface of the measured object, which can cause the measurement period to be lengthened and the measurement to be complicated.
In view of the above, there is an urgent need for a scanning method that can meet the requirements of handheld measurement and dynamic measurement, and can well splice the measurement results of various angles.
Disclosure of Invention
The invention aims to realize a portable and convenient-to-operate handheld scanning method, which is suitable for most measuring objects, does not need a large number of mark points and well realizes the splicing problem of multi-angle measurement.
The free moving three-dimensional scanning method includes one color camera, one color projecting device, one controller, etc. and features the camera and the projecting device in acute angle, the control button to control the start and stop of measurement, the projecting device to project three-frequency color sine stripes onto the surface of the measured object, and the controller to control the camera to trigger the shooting and to shoot color deformed stripe pattern via single frame exposure. On one hand, the deformed fringe pattern calculates a high-frequency carrier frequency unwrapping phase by adopting a three-frequency phase unwrapping method, and three-dimensional point cloud of the measured object is obtained by system calibration parameters. On the other hand, the deformed fringe pattern obtains three frequencies by adopting a frequency domain minimization methodAnd (4) a sinusoidal fringe component, and subtracting the three-frequency sinusoidal fringe component from the color deformed fringe pattern to recover the texture pattern of the surface of the measured object. Two adjacent frames of texture maps can obtain two pairwise corresponding sparse angular points by using a SIFT intersection point extraction method, an initial coordinate transformation matrix p1 between two adjacent frames of measurement point clouds is calculated by using space points corresponding to the angular points, and two three-dimensional point clouds are unified under a former point cloud coordinate system through coordinate transformation; dividing a triangular surface patch of each measured three-dimensional point cloud of a measured object, calculating the curvature and the normal of each triangular surface patch, solving a coordinate transformation matrix p2 for further splicing the two point clouds by adopting the maximum global curvature correlation and the maximum normal direction correlation of the three-dimensional curved surface after unifying the coordinate system, and carrying out coordinate transformation; iterating the two point clouds after coordinate transformation by adopting an ICP (iterative closed Point) method to find a coordinate transformation matrix p3 for more accurately splicing the two point clouds together; therefore, the three-dimensional point cloud left-multiplication matrix pi of the i +1 th frame is p1 p2 p3, and the three-dimensional point cloud left-multiplication matrix pi and the three-dimensional point cloud obtained by the exposure of the i +1 th frame are spliced together to be left-multiplied by HiSplicing the 1 st frame measurement result together with the P1. P2. P3. Pi, splicing all the measured point clouds with the 1 st frame measurement result, and completing the complete splicing of the measurement.
A mobile three dimensional scanning method according to claim 1, a hand held 3d real time scanner according to claim 1, wherein the camera exposure is automatically adjusted according to the degree of fringe modulation. Specifically, the high-frequency component of the three-frequency color stripe is obtained by empirical mode decomposition and two-dimensional Hilbert transformation, the two-dimensional amplitude of the high-frequency component is extracted by the two-dimensional Hilbert, the maximum value and the minimum value of the two-dimensional amplitude are calculated, the difference between the maximum value and the minimum value and a set value is calculated, and the exposure time is adjusted to enable the difference to be minimum. Thereby achieving the purpose of adjusting exposure.
A mobile three-dimensional scanning method according to claim 1, wherein the scanner is triggered only within a predetermined measuring distance, the operator should reasonably control the distance between the scanner and the object to be measured, and the indicator lights 1 and 2 respectively indicate whether the measuring distance is within the measuring range. Meanwhile, the measurement can be classified into a jog measurement and a continuous measurement.
(1) The measurement button is inching, namely, the measurement is finished once when the button is pressed once, and if the measurement results of multiple times need to be spliced, an operator needs to ensure that the superposition rate of the two adjacent measurements of the measured object is more than 60%.
(2) Continuous measurement is realized, the controller continuously triggers the camera, and an operator continuously moves the scanner; the frame rate can be adjusted according to the size of the visual field and the speed of the mobile person so as to ensure the coincidence rate of more than 60 percent. For example, for a field size of 0.5m × 0.5m, assuming that the moving speed of the operator is 1m/s and the trigger speed of the signal generator is 4 frames/s, it is able to ensure that the coincidence rate between two adjacent measurements is above 60%. In order to avoid overlarge data amount caused by excessive overlapped parts, when a person carries out handheld measurement, the trigger speed is set to be 5-10 frames/s.
(3) Can be triggered only within the calibrated measuring distance, an operator needs to reasonably control the distance between the scanner and the measured object,
the method of claim 1, wherein the invention can realize dynamic object measurement, the measured object is in the measuring range, the scanner is fixed, the frame rate is set, the position and shape of the measured curved surface at each moment can be recorded by using the continuous measuring mode, and the data can be used for motion state analysis and motion deformation analysis.
The freely movable three-dimensional scanning method as claimed in claim 1, wherein the projection device projects a three-frequency color fringe pattern, the color fringe pattern is generated by the computer, the three RGB color channels are generated by the sine fringes of the low, middle and high carrier frequencies, respectively, and the frequency ratio of the three-frequency color sine fringes is 1: 4: 12.
The handheld 3d real-time scanner of claim 1, wherein the decomposition of the high, medium and low frequency components is performed by subtracting a color channel containing high frequency fringes from a color channel containing medium frequency fringes in the deformed color fringe pattern to reduce background light intensity, thereby obtaining a high and low frequency composite fringe pattern. The high and intermediate carrier frequency components are then separated by fourier transform. And in the same way, decomposing to obtain medium and low carrier frequency components. And finally, sequentially completing wrapping phase expansion according to the low, medium and high carrier frequency components by a variable-precision de-wrapping algorithm to obtain the expansion phase of the high-frequency carrier frequency item. The unwrapping phase can thus recover the object height based on the calibration result.
The invention has the advantages that:
1) three-dimensional point cloud measurement can be completed by realizing single-frame exposure by using a method of projecting a three-frequency color sine fringe pattern, and the measurement result point cloud is dense and high in precision;
2) automatically adjusting the exposure time of the camera according to the information such as color, texture, reflectivity and the like of the measured object;
3) the two measurement modes of inching measurement and continuous measurement can be realized, and in the continuous measurement mode, the scanner can continuously expose at a certain frame rate to realize continuous measurement;
4) the single-angle three-dimensional measurement of the motion curved surface can be realized;
5) obtaining a three-frequency sine stripe component by a frequency domain minimization method, and subtracting the three-frequency sine stripe component from the color deformation stripe graph to recover the texture graph of the surface of the measured object;
6) and realizing the splicing of multi-angle and multi-frame exposure measurement according to the texture map recovered from the deformed stripe map, the curvature of the curved surface, the normal direction and other information.
Drawings
Fig. 1 shows a technical roadmap for a handheld three-dimensional scanning method;
FIG. 2 illustrates a projected color sine-fringe pattern (R: G: B: 1: 4: 12);
FIG. 3 illustrates a three-frequency color sinusoidal fringe profilometry and color texture recovery technique route;
FIG. 4 illustrates a method for stitching two adjacent angle measurements;
FIG. 5 illustrates a multi-angle scanning result stitching method;
FIG. 6 illustrates a face 6 angular scan model;
FIG. 7 illustrates a multi-angle presentation of the stitched model.
Detailed Description
The present invention will be described in detail below with reference to the accompanying drawings.
1. General technical route map
The handheld three-dimensional scanning method described in this patent is a method applied to a mobile, handheld, and portable three-dimensional scanner, which scans and splices a measured object from multiple angles to generate a complete three-dimensional model. The handheld three-dimensional scanning method described in the patent has two measurement modes, namely a jog measurement and a continuous measurement. Under the two measurement modes, the measurement process and the technical scheme are the same, and the general technical route is shown as the attached figure 1.
(1) Selecting a measurement mode, and performing inching measurement or continuous measurement; wherein, the inching measurement is carried out, the measurement angle is selected, and the measurement is carried out once when the measurement is triggered; continuous measurement, continuous exposure of a camera and continuous movement of a scanner are realized, for example, for a field size of 0.5m by 0.5m, assuming that the movement speed of an operator is 4 frames/s, the trigger speed of a signal generator is 4 frames/s, and thus the coincidence rate between two adjacent measurements can be ensured to be more than 60%. In order to avoid overlarge data amount caused by excessive overlapped parts, when a person carries out handheld measurement, the trigger speed is set to be 5-10 frames/s.
(2) Selecting a measurement angle of a measured object, wherein the coincidence rate of adjacent angles is more than 60%;
(3) the camera automatically adjusts the exposure time according to the information of illumination, reflectivity and the like at different angles, and the adjustment method is described in detail in the following 2;
(4) selecting a good angle in the step (1) to carry out scanning measurement, and obtaining a three-dimensional point cloud and a texture map of each angle by adopting a three-frequency color phase-shift profilometry and color fringe decoupling method, wherein the detailed steps are shown as an attached figure 3;
(5) and (3) splicing the three-dimensional point clouds with different angles obtained in the step (2) into a complete model, wherein the splicing method is shown in the attached drawings 4 and 5.
2. Automatic exposure adjustment
The method comprises the steps of obtaining a high-frequency component of the tri-frequency color stripe by empirical mode decomposition and two-dimensional Hilbert transformation, extracting two-dimensional amplitude of the high-frequency component by the two-dimensional Hilbert, calculating the maximum value and the minimum value of the two-dimensional amplitude, calculating the difference value between the maximum value and the minimum value and a set value, and adjusting exposure time to enable the difference value to be the minimum value. Thereby achieving the purpose of adjusting exposure.
3. Tri-frequency color phase shift profilometry and texture recovery
The hand-held three-dimensional scanning method described in this patent adopts a color grating projection technology to project a three-frequency color sine fringe pattern as shown in fig. 2 onto the surface of a measured object, the ratio of RGB three-channel frequencies is 1: 4: 12, a color camera is used to shoot a deformed fringe pattern projected on the surface of the object, three carrier frequencies of high, medium and low are separated by fourier transform, then three-frequency phases are respectively unfolded, phase unwrapping is carried out, a high-frequency carrier phase is recovered, and finally, three-dimensional point cloud is obtained through parameters calibrated by a system. And the deformed fringe pattern obtains a three-frequency sine fringe component by adopting a frequency domain minimization method, and subtracts the three-frequency sine fringe component from the color deformed fringe pattern to recover the texture pattern of the surface of the measured object. The specific steps for obtaining the three-dimensional point cloud are shown in the attached figure 3:
(1) the projection equipment projects the color tri-frequency color stripes onto the surface of the measured object, as shown in the attached figure 2;
(2) shooting a deformed stripe image of the surface of a measured object by a color camera;
(3) separating high, medium and low three carrier frequencies by Fourier change to obtain high, medium and low three-amplitude deformation fringe patterns;
(4) adopting HHT (Hilbert-Huang transform) to perform stripe self-adaptive analysis and phase extraction to obtain high, medium and low three-frequency wrapping phases;
(5) obtaining a high-precision absolute phase by a time domain variable precision unwrapping method;
(6) and (5) obtaining the three-dimensional point cloud of the measured object by using the system calibration parameters according to the absolute phase obtained in the step (5).
The steps for obtaining the texture map are as follows:
(7) subtracting the high, medium and low three-frequency components obtained in the step (3) from the color deformation fringe pattern obtained in the step (2) to obtain a background component;
and (5) separating and recovering the background component obtained in the step (7) through texture illumination to obtain the original texture image of the measured object.
4. Embodiments of automated splicing
The handheld three-dimensional scanning method described by the invention can realize automatic seamless splicing without mark points, the splicing mainly depends on texture information recovered from a deformed fringe pattern, the curvature and the normal direction of a point cloud dividing curved surface are measured as constraints, the position relation between two adjacent frames of measurement results is searched, and the global splicing is finally realized.
The coordinate system of the point cloud scanned at each angle is superposed with the camera coordinate system of the scanning position at that time, so that the point cloud coordinates scanned at different angles are not in the same coordinate system, and the process of converting the point clouds in different coordinate systems into the same coordinate system is the splicing of the point clouds. And (3) assuming that a world coordinate system is superposed with a camera coordinate system under the 1 st angle, and converting the point beans obtained by n angles measured each time into the world coordinate system, so that multi-angle splicing is realized. The world coordinate system is expressed as (X)w,Yw,Zw,Ow) The camera coordinate system at each angle is represented as (X)i,Yi,Zi,Oi) Simultaneously (X)w,Yw,Zw,Ow)=(X1,Y1,Z1,O1) (ii) a The representation of the point cloud is
Figure BSA0000108572590000081
Wherein i represents the point cloud obtained by the ith measurement, j represents j points in the point cloud, and l represents coordinates under the ith angular coordinate system.
(1) Splicing results of two adjacent angle scans
For the splicing of exposure measurement results of two adjacent angles, the splicing algorithm of the point clouds obtained by two adjacent scans is described in detail below by taking the splicing of the three-dimensional point clouds measured at the ith angle and the (i + 1) th angle as an example. Wherein the point cloud obtained by the ith measurement is represented as
Figure BSA0000108572590000082
k represents the number of points of the point cloud i; the point cloud obtained from the (i + 1) th measurement is represented as
Figure BSA0000108572590000083
s represents the number of points of the point cloud i + 1; the algorithm flow is shown in fig. 4:
a) using SIFT (Scale-invariant Feature Transform), a Feature point extraction method to find sparse corresponding Feature points in texture maps i and i +1, and marking the sparse corresponding Feature points as { a1, a2 ·, am } and { b1, b2 ·, bm }, wherein m is the corresponding point number found in the texture maps;
b) using RANSAC (Random Sample Consensus) method to remove points with larger limit errors, which are recorded as { a1, a2 ·, ar } and { b1, b2 ·, br } and r is the reserved corresponding points found in the texture map; and finding the spatial point coordinates { A1, A2,. Ar } and { B1, B2,. Br } corresponding to the matching feature points, wherein,
Figure BSA0000108572590000084
representing the feature points in the ith point cloud,
Figure BSA0000108572590000085
and (3) representing the corresponding point in the (i + 1) th point cloud, wherein r is less than or equal to s, k.
c) Iteratively solving the equation (formula 1) to obtain p1, wherein p1 represents the (i + 1) th coordinate system (X)i+1,Yi+1,Zi+1,Oi+1) In the i-th coordinate system (X)i,Yi,Zi,Qi) Pose potential matrix of (X)0,Y0,Z0) The position is indicated by a position indication,
Figure BSA0000108572590000095
representing the attitude, i.e. the (i + 1) th coordinate system (X)i+1,Yi+1,Zi+1,Oi+1) With the i-th coordinate system (X)i,Yi,Zi,Oi) The included angle of each coordinate axis.
Figure BSA0000108572590000091
(formula 1)
Wherein p1 ═ R T],R=RotX·RotY·RotZ,
Figure BSA0000108572590000096
Figure BSA0000108572590000092
d) All the points in the point cloud i +1 are multiplied by the matrix p1 to transform them into the coordinate system of the point cloud i, i.e. the point cloud i
Figure BSA0000108572590000094
e) Solving a space coordinate matrix p2 based on the curvature consistency correlation in the normal direction, wherein the meaning represented by p2 is the same as that represented by p1, and the steps are as follows:
step one, initializing p2 to be 0;
secondly, respectively carrying out triangular patch mesh division on the point cloud i and the point cloud i +1, namely constructing a triangular patch mesh model with a connected topological relation; the point cloud i and its divided patches are denoted as model i, denoted as MiSimilarly, point cloud i +1 and its divided patch are denoted as model i +1, denoted as Mi+1
Step three, the model i +1, namely Mi+1Performing a coordinate transformation with a transformation matrix of p2, i.e. Mi+1=p2·Mi+1And calculating the curvature and normal direction of each point after coordinate transformation, and expressing as { ri,1,ri,2,ri,3,···,ri,k},{ri+1,1,ri+1,2,ri+1,3,···,ri+1,sAnd { N }i,1,Ni,2,Ni,3,···,Ni,k},{Ni+1,1,Ni+1,2,Ni+1,3,···,Ni+1,s};
Step four, each point of the curved surface model i +1 is according to the normal direction { N }i+1,1,Ni+1,2,Ni+1,3···,Ni+1,sFind the corresponding point on the surface model i. We define Q as a point on the surface model i +1
Figure BSA0000108572590000101
If a point Q' on the surface model is found to satisfy the following condition, it is the corresponding point on the surface model i.
The connecting direction of Q' and Q and the normal direction N of Qn+1,jThe included angle is very small, i.e. angle [ | Q' -Q |, N |)n+1,j]≤ω;
The distance between Q 'and Q is less than a certain range, namely distance | Q' -Q | is less than or equal to d;
among the points satisfying the conditions of i and ii, a point having the closest curvature is selected as a matching point, i.e., f (Q') ═ min | rQ′-ri+1,j|;
If the point meeting the conditions i and ii is not found, the point Q is considered to not find a matching point;
we will represent the corresponding points on model i found by model i +1 as a set of points { Q } respectively1,Q2,...,QvAnd { Q }1′,Q2′,...,Qv' }, v denotes the number of found corresponding point pairs, the curvatures of which are expressed as
Figure BSA0000108572590000102
And
Figure BSA0000108572590000103
step five, calculating the curvature variance of the corresponding points
Figure BSA0000108572590000104
Step six, if g is less than or equal to sigma, outputting p 2; if g > σ, change p2 and repeat steps two through six.
f) Iterative optimization is carried out on the splicing result by adopting an ICP (iterative closed Point) iterative method, an optimization parameter in the formula 2 is solved to be p3, wherein the meaning of p3 is the same as that of p1, and coordinate transformation is carried out on the point cloud i +1 obtained in e) again;
f(p3)=∑||Mi,j-p3·Mi+1,j||2not min (formula 2)
To this end, we have a transformation matrix p (i) ═ p3 · p2 · p1 that splices the i +1 th point cloud with the i-th point cloud
(2) Stitching of multiple exposure measurements
The implementation steps are as shown in figure 5
According to the method described in (1), we can obtain P (1), P (2), P (3),.. multidot.P (n-1), and for the ith point cloud, the ith point cloud is spliced with the 1 st point cloud obtained by the first exposure measurement, and the coordinate transformation needed to be carried out is converted into
Figure BSA0000108572590000111
Wherein the content of the first and second substances,
Figure BSA0000108572590000112
representing transformation of the i +1 th angle measurement to the world coordinate system (X)w,Yw,Zw,Ow) (i.e., the 1 st measurement angle coordinate system),
Figure BSA0000108572590000113
represents the coordinates of the (i + 1) th angle measurement result in the original coordinate system (i + 1) th measurement angle coordinate system), wherein H is recordedi=P(1)·P(2)·P(3)·····P(i-1)。
And (5) converting the 2 nd to n th angle measurement results into a world coordinate system according to the steps, completing the splicing of multi-angle measurement, and reconstructing a complete three-dimensional model.
Experimental cases:
the hand-held three-dimensional scanning equipment described by the invention is used for scanning the human face, and the human face has uneven reflectivity due to complex color, so that the problem is well solved by using the automatic exposure adjustment technology of the invention. Fig. 6 is a three-dimensional scan of a person with six angles, and fig. 7 is a multi-angle presentation of the results of the automatic stitching after scanning.

Claims (4)

1. A free-moving three-dimensional scanning method is characterized by comprising the following steps:
(1) selecting a measurement mode, and performing inching measurement or continuous measurement;
the inching measurement is: selecting a measuring angle, and measuring once when exposure is triggered;
the continuous measurement is: the camera is continuously exposed, and the scanner is continuously moved to realize continuous measurement;
(2) selecting a measurement angle of a measured object, wherein the coincidence rate of adjacent angles is more than 60%;
(3) the camera automatically adjusts exposure time according to illumination and reflectivity information of different angles;
(4) selecting good angles in the step (2) to carry out scanning measurement, and obtaining three-dimensional point cloud and color texture maps of all the angles by adopting a three-frequency color phase-shift profilometry and color fringe decoupling method;
the method for obtaining the three-dimensional point cloud and the color texture map in the step (4) comprises the following steps:
(a) the projection equipment projects the three-frequency color stripes onto the surface of the measured object;
(b) shooting a deformed stripe image of the surface of a measured object by a color camera;
(c) separating high, medium and low three carrier frequencies of the deformed fringe pattern by Fourier change to obtain three deformed fringe patterns of high, medium and low;
(d) performing stripe self-adaptive analysis and phase extraction by using Hilbert-Huang transform HHT to obtain high, medium and low frequency wrapping phases;
(e) obtaining a high-precision absolute phase by a time domain variable precision unpacking method;
(f) obtaining three-dimensional point cloud of the measured object by system calibration parameters according to the absolute phase obtained in the step (e);
(g) obtaining a texture map; subtracting the high, medium and low frequency components obtained in the step (c) from the deformed fringe pattern obtained in the step (b) to obtain background components; separating and recovering the obtained background components through texture illumination to obtain an original texture map of the measured object;
(5) splicing the obtained three-dimensional point clouds with different measurement angles into a complete model;
the splicing method of the step (5) comprises the following steps:
assuming that a world coordinate system is coincident with a camera coordinate system under the 1 st angle, converting point clouds obtained by n angles measured each time into the world coordinate system, and realizing multi-angle splicing; the world coordinate system is expressed as (X)W,YW,ZW,OW) The camera coordinate system at each angle is represented as (X)n,Yn,Zn,On) Simultaneously (X)W,YW,ZW,OW)=(X1,Y1,Z1,O1) (ii) a The point cloud obtained at the 1 st angle is represented as
Figure FDA0002755906720000021
Wherein i represents the point cloud obtained by the ith measurement, j represents j points in the point cloud obtained by the measurement, and n represents the nth angle;
step 1), splicing adjacent two angle scanning results;
for the stitching of the exposure measurement results of two adjacent angles, the following takes the stitching of three-dimensional point clouds measured at the nth angle and the (n + 1) th angle as an example to describe in detail the stitching method of the point clouds obtained by two adjacent scans, wherein the point cloud obtained by the ith measurement is represented as
Figure FDA0002755906720000022
k represents the point number of the point cloud obtained by the ith measurement; the point cloud obtained from the (i + 1) th measurement is represented as
Figure FDA0002755906720000023
Figure FDA0002755906720000024
s represents the point number of the point cloud obtained by the i +1 th measurement;
the method comprises the following specific steps:
a) using a feature point extraction method of Scale Invariant Feature Transform (SIFT) to find corresponding feature points in the texture map obtained by the ith measurement and the texture map obtained by the (i + 1) th measurement, wherein the corresponding feature points are marked as { a1, a2,. multidot.,. am } and { b1, b2,. multidot.,. bm }, and m is the number of corresponding points found in the texture map;
b) points with larger limit errors are removed by a random sampling consensus RANSAC method, and are marked as { a1, a 2., ar } and { b1, b 2., br }, wherein r is the reserved corresponding points found in the texture map;
finding the spatial point coordinates { a1, a 2., Ar } and { B1, B2., Br } corresponding to the matching feature points, wherein,
Figure FDA0002755906720000025
representing the corresponding characteristic points in the point cloud obtained by the ith measurement,
Figure FDA0002755906720000031
Figure FDA0002755906720000032
representing the corresponding characteristic points in the point cloud obtained by the i +1 th measurement, wherein r is less than or equal to s, k;
c) solving the formula (1) by an iterative method to obtain a spatial coordinate matrix p1, wherein p1 represents the (n + 1) th angular coordinate system (X)n+1,Yn +1,Zn+1,On+1) At the n-th angular coordinate system (X)n,Yn,Zn,On) (X) is a space coordinate matrix of (2)0,Y0,Z0) Indicating a location; gamma, theta, psi denotes attitude, i.e. the (n + 1) th angular coordinate system (X)n+1,Yn+1,Zn+1,On+1) From the n-th angular coordinate system (X)n,Yn,Zn,On) The included angle of each coordinate axis;
Figure FDA0002755906720000033
wherein p1 ═ R T],R=RotX·RotY·RotZ,T=[X0,Y0,Z0,1]T
Figure FDA0002755906720000034
d) All points in the point cloud obtained by the (i + 1) th measurement
Figure FDA0002755906720000035
Left-multiplying by the space coordinate matrix p1, transforming it to the coordinate system of the point cloud obtained from the ith measurement, i.e.
Figure FDA0002755906720000036
e) Solving a space coordinate matrix p2 based on the curvature consistency correlation in the normal direction, wherein the meaning represented by p2 is the same as that of p 1;
step 2), splicing multiple exposure measurement results;
obtaining a space coordinate matrix p (1), p (2), p (3),. and.p (n-1) according to the step 1), splicing the point cloud obtained by the ith measurement and the 1 st point cloud obtained by the first exposure measurement, and transforming the coordinates required to be carried out
Figure FDA0002755906720000037
Wherein the content of the first and second substances,
Figure FDA0002755906720000038
representing transformation of the (n + 1) th angular measurement to the world coordinate system (X)W,YW,ZW,OW) I.e., the coordinates of the 1 st angular coordinate system,
Figure FDA0002755906720000041
represents the coordinates of the n +1 th angle measurement result in the original coordinate system, i.e. the n +1 th angle coordinate system, wherein H is recordedi=p(1)·p(2)·p(3),...,p(n-1);
And 3) converting the 2 nd to n th angle measurement results into a world coordinate system according to the step 2), completing the splicing of multi-angle measurement, and reconstructing a complete three-dimensional model.
2. The free-moving three-dimensional scanning method of claim 1, wherein: when the human carries out measurement, the trigger speed of continuous measurement is set to be 5-10 frames/s.
3. The free-moving three-dimensional scanning method of claim 1,
the three-frequency color phase shift profilometry and color fringe decoupling method in the step (4) comprises the following steps:
projecting a three-frequency color sine fringe pattern on the surface of a measured object by adopting a color grating projection method; setting RGB three-channel frequency ratio, and shooting a deformed fringe pattern projected on the surface of an object by using a color camera; the three carrier frequencies of the high, middle and low of the deformed fringe pattern, namely three frequencies, are separated by Fourier transform;
respectively unfolding the three-frequency phases, unwrapping the phases and recovering a high-frequency carrier frequency phase;
obtaining three-dimensional point cloud through parameters calibrated by the system;
and obtaining a three-frequency sine fringe component by adopting a frequency domain minimization method for the deformed fringe pattern, and subtracting the three-frequency sine fringe component from the color deformed fringe pattern so as to recover the texture pattern of the surface of the measured object.
4. The free-moving three-dimensional scanning method of claim 3, wherein the RGB three-channel frequency ratio is 1: 4: 12.
CN201410494897.8A 2014-09-25 2014-09-25 Free-moving type three-dimensional scanning method Active CN104299211B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410494897.8A CN104299211B (en) 2014-09-25 2014-09-25 Free-moving type three-dimensional scanning method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410494897.8A CN104299211B (en) 2014-09-25 2014-09-25 Free-moving type three-dimensional scanning method

Publications (2)

Publication Number Publication Date
CN104299211A CN104299211A (en) 2015-01-21
CN104299211B true CN104299211B (en) 2020-12-29

Family

ID=52318933

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410494897.8A Active CN104299211B (en) 2014-09-25 2014-09-25 Free-moving type three-dimensional scanning method

Country Status (1)

Country Link
CN (1) CN104299211B (en)

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106568394A (en) * 2015-10-09 2017-04-19 西安知象光电科技有限公司 Hand-held three-dimensional real-time scanning method
US10317199B2 (en) 2016-04-08 2019-06-11 Shining 3D Tech Co., Ltd. Three-dimensional measuring system and measuring method with multiple measuring modes
KR102207725B1 (en) * 2016-04-08 2021-01-26 샤이닝 쓰리디 테크 컴퍼니., 리미티드. 3D survey system and survey method in multiple survey mode
DE102016120406A1 (en) * 2016-10-26 2018-04-26 Deutsche Post Ag A method of determining a charge for sending a shipment
CN108065913A (en) * 2016-11-16 2018-05-25 财团法人工业技术研究院 Handheld three-dimensional scanning means
CN106600531B (en) * 2016-12-01 2020-04-14 深圳市维新登拓医疗科技有限公司 Handheld scanner, and handheld scanner point cloud splicing method and device
CN106643563B (en) * 2016-12-07 2019-03-12 西安知象光电科技有限公司 A kind of Table top type wide view-field three-D scanning means and method
CN109506591A (en) * 2018-09-14 2019-03-22 天津大学 A kind of adaptive illumination optimization method being adapted to complex illumination scene
CN109472741B (en) * 2018-09-30 2023-05-30 先临三维科技股份有限公司 Three-dimensional splicing method and device
CN110030944B (en) * 2019-04-03 2021-09-21 中国科学院光电技术研究所 Large-gradient free-form surface measuring method
CN110415361B (en) * 2019-07-26 2020-05-15 北京罗森博特科技有限公司 Method and device for splicing broken objects
CN110866944A (en) * 2019-12-06 2020-03-06 民航成都物流技术有限公司 Consigned luggage measurement and identification method and system
CN111060023B (en) * 2019-12-12 2020-11-17 天目爱视(北京)科技有限公司 High-precision 3D information acquisition equipment and method
CN111637850B (en) * 2020-05-29 2021-10-26 南京航空航天大学 Self-splicing surface point cloud measuring method without active visual marker
CN112164084B (en) * 2020-12-01 2021-03-26 南京智谱科技有限公司 Method and device for generating two-dimensional laser point cloud picture based on laser scanning
CN112665529B (en) * 2021-01-19 2022-06-24 浙江理工大学 Object three-dimensional shape measuring method based on stripe density area segmentation and correction
CN113140042B (en) * 2021-04-19 2023-07-25 思看科技(杭州)股份有限公司 Three-dimensional scanning splicing method and device, electronic device and computer equipment

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2389500A (en) * 2002-04-20 2003-12-10 Virtual Mirrors Ltd Generating 3D body models from scanned data

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1301480C (en) * 2002-03-27 2007-02-21 深圳市特得维技术有限公司 Computerized 3D visual color scan system and its scanning mode
CN101739717B (en) * 2009-11-12 2011-11-16 天津汇信软件有限公司 Non-contact scanning method for three-dimensional colour point clouds
CN101936718B (en) * 2010-03-23 2012-07-18 上海复蝶智能科技有限公司 Sine stripe projection device and three-dimensional profile measuring method
CN101813461B (en) * 2010-04-07 2011-06-22 河北工业大学 Absolute phase measurement method based on composite color fringe projection
CN101986098B (en) * 2010-09-21 2012-02-22 东南大学 Tricolor projection-based Fourier transform three-dimensional measuring method
CN102322822B (en) * 2011-08-08 2013-04-17 西安交通大学 Three-dimensional measurement method for triple-frequency color fringe projection
CN102628676B (en) * 2012-01-19 2014-05-07 东南大学 Adaptive window Fourier phase extraction method in optical three-dimensional measurement
CN103162643A (en) * 2013-04-02 2013-06-19 北京博维恒信科技发展有限责任公司 Monocular structured light three-dimensional scanner

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2389500A (en) * 2002-04-20 2003-12-10 Virtual Mirrors Ltd Generating 3D body models from scanned data

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于经验模式分解的三频彩色条纹投影轮廓术;邹海华 等;《光学学报》;20110831;第31卷(第8期);第0812009-1-0812009-9页 *

Also Published As

Publication number Publication date
CN104299211A (en) 2015-01-21

Similar Documents

Publication Publication Date Title
CN104299211B (en) Free-moving type three-dimensional scanning method
CN110288642B (en) Three-dimensional object rapid reconstruction method based on camera array
CN107767442B (en) Foot type three-dimensional reconstruction and measurement method based on Kinect and binocular vision
CN110514143B (en) Stripe projection system calibration method based on reflector
Pieraccini et al. 3D digitizing of cultural heritage
CN109242954B (en) Multi-view three-dimensional human body reconstruction method based on template deformation
DK3144881T3 (en) PROCEDURE FOR 3D PANORAMA MOSAIC CREATION OF A SCENE
US20100328308A1 (en) Three Dimensional Mesh Modeling
CN104266587A (en) Three-dimensional measurement system and method for obtaining actual 3D texture point cloud data
EP3382645B1 (en) Method for generation of a 3d model based on structure from motion and photometric stereo of 2d sparse images
EP2909575A1 (en) Systems and methods for marking images for three-dimensional image generation
CN104240289A (en) Three-dimensional digitalization reconstruction method and system based on single camera
CN111288925A (en) Three-dimensional reconstruction method and device based on digital focusing structure illumination light field
CN115345822A (en) Automatic three-dimensional detection method for surface structure light of aviation complex part
CN106568394A (en) Hand-held three-dimensional real-time scanning method
Zagorchev et al. A paintbrush laser range scanner
CN114170284B (en) Multi-view point cloud registration method based on active landmark point projection assistance
CN115205360A (en) Three-dimensional outer contour online measurement and defect detection method of composite stripe projection steel pipe and application
Marrugo et al. Fourier transform profilometry in labview
CN116433841A (en) Real-time model reconstruction method based on global optimization
Wong et al. 3D object model reconstruction from image sequence based on photometric consistency in volume space
CN113450460A (en) Phase-expansion-free three-dimensional face reconstruction method and system based on face shape space distribution
Sert et al. Three stepped calibration of structured light system with adaptive thresholding for 3D measurements
CN108921908B (en) Surface light field acquisition method and device and electronic equipment
CN112325799A (en) High-precision three-dimensional face measurement method based on near-infrared light projection

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
DD01 Delivery of document by public notice

Addressee: Zhou Xiang

Document name: Notification of Passing Preliminary Examination of the Application for Invention

DD01 Delivery of document by public notice

Addressee: Zhou Xiang

Document name: Notification of Publication of the Application for Invention

C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20180824

Address after: 215505 2 Jianye Road, Changshou City, Jiangsu

Applicant after: Suzhou cartesan Testing Technology Co. Ltd

Address before: 710000 Xi'an, Shaanxi, south 2nd Ring Road East Section 2, 3012

Applicant before: Zhou Xiang

TA01 Transfer of patent application right
CB03 Change of inventor or designer information

Inventor after: Li Huanhuan

Inventor after: Guo Jiayu

Inventor after: Li Yuqin

Inventor after: Wang Chao

Inventor after: Zhao Lei

Inventor after: Yang Tao

Inventor before: Zhou Xiang

Inventor before: Li Huanhuan

Inventor before: Guo Jiayu

Inventor before: Li Yuqin

Inventor before: Wang Chao

Inventor before: Zhao Lei

Inventor before: Yang Tao

CB03 Change of inventor or designer information
GR01 Patent grant
GR01 Patent grant