CN114234852B - Multi-view structured light three-dimensional measurement method and system based on optimal mapping point set matching - Google Patents

Multi-view structured light three-dimensional measurement method and system based on optimal mapping point set matching Download PDF

Info

Publication number
CN114234852B
CN114234852B CN202111567224.7A CN202111567224A CN114234852B CN 114234852 B CN114234852 B CN 114234852B CN 202111567224 A CN202111567224 A CN 202111567224A CN 114234852 B CN114234852 B CN 114234852B
Authority
CN
China
Prior art keywords
image
projector
point set
phase
representing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111567224.7A
Other languages
Chinese (zh)
Other versions
CN114234852A (en
Inventor
庄逸钟
邓海祥
郑卓鋆
张揽宇
高健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong University of Technology
Original Assignee
Guangdong University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong University of Technology filed Critical Guangdong University of Technology
Priority to CN202111567224.7A priority Critical patent/CN114234852B/en
Publication of CN114234852A publication Critical patent/CN114234852A/en
Application granted granted Critical
Publication of CN114234852B publication Critical patent/CN114234852B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/254Projection of a pattern, viewing through a pattern, e.g. moiré
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a multi-view structured light three-dimensional measurement method and a system based on optimal mapping point set matching. And matching is completed by utilizing the characteristic information of the local point set and the mapping point set based on the characteristic information of the point set, so as to find the optimal mapping point set. And obtaining a mapping point set corresponding to the optimal mapping point set in the projector image in the right camera image according to the conversion relation between the mapping point set of the right camera image and the mapping point set of the projector image, and solving the absolute phase of the mapping point set by combining the mathematical relation between the pixel coordinate and the absolute phase of the projector image. And finally, reconstructing three-dimensional point cloud according to the triangular distance measurement to establish a three-dimensional model of the object to be measured and finish the three-dimensional measurement of the object to be measured.

Description

Multi-view structured light three-dimensional measurement method and system based on optimal mapping point set matching
Technical Field
The invention relates to the technical field of optical three-dimensional measurement, in particular to a multi-view structured light three-dimensional measurement method and system based on optimal mapping point set matching.
Background
The structured light measurement technology has the advantages of non-contact, full-field lossless measurement, high speed, high precision and the like, and is widely applied to the fields of industrial detection, machine vision, cultural relic digitization, medicine and the like. In the existing multi-view structured light measurement system, a multi-view structured light three-dimensional measurement system composed of two cameras and one projector is widely used due to the advantages of small number of projection patterns, high absolute phase solving speed, high point cloud reconstruction efficiency and the like. In a typical dual-camera structured light three-dimensional measurement system, a projection device projects stripe patterns onto the surface of a measured object in the measurement process, a left camera and a right camera are used for collecting the stripe patterns which are deformed by the height modulation of the measured object, then phase solving is carried out on collected stripe images, pixel matching is completed by using phase information of the left camera and the right camera, and finally three-dimensional information of the measured object is obtained by using a trigonometric principle according to the phase information, a matching result and calibrated system parameters.
Methods for solving the absolute phase of the multi-field structured light according to phase matching can be divided into a speckle matching method, a double-frequency stripe matching method and a geometric constraint matching method. The speckle matching method can obtain higher spatial resolution and measurement accuracy, but needs to project additional speckle patterns; the double-frequency stripe matching method is to project stripe patterns of two frequencies, convert high-frequency stripes into low-frequency stripes by a double-frequency heterodyne method, perform phase matching by combining height constraint, but project additional stripe patterns; the geometric constraint matching method is characterized in that a geometric relation between a left camera, a right camera and a projector is utilized, phase matching is combined to uniquely determine three-dimensional points in a space, extra patterns do not need to be projected, resolving speed is high, but due to the fact that feature information of a single pixel point is single, feature differences are not obvious and the like, the method is low in matching accuracy and only suitable for low-frequency stripes. For solving the absolute phase of the multi-field structured light, the problems of extra fringe patterns needing to be projected, low matching precision, low resolving speed and the like still exist at present. Therefore, in order to realize high-speed and high-precision three-dimensional measurement, it has been important to improve the phase matching precision and the calculation speed without increasing the number of projections.
Disclosure of Invention
The invention aims to provide a multi-view structured light three-dimensional measurement method and a multi-view structured light three-dimensional measurement system based on optimal mapping point set matching aiming at the defects in the background technology. Compared with a speckle matching method and a dual-frequency fringe matching method, the method does not need to project extra coding patterns in the aspect of projecting the number of patterns, and reduces the projection time and the calculation time; compared with a geometric constraint matching method, the phase matching of the medium-high frequency stripes can be completed in the aspect of stripe frequency, and the matching precision and the reconstruction precision are improved.
In order to achieve the purpose, the invention adopts the following technical scheme:
a multi-view structured light three-dimensional measurement method based on optimal mapping point set matching comprises the following steps:
step A: generating 3 sinusoidal stripe patterns according to a three-step phase shift method;
and B: the projector generates 3 strips to the surface of the object to be measured, and the left camera and the right camera respectively collect the 3 strips deformed on the surface of the object to be measured;
and C: respectively generating 3 fringe patterns for projection and solving a wrapping phase for the acquired 3 fringe patterns according to a three-step phase shift method;
Step D: according to the wrapping phase of the left camera image, combining the calibration parameters, and finding a plurality of mapping point sets corresponding to the local point set of the left camera image by utilizing epipolar constraint and phase constraint in the wrapping phase of the projector image;
and E, step E: according to the local point set in the left camera image and the plurality of mapping point sets in the projector image, a plurality of mapping point sets corresponding to the local point set of the left camera image are found in the right camera image by combining the calibration parameters;
step F: according to the characteristics of a local point set in a left camera image, finding an optimal mapping point set in a plurality of mapping point sets of a right camera image, and determining a corresponding mapping point set of the optimal mapping point set in a projector image;
solving the absolute phase of a mapping point set in the projector image by utilizing the relation between the pixel coordinate and the absolute phase of the projector image, determining the absolute phase of a local point set in the left camera image, and completing the solution of the absolute phase;
step G: and reconstructing three-dimensional point cloud according to the triangular distance measurement to build a three-dimensional model of the object to be measured.
Preferably, step a includes generating 3 sinusoidal fringe patterns according to a three-step phase shift method based on formula one;
Figure BDA0003422160540000031
wherein:
I n an nth stripe pattern representing a projection of the projector, n being 0,1, 2;
(u, v) pixel coordinates representing a fringe pattern projected by the projector;
a represents the mean intensity of the fringe pattern projected by the projector;
b represents the modulation intensity of the fringe pattern projected by the projector;
Figure BDA0003422160540000032
representing the wrapped phase of the fringe pattern projected by the projector.
Preferably, in the step B, the projector projects 3 strips to the surface of the object to be measured, and the left and right cameras respectively collect the 3 strips deformed on the surface of the object to be measured;
expressing the stripe pattern collected by the left camera based on a formula II, and expressing the stripe pattern collected by the right camera based on a formula III;
Figure BDA0003422160540000033
Figure BDA0003422160540000034
wherein:
(x, y) represents the pixel coordinates of the acquired fringe pattern;
Figure BDA0003422160540000041
represents the nth stripe pattern collected by the left camera, wherein n is 0,1, 2;
Figure BDA0003422160540000042
represents the nth stripe pattern collected by the right camera, wherein n is 0,1, 2;
A left representing the mean intensity of the fringe pattern acquired by the left camera;
A right representing the mean intensity of the fringe pattern acquired by the right camera;
B left represents the modulation intensity of the fringe pattern acquired by the left camera;
B right a modulation intensity representing a fringe pattern captured by the right camera;
Figure BDA0003422160540000043
a wrapping phase representing a fringe pattern acquired by the left camera;
Figure BDA0003422160540000044
representing the wrapped phase of the fringe pattern acquired by the right camera.
Preferably, in the step C, the method includes generating 3 fringe patterns by projection and solving the wrapping phase for the acquired 3 fringe patterns respectively according to a three-step phase shift method, including solving the wrapping phase for the fringe patterns generated by projection based on formula four, solving the wrapping phase for the fringe patterns acquired by the left camera based on formula five, and solving the wrapping phase for the fringe patterns acquired by the right camera based on formula six;
Figure BDA0003422160540000045
Figure BDA0003422160540000046
Figure BDA0003422160540000047
wherein:
Figure BDA0003422160540000048
a wrapping phase representing a fringe pattern projected by the projector;
I n an nth stripe pattern representing a projection of the projector, n being 0,1, 2;
(u, v) pixel coordinates representing a fringe pattern projected by the projector;
Figure BDA0003422160540000049
a wrapping phase representing a fringe pattern acquired by the left camera;
(x, y) represents the pixel coordinates of the acquired fringe pattern;
Figure BDA0003422160540000051
represents the nth stripe pattern collected by the left camera, wherein n is 0,1, 2;
Figure BDA0003422160540000052
a wrapping phase representing a fringe pattern acquired by the right camera;
Figure BDA0003422160540000053
the nth stripe pattern acquired by the right camera is represented, and n is 0,1 and 2.
Preferably, in step D, according to the wrapped phase of the left camera image, in combination with the calibration parameters, using epipolar constraint and phase constraint to find a plurality of mapping point sets corresponding to the local point set of the left camera image in the wrapped phase of the projector image, the method includes:
Step D1: based on the formula seven, acquiring a pixel point in the left camera image corresponding to an epipolar line in the projector image by utilizing a basis matrix calibrated by the left camera and the projector according to epipolar line constraint;
step D2: based on the eighth formula and the ninth formula, combining epipolar constraint and phase constraint, and searching a corresponding point of a pixel point in the left camera image in the projector image along the epipolar direction in the projector image;
step D3: finding a plurality of mapping point sets in the projector image by the local point set in the left camera image;
Figure BDA0003422160540000054
Figure BDA0003422160540000055
P n (u,v)=p p (u, v) if (0. ltoreq. D. ltoreq.T) -formula nine;
wherein the content of the first and second substances,
c (u, v) represents the position identification of the corresponding point;
p l (x, y, 1) represents a pixel point in the left camera image;
f represents a base matrix obtained by calibration;
p p (u, v, 1) representing pixel points in a projector image
Figure BDA0003422160540000056
Representing a pixel point p in a left camera image l Mapping to polar lines in the projector image if pixel point p p On the polar line, C (u, v) ═ 0;
d (u, v) represents a phase error identifier of the corresponding point, and D represents a difference value between the phase of the corresponding point in the projector image and the phase of the pixel point in the left camera image;
fabs [ ] represents a function of absolute values;
Figure BDA0003422160540000061
a wrapping phase representing a fringe pattern projected by the projector;
Figure BDA0003422160540000062
A wrapping phase representing a fringe pattern acquired by the left camera;
P n (u, v) represents a set of all corresponding points in the projector image that satisfy the phase constraint and the epipolar constraint;
t represents a threshold, and the range falls within [0, T ].
Preferably, in the step E, according to the local point set in the left camera image and the plurality of mapping point sets in the projector image, in combination with the calibration parameters, finding a plurality of mapping point sets corresponding to the local point set of the left camera image in the right camera image includes:
step E1: acquiring a plurality of three-dimensional point sets of a plurality of mapping point sets of the projector image in a world coordinate system according to calibration parameters of a left camera and the projector, and a local point set of the left camera image and a plurality of mapping point sets of the projector image;
step E2: and according to the calibration parameters of the projector and the right camera, converting a plurality of three-dimensional point sets of the projector in a world coordinate system into a plurality of mapping point sets in a right camera image coordinate system, and determining the conversion relation between the mapping point sets in the projector image and the right camera image.
Preferably, in the step F, the method includes:
based on a formula ten and a formula eleven, finding an optimal mapping point set in the plurality of mapping point sets of the right camera image according to the characteristics of the local point set in the left camera image;
P n =Feature(p n1 ,p n2 ,p n3 ,…,p nm ) -formula ten;
P best (p b1 ,p b2 ,p b3 …p bm )=FindMin(P l -P 1 ,P l -P 2 ,P l -P 3 ,…,P l -P n ) - -formula eleven;
wherein:
P n a feature representation representing an nth set of mapping points in the right camera image;
p nm representing an mth point in an nth set of mapped points in the right camera image;
featuere (.) represents a function for calculating feature information of the mapping point set;
P best representing an optimal set of mapping points;
p bm representing the mth point in the optimal mapping point set;
P l a feature representation representing a local set of points of the left camera image;
FindMin (.) represents the function that returns the set of mapping points to which the feature difference minimum corresponds.
Preferably, in the step F, the method includes:
determining a mapping point set corresponding to the optimal mapping point set in the projector image according to the conversion relation based on the formula twelve;
P p (p 1 ,p 2 ,p 3 ,…,p m )=FuncP[(p b1 ,p b2 ,p b3 …p bm )]- - -formula twelve;
wherein:
P p a set of mapping points representing the returned projector image;
p m an m-th point in the set of mapped points representing the returned projector image;
FuncP [ ] represents a function that returns a set of mapping points in the projector image corresponding to the set of mapping points in the right camera image;
p bm indicating the mth point in the set of best mapped points.
Preferably, in the step F, the method includes:
solving the absolute phase of a mapping point set in the projector image by utilizing the mathematical relation between the image pixel coordinate and the absolute phase of the projector based on the formula thirteen, determining the absolute phase of a local point set in the left camera image, and finishing the solution of the absolute phase;
Figure BDA0003422160540000081
Wherein:
Figure BDA0003422160540000082
an absolute phase of an m-th point in a set of mapped points representing a projector image;
fun cX (·) represents the x coordinate of the returned pixel point;
f represents the number of pixels contained in a single fringe period;
p m the mth point in the set of mapped points representing the returned projector image.
A multi-purpose structured light three-dimensional measurement system based on optimal mapping point set matching applies any one of the multi-purpose structured light three-dimensional measurement methods based on optimal mapping point set matching, and the system comprises:
a first unit for generating 3 sinusoidal fringe patterns according to a three-step phase shift method;
the second unit is used for generating 3 strips of patterns to be projected to the surface of the object to be detected by the projector, and respectively collecting the 3 strips of the deformed surface of the object to be detected by the left camera and the right camera;
the third unit is used for respectively solving the wrapping phase of the 3 fringe patterns generated by projection and the 3 collected fringe patterns according to a three-step phase shift method;
the fourth unit is used for finding a plurality of mapping point sets corresponding to the local point set of the left camera image in the wrapping phase of the projector image by utilizing polar line constraint and phase constraint according to the wrapping phase of the left camera image in combination with the calibration parameters;
a fifth unit, configured to find, according to the local point set in the left camera image and the multiple mapping point sets in the projector image, multiple mapping point sets corresponding to the local point set of the left camera image in the right camera image in combination with the calibration parameter;
A sixth unit, configured to find an optimal mapping point set from the multiple mapping point sets of the right camera image according to a feature of the local point set in the left camera image, and determine a mapping point set corresponding to the optimal mapping point set in the projector image;
solving the absolute phase of a mapping point set in the projector image by utilizing the relation between the pixel coordinate and the absolute phase of the projector image, determining the absolute phase of a local point set in the left camera image, and completing the solution of the absolute phase;
and the seventh unit is used for reconstructing the three-dimensional point cloud according to the triangular distance measurement to build a three-dimensional model of the object to be measured.
The technical effect realized by the technical scheme of the invention is as follows:
the invention provides a multi-objective structured light three-dimensional measurement method and a system based on optimal mapping point set matching. Compared with a speckle matching method and a dual-frequency fringe matching method, the method does not need to project extra coding patterns in the aspect of projecting the number of patterns, and reduces the projection time and the calculation time; compared with a geometric constraint matching method, the phase matching of the medium-high frequency stripes can be completed in the aspect of stripe frequency, and the matching precision and the reconstruction precision are improved.
Drawings
FIG. 1 is a flow chart of a multi-view structured light three-dimensional measurement method based on optimal mapping point set matching according to one embodiment of the present invention;
FIG. 2 is a schematic diagram of the evolution of one embodiment of the present invention;
FIG. 3 is a block diagram of a multi-purpose structured light three-dimensional measurement system based on optimal mapping point set matching according to one embodiment of the present invention.
Detailed Description
The technical scheme of the invention is further explained by the specific implementation mode in combination with the attached drawings.
In the invention, firstly, 3 sinusoidal stripe patterns are generated according to the requirements of a three-step phase shift method, a projector projects the generated sinusoidal stripe patterns to the surface of a measured object, and a left camera and a right camera acquire the stripe patterns deformed on the surface of the measured object. And solving the wrapping phase for the 3 projected fringe patterns of the projector and the 3 fringe images collected by the left camera and the right camera by using a three-step phase shift method.
Further, according to wrapped phase images of the left camera, the right camera and the projector, by combining calibration parameters and utilizing geometric constraint and phase constraint, a plurality of mapping point sets are found in the projector image firstly by a local point set in the left camera image; then, a plurality of mapping point sets in the right camera image are found by combining the conversion relation between the mapping point sets in the projector image and the mapping point sets in the right camera image; finding an optimal mapping point set from a plurality of mapping point sets in the right camera image according to the local point set characteristics in the left camera image; and determining a mapping point set corresponding to the optimal mapping point set in the projector image according to the optimal mapping point set of the right camera image and the conversion relation between the mapping point set of the right camera image and the mapping point set of the projector image, and solving the absolute phase of the mapping point set of the projector image by using the conversion relation between the pixel coordinate of the projector and the absolute phase, wherein the absolute phase of the mapping point set is the absolute phase of the local point set of the left camera, so that the absolute phase is solved.
The technical solution set forth above will be explained in detail with specific embodiments below;
specifically, as shown in fig. 1 and fig. 2, a multi-view structured light three-dimensional measurement method based on optimal mapping point set matching includes the following steps:
step A: generating 3 sinusoidal stripe patterns according to a three-step phase shift method;
and B: the projector generates 3 strips to the surface of the object to be measured, and the left camera and the right camera respectively collect the 3 strips deformed on the surface of the object to be measured;
and C: respectively generating 3 fringe patterns for projection and solving a wrapping phase for the acquired 3 fringe patterns according to a three-step phase shift method;
step D: according to the wrapping phase of the left camera image, combining with the calibration parameters, and finding a plurality of mapping point sets corresponding to the local point set of the left camera image by utilizing polar line constraint and phase constraint in the wrapping phase of the projector image;
step E: according to the local point set in the left camera image and the plurality of mapping point sets in the projector image, a plurality of mapping point sets corresponding to the local point set of the left camera image are found in the right camera image by combining the calibration parameters;
step F: according to the characteristics of a local point set in a left camera image, finding an optimal mapping point set in a plurality of mapping point sets of a right camera image, and determining a corresponding mapping point set of the optimal mapping point set in a projector image;
Solving the absolute phase of a mapping point set in the projector image by utilizing the relation between the pixel coordinate and the absolute phase of the projector image, determining the absolute phase of a local point set in the left camera image, and completing the solution of the absolute phase;
step G: and reconstructing three-dimensional point cloud according to the triangular distance measurement to build a three-dimensional model of the object to be measured.
Preferably, step a includes generating 3 sinusoidal fringe patterns according to a three-step phase shift method based on formula one;
Figure BDA0003422160540000111
wherein:
I n an nth stripe pattern representing a projection of the projector, n being 0,1, 2;
(u, v) pixel coordinates representing a generated fringe pattern projected by a projector;
a represents the mean intensity of the fringe pattern projected by the projector;
b represents the modulation intensity of the fringe pattern projected by the projector;
Figure BDA0003422160540000112
representing the wrapped phase of the fringe pattern projected by the projector.
Preferably, in the step B, the projector projects 3 strips to the surface of the object to be measured, and the left and right cameras respectively collect the 3 strips deformed on the surface of the object to be measured;
expressing the stripe pattern collected by the left camera based on a formula II, and expressing the stripe pattern collected by the right camera based on a formula III;
Figure BDA0003422160540000113
Figure BDA0003422160540000121
wherein:
(x, y) represents the pixel coordinates of the acquired fringe pattern;
Figure BDA0003422160540000122
The nth stripe pattern collected by the left camera is represented, and n is 0,1, 2;
Figure BDA0003422160540000123
represents the nth stripe pattern collected by the right camera, wherein n is 0,1, 2;
A left representing the mean intensity of the fringe pattern acquired by the left camera;
A right representing the mean intensity of the fringe pattern acquired by the right camera;
B left represents the modulation intensity of the fringe pattern acquired by the left camera;
B right a modulation intensity representing a fringe pattern captured by the right camera;
Figure BDA0003422160540000124
a wrapping phase representing a fringe pattern acquired by the left camera;
Figure BDA0003422160540000125
representing the wrapped phase of the fringe pattern acquired by the right camera.
Preferably, in the step C, the method includes generating 3 fringe patterns by projection and solving the wrapping phase for the acquired 3 fringe patterns respectively according to a three-step phase shift method, including solving the wrapping phase for the fringe patterns generated by projection based on formula four, solving the wrapping phase for the fringe patterns acquired by the left camera based on formula five, and solving the wrapping phase for the fringe patterns acquired by the right camera based on formula six;
Figure BDA0003422160540000126
Figure BDA0003422160540000127
Figure BDA0003422160540000128
wherein:
Figure BDA0003422160540000129
a wrapping phase representing a fringe pattern projected by the projector;
I n an nth stripe pattern representing a projection of the projector, n being 0,1, 2;
(u, v) pixel coordinates representing a fringe pattern projected by the projector;
Figure BDA0003422160540000131
a wrapping phase representing a fringe pattern acquired by the left camera;
(x, y) represents pixel coordinates of the acquired fringe pattern;
Figure BDA0003422160540000132
the nth stripe pattern collected by the left camera is represented, and n is 0,1, 2;
Figure BDA0003422160540000133
a wrapping phase representing a fringe pattern acquired by the right camera;
Figure BDA0003422160540000134
n-th sheet showing right camera pickupStripe pattern, n is 0,1, 2.
Preferably, in step D, according to the wrapped phase of the left camera image, in combination with the calibration parameters, using epipolar constraint and phase constraint to find a plurality of mapping point sets corresponding to the local point set of the left camera image in the wrapped phase of the projector image, the method includes:
step D1: based on the formula seven, acquiring a pixel point in the left camera image corresponding to an epipolar line in the projector image by utilizing a basis matrix calibrated by the left camera and the projector according to epipolar line constraint;
step D2: based on the eighth formula and the ninth formula, combining epipolar constraint and phase constraint, and searching a corresponding point of a pixel point in the left camera image in the projector image along the epipolar direction in the projector image;
step D3: finding a plurality of mapping point sets in the projector image by the local point set in the left camera image;
Figure BDA0003422160540000135
Figure BDA0003422160540000136
P n (u,v)=p p (u, v) if (0. ltoreq. D. ltoreq.T) -formula nine;
wherein the content of the first and second substances,
c (u, v) represents the position identification of the corresponding point;
p l (x, y,1) represents a pixel point in the left camera image;
F represents a base matrix obtained by calibration;
p p (u, v,1) representing pixel points in a projector image
Figure BDA0003422160540000137
Representing a pixel point p in a left camera image l Mapping to polar lines in the projector image if pixel point p p On the polar line, C (u, v) ═ 0;
d (u, v) represents a phase error identifier of the corresponding point, and D represents a difference value between the phase of the corresponding point in the projector image and the phase of the pixel point in the left camera image;
fabs [ ] represents a function of absolute values;
Figure BDA0003422160540000141
a wrapping phase representing a fringe pattern projected by the projector;
Figure BDA0003422160540000142
a wrapping phase representing a fringe pattern acquired by the left camera;
P n (u, v) represents a set of all corresponding points in the projector image that satisfy the phase constraint and the epipolar constraint;
t represents a threshold value, and the range belongs to [0, T ].
Preferably, in the step E, according to the local point set in the left camera image and the plurality of mapping point sets in the projector image, in combination with the calibration parameters, finding a plurality of mapping point sets corresponding to the local point set of the left camera image in the right camera image includes:
step E1: acquiring a plurality of three-dimensional point sets of a plurality of mapping point sets of the projector image in a world coordinate system according to calibration parameters of a left camera and the projector, and a local point set of the left camera image and a plurality of mapping point sets of the projector image;
Step E2: and according to the calibration parameters of the projector and the right camera, converting a plurality of three-dimensional point sets of the projector in a world coordinate system into a plurality of mapping point sets in a right camera image coordinate system, and determining the conversion relation between the mapping point sets in the projector image and the right camera image.
Preferably, in the step F, the method includes:
based on a formula ten and a formula eleven, finding an optimal mapping point set in the plurality of mapping point sets of the right camera image according to the characteristics of the local point set in the left camera image;
P n =Feature(p n1 ,p n2 ,p n3 ,…,p nm ) -formula ten;
P best (p b1 ,p b2 ,p b3 …p bm )=FindMin(P l -P 1 ,P l -P 2 ,P l -P 3 ,…,P l -P n ) - -formula eleven;
wherein:
P n a feature representation representing an nth set of mapping points in the right camera image;
p nm representing an mth point in an nth set of mapped points in the right camera image;
featuere (.) represents a function for calculating characteristic information of the mapping point set, wherein the characteristic information comprises gray scale, modulation intensity, mean intensity, phase and the like;
P best representing an optimal set of mapping points;
p bm representing the mth point in the optimal mapping point set;
P l a feature representation representing a local set of points of the left camera image;
FindMin (.) represents the function that returns the set of mapping points to which the feature difference minimum corresponds.
Preferably, in the step F, the method includes:
determining a mapping point set corresponding to the optimal mapping point set in the projector image according to the conversion relation based on the formula twelve;
P p (p 1 ,p 2 ,p 3 ,…,p m )=FuncP[(p b1 ,p b2 ,p b3 …p bm )]- - -formula twelve;
wherein:
P p a set of mapping points representing a returned projector image;
p m an m-th point in the set of mapped points representing the returned projector image;
FuncP [ ] represents a function that returns a set of mapping points in the projector image corresponding to the set of mapping points in the right camera image;
p bm indicating the mth point in the set of best mapped points.
Preferably, in the step F, the method includes:
solving the absolute phase of a mapping point set in the projector image by utilizing the mathematical relation between the image pixel coordinate and the absolute phase of the projector based on the formula thirteen, determining the absolute phase of a local point set in the left camera image, and finishing the solution of the absolute phase;
Figure BDA0003422160540000161
wherein:
Figure BDA0003422160540000162
an absolute phase representing an m-th point in a set of mapped points of the projector image;
FuncX (.) represents the x coordinate of the returned pixel point;
f represents the number of pixels contained in a single fringe period;
p m the mth point in the set of mapped points representing the returned projector image.
A multi-purpose structured light three-dimensional measurement system based on optimal mapping point set matching, which applies any one of the multi-purpose structured light three-dimensional measurement methods based on optimal mapping point set matching, as shown in fig. 3, the system includes:
a first unit for generating 3 sinusoidal fringe patterns according to a three-step phase shift method;
The second unit is used for generating 3 strips of patterns to be projected to the surface of the object to be detected by the projector, and respectively collecting the 3 strips of the deformed surface of the object to be detected by the left camera and the right camera;
the third unit is used for respectively solving the wrapping phase of the 3 fringe patterns generated by projection and the 3 collected fringe patterns according to a three-step phase shift method;
the fourth unit is used for finding a plurality of mapping point sets corresponding to the local point set of the left camera image in the wrapping phase of the projector image by utilizing polar line constraint and phase constraint according to the wrapping phase of the left camera image in combination with the calibration parameters;
a fifth unit, configured to find, according to the local point set in the left camera image and the multiple mapping point sets in the projector image, multiple mapping point sets corresponding to the local point set of the left camera image in the right camera image in combination with the calibration parameter;
a sixth unit, configured to find an optimal mapping point set from the multiple mapping point sets of the right camera image according to a feature of the local point set in the left camera image, and determine a mapping point set corresponding to the optimal mapping point set in the projector image;
solving the absolute phase of a mapping point set in the projector image by utilizing the relation between the pixel coordinate and the absolute phase of the projector image, determining the absolute phase of a local point set in the left camera image, and completing the solution of the absolute phase;
And the seventh unit is used for reconstructing the three-dimensional point cloud according to the triangular distance measurement to build a three-dimensional model of the object to be measured.
The invention provides a multi-view structured light three-dimensional measurement method based on optimal mapping point set matching. Based on the method, for the local point set in the left camera image, a plurality of mapping point sets are found in the projector image, and then the plurality of mapping point sets in the projector image are converted into a plurality of mapping point sets in the right camera image, so that the local point set in the left camera image and the plurality of mapping point sets in the right camera image are corresponding. And matching is completed by utilizing the characteristic information of the local point set and the mapping point set based on the point set characteristic information consisting of information such as gray scale, modulation degree, mean intensity, phase and the like, so as to find out the optimal mapping point set. And obtaining a mapping point set corresponding to the optimal mapping point set in the projector image in the right camera image according to the conversion relation between the mapping point set of the right camera image and the mapping point set of the projector image, and solving the absolute phase of the mapping point set by combining the mathematical relation between the pixel coordinate and the absolute phase of the projector image. And finally, reconstructing three-dimensional point cloud according to the triangular distance measurement to establish a three-dimensional model of the object to be measured and finish the three-dimensional measurement of the object to be measured.
According to the multi-view structured light three-dimensional measurement method and system based on optimal mapping point set matching, extra fringe patterns do not need to be projected in time, and projection time and phase solution time are reduced; in the aspect of matching, compared with the feature matching of a single pixel point, a point set formed by multiple pixel points has more feature information, and compared with a neighborhood point set of a single pixel point, a local point set and all points in a mapping point set are in one-to-one correspondence, so that the reliability of the feature of the point set is improved. The matching method adopted by the invention is more reasonable and can be followed, only the local point set in the left camera image and the corresponding multiple mapping point sets in the right camera image are needed to be obtained, the optimal mapping point set can be found by utilizing the characteristic information of the local point set and the mapping point sets, and the solution of the absolute phase is completed.
The technical principles of the present invention have been described above with reference to specific embodiments. The description is made for the purpose of illustrating the principles of the invention and should not be construed in any way as limiting the scope of the invention. Based on the explanations herein, those skilled in the art will be able to conceive of other embodiments of the present invention without inventive effort, which would fall within the scope of the present invention.

Claims (10)

1. A multi-view structured light three-dimensional measurement method based on optimal mapping point set matching is characterized by comprising the following steps:
step A: generating 3 sinusoidal stripe patterns according to a three-step phase shift method;
and B: the projector generates 3 strips to the surface of the object to be measured, and the left camera and the right camera respectively collect the 3 strips deformed on the surface of the object to be measured;
and C: respectively generating 3 fringe patterns for projection and solving a wrapping phase for the acquired 3 fringe patterns according to a three-step phase shift method;
step D: according to the wrapping phase of the left camera image, combining with the calibration parameters, and finding a plurality of mapping point sets corresponding to the local point set of the left camera image by utilizing polar line constraint and phase constraint in the wrapping phase of the projector image;
step E: according to the local point set in the left camera image and the plurality of mapping point sets in the projector image, a plurality of mapping point sets corresponding to the local point set of the left camera image are found in the right camera image by combining the calibration parameters;
step F: according to the characteristics of a local point set in a left camera image, finding an optimal mapping point set in a plurality of mapping point sets of a right camera image, and determining a corresponding mapping point set of the optimal mapping point set in a projector image;
Solving the absolute phase of a mapping point set in the projector image by utilizing the relation between the pixel coordinate and the absolute phase of the projector image, determining the absolute phase of a local point set in the left camera image, and completing the solution of the absolute phase;
g: and reconstructing three-dimensional point cloud according to the triangular distance measurement to build a three-dimensional model of the object to be measured.
2. The method for the three-dimensional measurement of the multi-view structured light based on the optimal mapping point set matching according to claim 1, wherein:
in the step A, 3 sinusoidal fringe patterns are generated according to a three-step phase shift method based on a formula I;
Figure 949451DEST_PATH_IMAGE001
-formula one;
wherein:
Figure 383974DEST_PATH_IMAGE002
represents the nth stripe pattern, n =0,1, 2;
Figure 299978DEST_PATH_IMAGE003
pixel coordinates representing the generation of the fringe pattern;
a represents the mean intensity of the fringe pattern projected by the projector;
b represents the modulation intensity of the fringe pattern projected by the projector;
Figure 473339DEST_PATH_IMAGE004
representing the wrapped phase of the fringe pattern projected by the projector.
3. The method for the three-dimensional measurement of the multi-view structured light based on the optimal mapping point set matching according to claim 1, wherein:
in the step B, a projector is used for projecting and generating 3 strips of patterns to the surface of the object to be detected, and the 3 strips of the deformed surface of the object to be detected are respectively collected by a left camera and a right camera;
Expressing the stripe pattern collected by the left camera based on a formula II, and expressing the stripe pattern collected by the right camera based on a formula III;
Figure 45266DEST_PATH_IMAGE005
- - -formula two;
Figure 345797DEST_PATH_IMAGE006
- - -formula three;
wherein:
Figure 99995DEST_PATH_IMAGE007
pixel coordinates representing the acquired fringe pattern;
Figure 398253DEST_PATH_IMAGE008
represents the nth fringe pattern acquired by the left camera, n =0,1, 2;
Figure 519792DEST_PATH_IMAGE009
represents the nth fringe pattern captured by the right camera, n =0,1, 2;
Figure 342124DEST_PATH_IMAGE010
representing the mean intensity of the fringe pattern acquired by the left camera;
Figure 967140DEST_PATH_IMAGE011
representing the mean intensity of the fringe pattern acquired by the right camera;
Figure 436299DEST_PATH_IMAGE012
represents the modulation intensity of the fringe pattern acquired by the left camera;
Figure 763244DEST_PATH_IMAGE013
a modulation intensity representing a fringe pattern captured by the right camera;
Figure 671157DEST_PATH_IMAGE014
a wrapping phase representing a fringe pattern acquired by the left camera;
Figure 353942DEST_PATH_IMAGE015
representing the wrapped phase of the fringe pattern acquired by the right camera.
4. The method for the three-dimensional measurement of the multi-view structured light based on the optimal mapping point set matching according to claim 1, wherein:
in the step C, 3 fringe patterns generated by projection and 3 acquired fringe patterns are respectively solved for wrapping phases according to a three-step phase shift method, the wrapping phases are solved for the fringe patterns generated by four pairs of projection based on a formula, the wrapping phases are solved for the fringe patterns acquired by a left camera based on a formula five, and the wrapping phases are solved for the fringe patterns acquired by a right camera based on a formula six;
Figure 40007DEST_PATH_IMAGE016
- - -formula four;
Figure 604981DEST_PATH_IMAGE017
- - -formula five;
Figure 785426DEST_PATH_IMAGE018
-formula six;
wherein:
Figure 837565DEST_PATH_IMAGE004
a wrapping phase representing a fringe pattern projected by the projector;
Figure 710843DEST_PATH_IMAGE002
n =0,1,2, representing the nth fringe pattern projected by the projector;
Figure 763113DEST_PATH_IMAGE003
pixel coordinates representing a fringe pattern projected by the projector;
Figure 481670DEST_PATH_IMAGE014
a wrapping phase representing a fringe pattern acquired by the left camera;
Figure 388315DEST_PATH_IMAGE007
pixel coordinates representing the acquired fringe pattern;
Figure 432494DEST_PATH_IMAGE008
representing left camera acquisitionN =0,1, 2;
Figure 972060DEST_PATH_IMAGE015
a wrapping phase representing a fringe pattern acquired by the right camera;
Figure 743576DEST_PATH_IMAGE009
representing the nth fringe pattern captured by the right camera, n =0,1, 2.
5. The method for the three-dimensional measurement of the multi-view structured light based on the optimal mapping point set matching according to claim 1, wherein:
in step D, according to the wrapped phase of the left camera image, in combination with the calibration parameters, using epipolar constraint and phase constraint to find a plurality of mapping point sets corresponding to the local point set of the left camera image in the wrapped phase of the projector image, including:
step D1: based on the formula seven, acquiring a pixel point in the left camera image corresponding to an epipolar line in the projector image by utilizing a basis matrix calibrated by the left camera and the projector according to epipolar line constraint;
step D2: based on the eighth formula and the ninth formula, combining epipolar constraint and phase constraint, and searching a corresponding point of a pixel point in the left camera image in the projector image along the epipolar direction in the projector image;
Step D3: finding a plurality of mapping point sets in the projector image by the local point set in the left camera image;
Figure 255460DEST_PATH_IMAGE019
-formula seven;
Figure 470540DEST_PATH_IMAGE020
-formula eight;
Figure 481091DEST_PATH_IMAGE021
- - -formula nine;
wherein the content of the first and second substances,
Figure 10292DEST_PATH_IMAGE022
a location identifier representing a corresponding point;
Figure 173420DEST_PATH_IMAGE023
representing pixel points in the left camera image;
f represents a base matrix obtained by calibration;
Figure 808670DEST_PATH_IMAGE024
representing pixel points in a projector image
Figure 526090DEST_PATH_IMAGE025
Representing pixel points in a left camera image
Figure 655720DEST_PATH_IMAGE026
Mapping to polar lines in the projector image if a pixel point
Figure 922622DEST_PATH_IMAGE027
On the polar line, then
Figure 682768DEST_PATH_IMAGE028
Figure 684222DEST_PATH_IMAGE029
Representing the phase error identification of the corresponding point, and D representing the difference value between the phase of the corresponding point in the projector image and the phase of the pixel point in the left camera image;
fabs [ ] represents a function of absolute values;
Figure 335652DEST_PATH_IMAGE004
a wrapping phase representing a fringe pattern projected by the projector;
Figure 473372DEST_PATH_IMAGE031
a wrapping phase representing a fringe pattern acquired by the left camera;
Figure 404419DEST_PATH_IMAGE032
representing a set of all corresponding points in the projector image that satisfy the phase constraint and the epipolar constraint;
t represents a threshold value, and the range belongs to [0, T ].
6. The method for the three-dimensional measurement of the multi-view structured light based on the optimal mapping point set matching according to claim 1, wherein:
in step E, according to the local point set in the left camera image and the multiple mapping point sets in the projector image, in combination with the calibration parameters, finding multiple mapping point sets corresponding to the local point set of the left camera image in the right camera image includes:
Step E1: acquiring a plurality of three-dimensional point sets of a plurality of mapping point sets of the projector image in a world coordinate system according to calibration parameters of a left camera and the projector, and a local point set of the left camera image and a plurality of mapping point sets of the projector image;
step E2: and according to the calibration parameters of the projector and the right camera, converting a plurality of three-dimensional point sets of the projector in a world coordinate system into a plurality of mapping point sets in a right camera image coordinate system, and determining the conversion relation between the mapping point sets in the projector image and the right camera image.
7. The method according to claim 6, wherein the method comprises:
in the step F, the method includes:
based on a formula ten and a formula eleven, finding an optimal mapping point set in the plurality of mapping point sets of the right camera image according to the characteristics of the local point set in the left camera image;
Figure 611278DEST_PATH_IMAGE033
-formula ten;
Figure 348290DEST_PATH_IMAGE034
- -formula eleven;
wherein:
Figure 340517DEST_PATH_IMAGE035
a feature representation representing an nth set of mapping points in the right camera image;
Figure 426154DEST_PATH_IMAGE036
representing an mth point in an nth set of mapped points in the right camera image;
featuere (.) represents a function for calculating feature information of the mapping point set;
Figure 402200DEST_PATH_IMAGE037
Representing an optimal set of mapping points;
Figure 411744DEST_PATH_IMAGE038
representing the mth point in the optimal mapping point set;
Figure 711007DEST_PATH_IMAGE039
a feature representation representing a local set of points of the left camera image;
Figure 46174DEST_PATH_IMAGE040
and representing a function of the set of mapping points corresponding to the minimum value of the returned feature difference.
8. The method for the three-dimensional measurement of the multi-view structured light based on the optimal mapping point set matching according to claim 1, wherein:
in the step F, the method includes:
determining a mapping point set corresponding to the optimal mapping point set in the projector image according to the conversion relation based on the formula twelve;
Figure 978358DEST_PATH_IMAGE041
- - -formula twelve;
wherein:
Figure 244123DEST_PATH_IMAGE042
a set of mapping points representing the returned projector image;
Figure 476521DEST_PATH_IMAGE043
an m-th point in the set of mapped points representing the returned projector image;
FuncP [ ] represents a function that returns a set of mapping points in the projector image corresponding to the set of mapping points in the right camera image;
Figure 451430DEST_PATH_IMAGE038
indicating the mth point in the set of best mapped points.
9. The method for the three-dimensional measurement of the multi-view structured light based on the optimal mapping point set matching according to claim 1, wherein:
in the step F, the method includes:
solving the absolute phase of a mapping point set in the projector image by utilizing the mathematical relation between the image pixel coordinate and the absolute phase of the projector based on the formula thirteen, determining the absolute phase of a local point set in the left camera image, and finishing the solution of the absolute phase;
Figure 589019DEST_PATH_IMAGE044
- - -formula thirteen;
wherein:
Figure 205945DEST_PATH_IMAGE045
an absolute phase representing an m-th point in a set of mapped points of the projector image;
Figure 230533DEST_PATH_IMAGE046
x coordinates representing returned pixel points;
f represents the number of pixels contained in a single fringe period;
Figure 625611DEST_PATH_IMAGE043
the mth point in the set of mapped points representing the returned projector image.
10. A multi-purpose structured light three-dimensional measurement system based on optimal mapping point set matching is characterized in that: the multi-purpose structured light three-dimensional measurement method based on the optimal mapping point set matching as claimed in any one of claims 1 to 9, the system comprises:
a first unit for generating 3 sinusoidal fringe patterns according to a three-step phase shift method;
the second unit is used for generating 3 strips of patterns to be projected to the surface of the object to be detected by the projector, and respectively collecting the 3 strips of the deformed surface of the object to be detected by the left camera and the right camera;
the third unit is used for respectively solving the wrapping phase of the 3 fringe patterns generated by projection and the 3 collected fringe patterns according to a three-step phase shift method;
the fourth unit is used for finding a plurality of mapping point sets corresponding to the local point set of the left camera image in the wrapping phase of the projector image by utilizing polar line constraint and phase constraint according to the wrapping phase of the left camera image in combination with the calibration parameters;
A fifth unit, configured to find, according to the local point set in the left camera image and the multiple mapping point sets in the projector image, multiple mapping point sets corresponding to the local point set of the left camera image in the right camera image in combination with the calibration parameter;
a sixth unit, configured to find an optimal mapping point set from the multiple mapping point sets of the right camera image according to a feature of the local point set in the left camera image, and determine a mapping point set corresponding to the optimal mapping point set in the projector image;
solving the absolute phase of a mapping point set in the projector image by utilizing the relation between the pixel coordinate and the absolute phase of the projector image, determining the absolute phase of a local point set in the left camera image, and completing the solution of the absolute phase;
and the seventh unit is used for reconstructing the three-dimensional point cloud according to the triangular distance measurement to build a three-dimensional model of the object to be measured.
CN202111567224.7A 2021-12-20 2021-12-20 Multi-view structured light three-dimensional measurement method and system based on optimal mapping point set matching Active CN114234852B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111567224.7A CN114234852B (en) 2021-12-20 2021-12-20 Multi-view structured light three-dimensional measurement method and system based on optimal mapping point set matching

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111567224.7A CN114234852B (en) 2021-12-20 2021-12-20 Multi-view structured light three-dimensional measurement method and system based on optimal mapping point set matching

Publications (2)

Publication Number Publication Date
CN114234852A CN114234852A (en) 2022-03-25
CN114234852B true CN114234852B (en) 2022-07-29

Family

ID=80759831

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111567224.7A Active CN114234852B (en) 2021-12-20 2021-12-20 Multi-view structured light three-dimensional measurement method and system based on optimal mapping point set matching

Country Status (1)

Country Link
CN (1) CN114234852B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114739322B (en) * 2022-06-09 2022-09-16 广东工业大学 Three-dimensional measurement method, equipment and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3199041B2 (en) * 1998-11-17 2001-08-13 日本電気株式会社 Three-dimensional shape measuring apparatus, method and recording medium
CN105547189B (en) * 2015-12-14 2018-01-23 南京航空航天大学 High-precision optical method for three-dimensional measurement based on mutative scale
CN109341536A (en) * 2018-09-25 2019-02-15 深圳市艾视铂智能技术有限公司 A kind of precision three-dimensional vision measuring method based on binocular camera and structured light projection
CN112504165A (en) * 2020-12-30 2021-03-16 南京理工大学智能计算成像研究院有限公司 Composite stereo phase unfolding method based on bilateral filtering optimization

Also Published As

Publication number Publication date
CN114234852A (en) 2022-03-25

Similar Documents

Publication Publication Date Title
CN110288642B (en) Three-dimensional object rapid reconstruction method based on camera array
CN110514143B (en) Stripe projection system calibration method based on reflector
CN111563564B (en) Speckle image pixel-by-pixel matching method based on deep learning
CN109506589B (en) Three-dimensional profile measuring method based on structural light field imaging
CN104197861B (en) Three-dimension digital imaging method based on structure light gray scale vector
CN111473744B (en) Three-dimensional shape vision measurement method and system based on speckle embedded phase shift stripe
CN107167073A (en) A kind of three-dimensional rapid measurement device of linear array structure light and its measuring method
CN104596439A (en) Speckle matching and three-dimensional measuring method based on phase information aiding
CN109307483A (en) A kind of phase developing method based on structured-light system geometrical constraint
CN113506348B (en) Gray code-assisted three-dimensional coordinate calculation method
CN113465545B (en) Three-dimensional measurement system based on high-speed LED array and measurement method thereof
WO2020199439A1 (en) Single- and dual-camera hybrid measurement-based three-dimensional point cloud computing method
CN110174079A (en) A kind of three-dimensional rebuilding method based on the code-shaped area-structure light of four-step phase-shifting
CN113763540A (en) Three-dimensional reconstruction method and equipment based on speckle fringe hybrid modulation
CN114234852B (en) Multi-view structured light three-dimensional measurement method and system based on optimal mapping point set matching
CN112945089A (en) Structured light coding method based on stripe width modulation
CN117450955B (en) Three-dimensional measurement method for thin object based on space annular feature
CN113345039B (en) Three-dimensional reconstruction quantization structure optical phase image coding method
CN107504919B (en) Wrapped phase three-dimension digital imaging method and device based on phase mapping
CN115290004B (en) Underwater parallel single-pixel imaging method based on compressed sensing and HSI
CN114252026B (en) Three-dimensional measurement method and system for modulating three-dimensional code on periodic edge
CN113551617B (en) Binocular double-frequency complementary three-dimensional surface type measuring method based on fringe projection
CN114234850B (en) Three-dimensional measurement method for modulation order phase at cycle edge
CN113450460A (en) Phase-expansion-free three-dimensional face reconstruction method and system based on face shape space distribution
CN111815697A (en) Dynamic three-dimensional measurement method for thermal deformation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant