CN110207614B - High-resolution high-precision measurement system and method based on double telecentric camera matching - Google Patents

High-resolution high-precision measurement system and method based on double telecentric camera matching Download PDF

Info

Publication number
CN110207614B
CN110207614B CN201910448519.9A CN201910448519A CN110207614B CN 110207614 B CN110207614 B CN 110207614B CN 201910448519 A CN201910448519 A CN 201910448519A CN 110207614 B CN110207614 B CN 110207614B
Authority
CN
China
Prior art keywords
phase
telecentric camera
camera
telecentric
expressed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910448519.9A
Other languages
Chinese (zh)
Other versions
CN110207614A (en
Inventor
张玉珍
梁一超
胡岩
左超
陈钱
冯世杰
尹维
张良
钱佳铭
顾国华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Science and Technology
Original Assignee
Nanjing University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Science and Technology filed Critical Nanjing University of Science and Technology
Priority to CN201910448519.9A priority Critical patent/CN110207614B/en
Publication of CN110207614A publication Critical patent/CN110207614A/en
Application granted granted Critical
Publication of CN110207614B publication Critical patent/CN110207614B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures

Abstract

The invention discloses a high-resolution high-precision measurement system and a method thereof based on double telecentric camera matching.A first telecentric camera and a second telecentric camera are positioned at two sides of a projector and are symmetrical to the central axis of the projector, the first telecentric camera and the second telecentric camera are connected with a computer through USB data lines, and the projector is connected with the first telecentric camera and the second telecentric camera through trigger lines; the projector projects multi-frequency multi-phase-shift stripes and simultaneously generates synchronous trigger signals, the synchronous trigger signals drive the first telecentric camera and the second telecentric camera to synchronously shoot, the first telecentric camera and the second telecentric camera shoot modulated phase shift images of a measured object, the modulated images are transmitted to a computer end, the computer expands the phases of the modulated images, then lens distortion correction and polar line correction are carried out on the expanded phases, and high-resolution high-precision three-dimensional profile information of the small measured object within the range of 24mm to 18mm can be measured by using a three-dimensional phase matching algorithm and a three-dimensional reconstruction algorithm.

Description

High-resolution high-precision measurement system and method based on double telecentric camera matching
Technical Field
The invention belongs to the technical field of three-dimensional measurement, and particularly relates to a high-resolution and high-precision measurement method based on double telecentric camera matching.
Background
At present, the industries such as integrated circuits, micro sensors, precision instruments and the like in China are in the initial scale, but the product testing technology is relatively backward and lacks a high-efficiency and high-precision detection tool for the whole-package products. The high-efficiency and high-precision three-dimensional shape measurement technology can be widely used for quality monitoring of the products, and can effectively improve the production efficiency and the production precision of the products.
For three-dimensional topography measurements to achieve a relatively small field of view, the use of auxiliary lenses is required. The first method is to use a stereomicroscope (y.hu, q.chen, t.tao, h.li, and c.zuo, "Absolute third-dimensional micro surface profile measured group on a great-thickness-stereo-microscope," Measurement Science and Technology 28, 004 (2017)), but because the optical path complexity of the microscope and the range of the samples to be measured are relatively difficult to operate, calibrating a stereomicroscope will become very difficult; the second approach is to use a telecentric lens, which increases the depth of field of the system and the magnification of the imaging surface along the optical axis is constant. Preliminary studies of three-dimensional topography measurements using telecentric lenses are based on models of a single camera combined with a projector. However, the projectors in the single-camera and projector combined model system must be calibrated by means of calibrated cameras (h.liu, h.lin, and l.yao, "Calibration method for projector-camera-based project from project planning system," Optics Express 25,31492 31508 (2017)), which is complicated. In addition, due to the uncertainty of the gamma electron beam, the deformation of the lens, and the misalignment of the additional auxiliary lens in the projector, the calibration accuracy cannot be guaranteed.
Due to the different surface morphologies of the objects, it is inevitable that some objects have higher surface reflectivity, and these light-intensity-saturated surfaces have higher difficulty in reproducibility (s.feng, l.zhang, c.zuo, t.tao, q.chen, and g.gu, "High dynamic range 3-d Measurement with fringe project profile: a view," Measurement Science and Technology (2018)), because the light intensity resolution of the camera is limited, the actual intensity cannot be stored correctly, and thus, errors occur during three-dimensional reconstruction. For surfaces where there is intensity saturation, the problem of intensity saturation may in some cases not be solved simply by reducing the exposure time or the intensity of the projected light. The fixed phase shift fringes may be smaller than the number of three non-intensity saturated fringes, so that the image saturation cannot be overcome, and the result of the calculation also generates a phase error. In addition, the phase error obtained by using common one-cycle stripes or three-step phase shift is large, so that the data accuracy after three-dimensional reconstruction is not high, and the system measurement time is prolonged by using multi-frequency multi-step phase shift.
Disclosure of Invention
The invention aims to provide a high-resolution high-precision measurement system based on double telecentric camera matching and a method thereof, which can respectively obtain parameters for calibrating an X-Y axis and height parameters for a Z axis by abandoning calibration of a projector, eliminating extra errors brought by the projector, and calibrating only the positions of a first telecentric camera and a second telecentric camera of the system and the space height of a measurement range.
The technical solution for realizing the purpose of the invention is as follows: a high-resolution high-precision measurement system and a method thereof based on double telecentric camera matching comprise a first telecentric camera, a second telecentric camera, a projector and a lifting displacement table, wherein the first telecentric camera and the second telecentric camera are positioned at two sides of the projector and are symmetrical to the central axis of the projector; the projector projects multi-frequency multi-phase-shift stripes and simultaneously generates synchronous trigger signals, the synchronous trigger signals drive the first telecentric camera and the second telecentric camera to synchronously shoot, the first telecentric camera and the second telecentric camera shoot modulated phase shift images of a measured object, the modulated images are transmitted to a computer end, the computer expands the phases of the modulated images, then lens distortion correction and polar line correction are carried out on the expanded phases, and high-resolution high-precision three-dimensional profile information of the small measured object within the range of 24mm to 18mm can be measured by using a three-dimensional phase matching algorithm and a three-dimensional reconstruction algorithm.
Compared with the prior art, the invention has the following remarkable advantages: (1) the first telecentric camera and the second telecentric camera of the system are both 7.1208mm 5.3268mm in target surface size, the shooting field of view is relatively large, the resolution is high, and the system is very suitable for small objects within 24mm 18 mm. (2) The combination of the phase shift profilometry and the generalized phase shift method in the method can inhibit the light intensity saturation of the image and calculate the high-precision three-dimensional coordinates at the light intensity saturation point. (3) The calibration is simple, a projector does not need to be calibrated, and the precision of three-dimensional reconstruction can reach 0.0021 mm.
The present invention is described in further detail below with reference to the attached drawing figures.
Drawings
Fig. 1 is a schematic flow chart of a high-resolution and high-precision measurement method based on double telecentric camera matching.
Fig. 2 is a schematic diagram of a high-resolution high-precision measurement system matched with a double telecentric camera.
Fig. 3 shows measurement results in a first complex scenario, (a), (b) show measurement scenarios of the micro-calibration plate and the ceramic plate, respectively, (c), (d) show measured point cloud results of the micro-calibration plate and the ceramic plate, respectively, and (e), (f) show histograms of distances between two adjacent points and a histogram of fitting errors in (c) and (d), respectively.
Fig. 4 is a measurement result in a second complex scenario, where (a) is a measurement scenario of a watch buckle, and (b), (c), and (d) are results obtained by a single-cycle three-step phase shift method, four groups of phase shift profiles with different wavelengths and twelve steps, respectively, and a result of a highlight dial measured by the system of the present invention.
Detailed Description
With reference to fig. 1, the high-resolution high-precision measurement system based on the matching of the double telecentric cameras comprises a left-side high-resolution black-and-white telecentric camera, hereinafter referred to as a first telecentric camera 1 (also referred to as a left-side telecentric camera) and a right-side high-resolution black-and-white telecentric camera, hereinafter referred to as a second telecentric camera 2 (also referred to as a right-side telecentric camera) and a color projector, hereinafter referred to as a projector 3 and a precision lifting displacement table 4, wherein the first telecentric camera 1 and the second telecentric camera 2 are positioned at two sides of the projector 3 and are symmetrical to a central axis of the projector 3, the first telecentric camera 1 and the second telecentric camera 2 are connected with a computer through USB data lines, and the projector 3 is connected with the first telecentric camera 1 and the second telecentric camera 2 through trigger lines; the projector 3 projects multi-frequency multi-phase-shift stripes and simultaneously generates synchronous trigger signals, the synchronous trigger signals drive the first telecentric camera 1 and the second telecentric camera 2 to synchronously shoot, the first telecentric camera 1 and the second telecentric camera 2 shoot modulated phase shift images of a measured object, the modulated images are transmitted to a computer terminal, the computer carries out phase expansion on the modulated images, then lens distortion correction and epipolar correction are carried out on the expanded phases, and high-resolution high-precision three-dimensional profile information of the small measured object within the range of 24mm 18mm can be measured by using a stereo phase matching algorithm and a three-dimensional reconstruction algorithm. Optimize the position and height between accurate lift displacement platform, left side and the right side heart far away camera among the measurement system: because the depth of field of the telecentric lens is relatively small, in order to ensure that a measured object is imaged clearly, the position of the left telecentric camera needs to be adjusted firstly, so that the left telecentric camera is adjusted to be as close to the projector as possible and the view field of the left telecentric camera is not blocked by the projector, the included angle between the left telecentric camera and the projector is fed back and finely adjusted through the position of the projection pattern of the projector shot by the left telecentric camera in the computer camera software, so that the center position of the view field of the left telecentric camera is coincided with the center position of the projection of the projector, and the height of the left telecentric camera is adjusted upwards, so that the pattern shot by the left telecentric camera in the camera software is all clearly visible; the right telecentric camera is placed at a position which is symmetrically arranged with the left telecentric camera about the central axis of the projector, the included angle between the right telecentric camera and the projector is finely adjusted according to the position of a measured object shot by the right telecentric camera in the camera software, so that the center position of the view field of the right telecentric camera coincides with the center position of the projection of the projector, the height of the right telecentric camera is adjusted upwards, the patterns shot by the right telecentric camera in the camera software are all clearly visible in the full view field, and at the moment, the left telecentric camera and the right telecentric camera are roughly coincided; and feeding back the rotation angle of the left telecentric camera and the right telecentric camera in a fine rotation manner by judging whether the projection patterns of the projectors shot by the left telecentric camera and the right telecentric camera in the camera software are parallel or not, so that the fields of view shot by the left telecentric camera and the right telecentric camera are as consistent as possible.
The invention relates to a high-resolution high-precision measurement method based on double telecentric cameras, which comprises the steps of firstly, building a measurement system, wherein the system comprises two high-resolution black and white cameras, the models of the two high-resolution black and white cameras are Basler ACA2040-120um, the pixel sizes are 3.45um, the highest frame rates are 120fps, and the resolution is 2048 x 1536; a color projector with model number lightcraft 4500Pro, resolution 912 x 1140, and maximum speed 120 Hz; two telecentric lenses with model of XF-UTL-0296X175, magnification of 0.296, depth of field of 16.1mm, and spatial resolution of 31.2 um; a precision lifting displacement table. The specific connection and placement positions of the related devices are as follows: the projector is connected with the first telecentric camera 1 and the second telecentric camera 2 through two trigger lines, and the first telecentric camera 1, the second telecentric camera 2 and the computer are connected through two data lines. The specific process is as follows: because the depth of field of the telecentric lens is relatively small, in order to ensure that a measured object can be imaged clearly, the positions of the first telecentric camera 1 and the second telecentric camera 2 are adjusted to be respectively positioned at the same position at the two sides of the projector and at the distance as small as possible, and the visual angles of the first telecentric camera 1 and the second telecentric camera 2 can be mutually overlapped, the height of the lens is adjusted, and the maximum overlapping of the depth of field of the lens of the first telecentric camera 1 and the second telecentric camera 2 is kept. The position of the precise lifting displacement platform is adjusted to be arranged under the tested visual angle, the height of the precise lifting displacement platform is adjusted to be positioned at the lowest part of the depth of field of the lenses of the first telecentric camera 1 and the second telecentric camera 2, and all the components are arranged as shown in figure 1.
With reference to fig. 2, the high-resolution and high-precision measurement method based on the matching of the double telecentric cameras of the present invention has the following steps:
firstly, because the magnification of an image cannot change within a certain object distance range of the telecentric lens, after 3 different positions of the first telecentric camera 1 and the second telecentric camera 2 are calibrated, the height of a measured space is calibrated, after a calibration pattern at a reference surface position is shot, a precise lifting displacement table is required to be finely adjusted to lift by 1mm, then a microscopic calibration plate at the position is shot and calibrated, and a computer calculates two calibration images to finish the calibration of the space range. And finely adjusting the spatial positions and angles of the first telecentric camera 1 and the second telecentric camera 2 in the measuring system to complete the position calibration between the first telecentric camera 1 and the second telecentric camera 2 and the height calibration of the measuring space range. Because the telecentric lens has the characteristic that the magnification of an image cannot change within a certain object distance range, the calibration of the height of a measurement space of a measured object is required after the position calibration of the first telecentric camera 1 and the second telecentric camera 2 in the system is required. The first telecentric camera 1 and the second telecentric camera 2 are calibrated at least for three postures, each posture is a position with two different heights, the height difference is 1mm, the whole system is calibrated under a unified world coordinate system, and calibration parameters of the first telecentric camera 1 and the second telecentric camera 2 under the world coordinate system are obtained.
And step two, calculating the wrapping phase of the measured object by using a phase-shift profilometry, calculating the saturated number of each pixel stripe of the phase shift, replacing the calculated result of the phase-shift profilometry with a generalized phase-shift method for pixels exceeding a specific number, and then sequentially unfolding the high-frequency wrapping phase by using the low-frequency wrapping phase to obtain the unfolded phase of the measured object. The wrapping phase is obtained by a phase shift contour method, the wrapping phase which does not meet the conditions and is solved by a generalized phase shift method is replaced, the number of light intensity saturation of each pixel in each group of twelve-step phase shift fringe images is calculated, the threshold value of the light intensity saturation of the pixels is generally set to be less than 255, and the wrapping phase is calculated by the phase shift contour method for the pixel position of which the light intensity saturation number is less than or equal to 3; calculating a wrapped phase by a generalized phase shift method for pixel positions with light intensity saturation numbers larger than 3 and smaller than 9; for the pixel position with the light intensity saturation number larger than 9, the light intensity saturation of the position is considered to be serious, the obtained phase shift information is not enough to solve the wrapping phase, or the obtained value of the wrapping phase is calculated by force to be larger than the real value, the wrapping phase at the position is set to nan, and the point does not participate in the subsequent operation.
By integrating the wrapping phases of the above methods, and then calculating the obtained unwrapped phases by using a degradation method, the four groups of phase shift fringe images with different wavelengths acquired by the first telecentric camera 1 and the second telecentric camera 2 can be represented as follows:
Figure BDA0002074376910000051
where m is the mth set of four different wavelength twelve-step phase-shifted fringes, n is the nth phase-shifted fringe pattern of the twelve-step phase shift,
Figure BDA0002074376910000052
an nth phase shift fringe pattern (A) of the m-th group of wavelength phase shift fringe patterns shot by the first telecentric camera 1 and the second telecentric camera 2CExpressed as the average light intensity of the phase shift fringes obtained by the shooting of the first telecentric camera 1 and the second telecentric camera 2, BCThe modulation degree light intensity of the phase shift stripes obtained by shooting by the first telecentric camera 1 and the second telecentric camera 2 is expressed,
Figure BDA0002074376910000053
the absolute phase of the mth set of wavelength phase shift fringe patterns is shown, and N may be selected to be 12 in the method of the present invention.
Judging the number of saturated light intensity of twelve data of the same pixel in the twelve-step phase shift diagram, firstly setting the light intensity saturation (u, v) pixel sat _ map of the mth group of wavelength phase shift fringe diagramsmThe initial value of (u, v) is 0, and twelve phase shift fringe graphs of the mth group of wavelength phase shift fringe graphs are traversed when the conditions are met
Figure BDA0002074376910000054
The method comprises the following steps:
sat_mapm(u,v)=sat_mapm(u,v)+1
wherein sat _ mapm(u, v) is the number of light intensity saturations at the twelve-step phase-shifted (u, v) pixel points of the mth set of wavelength phase-shifted fringe patterns, sat thr is the maximum value of the required range of light intensity,
Figure BDA0002074376910000055
the light intensity value at the pixel point of the nth phase shift fringe pattern (u, v) expressed as the mth set of wavelength phase shift fringe patterns,
Figure BDA0002074376910000056
denoted as mth set of wavelength phase shift stripesThe fringe pattern twelve-step phase shifts the wrapped phase at (u, v) pixel points.
Then according to sat _ mapm(u, v) which conditions are met, and respectively calculating the wrapping phases of the mth group of wavelength phase shift fringe patterns of the measured object:
Figure BDA0002074376910000061
wherein
Figure BDA0002074376910000062
Expressed as wrapped phases at the pixel points of the mth set of wavelength phase-shift fringe patterns (u, v) solved using phase-shift profilometry,
Figure BDA0002074376910000063
expressed as wrapped phase at the pixel point of the mth set of wavelength phase shift fringe patterns (u, v) solved using the generalized phase shift method, nan is expressed as not a number, indicating that the final phase result is not taken into account here.
Phase-shift profilometry solves the wrapped phase for an N-step phase shift as follows:
Figure BDA0002074376910000064
whereinnN may be 12 in the process of the invention.
The generalized phase shift method to calculate the wrapped phase can be expressed as:
Figure BDA0002074376910000065
Figure BDA0002074376910000066
Figure BDA0002074376910000067
wherein the matrix C is represented as the inverse of the matrix A, C11、C12、C13、C21、C22、C23、C31、C32、C33Represented as the value of the corresponding position in matrix C.
And thirdly, distortion correction is carried out on the unfolded phases obtained by the first telecentric camera 1 and the second telecentric camera 2. Completing binocular distortion correction, and performing radial distortion correction based on a lens model on an unfolded phase in order to obtain a high-precision phase result, wherein the following formula of the radial distortion correction is as follows:
Figure BDA0002074376910000071
λ=(1+k1r2+k2r4+k3r6)
wherein (u)d,vd) Is the point (u) after the radial distortion correction on the coordinates of the image shot by the telecentric camerau,vv) Is the coordinate value of any point before radial distortion correction on the coordinate of the image shot by the telecentric camera (u)0,v0) Is the coordinate value of the center point before the correction of the radial distortion, and lambda is expressed as the distortion coefficient of the radial distortion, k1,2,3Respectively expressed as the order of the distortion coefficient, r is (u)d,vd) Point to (u)0,v0) The distance of the points can be expressed as:
r2=(uu-u0)2+(vv-v0)2
and step four, using three-dimensional phase matching to find a point after the phase correction of the first telecentric camera 1 corresponds to a matching point after the phase correction of the second telecentric camera 2. Completing polar line correction and phase three-dimensional matching, and for accurately finding a matching value corresponding to the first telecentric camera 1 and the second telecentric camera 2, performing polar line correction on the unfolded phase by using a model based on the combination of a single camera and a projector, so that the unfolded phase is a common view field of the telecentric camera, wherein the following formulas are respectively expressed as a camera mapping model based on perspective transformation, a telecentric camera mapping model after polar line correction, and a formula for finding a matching point on an image shot by the first telecentric camera 1 and a formula for finding a matching point on an image shot by the second telecentric camera 2 by using a relation between the two models:
Figure BDA0002074376910000072
the above formula is expressed as a perspective transformation-based camera mapping model, where L is subscript for the first telecentric camera 1 and R is subscript for the second telecentric camera 2, where p isLExpressed as the coordinates of the pixels of the image captured by the first telecentric camera 1, P is expressed as the world coordinates of the middle point in space, aLDenoted as internal reference of the first telecentric camera 1
Figure BDA0002074376910000073
m is expressed as the telecentric camera magnification,
Figure BDA0002074376910000074
expressed as a rotational-translation matrix, R, converted from world coordinates to coordinates of the first telecentric camera 1L2×3Denoted as the rotation matrix of the first telecentric camera 1, tL2×1Represented as the translation matrix of the first telecentric camera 1;
Figure BDA0002074376910000081
the expression of the equation is that the model is mapped by the first telecentric camera 1 and the second telecentric camera 2 after polar line rectification, wherein p'LIs expressed as the first telecentric camera 1 coordinate, A 'after the polar line correction'LIndicated as the internal reference of the first telecentric camera 1 after epipolar rectification,
Figure BDA0002074376910000082
is expressed as a rotational-translation matrix, R ', of the first telecentric camera 1 after the polar line correction'L2×3Is represented as a rotation matrix, t 'of the first telecentric camera 1 after the correction of the polar lines'L2×1Expressed as translation matrix of the first telecentric camera 1 after epipolar rectification:
according to the characteristics of unchanged magnification and three-coordinate orthogonality of the telecentric lens image, the relationship between the internal parameters of the corrected model and the rotation translation matrix and the relationship before correction can be obtained:
A′L=A′R=(AL+AR)/2
Figure BDA0002074376910000083
Figure BDA0002074376910000084
Figure BDA0002074376910000085
Figure BDA0002074376910000086
Figure BDA0002074376910000087
r 'in the formula'zLIs expressed as a component r 'of the corrected rotation matrix of the first telecentric camera 1 along the Z-axis direction'yLIs expressed as a component r 'of the corrected rotation matrix of the first telecentric camera 1 along the Y-axis direction'xLExpressed as the component of the corrected rotation matrix of the first telecentric camera 1 along the X-axis direction, τxLExpressed as the component of the first telecentric camera 1 translation matrix intermediate variable along the direction of the X-axis, τyLExpressed as the component of the first telecentric camera 1 translation matrix intermediate variable along the direction of the Y axis, τzLExpressed as the component of the first telecentric camera 1 translation matrix intermediate variable along the Z-axis direction;
after the phase linear interpolation is carried out on the first telecentric camera 1 and the second telecentric camera 2, the phase of any point shot and calculated by the first telecentric camera 1
Figure BDA0002074376910000091
The same row of traversal phase values as the phase calculated by the second telecentric camera 2 is taken to find an AND
Figure BDA0002074376910000092
Point of closest phase value
Figure BDA0002074376910000093
Figure BDA0002074376910000094
U 'of type'RExpressed as the sub-pixel value, u ', of the first telecentric camera 1 phase found on the second telecentric camera 2 phase'R IExpressed as the integer pixel value of the matching point found by the phase of the first telecentric camera 1 on the same abscissa line of the phase of the second telecentric camera 2,
Figure BDA0002074376910000095
is expressed as the phase of the first telecentric camera 1 is in (u'L,v′L) The phase value of (a) is determined,
Figure BDA0002074376910000096
expressed as the first telecentric camera 1 phase finding the closest phase value on the same abscissa row phase as the second telecentric camera 2.
And step five, performing three-dimensional reconstruction by using the found matching points to obtain three-dimensional profile data of the measured object. Finishing three-dimensional reconstruction, finding phase value matching points of the phase of the first telecentric camera 1 and the phase of the second telecentric camera 2, obtaining three-dimensional outline information of a measured object by using a stereoscopic vision method, and eliminating some miscellaneous points by using depth constraint and modulation degree constraint to ensure that the result is more accurate, wherein the following formula is a formula met by the stereoscopic vision method:
Figure BDA0002074376910000099
Figure BDA0002074376910000097
integrating the two equations can be transformed into:
Figure BDA0002074376910000098
in the above formula, z is a depth information coordinate value of a certain point of the measured object in the space coordinate system, x is a horizontal coordinate value of the certain point of the measured object in the space coordinate system, y is a vertical coordinate value of the certain point of the measured object in the space coordinate system, f is a focal length of the first telecentric camera 1 and the second telecentric camera 2, (x is a focal length of the first telecentric camera 1 and a focal length of the second telecentric camera 2)l yl) Expressed as the coordinate value of any point on the coordinate system of the camera on the first telecentric camera 1, (x)r yr) Expressed as a correspondence (x) on the camera coordinate system of the second telecentric camera 2l yl) Coordinate values of matching points of the points, d is expressed as parallax, and x is obtained after polar line correctionl=xrSo that d is equal to yl-yrAnd b is the distance from the first telecentric camera 1 to the second telecentric camera 2.
In order to verify the feasibility of the system and the method, two groups of complex measurement scenes are designed. The first complex scenario represents the accuracy of testing the system and method of the present invention, and the measurement results are shown in FIG. 3. As shown in (a) and (b) of fig. 3, the three-dimensional point cloud of all points of the micro-calibration plate and the three-dimensional point cloud of the plane of the ceramic plate measured by the method of the present invention are shown in (c) and (d) of fig. 3, and the average distance between all adjacent two points in the micro-calibration plate is 0.6505mm, as shown in (e) of fig. 3, the distance between the real two points of the micro-calibration plate is 0.6500 mm; the measured ceramic plate was subjected to plane fitting, the fitting plane was subtracted from the measured result, and the square sum of the differences obtained for all the points of the ceramic plate was averaged to obtain an accuracy measurement result of the ceramic plate of 0.0021mm, as shown in fig. 3 (f). The accuracy of the system and the method can be obtained from the measurement scene to be 0.0021 mm.
The second complex scenario represents the measurement scenario of watch deduction under the light intensity saturation condition as shown in fig. 4 (a), and we respectively use the light intensity saturation deduction result measured by the single-period three-step phase shift method as shown in fig. 4 (b), the light intensity saturation deduction result measured by the four groups of twelve-step phase shift profile methods with different wavelengths as shown in fig. 4 (c), and the light intensity saturation deduction result measured by the method provided by the present invention as shown in fig. 4 (d). From the measured effect, the method of the invention is the method which can inhibit the saturation light intensity most.

Claims (5)

1. A measuring method of a high-resolution high-precision measuring system based on double telecentric camera matching is characterized by comprising the following steps:
finely adjusting the spatial positions and angles of a first telecentric camera (1) and a second telecentric camera (2) in a measurement system to complete position calibration between the first telecentric camera (1) and the second telecentric camera (2) and height calibration of a measurement spatial range;
step two, calculating the wrapping phase of the measured object by using a phase-shift profilometry, calculating the saturated number of each pixel stripe of the phase shift, replacing the calculated result of the phase-shift profilometry with a generalized phase-shift method for pixels exceeding a specific number, and then sequentially unfolding the high-frequency wrapping phase by using the low-frequency wrapping phase to obtain the unfolded phase of the measured object;
thirdly, distortion correction is carried out on the unfolding phases obtained by the first telecentric camera (1) and the second telecentric camera (2);
step four, using three-dimensional phase matching to find a point after the phase correction of the first telecentric camera (1) corresponds to a matching point after the phase correction of the second telecentric camera (2);
step five, performing three-dimensional reconstruction by using the found matching points to obtain three-dimensional profile data of the measured object;
in the second step: the projector (3) projects phase shift stripes on a measured object, four groups of twelve-step phase shift stripes with different wavelengths can completely meet the measurement requirement on light intensity saturation positions, the number of saturated pixel stripes of twelve-step phase shift of a phase shift stripe pattern with different wavelengths is calculated, the wrapping phase of the measured object is calculated for pixels with light intensity unsaturated positions by using a phase shift profilometry, the twelve data of the same pixel in the twelve-step phase shift pattern are counted whether the light intensity is saturated or not, the wrapping phase is calculated for the light intensity saturated parts with the light intensity saturation number smaller than 3 by using a phase shift method, the wrapping phase is calculated for the light intensity saturated parts with the light intensity saturation number smaller than 9 and larger than 3 by using a generalized phase shift method, the wrapping phase is not calculated for the light intensity saturated parts with the saturation light intensity number larger than 9, the wrapping phase is arranged at nan, and then the high-frequency wrapping phases, obtaining the expansion phase of the measured object; the acquisition of four groups of different wavelength phase shift fringe images by the first telecentric camera (1) and the second telecentric camera (2) can be represented as follows:
Figure FDA0002709556840000011
where m is the mth set of four different wavelength twelve-step phase-shifted fringes, n is the nth phase-shifted fringe pattern of the twelve-step phase shift,
Figure FDA0002709556840000012
an nth phase shift fringe pattern, A, represented as an mth group of wavelength phase shift fringe patterns captured by a telecentric cameraCExpressed as the average intensity of the phase-shifted fringes obtained by a telecentric camera, BCThe modulation intensity expressed as phase-shifted fringes captured by a telecentric camera,
Figure FDA0002709556840000013
representing the absolute phase of the mth set of wavelength phase shift fringe patterns;
judging the number of saturated light intensity of twelve data of the same pixel in the twelve-step phase shift diagram, firstly setting the light intensity saturation (u, v) pixel sat _ map of the mth group of wavelength phase shift fringe diagramsmThe initial value of (u, v) is 0, and twelve phase shift fringe graphs of the mth group of wavelength phase shift fringe graphs are traversed when the conditions are met
Figure FDA0002709556840000021
The method comprises the following steps:
sat_mapm(u,v)=sat_mapm(u,v)+1
wherein sat _ mapm(u, v) is the number of light intensity saturations at the twelve-step phase-shifted (u, v) pixel points of the mth set of wavelength phase-shifted fringe patterns, sat thr is the maximum value of the required range of light intensity,
Figure FDA0002709556840000022
the light intensity value at the pixel point of the nth phase shift fringe pattern (u, v) expressed as the mth set of wavelength phase shift fringe patterns,
Figure FDA0002709556840000023
represented as the wrapped phase at the pixel point of the mth set of wavelength phase-shift fringe patterns twelve-step phase-shift (u, v);
then according to sat _ mapm(u, v) which conditions are met, and respectively calculating the wrapping phases of the mth group of wavelength phase shift fringe patterns of the measured object:
Figure FDA0002709556840000024
wherein
Figure FDA0002709556840000025
Expressed as wrapped phases at the pixel points of the mth set of wavelength phase-shift fringe patterns (u, v) solved using phase-shift profilometry,
Figure FDA0002709556840000026
expressed as wrapped phase at the pixel point of the mth group of wavelength phase shift fringe patterns (u, v) obtained by calculation by using a generalized phase shift method, nan is expressed as not a number, and the final phase result is not counted;
phase-shift profilometry solves the wrapped phase for an N-step phase shift as follows:
Figure FDA0002709556840000027
whereinn=(N-1)*2π/N;
The generalized phase shift method to calculate the wrapped phase can be expressed as:
Figure FDA0002709556840000028
Figure FDA0002709556840000029
Figure FDA0002709556840000031
wherein the matrix C is represented as the inverse of the matrix A, C11、C12、C13、C21、C22、C23、C31、C32、C33Represented as the value of the corresponding position in matrix C.
2. The method of claim 1, wherein in step one: after 3 different positions of the first telecentric camera (1) and the second telecentric camera (2) are calibrated, the height of a measured object is calibrated in a measuring space, after a calibration pattern at the position of a reference surface is shot, the precise lifting displacement platform is finely adjusted to lift 1mm upwards, then a microscopic calibration plate at the position is shot and calibrated, and a computer calculates two calibration images to finish the calibration of a space range.
3. The method according to claim 1, characterized by the steps of three: the lens has two common lens distortions, namely radial distortion and tangential distortion of the lens, and the influence of the tangential distortion on an image formed by the lens is far smaller than that of the radial distortion on the image formed by the lens, so that the influence of the tangential distortion on a measurement system is ignored, and the radial distortion correction is independently performed on the formed image; the radial distortion of the image is the position deviation of the image pixel point which takes the distortion center as the central point and generates along the radial direction, thereby causing the deformation of the image formed in the image, and the following formula of the radial distortion correction is as follows:
Figure FDA0002709556840000032
λ=(1+k1r2+k2r4+k3r6)
wherein (u)d,vd) Is the coordinate value of the point after radial distortion correction on the coordinates of the images shot by the first telecentric camera (1) and the second telecentric camera (2), (u)u,vv) Is the coordinate value of any point before radial distortion correction on the coordinate of the image shot by the telecentric camera (u)0,v0) Is a coordinate value of a center point before correction of the radial distortion, and λ is a distortion coefficient of the radial distortion, k1,2,3Respectively expressed as the order of the distortion coefficient, r is (u)d,vd) Point to (u)0,v0) The distance of the points can be expressed as:
r2=(uu-u0)2+(vv-v0)2
4. the method according to claim 1, characterized in that in step four: the following formulas for finding matching points on the image shot by the second telecentric camera (2) based on the perspective transformed camera mapping model, the telecentric camera mapping model after epipolar rectification and the connection between the two models and the image shot by the first telecentric camera (1) are respectively as follows:
Figure FDA0002709556840000041
the above formula is expressed as a perspective transformation-based camera mapping model, wherein L is used as a subscript to represent a first telecentric camera (1), R is used as a subscript to represent a second telecentric camera (2), and p is the formulaLTo representIs the pixel coordinate of the image point shot by the first telecentric camera (1), P is the world coordinate of the spatial midpoint, ALExpressed as an internal reference calibrated by the first telecentric camera (1),
Figure FDA0002709556840000042
expressed as an external reference, R, calibrated for the first telecentric camera (1)L2×3A rotation matrix, t, expressed as an external parameter of the first telecentric camera (1)L2×1A translation matrix represented as an external parameter of the first telecentric camera (1);
Figure FDA0002709556840000043
the expression of the upper expression is a mapping model of the left side after the polar line rectification and a second telecentric camera (2), wherein p'LIs expressed as pixel coordinates A 'of an image point obtained after the polar line correction of the first telecentric camera (1)'LIs expressed as an internal reference calibrated after the polar line rectification of the first telecentric camera (1),
Figure FDA0002709556840000044
is represented by external reference, R 'after the line correction of the first telecentric camera (1)'L2×3Is expressed as a rotation matrix, t 'of the calibrated external reference after polar line rectification of the first telecentric camera (1)'L2×1Expressed as a translation matrix of the calibrated external parameters after polar line rectification of the first telecentric camera (1):
according to the characteristic that the magnification of an image cannot be changed within a certain object distance range of a telecentric lens and the characteristic of three-coordinate orthogonality, the relation of the inside and outside mapping models of the first telecentric camera (1) and the second telecentric camera (2) which participate in the perspective transformation camera mapping model after the polar line rectification can be obtained:
A′L=A′R=(AL+AR)/2
Figure FDA0002709556840000045
Figure FDA0002709556840000046
Figure FDA0002709556840000051
Figure FDA0002709556840000052
Figure FDA0002709556840000053
r 'in the formula'zLIs expressed as a component r 'of the corrected rotation matrix of the polar line of the first telecentric camera (1) along the Z-axis direction'yLIs expressed as a component r 'of a rotation matrix along the Y-axis direction after the polar line correction of the first telecentric camera (1)'xLExpressed as the component of the rotation matrix along the X-axis direction after the polar line rectification of the first telecentric camera (1), tauxLExpressed as the component of the intermediate variable of the external reference middle translation matrix along the X-axis direction after the polar line rectification of the first telecentric camera (1), tauyLExpressed as the component of the intermediate variable of the external reference middle translation matrix along the Y-axis direction after the polar line rectification of the first telecentric camera (1), tauzLExpressed as the component of the intermediate variable of the external reference translation matrix along the Z-axis direction after the polar line rectification of the first telecentric camera (1);
after polar line correction is carried out on the first telecentric camera (1) and the second telecentric camera (2), a matching point of the phase of the first telecentric camera (1) on the phase of the second telecentric camera (2) and the phase of any point of the first telecentric camera (1) are found
Figure FDA0002709556840000054
Finding an AND-phase in the same line of traversal phase of the second telecentric camera (2)
Figure FDA0002709556840000055
Point of closest phase value
Figure FDA0002709556840000056
Figure FDA0002709556840000057
U 'of type'RExpressed as the sub-pixel match value found on the right camera phase for the first telecentric camera (1) phase after epipolar rectification,
Figure FDA0002709556840000058
expressed as the position of the integer pixel matching point found on the right camera phase by the first telecentric camera (1) phase after epipolar rectification,
Figure FDA0002709556840000059
it is shown that the phase of the first telecentric camera (1) is (u'L,v′L) The phase value of (a) is determined,
Figure FDA00027095568400000510
the phase value of the first telecentric camera (1) is shown to find the nearest phase value on the same abscissa line of the phase of the second telecentric camera (2) after the polar correction.
5. The method according to claim 1, characterized in that in step five: after the phase matching point is found, a stereoscopic vision method is used for calculating to obtain three-dimensional contour information, points beyond the depth and points beyond the constraint are directly excluded by utilizing the point pairs of depth constraint and modulation degree constraint, and some miscellaneous points are excluded, so that the result is more accurate, and the following formula is a formula met by the stereoscopic vision method:
Figure FDA0002709556840000061
Figure FDA0002709556840000062
integrating the two equations can be transformed into:
Figure FDA0002709556840000063
in the above formula, z is represented as a depth information coordinate value of a certain point of the measured object in the space coordinate system, x is represented as a transverse coordinate value of the certain point of the measured object in the space coordinate system, y is represented as a longitudinal coordinate value of the certain point of the measured object in the space coordinate system, f is represented as focal lengths of the first telecentric camera (1) and the second telecentric camera (2), and (x)l yl) Expressed as the coordinate value of any point on the camera coordinate system of the first telecentric camera (1), (x)r yr) Expressed as a correspondence (x) on the camera coordinate system of the second telecentric camera (2)l yl) Coordinate values of matching points of the points, d is expressed as parallax, and x is obtained after polar line correctionl=xrSo that d is equal to yl-yrAnd b is the distance from the first telecentric camera (1) to the second telecentric camera (2).
CN201910448519.9A 2019-05-28 2019-05-28 High-resolution high-precision measurement system and method based on double telecentric camera matching Active CN110207614B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910448519.9A CN110207614B (en) 2019-05-28 2019-05-28 High-resolution high-precision measurement system and method based on double telecentric camera matching

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910448519.9A CN110207614B (en) 2019-05-28 2019-05-28 High-resolution high-precision measurement system and method based on double telecentric camera matching

Publications (2)

Publication Number Publication Date
CN110207614A CN110207614A (en) 2019-09-06
CN110207614B true CN110207614B (en) 2020-12-04

Family

ID=67788981

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910448519.9A Active CN110207614B (en) 2019-05-28 2019-05-28 High-resolution high-precision measurement system and method based on double telecentric camera matching

Country Status (1)

Country Link
CN (1) CN110207614B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110849266B (en) * 2019-11-28 2021-05-25 江西瑞普德测量设备有限公司 Telecentric lens telecentricity debugging method of image measuring instrument
CN111473745A (en) * 2020-06-23 2020-07-31 南京理工大学智能计算成像研究院有限公司 Light-emitting surface microscopic three-dimensional measurement method based on multi-frequency phase shift scheme
CN111899304B (en) * 2020-09-30 2020-12-29 南京理工大学智能计算成像研究院有限公司 Telecentric optical path distortion center positioning method
CN112700504B (en) * 2020-12-30 2024-02-20 南京理工大学智能计算成像研究院有限公司 Parallax measurement method of multi-view telecentric camera
CN112729160B (en) * 2021-01-05 2022-03-25 中国科学院上海微系统与信息技术研究所 Projection calibration method, device and system based on telecentric imaging and storage medium
CN113505626A (en) * 2021-03-15 2021-10-15 南京理工大学 Rapid three-dimensional fingerprint acquisition method and system
CN115289997B (en) * 2022-08-01 2024-02-20 合肥国际应用超导中心 Binocular camera three-dimensional contour scanner and application method thereof
CN114993207B (en) * 2022-08-03 2022-10-25 广东省智能机器人研究院 Three-dimensional reconstruction method based on binocular measurement system
CN115294233B (en) * 2022-10-09 2022-12-13 天津大学 Binocular large-view-field imaging method and system based on deep learning
CN116824069B (en) * 2023-08-31 2023-11-10 四川省产品质量监督检验检测院 Self-adaptive stripe method for detecting saturation point by using high-frequency signal

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100573595C (en) * 2003-06-20 2009-12-23 日本电信电话株式会社 Virtual visual point image generating method and three-dimensional image display method and device
CN106247950A (en) * 2016-08-27 2016-12-21 中国石油大学(华东) Based on the micro-displacement measurement method that broad sense phase-shifted digital is holographic
CN108180868A (en) * 2017-12-29 2018-06-19 南京理工大学 A kind of real-time three-dimensional micro imaging system based on fringe projection

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7268889B2 (en) * 2004-09-22 2007-09-11 Corning Incorporated Phase-resolved measurement for frequency-shifting interferometry
CN103197500B (en) * 2012-01-05 2015-09-30 上海微电子装备有限公司 A kind of method measuring mirror surface shape compensation effect
CN103453835A (en) * 2013-08-06 2013-12-18 王向阳 Spine white light three-dimensional motion measuring method
CN106595528B (en) * 2016-11-10 2019-03-05 华中科技大学 A kind of micro- binocular stereo vision measurement method of telecentricity based on digital speckle
CN108036740B (en) * 2017-12-05 2020-04-10 南京理工大学 High-precision real-time three-dimensional color measurement system and method based on multiple viewing angles

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100573595C (en) * 2003-06-20 2009-12-23 日本电信电话株式会社 Virtual visual point image generating method and three-dimensional image display method and device
CN106247950A (en) * 2016-08-27 2016-12-21 中国石油大学(华东) Based on the micro-displacement measurement method that broad sense phase-shifted digital is holographic
CN108180868A (en) * 2017-12-29 2018-06-19 南京理工大学 A kind of real-time three-dimensional micro imaging system based on fringe projection

Also Published As

Publication number Publication date
CN110207614A (en) 2019-09-06

Similar Documents

Publication Publication Date Title
CN110207614B (en) High-resolution high-precision measurement system and method based on double telecentric camera matching
CN110514143B (en) Stripe projection system calibration method based on reflector
CN110288642B (en) Three-dimensional object rapid reconstruction method based on camera array
CN107063129B (en) A kind of array parallel laser projection three-dimensional scan method
Kühmstedt et al. 3D shape measurement with phase correlation based fringe projection
Zhao et al. Calibration for stereo vision system based on phase matching and bundle adjustment algorithm
CN105180841B (en) A kind of new micro element three-dimension measuring system and its measuring method
CN109727290B (en) Zoom camera dynamic calibration method based on monocular vision triangulation distance measurement method
CN107610183B (en) Calibration method of fringe projection phase height conversion mapping model
Li et al. Lens distortion elimination for improving measurement accuracy of fringe projection profilometry
CN112815843B (en) On-line monitoring method for printing deviation of workpiece surface in 3D printing process
CN113091646B (en) Three-dimensional shape measurement method based on stripe calibration
CN112762859B (en) High-precision three-dimensional measuring device for sine stripe structured light of non-digital optical machine
CN112489109B (en) Three-dimensional imaging system method and device and three-dimensional imaging system
WO2020199439A1 (en) Single- and dual-camera hybrid measurement-based three-dimensional point cloud computing method
CN109916304A (en) Mirror surface/class mirror surface three-dimensional measurement of objects system calibrating method
CN104501741B (en) A kind of orthogonal grating phase shifting method for three dimension profile measurement
Cheng et al. A practical micro fringe projection profilometry for 3-D automated optical inspection
CN115775303A (en) High-reflectivity object three-dimensional reconstruction method based on deep learning and illumination model
CN113724337A (en) Camera dynamic external parameter calibration method and device without depending on holder angle
CN111652943A (en) Defocusing digital fringe projection calibration device and method
CN110766759A (en) Multi-camera calibration method and device without overlapped view fields
CN116363226A (en) Real-time multi-camera multi-projector 3D imaging processing method and device
CN113298882B (en) Camera calibration device and method for microscopic three-dimensional topography measurement system
CN116242277A (en) Automatic measurement method for size of power supply cabinet structural member based on full-field three-dimensional vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant