CN111750806A - Multi-view three-dimensional measurement system and method - Google Patents

Multi-view three-dimensional measurement system and method Download PDF

Info

Publication number
CN111750806A
CN111750806A CN202010699165.8A CN202010699165A CN111750806A CN 111750806 A CN111750806 A CN 111750806A CN 202010699165 A CN202010699165 A CN 202010699165A CN 111750806 A CN111750806 A CN 111750806A
Authority
CN
China
Prior art keywords
projector
coordinate system
camera
point
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010699165.8A
Other languages
Chinese (zh)
Other versions
CN111750806B (en
Inventor
杨树明
郑逢鹤
胡鹏宇
刘勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Jiaotong University
Original Assignee
Xian Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Jiaotong University filed Critical Xian Jiaotong University
Priority to CN202010699165.8A priority Critical patent/CN111750806B/en
Publication of CN111750806A publication Critical patent/CN111750806A/en
Application granted granted Critical
Publication of CN111750806B publication Critical patent/CN111750806B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/254Projection of a pattern, viewing through a pattern, e.g. moiré
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2504Calibration devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Abstract

The invention discloses a multi-view three-dimensional measurement system and a multi-view three-dimensional measurement method. The system comprises a projector, a peripheral camera, a beam splitting plate and a color camera, wherein the peripheral camera adopts an inclined image surface structure to meet the depth of field requirement, the beam splitting plate is used for imaging of the color camera, and the color camera is used for restoring object color information. The calibration measurement method comprises the steps of establishing an inclined image surface imaging model for cameras around, shooting a calibration plate by using a camera, extracting characteristic point coordinates, and obtaining internal and external parameters of the cameras by using a Zhang Yongyou calibration method; calibrating internal and external parameters of the projector by adopting a pseudo camera method principle; calibrating the relation between each camera coordinate system and the projector coordinate system by a binocular stereo calibration principle to realize system calibration; and the multi-view point cloud fusion is realized by establishing a world coordinate system on a projector coordinate system. The system has the advantages of high measurement precision, high speed, simple and convenient operation and capability of carrying out three-dimensional measurement on the object with high reflectivity.

Description

Multi-view three-dimensional measurement system and method
Technical Field
The invention belongs to the field of structured light three-dimensional measurement, and particularly relates to a multi-view three-dimensional measurement system and method.
Background
With the progress of modern precision measurement technology, the optical measurement is developed rapidly. The structured light three-dimensional measurement is widely applied to the fields of product design, quality detection, reverse engineering, biological medical treatment and the like due to the advantages of high precision, high speed, no contact and the like. The grating projection structured light three-dimensional measurement is widely researched due to the high measurement precision and the high efficiency, and the realization is simple.
At present, the three-dimensional measurement of objects with a high dynamic reflectivity range is realized by adopting a method of spraying and inhibiting reflective powder, the process is complicated, the precision is low, and the application range of small objects such as PCB boards is greatly limited.
The multi-view three-dimensional measurement utilizes a plurality of cameras to shoot an object from different views, preferably selects a part which can be used for three-dimensional reconstruction in an image, reduces the three-dimensional appearance of the object, and can conveniently realize the three-dimensional measurement of the object with high dynamic reflectivity.
Disclosure of Invention
The invention aims to solve the problems of low precision, complex operation and the like in the three-dimensional measurement of a high-dynamic-reflectivity object, and provides a multi-view three-dimensional measurement system and a multi-view three-dimensional measurement method which are simple to operate and high in speed and are used for detecting a reflective object.
In order to achieve the purpose, the invention is realized by adopting the following technical scheme:
a multi-view three-dimensional measurement system comprises a projector, a projector lens, a beam splitting plate, a color camera lens, a peripheral camera and a peripheral camera lens; wherein the content of the first and second substances,
the optical axes of the cameras at the periphery and the optical axis of the projector form included angles of 20-40 degrees, the plane of the beam splitting plate and the optical axis of the projector are arranged at an angle of 45 degrees, and the optical axes of the color camera and the color camera lens and the optical axis of the projector are arranged at an angle of 90 degrees;
during detection, the optical axis of a projector is perpendicular to a platform of a measured object, a grating image is projected to pass through a projector lens and a beam splitting plate below the projector image and is irradiated on the measured object on the platform of the measured object to generate a deformed grating pattern modulated by the surface height of the measured object, the surrounding cameras and the surrounding camera lenses acquire the deformed grating pattern modulated by the surface height of the measured object, and the acquired deformed stripes are analyzed to obtain corresponding phase values of the surface of the measured object; finally, three-dimensional data of surface points of the measured object are restored according to the previously calibrated internal and external parameters of the peripheral cameras and the projectors; the color camera and the color camera lens acquire an image of the measured object without projection illumination from the right above the measured object through the beam splitting plate to obtain color information of the surface of the measured object, match pixels of the image with the three-dimensional point cloud according to the calibration information, and color the point cloud.
The invention has the further improvement that the peripheral cameras and the lenses of the peripheral cameras adopt the structures of the Samm inclined image surfaces so as to solve the problem of over-small depth of field under the condition of small field of view and high magnification;
a wedge is arranged between the four-side camera and the four-side camera lens.
The invention has the further improvement that the projector and each peripheral camera form a monocular vision system to obtain the three-dimensional point cloud of the visual angle, and finally the point clouds of the four visual angles are fused to obtain the complete three-dimensional point cloud.
An integral calibration method of a multi-view three-dimensional measurement system comprises the following steps:
1) calibrating the cameras around by combining a Zhangyingyou calibration method with the Samm's law;
2) the projector projects a fringe image, the accurate phase value of the calibration point is solved, the projector is endowed with the capability of shooting an object by utilizing the phase-projector pixel corresponding relation, and the projector is calibrated by combining a Zhang Yongyou calibration method;
3) and calibrating the position relation between the projector and the cameras around by combining a binocular stereo calibration method, and completing the calibration of the system by taking the projector coordinate system as a unified world coordinate system.
5. The overall calibration method of the multi-view three-dimensional measurement system according to claim 4, wherein in step 1), a mapping relationship between a pixel coordinate system and a target coordinate system of the four-side camera image is established:
Figure BDA0002592398830000021
where ρ is a scale factor, (u, v) is the actual image pixel coordinate of a point, du, dv is the pixel size of the actual image plane, and (u, dv) is the pixel size of the actual image plane0,v0) Is the pixel coordinate of the principal point O, f is the effective focal length of the camera around, H is the homography matrix of the virtual image physical coordinate (X, y) of the point converted into the actual image physical coordinate (X ', y') (X)B,YB,ZB) The coordinate of the point under the target coordinate system, R and T are respectively the transformation matrix from the target coordinate system to the peripheral camera coordinate system, R is a 3 x 3 orthogonal rotation matrix, T is a 3 x 1 translation matrix and 0TIs a 1 x 3 zero matrix.
The invention further improves that a homography matrix H and a rotation angle tau are established for the mapping relation between the pixel coordinate system of the camera image at the periphery and the target coordinate systemx、τyThe relationship of (1):
Figure BDA0002592398830000031
wherein:
Figure BDA0002592398830000032
wherein Q is a rotation matrix from a coordinate system O-xyz to a coordinate system O-x ' y ' z '; the coordinate system O-xyz is generated by the virtual image physical coordinate system O-xy according to the right-hand rule, and the coordinate system O-x ' y ' z ' is generated by the actual image physical coordinate system O-x ' y ' according to the right-hand rule.
A further development of the invention is that, in step 2), for each index point, the phase values from four viewing angles
Figure BDA0002592398830000033
Taking an average value, where n is 1,2,3,4, to obtain a higher accuracy phase value
Figure BDA0002592398830000034
Figure BDA0002592398830000035
The further improvement of the invention is that in step 3), the projector coordinate system is used as a unified world coordinate system, and the conversion relation between the coordinate systems of all the cameras around and the projector coordinate system is established:
Figure BDA0002592398830000036
in the formula (X)p,Yp,Zp) As the coordinates of the point in the projector coordinate system, (X)cn,Ycn,Zcn) Is the coordinate of the point in the nth four-around camera coordinate system (n is 1,2,3,4), Rn,TnRespectively, the transformation matrix, R, from the projector coordinate system to the peripheral camera coordinate systemnIs 3 x 3 orthogonal matrix, TnIs a 3 x 1 translation matrix, 0TIs a 1 x 3 zero matrix.
A three-dimensional reconstruction method of a multi-view three-dimensional measurement system comprises the following steps:
1) placing the measured object in a common view field of the cameras and the projectors on the periphery, projecting a grating fringe pattern by the projectors, and synchronously shooting by the cameras on the periphery;
2) performing phase calculation on the shot grating fringe pattern to obtain phase distribution on the measured object, and further corresponding to the projector pixel coordinates of the point;
3) carrying out three-dimensional reconstruction on each pixel in the images shot by the peripheral cameras by utilizing the constraint of a single peripheral camera-projector or the constraint of a plurality of peripheral cameras-projectors to obtain point cloud data;
based on the calibration method step 3), point clouds obtained from different viewing angles are in the same coordinate system, namely a projector coordinate system, and in order to improve higher three-dimensional reconstruction accuracy, a multi-peripheral-camera-projector constraint method is adopted for three-dimensional reconstruction of a view field part where a plurality of peripheral cameras are overlapped.
A further improvement of the invention is to use the projector coordinate system as a unified world coordinate system, as follows:
Figure BDA0002592398830000041
in the above formula, ρ is a scale factor, (u, v) is the actual image pixel coordinate of the point, du, dv is the pixel size of the actual image plane, and (u, dv) is the pixel size of the actual image plane0,v0) Is the pixel coordinate of the principal point O of the image surface of the camera around, f is the effective focal length of the camera around, H is the homography matrix of the virtual image physical coordinate (X, y) of the point converted into the actual image physical coordinate (X ', y') (X)p,Yp,Zp) As coordinates of the point in the projector coordinate system, Rn,TnRespectively, the transformation matrix from the projector coordinate system to the nth peripheral camera coordinate system, RnIs a 3 x 3 orthogonal rotation matrix, TnIs a 3 x 1 translation matrix, 0TIs a 1 x 3 zero matrix, n is 1,2,3, 4; in the above formula, the internal parameters of the four cameras and (u, v) are different with the different numbers n of the four cameras;
Figure BDA0002592398830000042
in the above formula, ρpIs a scale factor, (u)p,vp) Is the projector image pixel coordinate of a point, dup、dvpIs the primitive size of the projector DMD chip, (u)0p,v0p) Is the pixel coordinate of the optical principal point O of the projector, fpFor the effective focal length of the pseudo camera, (X)p,Yp,Zp) The coordinates of the points under the projector coordinate system;
the coordinate of the target point under the world coordinate system can be solved by simultaneous formulas (17) and (18), and for the part where the view fields of the plurality of peripheral cameras are overlapped, the formulas (17) and (18) corresponding to the plurality of peripheral cameras are used for simultaneous solving.
The invention has at least the following beneficial technical effects:
the invention provides a multi-view three-dimensional measurement system which comprises a projector, a projector lens, a beam splitting plate, a color camera lens, four peripheral cameras and four peripheral camera lenses, wherein the projector is arranged on the front side of the projector; wherein, the camera optical axis and the projecting apparatus optical axis of all around have 20-40 degrees contained angles, and the plane of beam splitting board and projecting apparatus optical axis are 45 degrees configuration, and the optical axis of color camera and color camera lens and the optical axis of projecting apparatus are 90 degrees configuration. Every camera and projecting apparatus all around can constitute an independent three-dimensional measurement system, and through four visual angle point cloud screening fusions, can effectively solve the problem that the high reflection of light region can't effectively be rebuild and because the point cloud that shelters from and lead to is incomplete among the traditional three-dimensional measurement. The cameras around adopt the Samm inclined imaging model, so that the problem of low contact ratio between the depth of field range of the cameras around and the platform of the measured object is effectively solved, the measured object in the range can be imaged clearly, and the measurement precision is ensured. The invention can effectively realize the inclined imaging model by arranging the wedge pieces between the peripheral cameras and the peripheral camera lenses, and greatly reduces the equipment cost. The invention does not need to move objects when in measurement, and can conveniently, efficiently and accurately realize the three-dimensional reconstruction of the high-reflection object.
The invention provides an integral calibration method of a multi-view three-dimensional measurement system,
the method comprises the steps of calibrating a peripheral camera, calibrating a projector and calibrating a system structure.
For the calibration of the peripheral camera, by means of establishing a virtual physical image coordinate system and combining the property that connecting lines of corresponding points on an actual image surface and a virtual image surface necessarily pass through an optical center, a mathematical model of the peripheral camera under the condition of oblique imaging is established, and accurate mapping from pixel coordinates of the peripheral camera to space three-dimensional points is realized. For the calibration of the projector, on the basis of the traditional pseudo camera method, the average value of the phase values of each angle under different viewing angles is calculated, and the phase value with higher precision can be obtained. For the calibration of the system structure, the position relation between the projector coordinate system and each peripheral camera coordinate system is calibrated by using binocular stereo calibration, and the projector coordinate system is used as a unified world coordinate system, so that the unification of the coordinates of point clouds under four visual angles is realized, and the point cloud fusion can be effectively carried out.
The three-dimensional reconstruction method of the multi-view three-dimensional measurement system provided by the invention has the advantages that the three-dimensional reconstruction is carried out on the area under the view fields of a plurality of peripheral cameras by adopting the information constrained by the peripheral cameras and the projectors, and high-precision high-quality point clouds can be obtained.
Drawings
Fig. 1 is a schematic diagram of an overall system structure according to an embodiment.
Fig. 2 is a schematic plan view of a four-side camera and a lens according to an embodiment.
Fig. 3 is a schematic diagram illustrating establishment of a coordinate system of a four-side camera according to an embodiment.
FIG. 4 is a diagram illustrating the establishment of a system coordinate system according to an embodiment.
FIG. 5 is a diagram of a system measurement model according to an embodiment.
Detailed Description
To facilitate an understanding of the invention, the invention will now be described more fully with reference to the accompanying drawings. One embodiment of the present invention is shown in the drawings. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete.
The invention provides a multi-view three-dimensional measurement system, which has a specific implementation method as follows.
Referring to fig. 1, in one embodiment, the entire system includes a projector 10 and a projector lens 15, a beam splitter plate 20, a color camera 30 and a color camera lens 35, a peripheral camera 41 and a peripheral camera lens 45.
The optical axis of the projector 10 is perpendicular to the object table 50 to project a fringe image, and light is irradiated on the object to be measured through the projector lens 15 and the beam splitting plate 20 to generate a deformed grating pattern highly modulated by the surface of the object.
The color camera 30 and the color camera lens 35 are used to collect color information of the object to be measured. The optical axes of the color camera 30 and the color camera lens 35 are arranged at an angle of 90 ° with respect to the optical axis of the projector 10. The color camera 30 can acquire an image of the measured object without projection illumination from the right above the measured object through the beam splitting plate 10, obtain color information of the surface of the object, match pixels of the image with the three-dimensional point cloud according to the calibration information, and color the point cloud.
The plane of the beam splitting plate 20 is disposed at an angle of 45 ° with respect to the optical axis of the projector 10, and is used to assist the color camera 30 in capturing color images of the object.
The peripheral camera 40 and the peripheral camera lens 45 (4 in total, only 3 are shown in the figure) acquire the deformed grating patterns modulated by the height of the object surface from four different angles obliquely above the object to be measured, and analyze the acquired deformed stripes to obtain corresponding phase values of the object surface. And finally, restoring the three-dimensional data of the object surface points according to the calibrated internal and external parameters of the camera and the projector. Wherein the optical axes of the four-sided camera 40 and the four-sided camera lens 45 are at an angle of about 20-40 ° with respect to the optical axis of the projector 10.
Referring to fig. 2, the peripheral camera 40 and the peripheral camera lens 45 are of a hamm structure, and in this embodiment, a wedge 41 is disposed between the camera 40 and the peripheral camera lens 45, so that the depth of field range (yellow range in fig. 2) of the camera is fully overlapped with the platform 50 of the object to be measured, and the problem of too small depth of field under the condition of small field of view and high magnification can be effectively solved.
The above embodiments are merely illustrative of one or more embodiments of the present invention, and the description is specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims. The device is mounted on a precise displacement platform, and the expansion of the measurement range is realized through displacement in the horizontal direction.
The invention provides an integral calibration method of a multi-view three-dimensional measurement system. The method comprises the following steps of peripheral camera calibration, projector calibration and system structure calibration.
The calibration of the four-side camera comprises the following specific steps.
Step 1: to describe the structure of the four-around camera, the following coordinate systems are first established, respectively. Establishing a pixel coordinate system O by taking the pixel point at the upper left corner of the image plane as the origin0-uv; establishing an image physical coordinate system O-x 'y' by taking the intersection point O of the optical axes of the cameras and the image plane at the periphery as an origin and respectively parallel to the u axis and the v axis by the x 'axis and the y' axis; establishing a virtual image physical coordinate system O-xy parallel to the lens plane by taking the intersection point O of the optical axes of the cameras at the periphery and the image plane as an origin; with the optical center O of the cameracIs an origin, XcAxis and YcEstablishing a coordinate system O of the peripheral camera with axes parallel to the x axis and the y axis respectivelyc-XcYcZc(ii) a Target coordinate system O established by taking angular point at upper left corner of calibration plate as originB-XBYBZB(not shown in the figures); wherein, the image physical coordinate system O-x 'y' can be rotated by the virtual image physical coordinate system O-xy around the x axis and the y axis respectively by taux、τyThe angle is obtained.
Step 2: the mapping relationship between the five coordinate systems in step 1 is determined.
According to the coordinate system established in the step 1, the mapping relationship between the pixel coordinate system and the image physical coordinate system can be obtained:
Figure BDA0002592398830000081
in the above formula, (u, v) is the actual image pixel coordinate of the point, du, dv is the pixel size of the actual image plane, (u, v) is the pixel size of the actual image plane0,v0) Is the pixel coordinates of the principal point O, (x ', y') is the actual image physical coordinates.
According to the conversion relation between the image physical coordinate system and the virtual image physical coordinate system, a rotation matrix Q from the coordinate system O-xyz to the coordinate system O-x ' y ' z ' can be obtained:
Figure BDA0002592398830000082
the coordinate system O-xyz is generated by the virtual image physical coordinate system O-xy according to the right-hand rule, and the coordinate system O-x ' y ' z ' is generated by the actual image physical coordinate system O-x ' y ' according to the right-hand rule.
The above formula can be written as:
Figure BDA0002592398830000083
in the above formula, the image physical coordinate system O-x 'y' is rotated by tau around the x-axis and the y-axis from the virtual image physical coordinate system O-xy respectivelyx、τyThe angle is obtained.
According to the property that a connecting line of corresponding points on the actual image surface and the virtual image surface necessarily passes through an optical center. The conversion relation between the physical coordinates of the actual image and the physical coordinates of the virtual image can be obtained:
Figure BDA0002592398830000084
wherein:
Figure BDA0002592398830000091
in the above formula, f is the effective focal length of the camera around, H is the homography matrix of the virtual image physical coordinates (x, y) of the point converted into the actual image physical coordinates (x ', y'), and s is the scale factor.
According to the pinhole imaging model of the camera, the conversion relation between the physical coordinate system of the virtual image and the coordinate system of the camera can be obtained:
Figure BDA0002592398830000092
in the above formula, (X, y) is the virtual image physical coordinate of the point, f is the effective focal length of the camera around (X)c,Yc,Zc) Is the coordinate of a point in the coordinate system of the camera around, rho1Is a scale factor.
According to the Euclidean space transformation, the transformation relation between the coordinate system of the camera at the periphery and the coordinate system of the target can be obtained:
Figure BDA0002592398830000093
in the above formula, (X)c,Yc,Zc) As the coordinates of the point in the four-sided camera coordinate system, (X)B,YB,ZB) The coordinate of the point under the target coordinate system, R and T are respectively the transformation matrix from the target coordinate system to the peripheral camera coordinate system, R is a 3 x 3 orthogonal rotation matrix, T is a 3 x 1 translation matrix and 0TIs a 1 x 3 zero matrix.
By combining the mapping relations among the coordinate systems, the conversion relations among the five coordinate systems can be unified under an image pixel coordinate system and a target coordinate system to obtain the mapping relations between the image pixel coordinate system and the target coordinate system:
Figure BDA0002592398830000094
and step 3: a camera imaging model that accounts for lens distortion. In practical situations, due to factors such as installation and manufacturing errors of the camera, a molding model of the camera does not completely conform to a pinhole imaging model, and a certain deviation exists between an actual imaging position and an ideal imaging position of an object, so that optical distortion is generated. The optical distortion of the camera mainly includes radial distortion, tangential distortion and thin prism distortion. The radial distortion has the largest influence on the measurement accuracy, and the tangential distortion can be described by an inclined image plane model in the invention, so that only the radial distortion is considered. Let (x)u,yu) Is the physical coordinate of the ideal image point on the image plane of the camera (x)d,yd) To account for the physical coordinates of the actual image point of the radial distortion, the following relationship holds between the two:
Figure BDA0002592398830000101
in the above formula, r2=xu 2+yu 2,k1,k2Is the radial distortion coefficient.
The conversion to the image pixel coordinate system is:
Figure BDA0002592398830000102
in the above formula, r2=xu 2+yu 2=[(uu-u0)dx]2+[(vu-v0)dy]2,(uu,vu) Is the image pixel coordinate under the ideal pinhole imaging model, (u)d,vd) To account for the image pixel coordinates of radial distortion, (u)0,v0) Is the pixel coordinates of the principal point of the image plane.
And 4, step 4: and obtaining a calibration image. The calibration process of the system is realized by acquiring calibration plate images at different positions in a public view field through cameras at the periphery. The main operation process is as follows:
(1) the projector projects a non-pattern image with certain brightness and a plurality of sine stripe images, and the images are shot synchronously by the cameras around. These images taken by each of the four cameras may be referred to as a group.
(2) The calibration plate position is moved and the operation of (1) is repeated several times (typically 10-15 times).
For the calibration of the projector, the specific steps are as follows:
and 5: to describe the structure of the projector, the projector may be regarded as a pseudo camera, and the projector is described by a camera model, and first, the following coordinate systems are respectively established. Projector image pixel coordinate system O established by taking upper left corner pixel point of projector DMD chip as original point0p-upvp(ii) a With the intersection O of the optical axis of the projector and the DMD as the origin, xpAxis and ypEstablishing projector image physical coordinate system O-x with axes respectively parallel to u axis and v axispyp(ii) a With the optical center O of the projectorpIs an origin, XpAxis and YpAxes are respectively parallel to xpAxis, ypAxis establishing projector coordinate system Op-XpYpZp(ii) a Target coordinate system O established by taking angular point at upper left corner of calibration plate as originB-XBYBZB(not shown in the figure).
Step 6: calculating index point correspondencesProjector image pixel coordinates. Firstly, extracting the coordinates of the corner points of the first image in each group of images in the step 4, then carrying out phase solution by utilizing a four-step phase shift and three-frequency heterodyne principle according to a plurality of positive selection fringe images in the group, and obtaining phase values of the corner points in the horizontal and vertical directions under different camera viewing angles
Figure BDA0002592398830000111
Wherein n is 1,2,3, 4. In order to obtain a phase value with higher precision, the phase values of the diagonal points under four viewing angles are averaged to be used as a final angular point phase value
Figure BDA0002592398830000112
Figure BDA0002592398830000113
Is provided (u)p,vp) The projector image pixel coordinates of the points are obtained according to the projector phase and pixel mapping relation:
Figure BDA0002592398830000114
in the above formula, NH,NVRespectively the number of projected horizontal and vertical sinusoidal stripes, WH,WVThe resolution of the projector in the horizontal and vertical directions, respectively.
And 7: the mapping relationship between these four coordinate systems in step 5 is determined.
Similar to the principle in the camera imaging model, the mapping relationship between the projector image pixel coordinate system and the target coordinate system is established:
Figure BDA0002592398830000115
in the above formula, ρpIs a scale factor, (u)p,vp) Is the projector image pixel coordinate of a point, dup、dvpIs the primitive size of the projector DMD chip, (u)0p,v0p) Is the pixel coordinate of the principal point O, fpFor the effective focal length of the pseudo camera, (X)B,YB,ZB) As coordinates of the point in the target coordinate system, Rp,TpRespectively, the transformation matrix, R, from the target coordinate system to the projector coordinate systempIs a 3 x 3 orthogonal rotation matrix, TpIs a 3 x 1 translation matrix, 0TIs a 1 x 3 zero matrix.
And 8: similar to the camera imaging model with lens distortion, consider the projector imaging model with lens distortion. Let (x)up,yup) Is the physical coordinate of the ideal image point on the projector image plane, (x)dp,ydp) To account for the physical coordinates of the actual image point of the radial distortion, the following relationship holds between the two:
Figure BDA0002592398830000121
in the above formula, r2=xup 2+yup 2,k1,k2Is the radial distortion coefficient.
The conversion to the image pixel coordinate system is:
Figure BDA0002592398830000122
in the above formula, r2=xup 2+yup 2=[(uup-u0p)dx]2+[(vup-v0p)dy]2,(uup,vup) Is the image pixel coordinate under the ideal pinhole imaging model, (u)dp,vdp) To account for the image pixel coordinates of radial distortion, (u)0p,v0p) Is the pixel coordinates of the principal point of the image plane.
And 9, calibrating internal and external parameters of the peripheral camera and the projector. The camera image pixel coordinates around the corner points and the corresponding projector image pixel coordinates can be obtained in the step 6, and the Z of the corner points can be known according to the establishment of the target coordinate system B0, again according to its position on the calibration plateThe horizontal position information may obtain its three-dimensional coordinate information in the target coordinate system. By combining the information with a Zhang Zhengyou calibration method, the internal parameters of the peripheral camera and the projector and the external parameters of the peripheral camera and the projector relative to a target coordinate system can be obtained.
For system structure calibration, the steps are as follows:
step 10: and calibrating the position relation between the projector and the cameras around by using a binocular stereo calibration method. Taking a projector coordinate system as a unified world coordinate system, establishing a conversion relation between all the camera coordinate systems around and the projector coordinate system:
Figure BDA0002592398830000123
in the formula (X)p,Yp,Zp) As the coordinates of the point in the projector coordinate system, (X)cn,Ycn,Zcn) Is the coordinate of the point in the nth four-around camera coordinate system (n is 1,2,3,4), Rn,TnRespectively, the transformation matrix, R, from the projector coordinate system to the peripheral camera coordinate systemnIs 3 x 3 orthogonal matrix, TnIs a 3 x 1 translation matrix, 0TIs a 1 x 3 zero matrix.
The three-dimensional reconstruction method of the multi-view three-dimensional measurement system comprises the following steps:
step 11: when three-dimensional reconstruction is carried out, the projector coordinate system is used as a unified world coordinate system, and the conversion relation between the camera coordinate systems around each camera and the projector coordinate system is established. The three-dimensional reconstruction measurement model of the system can be obtained by combining the formulas (8), (13) and (16) and is formed by combining the following two formulas:
Figure BDA0002592398830000131
in the above formula, ρ is a scale factor, (u, v) is the actual image pixel coordinate of the point, du, dv is the pixel size of the actual image plane, and (u, dv) is the pixel size of the actual image plane0,v0) The pixel coordinate of a principal point O of the image surface of the camera at the periphery, f is the effective focal length of the camera at the periphery, and H is the virtual image physical coordinate (x, y) of the point and is converted into the actual image physical coordinateHomography matrix of image physical coordinates (X ', y'), (X)p,Yp,Zp) As coordinates of the point in the projector coordinate system, Rn,TnRespectively, the transformation matrix from the projector coordinate system to the nth peripheral camera coordinate system, RnIs a 3 x 3 orthogonal rotation matrix, TnIs a 3 x 1 translation matrix, 0TIs a 1 x 3 zero matrix (n is 1,2,3, 4). It is worth noting that: the internal parameters of the four cameras in the above formula and (u, v) will be different according to the number n of the four cameras.
Figure BDA0002592398830000132
In the above formula, ρpIs a scale factor, (u)p,vp) Is the projector image pixel coordinate of a point, dup、dvpIs the primitive size of the projector DMD chip, (u)0p,v0p) Is the pixel coordinate of the optical principal point O of the projector, fpFor the effective focal length of the pseudo camera, (X)p,Yp,Zp) Is the coordinates of the point in the projector coordinate system.
The coordinate of the target point under the world coordinate system can be solved by simultaneous equations (17) and (18).
For the part where the fields of view of the plurality of peripheral cameras overlap, equations (17) and (18) corresponding to the plurality of peripheral cameras participate in the solution simultaneously.
The embodiments of the present invention have been described with reference to the accompanying drawings, but the description is not intended to limit the invention, the scope of the invention is defined by the appended claims, and any modification based on the claims is the scope of the invention.

Claims (10)

1. A multi-view three-dimensional measurement system is characterized by comprising a projector (10), a projector lens (15), a beam splitting plate (20), a color camera (30), a color camera lens (35), a peripheral camera (40) and a peripheral camera lens (45); wherein the content of the first and second substances,
the included angle of 20-40 degrees is formed between the optical axis of the camera at the periphery and the optical axis of the projector, the plane of the beam splitting plate (20) and the optical axis of the projector are arranged at an angle of 45 degrees, and the optical axes of the color camera (30) and the color camera lens (35) and the optical axis of the projector are arranged at an angle of 90 degrees;
during detection, an optical axis of a projector (10) is perpendicular to a measured object platform (50), a projected grating image is irradiated on a measured object on the measured object platform (50) through a projector lens (15) and a beam splitting plate (20) below, a deformed grating pattern modulated by the surface height of the measured object is generated, a peripheral camera (40) and a peripheral camera lens (45) collect the deformed grating pattern modulated by the surface height of the measured object, and the collected deformed stripes are analyzed to obtain corresponding phase values of the surface of the measured object; finally, three-dimensional data of surface points of the measured object are restored according to the previously calibrated internal and external parameters of the peripheral cameras (40) and the projector (10); the color camera (30) and the color camera lens (35) acquire an image of the measured object without projection illumination from the right above the measured object through the beam splitting plate (20), obtain color information of the surface of the measured object, match pixels of the image with the three-dimensional point cloud according to the calibration information, and color the point cloud.
2. The multi-view three-dimensional measurement system according to claim 1, wherein the four-around camera (40) and the four-around camera lens (45) adopt a Schlemm inclined image plane structure to solve the problem of too small depth of field under the condition of small field of view and high magnification;
a wedge (41) is arranged between the peripheral camera (40) and the peripheral camera lens (45).
3. The multi-view three-dimensional measurement system according to claim 1, wherein the projector (10) and each of the four cameras (40) form a monocular vision system to obtain three-dimensional point clouds at the view angle, and finally the point clouds at the four view angles are fused to obtain a complete three-dimensional point cloud.
4. The overall calibration method of the multi-view three-dimensional measurement system is characterized by comprising the following steps of:
1) calibrating the cameras around by combining a Zhangyingyou calibration method with the Samm's law;
2) the projector projects a fringe image, the accurate phase value of the calibration point is solved, the projector is endowed with the capability of shooting an object by utilizing the phase-projector pixel corresponding relation, and the projector is calibrated by combining a Zhang Yongyou calibration method;
3) and calibrating the position relation between the projector and the cameras around by combining a binocular stereo calibration method, and completing the calibration of the system by taking the projector coordinate system as a unified world coordinate system.
5. The overall calibration method of the multi-view three-dimensional measurement system according to claim 4, wherein in step 1), a mapping relationship between a pixel coordinate system and a target coordinate system of the four-side camera image is established:
Figure FDA0002592398820000021
where ρ is a scale factor, (u, v) is the actual image pixel coordinate of a point, du, dv is the pixel size of the actual image plane, and (u, dv) is the pixel size of the actual image plane0,v0) Is the pixel coordinate of the principal point O, f is the effective focal length of the camera around, H is the homography matrix of the virtual image physical coordinate (X, y) of the point converted into the actual image physical coordinate (X ', y') (X)B,YB,ZB) The coordinate of the point under the target coordinate system, R and T are respectively the transformation matrix from the target coordinate system to the peripheral camera coordinate system, R is a 3 x 3 orthogonal rotation matrix, T is a 3 x 1 translation matrix and 0TIs a 1 x 3 zero matrix.
6. The overall calibration method of the multi-view three-dimensional measurement system according to claim 5, wherein a homography matrix H and a rotation angle τ are established for the mapping relationship between the pixel coordinate system of the camera image and the target coordinate system around the camera imagex、τyThe relationship of (1):
Figure FDA0002592398820000022
wherein:
Figure FDA0002592398820000023
wherein Q is a rotation matrix from a coordinate system O-xyz to a coordinate system O-x ' y ' z '; the coordinate system O-xyz is generated by the virtual image physical coordinate system O-xy according to the right-hand rule, and the coordinate system O-x ' y ' z ' is generated by the actual image physical coordinate system O-x ' y ' according to the right-hand rule.
7. The method for calibrating a multi-view three-dimensional measurement system as claimed in claim 4, wherein in step 2), the phase values from four views are used for each calibration point
Figure FDA0002592398820000031
Taking an average value, where n is 1,2,3,4, to obtain a higher accuracy phase value
Figure FDA0002592398820000032
Figure FDA0002592398820000033
8. The overall calibration method of the multi-view three-dimensional measurement system according to claim 4, wherein in step 3), the projector coordinate system is used as a unified world coordinate system, and a transformation relationship between the coordinate systems of all the four cameras and the projector coordinate system is established:
Figure FDA0002592398820000034
in the formula (X)p,Yp,Zp) As the coordinates of the point in the projector coordinate system, (X)cn,Ycn,Zcn) Is the coordinate of the point in the nth four-around camera coordinate system (n is 1,2,3,4), Rn,TnRespectively, transformation matrix from projector coordinate system to peripheral camera coordinate system,RnIs 3 x 3 orthogonal matrix, TnIs a 3 x 1 translation matrix, 0TIs a 1 x 3 zero matrix.
9. A three-dimensional reconstruction method of a multi-view three-dimensional measurement system is characterized by comprising the following steps:
1) placing the measured object in a common view field of the cameras and the projectors on the periphery, projecting a grating fringe pattern by the projectors, and synchronously shooting by the cameras on the periphery;
2) performing phase calculation on the shot grating fringe pattern to obtain phase distribution on the measured object, and further corresponding to the projector pixel coordinates of the point;
3) carrying out three-dimensional reconstruction on each pixel in the images shot by the peripheral cameras by utilizing the constraint of a single peripheral camera-projector or the constraint of a plurality of peripheral cameras-projectors to obtain point cloud data;
based on the calibration method step 3), point clouds obtained from different viewing angles are in the same coordinate system, namely a projector coordinate system, and in order to improve higher three-dimensional reconstruction accuracy, a multi-peripheral-camera-projector constraint method is adopted for three-dimensional reconstruction of a view field part where a plurality of peripheral cameras are overlapped.
10. The three-dimensional reconstruction method of the multi-view three-dimensional measurement system according to claim 9, wherein the projector coordinate system is used as a unified world coordinate system, and the method comprises the following steps:
Figure FDA0002592398820000041
in the above formula, ρ is a scale factor, (u, v) is the actual image pixel coordinate of the point, du, dv is the pixel size of the actual image plane, and (u, dv) is the pixel size of the actual image plane0,v0) Is the pixel coordinate of the principal point O of the image surface of the camera around, f is the effective focal length of the camera around, H is the homography matrix of the virtual image physical coordinate (X, y) of the point converted into the actual image physical coordinate (X ', y') (X)p,Yp,Zp) As coordinates of the point in the projector coordinate system, Rn,TnAre respectively a projector seatTransformation matrix of coordinate system to nth peripheral camera coordinate system, RnIs a 3 x 3 orthogonal rotation matrix, TnIs a 3 x 1 translation matrix, 0TIs a 1 x 3 zero matrix, n is 1,2,3, 4; in the above formula, the internal parameters of the four cameras and (u, v) are different with the different numbers n of the four cameras;
Figure FDA0002592398820000042
in the above formula, ρpIs a scale factor, (u)p,vp) Is the projector image pixel coordinate of a point, dup、dvpIs the primitive size of the projector DMD chip, (u)0p,v0p) Is the pixel coordinate of the optical principal point O of the projector, fpFor the effective focal length of the pseudo camera, (X)p,Yp,Zp) The coordinates of the points under the projector coordinate system;
the coordinate of the target point under the world coordinate system can be solved by simultaneous formulas (17) and (18), and for the part where the view fields of the plurality of peripheral cameras are overlapped, the formulas (17) and (18) corresponding to the plurality of peripheral cameras are used for simultaneous solving.
CN202010699165.8A 2020-07-20 2020-07-20 Multi-view three-dimensional measurement system and method Active CN111750806B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010699165.8A CN111750806B (en) 2020-07-20 2020-07-20 Multi-view three-dimensional measurement system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010699165.8A CN111750806B (en) 2020-07-20 2020-07-20 Multi-view three-dimensional measurement system and method

Publications (2)

Publication Number Publication Date
CN111750806A true CN111750806A (en) 2020-10-09
CN111750806B CN111750806B (en) 2021-10-08

Family

ID=72711234

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010699165.8A Active CN111750806B (en) 2020-07-20 2020-07-20 Multi-view three-dimensional measurement system and method

Country Status (1)

Country Link
CN (1) CN111750806B (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111227785A (en) * 2020-02-24 2020-06-05 耀视(苏州)医疗科技有限公司 Eyeball laser scanning imaging method
CN112179292A (en) * 2020-11-20 2021-01-05 苏州睿牛机器人技术有限公司 Projector-based line structured light vision sensor calibration method
CN112489109A (en) * 2020-11-19 2021-03-12 广州视源电子科技股份有限公司 Three-dimensional imaging system method and device and three-dimensional imaging system
CN112507755A (en) * 2020-12-22 2021-03-16 芜湖英视迈智能科技有限公司 Target object six-degree-of-freedom positioning method and system for minimizing two-dimensional code corner re-projection error
CN112630469A (en) * 2020-12-07 2021-04-09 清华大学深圳国际研究生院 Three-dimensional detection method based on structured light and multi-light-field camera
CN112648976A (en) * 2020-12-23 2021-04-13 北京恒达时讯科技股份有限公司 Live-action image measuring method and device, electronic equipment and storage medium
CN112767536A (en) * 2021-01-05 2021-05-07 中国科学院上海微系统与信息技术研究所 Three-dimensional reconstruction method, device and equipment of object and storage medium
CN113052898A (en) * 2021-04-08 2021-06-29 四川大学华西医院 Point cloud and strong-reflection target real-time positioning method based on active binocular camera
CN113160339A (en) * 2021-05-19 2021-07-23 中国科学院自动化研究所苏州研究院 Projector calibration method based on Samm's law
CN113143200A (en) * 2021-05-07 2021-07-23 苏州健雄职业技术学院 Laser scanning fundus camera imaging method
CN113222965A (en) * 2021-05-27 2021-08-06 西安交通大学 Three-dimensional observation method of discharge channel
CN113259642A (en) * 2021-05-12 2021-08-13 华强方特(深圳)科技有限公司 Film visual angle adjusting method and system
CN113470117A (en) * 2021-06-28 2021-10-01 上海交通大学 Unit attitude three-dimensional structured light calibration system and method based on spherical reverse perspective projection
CN113532328A (en) * 2021-07-16 2021-10-22 燕山大学 Surface profile real-time measurement system and method in medium plate straightening process
CN113706692A (en) * 2021-08-25 2021-11-26 北京百度网讯科技有限公司 Three-dimensional image reconstruction method, three-dimensional image reconstruction device, electronic device, and storage medium
CN114255286A (en) * 2022-02-28 2022-03-29 常州罗博斯特机器人有限公司 Target size measuring method based on multi-view binocular vision perception

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090086081A1 (en) * 2006-01-24 2009-04-02 Kar-Han Tan Color-Based Feature Identification
CN101504275A (en) * 2009-03-11 2009-08-12 华中科技大学 Hand-hold line laser three-dimensional measuring system based on spacing wireless location
CN102364299A (en) * 2011-08-30 2012-02-29 刘桂华 Calibration technology for multiple structured light projected three-dimensional profile measuring heads
CN104776815A (en) * 2015-03-23 2015-07-15 中国科学院上海光学精密机械研究所 Color three-dimensional profile measuring device and method based on Dammann grating
CN107014312A (en) * 2017-04-25 2017-08-04 西安交通大学 A kind of integral calibrating method of mirror-vibrating line laser structured light three-dimension measuring system
CN107869968A (en) * 2017-12-01 2018-04-03 杭州测度科技有限公司 A kind of quick three-dimensional scan method and system suitable for complex object surface
CN108536142A (en) * 2018-03-18 2018-09-14 上海交通大学 Industrial robot anti-collision early warning system based on digital fringe projection and method
CN110793464A (en) * 2019-10-17 2020-02-14 天津大学 Large-field-of-view fringe projection vision three-dimensional measurement system and method
CN110926371A (en) * 2019-11-19 2020-03-27 宁波舜宇仪器有限公司 Three-dimensional surface detection method and device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090086081A1 (en) * 2006-01-24 2009-04-02 Kar-Han Tan Color-Based Feature Identification
CN101504275A (en) * 2009-03-11 2009-08-12 华中科技大学 Hand-hold line laser three-dimensional measuring system based on spacing wireless location
CN102364299A (en) * 2011-08-30 2012-02-29 刘桂华 Calibration technology for multiple structured light projected three-dimensional profile measuring heads
CN104776815A (en) * 2015-03-23 2015-07-15 中国科学院上海光学精密机械研究所 Color three-dimensional profile measuring device and method based on Dammann grating
CN107014312A (en) * 2017-04-25 2017-08-04 西安交通大学 A kind of integral calibrating method of mirror-vibrating line laser structured light three-dimension measuring system
CN107869968A (en) * 2017-12-01 2018-04-03 杭州测度科技有限公司 A kind of quick three-dimensional scan method and system suitable for complex object surface
CN108536142A (en) * 2018-03-18 2018-09-14 上海交通大学 Industrial robot anti-collision early warning system based on digital fringe projection and method
CN110793464A (en) * 2019-10-17 2020-02-14 天津大学 Large-field-of-view fringe projection vision three-dimensional measurement system and method
CN110926371A (en) * 2019-11-19 2020-03-27 宁波舜宇仪器有限公司 Three-dimensional surface detection method and device

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111227785A (en) * 2020-02-24 2020-06-05 耀视(苏州)医疗科技有限公司 Eyeball laser scanning imaging method
CN112489109A (en) * 2020-11-19 2021-03-12 广州视源电子科技股份有限公司 Three-dimensional imaging system method and device and three-dimensional imaging system
CN112489109B (en) * 2020-11-19 2022-10-21 广州视源电子科技股份有限公司 Three-dimensional imaging system method and device and three-dimensional imaging system
CN112179292A (en) * 2020-11-20 2021-01-05 苏州睿牛机器人技术有限公司 Projector-based line structured light vision sensor calibration method
CN112630469A (en) * 2020-12-07 2021-04-09 清华大学深圳国际研究生院 Three-dimensional detection method based on structured light and multi-light-field camera
CN112630469B (en) * 2020-12-07 2023-04-25 清华大学深圳国际研究生院 Three-dimensional detection method based on structured light and multiple light field cameras
CN112507755A (en) * 2020-12-22 2021-03-16 芜湖英视迈智能科技有限公司 Target object six-degree-of-freedom positioning method and system for minimizing two-dimensional code corner re-projection error
CN112648976A (en) * 2020-12-23 2021-04-13 北京恒达时讯科技股份有限公司 Live-action image measuring method and device, electronic equipment and storage medium
CN112767536A (en) * 2021-01-05 2021-05-07 中国科学院上海微系统与信息技术研究所 Three-dimensional reconstruction method, device and equipment of object and storage medium
CN114004880A (en) * 2021-04-08 2022-02-01 四川大学华西医院 Point cloud and strong-reflection target real-time positioning method of binocular camera
CN113052898A (en) * 2021-04-08 2021-06-29 四川大学华西医院 Point cloud and strong-reflection target real-time positioning method based on active binocular camera
CN113052898B (en) * 2021-04-08 2022-07-12 四川大学华西医院 Point cloud and strong-reflection target real-time positioning method based on active binocular camera
CN113143200A (en) * 2021-05-07 2021-07-23 苏州健雄职业技术学院 Laser scanning fundus camera imaging method
CN113259642A (en) * 2021-05-12 2021-08-13 华强方特(深圳)科技有限公司 Film visual angle adjusting method and system
CN113160339B (en) * 2021-05-19 2024-04-16 中国科学院自动化研究所苏州研究院 Projector calibration method based on Molaque law
CN113160339A (en) * 2021-05-19 2021-07-23 中国科学院自动化研究所苏州研究院 Projector calibration method based on Samm's law
CN113222965A (en) * 2021-05-27 2021-08-06 西安交通大学 Three-dimensional observation method of discharge channel
CN113222965B (en) * 2021-05-27 2023-12-29 西安交通大学 Three-dimensional observation method for discharge channel
CN113470117A (en) * 2021-06-28 2021-10-01 上海交通大学 Unit attitude three-dimensional structured light calibration system and method based on spherical reverse perspective projection
CN113532328A (en) * 2021-07-16 2021-10-22 燕山大学 Surface profile real-time measurement system and method in medium plate straightening process
CN113706692B (en) * 2021-08-25 2023-10-24 北京百度网讯科技有限公司 Three-dimensional image reconstruction method, three-dimensional image reconstruction device, electronic equipment and storage medium
CN113706692A (en) * 2021-08-25 2021-11-26 北京百度网讯科技有限公司 Three-dimensional image reconstruction method, three-dimensional image reconstruction device, electronic device, and storage medium
CN114255286A (en) * 2022-02-28 2022-03-29 常州罗博斯特机器人有限公司 Target size measuring method based on multi-view binocular vision perception

Also Published As

Publication number Publication date
CN111750806B (en) 2021-10-08

Similar Documents

Publication Publication Date Title
CN111750806B (en) Multi-view three-dimensional measurement system and method
US10690492B2 (en) Structural light parameter calibration device and method based on front-coating plane mirror
WO2021027719A1 (en) Reflector-based calibration method for fringe projection system
CN113205593B (en) High-light-reflection surface structure light field three-dimensional reconstruction method based on point cloud self-adaptive restoration
CN110793464B (en) Large-field-of-view fringe projection vision three-dimensional measurement system and method
CN108020175B (en) multi-grating projection binocular vision tongue surface three-dimensional integral imaging method
CN112489109B (en) Three-dimensional imaging system method and device and three-dimensional imaging system
CN110296667A (en) High reflection surface method for three-dimensional measurement based on line-structured light multi-angle projection
EP2751521A1 (en) Method and system for alignment of a pattern on a spatial coded slide image
CN101308012A (en) Double monocular white light three-dimensional measuring systems calibration method
CN104729428B (en) Mirror face part three dimensional shape measurement system and measurement method based on coaxial configuration light
WO2020199439A1 (en) Single- and dual-camera hybrid measurement-based three-dimensional point cloud computing method
CN114111637A (en) Stripe structured light three-dimensional reconstruction method based on virtual dual-purpose
CN113108721B (en) High-reflectivity object three-dimensional measurement method based on multi-beam self-adaptive complementary matching
Zou et al. High-accuracy calibration of line-structured light vision sensors using a plane mirror
CN113505626A (en) Rapid three-dimensional fingerprint acquisition method and system
Wang et al. Highly reflective surface measurement based on dual stereo monocular structured light system fusion
Yu et al. An improved projector calibration method for structured-light 3D measurement systems
Li et al. A virtual binocular line-structured light measurement method based on a plane mirror
CN113865514B (en) Calibration method of line structured light three-dimensional measurement system
CN115082538A (en) System and method for three-dimensional reconstruction of surface of multi-view vision balance ring part based on line structure light projection
CN114143426A (en) Three-dimensional reconstruction system and method based on panoramic structured light
Jin et al. Shadow-Based Lightsource Localization with Direct Camera-Lightsource Geometry
TWI725620B (en) Omnidirectional stereo vision camera configuration system and camera configuration method
Li et al. Vision occlusion solution for line-structured light measurement system based on a plane mirror

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant