CN114943755A - Processing method for three-dimensional reconstruction of phase image based on binocular structured light - Google Patents

Processing method for three-dimensional reconstruction of phase image based on binocular structured light Download PDF

Info

Publication number
CN114943755A
CN114943755A CN202210875682.5A CN202210875682A CN114943755A CN 114943755 A CN114943755 A CN 114943755A CN 202210875682 A CN202210875682 A CN 202210875682A CN 114943755 A CN114943755 A CN 114943755A
Authority
CN
China
Prior art keywords
phase
image
slope
pixel
polar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210875682.5A
Other languages
Chinese (zh)
Other versions
CN114943755B (en
Inventor
郑晓军
唐笑虎
胡子阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SICHUAN INSTITUTE PRODUCT QUALITY SUPERVISION INSPECTION AND RESEARCH
Original Assignee
SICHUAN INSTITUTE PRODUCT QUALITY SUPERVISION INSPECTION AND RESEARCH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SICHUAN INSTITUTE PRODUCT QUALITY SUPERVISION INSPECTION AND RESEARCH filed Critical SICHUAN INSTITUTE PRODUCT QUALITY SUPERVISION INSPECTION AND RESEARCH
Priority to CN202210875682.5A priority Critical patent/CN114943755B/en
Publication of CN114943755A publication Critical patent/CN114943755A/en
Application granted granted Critical
Publication of CN114943755B publication Critical patent/CN114943755B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/38Registration of image sequences
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/254Projection of a pattern, viewing through a pattern, e.g. moiré
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Image Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to the technical field of three-dimensional measurement and discloses a processing method for three-dimensional reconstruction of a phase image based on binocular structured light. The method comprises the following steps: acquiring a pose transformation matrix between the left camera and the right camera and normalizing the pose transformation matrix; setting minimum pixel change, calculating the maximum pixel change generated in the horizontal direction, and rounding up to be used as the number of horizontal segments to obtain a segmented image; calculating the epipolar slope of the central pixels of each segmented image as the non-difference epipolar slope of all pixels in each segmented image; projecting a plurality of grating images with specific phase shifts to the surface of an object to be measured by adopting a phase measurement profilometry to form a sine stripe image sequence, solving to obtain a wrapping phase, and generating a phase image which monotonously changes along the horizontal direction; and acquiring a segmentation region of the phase image, and performing segmentation region continuous matching by taking the front point matching end point as a rear point matching start point. The technical scheme of the invention improves the three-dimensional measurement precision and the measurement efficiency.

Description

Processing method for three-dimensional reconstruction of phase image based on binocular structured light
Technical Field
The invention relates to the technical field of object three-dimensional measurement, in particular to a processing method for three-dimensionally reconstructing a phase image based on binocular structured light.
Background
Three-dimensional morphological features of objects are one of the most important features of objects. Three-dimensional surface shape measurement (namely three-dimensional object surface profile measurement) is an important means for acquiring object morphological characteristics, and is also a basis for recording, comparing and copying the object morphological characteristics.
In the existing method for measuring the three-dimensional surface shape of an object, optical three-dimensional sensing is increasingly paid attention and researched by people due to non-contact and high measurement precision. The phase measurement profilometry based on the structured light is an important three-dimensional sensing method, is an existing technology, adopts the sinusoidal grating projection and the phase shift technology, reconstructs three-dimensional information of the surface of an object by acquiring the spatial information of full-field stripes and the time sequence information of the phase shift stripes in a stripe period, has the characteristics of high precision, no influence of the surface reflectivity of the object and the like, and is easy to realize computer-aided automatic measurement, so that the phase measurement profilometry based on the structured light has wide application prospect in the fields of industrial detection, physical profiling, biomedicine, machine vision and the like.
The phase measurement profilometry based on the structured light is matched with binocular stereo vision, so that three-dimensional reconstruction can be realized quickly, accurately and simply. Compared with monocular camera reconstruction, the binocular camera has better robust performance on complex scenes due to the reduction of the blind area of the visual field. At present, in order to realize real-time binocular three-dimensional reconstruction, projection is performed by using various structured lights in the aspect of structured light imaging, wherein the projection is performed by using surface structured light, bidirectional structured light, colored structured light, binary coded structured light and the like, and various structured light characteristics are utilized to mainly optimize a unwrapping phase process, so that the unwrapping phase speed and precision are improved.
However, no matter what structured light is adopted, the process of acquiring the phase is optimized, and the customized optimization is not performed on the stereo matching after the phase expansion. According to the method, through polar line geometric characteristics, matching optimization is performed on stereo phases through traditional stereo vision optimization schemes such as polar line correction, reprojection correction is performed on images to be matched to achieve pixel one-to-one corresponding matching, however, after the images are corrected through polar line correction, image phase information changes due to interpolation and matching between original pixels is not performed, and therefore when pixels are accurately located, principles such as least square are needed to restore a fitting surface to select sub-pixel matching points, and matching accuracy is maintained. Although the methods such as epipolar rectification reduce the matching time of the pixel points, the preprocessing is often greatly prolonged and brings errors caused by fitting, and the rapid, accurate and simple measurement is difficult.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: the invention provides a processing method for three-dimensional reconstruction phase images based on binocular structured light, aiming at solving the problems of measurement precision and measurement efficiency in the existing binocular structured light three-dimensional reconstruction.
In order to achieve the purpose, the technical scheme adopted by the invention is as follows:
a processing method for three-dimensional reconstruction of a phase image based on binocular structured light comprises the following steps:
step 1, acquiring a pose transformation matrix between a left camera and a right camera and normalizing the pose transformation matrix;
step 2, setting minimum pixel change, calculating the maximum pixel change generated in the horizontal direction according to the polar line slope, rounding up to be used as the number of horizontal segments, and averagely dividing based on the number of the horizontal segments to obtain a segmented image;
step 3, calculating the polar slope of the central pixel of each segmented image, and taking the polar slope of the central pixel as the non-difference polar slope of all pixels in each segmented image;
step 4, projecting a plurality of grating images with specific phase shifts to the surface of an object to be measured by adopting a phase measurement profilometry to form a sine stripe image sequence, solving to obtain a wrapping phase, and generating a phase image which changes monotonously along the horizontal direction;
and 5, acquiring a segmented region of the phase image, and performing segmented region continuous matching by taking the front point matching end point as a rear point matching start point.
Further, in step 1, the pose transformation matrix includes a rotation matrix and a translation matrix.
Further, in step 2, the minimum pixel change is 1 pixel.
Further, in step 2, the maximum pixel change S is calculated by:
Figure 848366DEST_PATH_IMAGE001
wherein, the delta k' is the slope difference of the head and tail polar lines in the segmented image area,
Figure 236622DEST_PATH_IMAGE002
the imaging width for the camera is wide.
Further, in step 3, the polar slope k of the central pixel is related to the pixel point of the left camera
Figure 727646DEST_PATH_IMAGE003
The calculation method comprises the following steps:
Figure 910366DEST_PATH_IMAGE004
whereinp i q i t i For parameterized matrices, the calculation process is as follows:
Figure 690103DEST_PATH_IMAGE005
wherein C is an algebraic residue matrix of the left camera reprojection matrix;
whereina i b i c i For the reprojection matrix parameters:
Figure 367203DEST_PATH_IMAGE006
wherein,
Figure 712734DEST_PATH_IMAGE007
is the pth row and qth column of the reprojection matrix of the right camera,
Figure 331934DEST_PATH_IMAGE008
the imaging pole of the right camera.
Further, calculating all pixel point positions corresponding to each non-differential polar line slope, and establishing a mapping table of the corresponding relation between the pixel point positions and the non-differential polar line slopes.
Compared with the prior art, the invention has the following beneficial effects:
in the technical scheme of the invention, the epipolar line slope of the central pixel is taken as the non-differential epipolar line slope of all pixels in each segmented image, and the isolated epipolar lines in the segmented images are combined into the non-differential epipolar line of the pixels, so that the continuous matching of the phase along the non-differential epipolar line of the pixels is realized, and the measurement precision is high. In addition, the epipolar slope of all pixels in each segmented image is replaced by the epipolar slope of the central pixel in the similar region, and a mapping table is established for searching all the pixel point positions and the epipolar slopes, so that the purpose of simplifying the epipolar calculation is achieved, the calculation time is greatly reduced while the precision is kept, and the measurement efficiency is improved.
Drawings
Fig. 1 is a schematic flow chart of a processing method for three-dimensionally reconstructing a phase image based on binocular structured light.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail below with reference to the accompanying drawings. It should be apparent that the described embodiments are only some embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1, a processing method for three-dimensionally reconstructing a phase image based on binocular structured light includes the following steps:
step 1, acquiring a pose transformation matrix between a left camera and a right camera and normalizing the pose transformation matrix; step 2, setting minimum pixel change, calculating maximum pixel change generated in the horizontal direction according to the polar line slope, taking the maximum pixel change as the number of horizontal segments after rounding up, and obtaining a segmented image based on average division of the number of the horizontal segments; step 3, calculating the polar slope of the central pixel of each segmented image, taking the polar slope of the central pixel as the non-differential polar slope of all pixels in each segmented image, and establishing a mapping table based on the corresponding relation between all pixel positions and the non-differential polar slope; step 4, projecting a plurality of grating images with specific phase shifts to the surface of an object to be measured by adopting a phase measurement profilometry to form a sine stripe image sequence, solving to obtain a wrapping phase, and generating a phase image which changes monotonously along the horizontal direction; and 5, acquiring a segmented region of the phase image, and performing segmented region continuous matching by taking the front point matching end point as a rear point matching start point.
In this embodiment, the phase profilometry projects a plurality of grating images with specific phase shifts onto the surface of an object to form a sequence of sinusoidal fringe images with depth differences, and a phase depth image is obtained by unwrapping. Phase profilometry is one prior art technique.
In a first aspect, in this embodiment, by combining monotonicity of polar line change in a global step in polar line geometry and pixel level change of a phase image, a similar region along a polar line direction (horizontal direction) and also a phase increasing direction is integrated, that is, data of each row in the horizontal direction is integrated to obtain a segmented image. The pixels with the position difference of the corresponding pixels on the global step length epipolar line smaller than the minimum pixel change are integrated into a similar area, so that the complicated calculation process of the whole pixel epipolar line can be simplified into the calculation of the central pixel epipolar line in the segmented image area, and the purpose of simplifying the epipolar line calculation is further achieved.
In a second aspect, this embodiment provides a phase continuous search algorithm according to the strategy for calculating a segmented image proposed in the first aspect, because the phase change in a close region (usually, a horizontal line segment) is monotonous, when searching along the same polar line, the search on the polar line does not need to be performed from zero step to image wide step according to the original search every time, and the purpose of continuous search is achieved by continuing the search position of the previous pixel point as the search start position of the current pixel point, thereby reducing redundant calculation among different pixel points, and for countering certain sudden change interference, matching can be performed after a certain buffer interval is set for each matching.
In some embodiments, in step 1, the re-projection matrix of the camera is a pose transformation matrix between the left camera and the right camera, the pose transformation matrix including a rotation matrix and a translation matrix. Normalizing the pose transformation matrix between the left camera and the right camera, setting the imaging plane of the left camera and the imaging plane of the right camera as a two-dimensional coordinate, setting the imaging plane on a plane with a z-axis value of 1 in the three-dimensional coordinate when adding the two-dimensional coordinate into the three-dimensional coordinate for calculation, and connecting a pixel coordinate system and a camera coordinate system in series in order to realize the constraint that the z-axis value in the three-dimensional coordinate is 1.
In some embodiments, in step 2, the minimum pixel variation is 1 pixel. The pixel points with the position difference of the corresponding points on the global step length epipolar line less than 1 pixel are integrated into a similar area, and one similar area is a segmented image, so that the complicated process of computing the epipolar line of the full pixel can be simplified into computing the epipolar line of the central pixel in the segmented image area, and the purpose of simplifying the epipolar line computing is further achieved. Furthermore, based on the simplification strategy, a phase continuous search algorithm is provided, redundant calculation among different pixel points is reduced, and three-dimensional reconstruction is realized quickly, accurately and simply.
In some embodiments, for epipolar geometry, all epipolar lines cross the stationary point of the pole of the imaging plane, so the epipolar lines can be represented by the point-diagonal equation:
Figure 333388DEST_PATH_IMAGE009
whereinkThe slope of the epipolar line is found from the left point,
Figure 63447DEST_PATH_IMAGE010
for the imaging point of the right camera,
Figure 14216DEST_PATH_IMAGE011
the imaging pole of the right camera. Setting the depth of a reconstructed objectTo infinity, the position of the reconstructed object in the left camera image is not changed, so the image in the right camera is still on the same polar line, and the position projected on the right camera image plane can be located according to the homography principle of the infinity plane, and the point is taken as the phase pole, so the polar slope can be expressed as:
Figure 804318DEST_PATH_IMAGE012
whereinkThe slope of the epipolar line is found from the left point,
Figure 293068DEST_PATH_IMAGE013
is the right camera phase imaging point,
Figure 561238DEST_PATH_IMAGE011
imaging poles for the right camera. And then calculating a right camera plane normal vector by using Hartley algorithm
Figure 881361DEST_PATH_IMAGE014
And expressing the polar line slope by using the right camera plane normal vector and a re-projection matrix (a pose transformation matrix between the left camera and the right camera) parameter:
Figure 593096DEST_PATH_IMAGE015
whereina i b i c i The parameters obtained by the reprojection matrix and the pixel points of the left camera can be obtained by the following formula:
Figure 303563DEST_PATH_IMAGE006
wherein
Figure 375424DEST_PATH_IMAGE007
Is the pth row and the qth column of the reprojection matrix of the right camera.
Imaging the pixel in the epipolar geometry at each position in the world coordinate system satisfies the homography principle, namely obeys the epipolar constraint
Figure 815633DEST_PATH_IMAGE016
Whereinmm' refers to the homogeneous coordinates of the corresponding pixel points,Frefers to the basis matrix. Then, parameterization expression is carried out on the intermediate process, and the calculation equation of the epipolar line slope directly related to the pixel point of the left camera is obtained as follows:
Figure 885220DEST_PATH_IMAGE017
whereinp i q i t i The matrix calculation process for parameterization can be calculated by:
Figure 145300DEST_PATH_IMAGE005
whereinCReproject matrix M for left camera L An algebraic remainder matrix of (1), thereby, due top i q i t i The values of the isoparametric are fixed for a pair of binocular cameras, so that when the whole image is matched, only 1 time of calculation is needed to establish a non-differential polar slope about all pixel points of the left camera
Figure 503076DEST_PATH_IMAGE003
The mapping table of the position has corresponding relations between all pixel point positions and polar line slopes, and the subsequent table lookup is only needed to obtain the non-differential polar line slope based on the pixel point positions, so that the time consumed by matching can be greatly reduced.
In some embodiments, in binocular stereopsis, epipolar lines typically appear as straight lines nearly parallel to the imaging plane, with small values of slope. Therefore, for a higher resolution image, when the change generated by the epipolar slope of the adjacent pixels in the left camera is not different from the pixel position searched along the epipolar line in the right camera imaging in the same step length, it can be regarded as being on the same epipolar line, which is called as pixel non-differential epipolar line in the present application, that is, the following conditions are satisfied:
Figure 797791DEST_PATH_IMAGE018
whereinW s The length is searched synchronously, i.e. the same step size as experienced in the search for different epipolar lines. Therefore, under a specific step length, polar lines of different pixels in the area can be regarded as the same straight line, the result is no difference in pixel level search, and meanwhile, according to monotonicity of a phase image, continuous search can be realized for the same-row phase in a certain area, namely, a search starting point of a subsequent point can start continuous search from a search end point of a front point, so that the search time is greatly reduced, and the matching efficiency is improved. To realize continuous searching in image area, the searching step length is setW s The imaging width for the camera is wide. By the polar slope formula
Figure 835017DEST_PATH_IMAGE017
It can be known that, for each row of pixel points, the number of rows in the vertical direction is fixed, so that the change of the polar slope is a monotonous inverse proportion function, and the maximum pixel change can be obtained by directly calculating the head-to-tail slope interpolation as the maximum difference value in the horizontal direction:
Figure 785656DEST_PATH_IMAGE001
wherein Δk′For the head and tail epipolar slope difference in the segmented image region,Sfor maximum pixel change, since the pixel changes to the pixel level, thereforeSThe whole is required to be rounded up,Wimaging width for cameraWidth. And the pose of the binocular camera can be adjusted to enable the baseline of the binocular camera to be parallel to the horizontal plane, and the pose between the internal reference of the camera and the camera can be obtained through combination:
Figure 464899DEST_PATH_IMAGE019
thus, according to the polar slope formula
Figure 99273DEST_PATH_IMAGE017
And calculating the polar slope in the same row, the calculation of the polar slope will be degenerated to a linear function, so that the slope change in the horizontal direction (polar direction) can be regarded as a linear change. Can be based onSEqually dividing the pixel points of the left camera in the horizontal direction to obtain the original imageWidthStrip for packaging articlesWidthStep size polar line simplification toSStripWidthThe step-size pixels have no continuous phase matching on the epipolar line of the difference, wherein each segment has no polar slope of the difference as the polar slope of the center of the segment.
Finally, it should be noted that: the above embodiments are only preferred embodiments of the present invention to illustrate the technical solutions of the present invention, but not to limit the technical solutions, and certainly not to limit the patent scope of the present invention; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention; that is, the technical problems to be solved by the present invention, which are not substantially changed or supplemented by the spirit and the concept of the main body of the present invention, are still consistent with the present invention and shall be included in the scope of the present invention; in addition, the technical scheme of the invention is directly or indirectly applied to other related technical fields, and the technical scheme of the invention is included in the patent protection scope of the invention.

Claims (6)

1. A processing method for three-dimensional reconstruction of a phase image based on binocular structured light is characterized by comprising the following steps:
step 1, acquiring a pose transformation matrix between a left camera and a right camera and normalizing the pose transformation matrix;
step 2, setting minimum pixel change, calculating the maximum pixel change generated in the horizontal direction according to the polar line slope, rounding up to be used as the number of horizontal segments, and averagely dividing based on the number of the horizontal segments to obtain a segmented image;
step 3, calculating the polar slope of the central pixel of each segmented image, and taking the polar slope of the central pixel as the non-difference polar slope of all pixels in each segmented image;
step 4, projecting a plurality of grating images with specific phase shifts to the surface of an object to be measured by adopting a phase measurement profilometry to form a sine stripe image sequence, solving to obtain a wrapping phase, and generating a phase image which changes monotonously along the horizontal direction;
and 5, acquiring a segmented region of the phase image, and performing segmented region continuous matching by taking the front point matching end point as a rear point matching start point.
2. The binocular-structured-light-based phase image three-dimensional reconstruction processing method according to claim 1, wherein in step 1, the pose transformation matrix comprises a rotation matrix and a translation matrix.
3. The binocular structured light-based phase image three-dimensional reconstruction processing method according to claim 1, wherein in step 2, the minimum pixel variation is 1 pixel.
4. The processing method for three-dimensional reconstruction of phase images based on binocular structured light according to claim 1, wherein in step 2, the maximum pixel change S is calculated by:
Figure 1156DEST_PATH_IMAGE001
wherein, the delta k' is the slope difference of the head and tail polar lines in the segmented image area,
Figure 636406DEST_PATH_IMAGE002
the imaging width for the camera is wide.
5. The binocular structured light-based phase image three-dimensional reconstruction method of claim 1, wherein in the step 3, the polar slope k of the central pixel is related to the pixel point of the left camera
Figure 884984DEST_PATH_IMAGE003
The calculation method comprises the following steps:
Figure 749035DEST_PATH_IMAGE004
whereinp i q i t i For parameterized matrices, the calculation process is as follows:
Figure 297828DEST_PATH_IMAGE005
wherein C is an algebraic remainder matrix of the left camera reprojection matrix;
whereina i b i c i For the reprojection matrix parameters, the expression is:
Figure 854711DEST_PATH_IMAGE006
wherein,
Figure 341319DEST_PATH_IMAGE007
is the pth row and the qth column of the reprojection matrix of the right camera,
Figure 274640DEST_PATH_IMAGE008
the imaging pole of the right camera.
6. The binocular-structure-based light three-dimensional reconstruction phase image processing method of claim 1, wherein all pixel point positions corresponding to each non-differential epipolar slope are calculated, and a mapping table of correspondence between the pixel point positions and the non-differential epipolar slopes is established.
CN202210875682.5A 2022-07-25 2022-07-25 Processing method for three-dimensional reconstruction of phase image based on binocular structured light Active CN114943755B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210875682.5A CN114943755B (en) 2022-07-25 2022-07-25 Processing method for three-dimensional reconstruction of phase image based on binocular structured light

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210875682.5A CN114943755B (en) 2022-07-25 2022-07-25 Processing method for three-dimensional reconstruction of phase image based on binocular structured light

Publications (2)

Publication Number Publication Date
CN114943755A true CN114943755A (en) 2022-08-26
CN114943755B CN114943755B (en) 2022-10-04

Family

ID=82910194

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210875682.5A Active CN114943755B (en) 2022-07-25 2022-07-25 Processing method for three-dimensional reconstruction of phase image based on binocular structured light

Country Status (1)

Country Link
CN (1) CN114943755B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115880448A (en) * 2022-12-06 2023-03-31 温州鹿城佳涵网络技术服务工作室 Three-dimensional measurement method, device and equipment based on binocular imaging and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101936761A (en) * 2009-06-30 2011-01-05 宝山钢铁股份有限公司 Visual measuring method of stockpile in large-scale stock ground
CN109166149A (en) * 2018-08-13 2019-01-08 武汉大学 A kind of positioning and three-dimensional wire-frame method for reconstructing and system of fusion binocular camera and IMU
WO2020187719A1 (en) * 2019-03-15 2020-09-24 Trinamix Gmbh Detector for identifying at least one material property
CN112102491A (en) * 2020-08-12 2020-12-18 西安交通大学 Skin damage surface three-dimensional reconstruction method based on surface structured light
CN114152217A (en) * 2022-02-10 2022-03-08 南京南暄励和信息技术研发有限公司 Binocular phase expansion method based on supervised learning
CN114332349A (en) * 2021-11-17 2022-04-12 浙江智慧视频安防创新中心有限公司 Binocular structured light edge reconstruction method and system and storage medium
CN114723828A (en) * 2022-06-07 2022-07-08 杭州灵西机器人智能科技有限公司 Binocular vision-based multi-line laser scanning method and system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101936761A (en) * 2009-06-30 2011-01-05 宝山钢铁股份有限公司 Visual measuring method of stockpile in large-scale stock ground
CN109166149A (en) * 2018-08-13 2019-01-08 武汉大学 A kind of positioning and three-dimensional wire-frame method for reconstructing and system of fusion binocular camera and IMU
WO2020187719A1 (en) * 2019-03-15 2020-09-24 Trinamix Gmbh Detector for identifying at least one material property
CN112102491A (en) * 2020-08-12 2020-12-18 西安交通大学 Skin damage surface three-dimensional reconstruction method based on surface structured light
CN114332349A (en) * 2021-11-17 2022-04-12 浙江智慧视频安防创新中心有限公司 Binocular structured light edge reconstruction method and system and storage medium
CN114152217A (en) * 2022-02-10 2022-03-08 南京南暄励和信息技术研发有限公司 Binocular phase expansion method based on supervised learning
CN114723828A (en) * 2022-06-07 2022-07-08 杭州灵西机器人智能科技有限公司 Binocular vision-based multi-line laser scanning method and system

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
JIANYANG LIU等: "Performance analysis of 3-D shape measurement projector-camera system with short baseline arrangement", 《2013 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS (ROBIO)》 *
ZHENZHOU WANG等: "Three-Dimensional Hand Reconstruction by Single-Shot Structured Light Line Pattern", 《IEEE ACCESS》 *
张海洋等: "基于双目结构光的高铁白车身三维测量方法", 《控制与信息技术》 *
王亚萍: "基于双目结构光的风电叶片测量技术研究", 《基于双目结构光的风电叶片测量技术研究 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115880448A (en) * 2022-12-06 2023-03-31 温州鹿城佳涵网络技术服务工作室 Three-dimensional measurement method, device and equipment based on binocular imaging and storage medium
CN115880448B (en) * 2022-12-06 2024-05-14 西安工大天成科技有限公司 Three-dimensional measurement method and device based on binocular imaging

Also Published As

Publication number Publication date
CN114943755B (en) 2022-10-04

Similar Documents

Publication Publication Date Title
CN108734776B (en) Speckle-based three-dimensional face reconstruction method and equipment
Wang et al. Single-shot three-dimensional reconstruction based on structured light line pattern
Young et al. Coded structured light
WO2012096747A1 (en) Forming range maps using periodic illumination patterns
CN104596439A (en) Speckle matching and three-dimensional measuring method based on phase information aiding
CN114943755B (en) Processing method for three-dimensional reconstruction of phase image based on binocular structured light
CN104236479A (en) Line structured optical three-dimensional measurement system and three-dimensional texture image construction algorithm
CN113506348B (en) Gray code-assisted three-dimensional coordinate calculation method
CN103826032A (en) Depth map post-processing method
CN112833818B (en) Single-frame fringe projection three-dimensional surface type measuring method
CN110940295A (en) High-reflection object measurement method and system based on laser speckle limit constraint projection
WO2013012054A1 (en) Image processing method and apparatus
CN111524173A (en) Rapid large-range phase unwrapping method based on double reference planes
Hu et al. High-speed and accurate 3D shape measurement using DIC-assisted phase matching and triple-scanning
CN112945089A (en) Structured light coding method based on stripe width modulation
CN111815697A (en) Dynamic three-dimensional measurement method for thermal deformation
CN115546255A (en) SIFT stream-based single-frame fringe projection high dynamic range error compensation method
Song et al. Super-resolution phase retrieval network for single-pattern structured light 3D imaging
WO2022177705A1 (en) Texture based fusion for images with cameras having differing modalities
CN111023994B (en) Grating three-dimensional scanning method and system based on multiple measurement
Tyle_ek et al. Refinement of surface mesh for accurate multi-view reconstruction
Xi et al. Research on the algorithm of noisy laser stripe center extraction
CN113551617B (en) Binocular double-frequency complementary three-dimensional surface type measuring method based on fringe projection
CN114943761A (en) Method and device for extracting center of light stripe of central line structure of FPGA (field programmable Gate array)
Yang et al. Center extraction algorithm of linear structured light stripe based on improved gray barycenter method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant