CN115479556A - Binary defocus three-dimensional measurement method and device for reducing phase error mean value - Google Patents

Binary defocus three-dimensional measurement method and device for reducing phase error mean value Download PDF

Info

Publication number
CN115479556A
CN115479556A CN202110800600.6A CN202110800600A CN115479556A CN 115479556 A CN115479556 A CN 115479556A CN 202110800600 A CN202110800600 A CN 202110800600A CN 115479556 A CN115479556 A CN 115479556A
Authority
CN
China
Prior art keywords
phase
mean value
binary
phase error
defocus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110800600.6A
Other languages
Chinese (zh)
Inventor
游迪
游志胜
朱江平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan University
Original Assignee
Sichuan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan University filed Critical Sichuan University
Priority to CN202110800600.6A priority Critical patent/CN115479556A/en
Publication of CN115479556A publication Critical patent/CN115479556A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2504Calibration devices

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to the field of optical three-dimensional measurement, in particular to a binary defocusing three-dimensional measurement method and device for reducing a phase error mean value. Firstly, carrying out binary coding on a sine stripe image by adopting an error diffusion algorithm to obtain a coding stripe group; then, calculating to obtain a phase error mean value; the coding fringe group is projected to the position of an object to be measured in an out-of-focus mode, fringe images returned by the object to be measured are collected, and a measuring phase is obtained by adopting the multi-step phase shift method; and subtracting the phase error mean value from the measured phase to obtain an actual phase, and reconstructing through a phase diagram to obtain the three-dimensional data of the object to be measured. The invention reduces the root mean square error of the final phase result by subtracting the phase error mean value from the original phase result, thereby effectively improving the final measurement precision and effectively improving the accuracy of the detection result of the invention.

Description

Binary defocus three-dimensional measurement method and device for reducing phase error mean value
Technical Field
The invention relates to the field of optical three-dimensional measurement, in particular to a binary defocusing three-dimensional measurement method and device for reducing a phase error mean value.
Background
Three-dimensional measurement based on structured light is a non-contact measurement method, which has many advantages, such as high precision and high speed. The method is widely applied to the fields of automatic processing, high-speed online detection, aerospace, physical profiling and the like.
In order to solve the problem that the commercial digital optical projectors have certain nonlinearity, which causes great errors in the final three-dimensional measurement result, a plurality of methods are proposed in the industry, wherein a binary defocus technique is a commonly used method for overcoming the nonlinearity of the projectors. This type of method usually first binary encodes the standard sinusoidal fringe image (i.e. 0 and 1 are used to represent each pixel value of the image), and then projects a fringe image close to the standard sinusoidal by out-of-focus projection to perform the subsequent fringe structured light measurement. The binary defocusing technology can solve the problem of nonlinearity of the projector, and can also fully utilize the characteristics of the digital projector to improve the speed of fringe projection and further improve the speed of three-dimensional measurement.
However, due to the phase error between the stripe after the defocus projection and the standard sinusoidal stripe in the binary defocus technique, a large error occurs in the three-dimensional measurement result after the phase reconstruction. Therefore, a three-dimensional measurement method of out-of-focus projection capable of reducing a phase error between an out-of-focus projected fringe and a standard sinusoidal fringe, thereby reducing an error of three-dimensional measurement is now required.
Disclosure of Invention
The invention aims to solve the problem that the phase error between a stripe after defocusing projection and a standard sine stripe is large to cause large error of a three-dimensional measurement result in the prior art, and provides a binary defocusing three-dimensional measurement method and a binary defocusing three-dimensional measurement device for reducing the mean value of the phase error.
In order to achieve the above purpose, the invention provides the following technical scheme:
a binary defocus three-dimensional measurement method for subtracting a phase error mean value comprises the following steps:
s1: carrying out binary coding on the sinusoidal fringe image by adopting an error diffusion algorithm to obtain a coding fringe group;
s2: carrying out Gaussian blur on the coding stripe group, obtaining a predicted phase by adopting a multi-step phase shift method, carrying out difference on the predicted phase and an ideal phase, and calculating the average value of phase errors of all pixels to obtain a phase error average value; the ideal phase is calculated by the multi-step phase shift method on the sine stripe image;
s3: defocusing and projecting the coding stripe group to an object to be measured, collecting stripe images returned by the object to be measured, and obtaining a measurement phase by adopting the multi-step phase shift method;
s4: and subtracting the phase error mean value from the measured phase to obtain an actual phase, then obtaining an unfolded phase through space phase unfolding, and finally obtaining three-dimensional data of the object to be measured through unfolding phase diagram reconstruction. The invention reduces the root mean square error of the final phase result by subtracting the phase error mean value from the original phase result, thereby effectively improving the final measurement precision and effectively improving the accuracy of the detection result.
As a preferred embodiment of the present invention, the step S2 and the step S3 can be executed simultaneously or in an alternative execution order.
As a preferred scheme of the invention, the calculation formula for calculating the phase by adopting the multi-step phase shift method is as follows:
Figure BDA0003164577010000031
where N represents the number of steps in the multi-step phase shift, (x, y) is the pixel coordinate, I n (x, y) is the pixel value of (x, y), and φ (x, y) is the phase value of (x, y).
As a preferred embodiment of the present invention, the calculation formula of the phase error mean value is:
Figure BDA0003164577010000032
wherein M represents the number of pixels, (x, y) is the pixel coordinate, φ bi (x, y) is the predicted phase, phi, calculated for the encoded group of stripes after Gaussian blur ideal (x, y) is an ideal phase calculated by the sine stripe image, and Mean is a phase error Mean value.
As a preferred embodiment of the present invention, in the step S2, a computer simulation is adopted to perform gaussian blurring for simulating the defocus effect.
As a preferable embodiment of the present invention, the step S3 uses a digital projector to perform the defocused projection.
A three-dimensional measuring device comprises at least one processor, at least one projection device for executing focus-off projection, at least one acquisition camera for acquiring return images of an object to be measured and a memory which is in communication connection with the at least one processor; the projection device and the acquisition camera are respectively in communication connection with the processor; the memory stores instructions executable by the at least one processor to cause the at least one processor to perform the method of any one of the above.
Compared with the prior art, the invention has the beneficial effects that:
the invention reduces the root mean square error of the final phase result by subtracting the phase error mean value from the original phase result, thereby effectively improving the final measurement precision and effectively improving the accuracy of the detection result.
Drawings
Fig. 1 is a schematic flowchart of a two-value defocus three-dimensional measurement method for reducing a phase error mean value according to embodiment 1 of the present invention;
fig. 2 is a schematic diagram of an encoding stripe group generated in step S1 in the binary defocus three-dimensional measurement method for reducing the mean value of the phase error according to embodiment 2 of the present invention;
fig. 3 is a fringe diagram of the simulated defocus projection generated in step S2 in the binary defocus three-dimensional measurement method for reducing the mean value of the phase error in embodiment 2 of the present invention;
fig. 4 is a fringe pattern actually captured in step S3 of the binary defocus three-dimensional measurement method for reducing the phase error mean value according to embodiment 2 of the present invention;
fig. 5 is a comparison diagram of phase results obtained by the implementation method of the present invention and the conventional method under the conditions of small defocus amount and different fringe periods in the binary defocus three-dimensional measurement method for reducing the mean value of the phase error in embodiment 2 of the present invention;
fig. 6 is a schematic structural diagram of a three-dimensional measurement apparatus according to embodiment 3 of the present invention, which uses the binary defocus three-dimensional measurement method with reduced mean phase error value according to embodiment 1.
Detailed Description
The present invention will be described in further detail with reference to test examples and specific embodiments. It should be understood that the scope of the above-described subject matter is not limited to the following examples, and any techniques implemented based on the disclosure of the present invention are within the scope of the present invention.
Among the sinusoidal stripe binary coding methods proposed by many scholars, the error diffusion algorithm is a commonly used sinusoidal stripe binary coding method with excellent effect. The basic idea is to weight-spread the quantization error of the processed elements towards the unprocessed elements, thereby reducing the quantization error of the entire coding surface.
Through careful research, after binary coding is carried out on a certain sinusoidal fringe pattern by using an error diffusion algorithm of a certain error diffusion kernel, defocusing projection is carried out under different defocusing amounts, the phase is calculated by adopting any multi-step phase shift method, and finally the obtained phase errors have the same mean value.
The evaluation index of the phase error in the industry generally adopts root mean square error (RMS), and the expression is as follows:
Figure BDA0003164577010000051
further, it has been found that the root mean square error can be represented by the variance of the error as well as the mean of the error, as follows:
Figure BDA0003164577010000052
where V is the variance of the error and E is the mean of the error.
It is apparent that the final root mean square error value (RMS) can be reduced if the effect of the error mean value on the root mean square error (RMS) is subtracted.
Therefore, in combination with our research findings, we can reduce the final phase root mean square error value (RMS) by calculating the corresponding phase error mean value through advanced computer simulation, and then subtracting the phase error mean value from the actually measured phase result.
However, the important condition for this solution is that the change of the defocus amount does not affect the average value of the finally obtained phase error. Here, the present invention particularly uses mathematical analysis to prove that the change of defocus amount does not affect the mean value of the finally obtained phase error.
Phase error of fringe by four-step phase shift method
Figure BDA0003164577010000061
And intensity error of fringe Δ A n The following relationships exist:
Figure BDA0003164577010000062
wherein,
Figure BDA0003164577010000063
for the intensity error of the nth frame, n is equal to 0,3],
Figure BDA0003164577010000064
And B is the phase, and B is the modulation degree of the stripes in the stripe image.
Assuming that the Gaussian kernel g represents a certain defocus amount, applied to the projected fringe pattern, then Δ A n And B will both change:
Figure BDA0003164577010000065
B'=t 1 B
wherein,
Figure BDA0003164577010000066
representing intensity error of the n-th frame after defocusing, representing convolution operator, t 1 The modulation degree of the standard sine function changes by a fixed coefficient under the action of the Gaussian kernel g.
Therefore, the phase error of the changed fringe can be expressed as:
Figure BDA0003164577010000067
due to the symmetric nature of the gaussian kernel, there may be:
Figure BDA0003164577010000068
thus, the mean phase error can be derived as follows:
Figure BDA0003164577010000071
it is therefore clear that:
Figure BDA0003164577010000072
therefore, the change in defocus amount does not affect the average value of the phase error obtained last. As shown in table 1, the mean value of the phase errors obtained by different defocus amounts is substantially constant (i.e. 0.0228 ± 0.0003, error less than 0.0003, negligible) when the fringe period is 60 pixels.
TABLE 1
Small defocus Middle defocus Large defocus
Mean value of phase error 0.0227 0.0231 0.0225
Example 1
As shown in fig. 1, a binary defocus three-dimensional measurement method for subtracting a mean value of phase errors includes the following steps:
s1: carrying out binary coding on the sinusoidal fringe image by adopting an error diffusion algorithm to obtain a coding fringe group;
s2: carrying out Gaussian blur on the coding stripe group, obtaining a predicted phase by adopting a multi-step phase shift method, and carrying out difference on the predicted phase and an ideal phase to obtain a phase error mean value; the ideal phase is calculated by the multi-step phase shift method for the sine stripe image;
s3: defocusing and projecting the coding stripe group to an object to be measured, collecting stripe images returned by the object to be measured, and obtaining a measurement phase by adopting the multi-step phase shift method;
s4: and subtracting the phase error mean value from the measured phase to obtain an actual phase, then performing space phase expansion to obtain an expanded phase, and finally performing expanded phase diagram reconstruction to obtain three-dimensional data of the object to be measured.
Wherein, step S2 and step S3 can be executed in the same order or at the same time.
The calculation formula for calculating the phase by using the multi-step phase shift method is as follows:
Figure BDA0003164577010000081
wherein N represents the number of steps of the multi-step phase shift, (x, y) is the pixel coordinate, I n (x, y) is the pixel value of (x, y), and φ (x, y) is the phase value of (x, y).
The calculation formula of the phase error mean value is as follows:
Figure BDA0003164577010000082
wherein M represents the number of pixels, phi bi (x, y) is the predicted phase, phi, calculated for the encoded group of stripes after Gaussian blur ideal (x, y) is the ideal phase calculated by the sine stripe image, and Mean is the Mean phase error.
Example 2
S1: the Floyd-Steinberg error diffusion algorithm is used to encode the sinusoidal fringe pattern with the resolution of 912 x 1140 and the period of 60 pixels in binary, resulting in the encoded fringe group, as shown in FIG. 2.
S2: and (3) performing Gaussian blur on the binarized fringe pattern generated in the step S1 by using a Gaussian kernel with a window size of 9 × 9 and a standard deviation of 3 by using computer simulation to simulate out-of-focus projection, calculating a predicted phase by using a four-step phase shift method, and comparing the predicted phase with an ideal phase to calculate a phase error mean value, wherein the phase error mean value is 0.0207. Fig. 3 shows a fringe pattern of the simulated out-of-focus projection.
S3: and (3) performing out-of-focus projection on the coding stripe group generated in the step (S1) to an object to be detected by using a digital projector, collecting a stripe image returned by the object to be detected, determining a corresponding phase result by using a four-step phase shift method, and subtracting the phase error mean value calculated in the step (S2) from the phase result to obtain a final more accurate phase result. The phase result error RMS drops from 0.035 to 0.027. Fig. 4 shows a striped pattern actually photographed.
S4: and subtracting the phase error mean value from the measured phase to obtain an actual phase, then performing spatial phase expansion to obtain an expanded phase, using the system parameters calibrated in advance, and finally performing expanded phase diagram reconstruction to obtain three-dimensional data of the object to be measured.
The system parameters calibrated in advance are obtained through the calibration process of the monocular structured light system. These parameters include the camera's intrinsic parameters Kc and the projector's intrinsic parameters Kp, and the extrinsic parameters between the camera and the projector: a rotation matrix R and a translation matrix T. And distortion coefficients dc and dp of the camera and projector.
Fig. 5 is a comparison of phase results (mean value reduction and non-mean value reduction) obtained by the method of the present patent and the conventional method under the condition of small defocus amount and different fringe periods. From the results, it can be seen that the method of the present invention can effectively reduce the root mean square error (RMS) of the phase error compared with the conventional method.
Example 3
As shown in fig. 6, a three-dimensional measuring device includes at least one processor, at least one projecting device for performing an out-of-focus projection, at least one collecting camera for collecting a return image of an object to be measured, and a memory communicatively connected to the at least one processor; the projection device and the acquisition camera are respectively in communication connection with the processor; the memory stores instructions executable by the at least one processor to enable the at least one processor to perform a phase error mean subtraction binary defocus three-dimensional measurement method as described in the previous embodiments. The input and output interface can comprise a display, a keyboard, a mouse and a USB interface and is used for inputting and outputting data; the power supply is used for providing electric energy for the three-dimensional measuring device.
Those skilled in the art will understand that: all or part of the steps for realizing the method embodiments can be completed by hardware related to program instructions, the program can be stored in a computer readable storage medium, and the program executes the steps comprising the method embodiments when executed; and the aforementioned storage medium includes: various media that can store program codes, such as a removable Memory device, a Read Only Memory (ROM), a magnetic disk, or an optical disk.
When the integrated unit of the present invention is implemented in the form of a software functional unit and sold or used as a separate product, it may also be stored in a computer-readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present invention may be essentially implemented or a part contributing to the prior art may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the methods described in the embodiments of the present invention. And the aforementioned storage medium includes: a removable storage device, a ROM, a magnetic or optical disk, or other various media that can store program code.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (7)

1. A binary out-of-focus three-dimensional measurement method for subtracting a phase error mean value is characterized by comprising the following steps:
s1: carrying out binary coding on the sinusoidal fringe image by adopting an error diffusion algorithm to obtain a coding fringe group;
s2: carrying out Gaussian blur on the coding stripe group, obtaining a predicted phase by adopting a multi-step phase shift method, carrying out difference on the predicted phase and an ideal phase, and calculating the average value of phase errors of all pixels to obtain a phase error average value; the ideal phase is calculated by the multi-step phase shift method on the sine stripe image;
s3: the coding fringe group is projected to the position of an object to be measured in an out-of-focus mode, fringe images returned by the object to be measured are collected, and a measuring phase is obtained by adopting the multi-step phase shift method;
s4: and subtracting the phase error mean value from the measured phase to obtain an actual phase, then obtaining an unfolded phase through space phase unfolding, and finally obtaining three-dimensional data of the object to be measured through unfolding phase diagram reconstruction.
2. The method for binary defocus three-dimensional measurement of phase error mean value according to claim 1, wherein the steps S2 and S3 can be executed simultaneously or in an alternative order.
3. The method for binary defocus three-dimensional measurement of the mean value of the phase error according to claim 1, wherein the calculation formula for calculating the phase by adopting the multi-step phase shift method is as follows:
Figure FDA0003164571000000011
where N represents the number of steps in the multi-step phase shift, (x, y) is the pixel coordinate, I n (x, y) is the pixel value of (x, y), and φ (x, y) is the phase value of (x, y).
4. The method for binary out-of-focus three-dimensional measurement of phase error mean value according to claim 1, wherein the phase error mean value is calculated by the following formula:
Figure FDA0003164571000000021
where M represents the number of pixels and (x, y) is the pixel coordinate, phi bi (x, y) is the predicted phase, phi, calculated for the encoded group of stripes after Gaussian blur ideal (x, y) is an ideal phase calculated by the sine stripe image, and Mean is a phase error Mean value.
5. The method for measuring binary defocus three-dimensional value by subtracting the mean value of phase error according to claim 1, wherein the step S2 is performed by using computer simulation to perform gaussian blur for simulating defocus effect.
6. The phase error mean subtraction binary defocus three-dimensional measurement method according to claim 1, wherein the step S3 adopts a digital projector to perform defocus projection.
7. A three-dimensional measuring device is characterized by comprising at least one processor, at least one projection device for executing focus-departure projection, at least one acquisition camera for acquiring return images of an object to be measured and a memory which is in communication connection with the at least one processor; the projection device and the acquisition camera are respectively in communication connection with the processor; the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1 to 6.
CN202110800600.6A 2021-07-15 2021-07-15 Binary defocus three-dimensional measurement method and device for reducing phase error mean value Pending CN115479556A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110800600.6A CN115479556A (en) 2021-07-15 2021-07-15 Binary defocus three-dimensional measurement method and device for reducing phase error mean value

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110800600.6A CN115479556A (en) 2021-07-15 2021-07-15 Binary defocus three-dimensional measurement method and device for reducing phase error mean value

Publications (1)

Publication Number Publication Date
CN115479556A true CN115479556A (en) 2022-12-16

Family

ID=84420070

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110800600.6A Pending CN115479556A (en) 2021-07-15 2021-07-15 Binary defocus three-dimensional measurement method and device for reducing phase error mean value

Country Status (1)

Country Link
CN (1) CN115479556A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118243000A (en) * 2024-02-02 2024-06-25 北京控制工程研究所 Method and device for detecting deformation of component

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110080471A1 (en) * 2009-10-06 2011-04-07 Iowa State University Research Foundation, Inc. Hybrid method for 3D shape measurement
CN104567730A (en) * 2015-01-15 2015-04-29 四川大学 Method for generating light field with sine structure by means of space-time binary encoding
CN108168464A (en) * 2018-02-09 2018-06-15 东南大学 For the phase error correction approach of fringe projection three-dimension measuring system defocus phenomenon
CN109579738A (en) * 2019-01-04 2019-04-05 北京理工大学 A kind of two-value striped defocus optical projection system low-frequency filter characteristics measurement method
CN110084127A (en) * 2019-03-29 2019-08-02 南京航空航天大学 A kind of magnetic suspension rotor vibration measurement method of view-based access control model
US20190271540A1 (en) * 2016-12-15 2019-09-05 Southeast University Error correction method for fringe projection profilometry system
CN110375673A (en) * 2019-07-01 2019-10-25 武汉斌果科技有限公司 A kind of big depth of field two-value defocus method for three-dimensional measurement based on multifocal optical projection system
CN111156928A (en) * 2020-02-07 2020-05-15 武汉玄景科技有限公司 Grating three-dimensional scanner moire fringe eliminating method based on DLP projection
CN111598929A (en) * 2020-04-26 2020-08-28 云南电网有限责任公司电力科学研究院 Two-dimensional unwrapping method based on time sequence differential interference synthetic aperture radar data

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110080471A1 (en) * 2009-10-06 2011-04-07 Iowa State University Research Foundation, Inc. Hybrid method for 3D shape measurement
CN104567730A (en) * 2015-01-15 2015-04-29 四川大学 Method for generating light field with sine structure by means of space-time binary encoding
US20190271540A1 (en) * 2016-12-15 2019-09-05 Southeast University Error correction method for fringe projection profilometry system
CN108168464A (en) * 2018-02-09 2018-06-15 东南大学 For the phase error correction approach of fringe projection three-dimension measuring system defocus phenomenon
CN109579738A (en) * 2019-01-04 2019-04-05 北京理工大学 A kind of two-value striped defocus optical projection system low-frequency filter characteristics measurement method
CN110084127A (en) * 2019-03-29 2019-08-02 南京航空航天大学 A kind of magnetic suspension rotor vibration measurement method of view-based access control model
CN110375673A (en) * 2019-07-01 2019-10-25 武汉斌果科技有限公司 A kind of big depth of field two-value defocus method for three-dimensional measurement based on multifocal optical projection system
CN111156928A (en) * 2020-02-07 2020-05-15 武汉玄景科技有限公司 Grating three-dimensional scanner moire fringe eliminating method based on DLP projection
CN111598929A (en) * 2020-04-26 2020-08-28 云南电网有限责任公司电力科学研究院 Two-dimensional unwrapping method based on time sequence differential interference synthetic aperture radar data

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
赵洋等: "深度学习精确相位获取的离焦投影三维测量", 红外与激光工程, no. 7, 25 July 2020 (2020-07-25), pages 169 - 176 *
郑东亮等: "提高数字光栅投影测量系统精度的gamma校正技术", 光学学报, no. 5, 10 May 2011 (2011-05-10), pages 124 - 129 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118243000A (en) * 2024-02-02 2024-06-25 北京控制工程研究所 Method and device for detecting deformation of component

Similar Documents

Publication Publication Date Title
Zabatani et al. Intel® realsense™ sr300 coded light depth camera
Lin et al. Three-dimensional shape measurement technique for shiny surfaces by adaptive pixel-wise projection intensity adjustment
Sun et al. A robust method to extract a laser stripe centre based on grey level moment
US20200057831A1 (en) Real-time generation of synthetic data from multi-shot structured light sensors for three-dimensional object pose estimation
CN113607085B (en) Binary defocus three-dimensional measurement method and device based on half-broken sine stripes
US20100034429A1 (en) Deconvolution-based structured light system with geometrically plausible regularization
KR20130000356A (en) Measuring method of 3d image depth and a system for measuring 3d image depth using boundary inheritance based hierarchical orthogonal coding
Dufour et al. Integrated digital image correlation for the evaluation and correction of optical distortions
CN113358065B (en) Three-dimensional measurement method based on binary coding and electronic equipment
CN104657999B (en) A kind of Digital Image Correlation Method based on kernel function
CN111928799A (en) Three-dimensional measurement method for realizing stripe image contrast enhancement based on deep learning
Tran et al. A Structured Light RGB‐D Camera System for Accurate Depth Measurement
Yang et al. High dynamic range fringe pattern acquisition based on deep neural network
CN109000587A (en) The method for obtaining accurate high density point cloud
CN110108230B (en) Binary grating projection defocus degree evaluation method based on image difference and LM iteration
CN118196308A (en) Structured light three-dimensional measurement method, medium and device
CN112802084B (en) Three-dimensional morphology measurement method, system and storage medium based on deep learning
Li et al. Global phase accuracy enhancement of structured light system calibration and 3D reconstruction by overcoming inevitable unsatisfactory intensity modulation
CN115479556A (en) Binary defocus three-dimensional measurement method and device for reducing phase error mean value
CN116608794A (en) Anti-texture 3D structured light imaging method, system, device and storage medium
Lee et al. Three-dimensional shape recovery from image focus using polynomial regression analysis in optical microscopy
CN113532330B (en) Three-dimensional measurement method for phase gray code
CN116109772A (en) Point cloud reconstruction method, device, equipment and readable storage medium
CN114279356A (en) Gray scale stripe pattern design method for three-dimensional measurement
Peng et al. Fringe pattern inpainting based on dual-exposure fused fringe guiding CNN denoiser prior

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination