CN116310101A - High-dynamic three-dimensional measurement method based on self-adaptive distribution of intensity of overexposure connected domain - Google Patents

High-dynamic three-dimensional measurement method based on self-adaptive distribution of intensity of overexposure connected domain Download PDF

Info

Publication number
CN116310101A
CN116310101A CN202310201805.1A CN202310201805A CN116310101A CN 116310101 A CN116310101 A CN 116310101A CN 202310201805 A CN202310201805 A CN 202310201805A CN 116310101 A CN116310101 A CN 116310101A
Authority
CN
China
Prior art keywords
maximum input
gray level
input gray
coordinate system
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310201805.1A
Other languages
Chinese (zh)
Inventor
冯世杰
牟双
胡岩
陈钱
左超
高建坡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Science and Technology
Original Assignee
Nanjing University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Science and Technology filed Critical Nanjing University of Science and Technology
Publication of CN116310101A publication Critical patent/CN116310101A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Abstract

The invention discloses a high-dynamic three-dimensional measurement method based on self-adaptive distribution of projection intensity of an overexposure connected domain, which generates a maximum input gray level image and a sinusoidal fringe image, projects the maximum input gray level image and the sinusoidal fringe image onto the surface of an object to be measured and synchronously collects the maximum input gray level image and the sinusoidal fringe image; identifying an overexposed region from the acquired maximum input gray level diagram according to a threshold value, extracting the boundary of the overexposed region and tracking; carrying out preliminary three-dimensional reconstruction on the object by combining parameters obtained by pre-calibration and acquired sine stripes; mapping the coordinates of the boundary to a projector pixel coordinate system, tracking, and automatically connecting to form a closed boundary; generating a binary mask by the closed boundary, reducing the maximum input gray level in the closed boundary, and regenerating the maximum input gray level map and the sine stripes; projecting the maximum input gray level image and the sinusoidal stripes onto the object again, collecting until the collected maximum input gray level image has no overexposed area, and reconstructing the complete three-dimension of the object by using the corresponding sinusoidal stripes. The invention can realize three-dimensional measurement of the object with high dynamic range.

Description

High-dynamic three-dimensional measurement method based on self-adaptive distribution of intensity of overexposure connected domain
Technical Field
The invention relates to the technical field of optical measurement, in particular to a high-dynamic three-dimensional measurement method based on self-adaptive distribution of the intensity of an overexposed connected domain.
Background
With the development of digital technology, three-dimensional measurement technology has an increasingly wide application requirement in various fields. The optical three-dimensional measurement method is various and can be roughly classified into multi (double) eye stereoscopic vision, a time-of-flight method, a structured light projection method and the like, and the shape of the surface of the object can be flexibly and efficiently perceived without physical contact. Among them, structured-light fringe projection is considered as one of the most promising technologies at present. The method comprises the steps of projecting a pre-designed sinusoidal fringe pattern onto an object through a computer-controlled projector, capturing corresponding deformed fringe by a camera, transmitting the corresponding deformed fringe pattern to the computer, extracting phase information from the captured fringe pattern by the computer, and finally realizing real three-dimensional measurement of the object by combining parameters obtained by a pre-calibration system.
Fringe projection techniques have been used in many ways, but are prone to overexposure when measuring highly reflective areas of an object, and while image saturation can be avoided by changing the aperture or exposure time, the intensity modulation of the low reflective areas is reduced, resulting in poor measurement accuracy. In the existing high-dynamic three-dimensional measurement method, the deformation fringe images under the same scene are acquired for multiple times by changing the exposure time for multiple times based on the multiple exposure technology, and the composite fringe images with high dynamic range are generated for three-dimensional reconstruction in a fusion mode. Three-dimensional measurement of highly reflective areas remains a challenging problem.
Disclosure of Invention
The invention aims to provide a high-dynamic three-dimensional measurement method based on self-adaptive distribution of overexposure connected domain intensity, which can realize three-dimensional measurement of an object with a high dynamic range under the condition of not changing camera aperture and exposure time.
In order to achieve the above object, the present invention provides the following technical solutions: a high dynamic three-dimensional measurement method based on the self-adaptive distribution of the intensity of overexposure connected domain comprises the following specific steps:
step 1: generating a maximum input gray level image and a four-step phase-shift sinusoidal fringe image by using a computer, projecting the maximum input gray level image and the four-step phase-shift sinusoidal fringe image onto the surface of a measured object by using a monocular fringe projection system, and synchronously collecting the sinusoidal fringe image and the maximum input gray level image;
step 2: identifying a communicated overexposure region from the acquired maximum input gray level diagram, extracting the boundary of the overexposure region under a camera pixel coordinate system, and tracking;
step 3: carrying out preliminary three-dimensional reconstruction on the object by combining parameters obtained by calibrating the system and continuous phases obtained by the acquired sine stripes;
step 4: mapping the closed boundary coordinates under the camera pixel coordinate system to the projector pixel coordinate system, tracking the coordinates of the closed boundary under the projector pixel coordinate system, and automatically connecting to form the closed boundary;
step 5: generating a binary mask by the closed boundary, reducing the maximum input gray level in the closed boundary, and regenerating the maximum input gray level map and the sine stripes with the adjusted intensity;
step 6: and projecting the maximum input gray level image and the sinusoidal stripes onto the object again, collecting, repeating the steps 2-5 until the image collected by the maximum input gray level image is projected without an overexposed area, and reconstructing the complete three-dimension of the object by utilizing the continuous phases obtained by the sinusoidal stripes.
Preferably, the four-step phase-shifted sinusoidal grating fringes generated in step 1 are computer-generated
Figure BDA0004109269420000021
The optical expression of (2) is as follows:
Figure BDA0004109269420000022
where n=0, 1,2,3 is the phase shift index, (u) p ,v p ) For projector pixel coordinates, M migl A is the maximum input gray scale map, a p Represents the average intensity, b p The amplitude is represented by a value representing the amplitude,
Figure BDA0004109269420000023
is the frequency at which the sinusoidal fringes are projected.
Preferably, the maximum input gray-scale map acquired in step 2 is denoted as I c (u c ,v c ),(u c ,v c ) For the camera pixel coordinates, according to a given threshold thr, the pixel is over-exposed and passes through a binary image
Figure BDA0004109269420000024
The marks identify the areas that have been exposed to light,
Figure BDA0004109269420000025
for binary images
Figure BDA0004109269420000026
Carrying out connected domain analysis, finding and marking each connected domain in the image, removing smaller connected domains, extracting the boundary of the residual saturated pixel region by adopting a Canny operator, and obtaining mask space of the boundary>
Figure BDA0004109269420000027
Preferably, the continuous phase diagram obtained in step 3 is Φ (u c ,v c ) According to the rule of phase distribution, calculating the horizontal position of the continuous phase diagram under the pixel coordinate system of the projector is as follows:
Figure BDA0004109269420000028
u is the resolution in the projection space U direction, T is the period of the projection space stripe.
Horizontal position in the pixel coordinate system of the projector, camera coordinate system lower coordinates (X c ,Y c ,Z c ) To pixel coordinates (u) p ,v p ) And the camera coordinate system lower coordinate (X c ,Y c ,Z c ) To camera pixel coordinates (u c ,v c ) Is used for initially calculating the three-dimensional (X c ,Y c ,Z c )。
Preferably, the parameters obtained by the calibration of the system described in step 3 include camera internal parameters A c And exo-ginseng R c And t c Projector internal reference A p And exo-ginseng R p And t p
Lower coordinate of camera coordinate system (X) c ,Y c ,Z c ) To pixel coordinates (u) p ,v p ) The conversion process of (2) is as follows:
Figure BDA0004109269420000031
the process is carried out by the steps of,
Figure BDA0004109269420000032
the simplified post-conversion process comprises the following steps:
Figure BDA0004109269420000033
and (3) integrating to obtain:
Figure BDA0004109269420000034
lower coordinate of camera coordinate system (X) c ,Y c ,Z c ) To camera pixel coordinates (u c ,v c ) The conversion process of (2) is as follows:
Figure BDA0004109269420000041
the product can be obtained by the method,
Figure BDA0004109269420000042
preferably, the image is displayed according to the coordinates (u) of any point on the closed boundary under the camera pixel coordinate system c ,v c ) Mapping to pixel coordinates u under projector p And the preliminarily calculated three-dimensional (X c ,Y c ,Z c ) To obtain its mapping to the pixel coordinate v under the projector p The method comprises the following steps:
Figure BDA0004109269420000043
preferably, in step 5, 1 is taken from inside the closed boundary under the projector pixel coordinate system, and 0 is taken from outside to generate a binary mask
Figure BDA0004109269420000044
Regenerating a maximum input gray scale map:
Figure BDA0004109269420000045
i represents the number of iterations and r represents the reduced gray level of the overexposed region.
Compared with the prior art, the invention has the remarkable advantages that:
the invention recognizes the over-exposed area through the acquired maximum input gray level image, locally reduces the maximum input gray level for multiple times, so that the finally acquired maximum input gray level image is subjected to three-dimensional reconstruction after the over-exposed area is not left, and the highlight problem in the prior art is well solved. The images are collected for fusion without blindly adjusting the exposure time for multiple times, so that the number of collected pictures is reduced, and the operation flow is simplified.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, some descriptions will be given below to drawings which are required to be used in the embodiments. The drawings are merely examples and other drawings may be derived from them without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a high dynamic three-dimensional measurement method based on the self-adaptive distribution of the projection intensity of the overexposed connected domain.
FIG. 2 is an image model of a monocular fringe projection system in accordance with an embodiment of the present invention.
FIG. 3 is a circular dot planar panel used in an embodiment of the present invention.
Fig. 4 shows a comparison of the projection intensity before and after the adaptation, in which (a) is a high-frequency fringe pattern of the initial projection and (b) is a high-frequency fringe pattern after 3 iterations.
Fig. 5 is a comparison of the reconstruction effects before and after the projection intensity adaptation in the embodiment of the present invention, where (a) is an initial reconstruction result and (b) is a reconstruction result after 3 iterations.
Detailed Description
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application. Other embodiments, which may be made by those of ordinary skill in the art without the benefit of the teachings of this invention, are within the scope of the invention.
A high dynamic three-dimensional measurement method based on overexposure connected domain intensity self-adaptive distribution is shown in fig. 1, and comprises the following steps:
step 1: a monocular imaging system is built, and an imaging model of the monocular imaging system is shown in fig. 2.
In some embodiments, a white dot planar plate with a large reflectance change is used as the object to be measured, as shown in fig. 3. Generating a sinusoidal grating fringe pattern and the maximum input gray level by using a computer, projecting the sinusoidal grating fringe pattern onto the surface of a measured object by using a monocular fringe projection system, and synchronously acquiring corresponding images; computer-generated four-step phase-shift sinusoidal grating fringes
Figure BDA0004109269420000051
The optical expression of (c) is as follows,
Figure BDA0004109269420000052
the image pixel depth is 8 bits, where n=0, 1,2,3 is the phase shift index, (u) p ,v p ) For projector pixel coordinates, M migl A is the maximum input gray scale map, a p Represents the average intensity, b p Represents the amplitude, a p =b p =0.5,
Figure BDA0004109269420000053
Is the frequency of the projected sine stripes, expressed as +.>
Figure BDA0004109269420000054
T represents the period of the projection space stripe, and U represents the resolution in the direction of the projection space U in pixel. Wherein the initial high frequency fringe pattern is shown in fig. 4 (a).
Step 2: and identifying the connected overexposed region from the acquired maximum input gray level diagram according to the threshold value, extracting the boundary of the overexposed region under the camera pixel coordinate system, and tracking.
The maximum input gray-scale image collected can be represented as I c (u c ,v c ),(u c ,v c ) For projector pixel coordinates, according to the threshold thr=250, the pixel value is equal to or greater than the threshold value, which is overexposureRegion and pass through binary image
Figure BDA0004109269420000061
The marks identify the areas that have been exposed to light,
Figure BDA0004109269420000062
for binary images
Figure BDA0004109269420000063
And (3) carrying out connected domain analysis, finding and marking each connected domain in the image, and removing smaller connected domains. Extracting the boundary of the residual saturated pixel area by adopting a Canny operator to obtain a mask space of the boundary +.>
Figure BDA0004109269420000064
Step 3: carrying out three-dimensional reconstruction on an object by combining parameters obtained by calibrating the system in advance and the sinusoidal grating stripes acquired in the step 1; the parameters obtained by the system calibration comprise camera internal parameters A c And exo-ginseng R c And t c Projector internal reference A p And exo-ginseng R p And t p Lower coordinate of camera coordinate system (X c ,Y c ,Z c ) Conversion to pixel coordinates under projector (u p ,v p ) The process of (1) is as follows:
Figure BDA0004109269420000065
the process is carried out by the steps of,
Figure BDA0004109269420000066
equation (3) simplifies the post-conversion process to,
Figure BDA0004109269420000067
further integration of equation (4), it can be obtained,
Figure BDA0004109269420000071
lower coordinate of camera coordinate system (X) c ,Y c ,Z c ) Conversion to camera pixel coordinates (u c ,v c ) The process of (1) is that,
Figure BDA0004109269420000072
it is possible to obtain a solution,
Figure BDA0004109269420000073
the continuous phase map obtained can be recorded as Φ (u c ,v c ) According to the law of phase distribution, the horizontal position of the projector under the pixel coordinate system can be calculated as,
Figure BDA0004109269420000074
preliminary calculation of three-dimensional (X) of object by combining equations (5), (7) and (8) c ,Y c ,Z c ). The initial three-dimensional of the white dot planar panel is thus obtained as shown in fig. 5 (a).
Step 4: mapping the closed boundary coordinates under the camera pixel coordinate system to the projector pixel coordinate system, tracking the coordinates of the closed boundary under the projector pixel coordinate system, and automatically connecting to form the closed boundary;
coordinates (u) of a point of the closed boundary in the camera pixel coordinate system obtained in step 2 c ,v c ) Mapping to pixel coordinates u under projector p Equation (8) is employed, while since the three-dimensional (X c ,Y c ,Z c ) According to equation (4),to obtain its mapping to the pixel coordinate v under the projector p In order to achieve this, the first and second,
Figure BDA0004109269420000075
so that the boundary in step 2 can be made
Figure BDA0004109269420000076
Mapping to projector pixel coordinate system one by one, and automatically connecting to obtain closed boundary +.>
Figure BDA0004109269420000077
Step 5: by closed boundaries
Figure BDA0004109269420000081
Generating a binary mask, wherein 1 is taken from the inside of a closed boundary under a projector pixel coordinate system, 0 is taken from the outside, and the binary mask generated by the binary mask is +.>
Figure BDA0004109269420000082
The maximum input gray-scale map generated is represented as,
Figure BDA0004109269420000083
where i represents the number of iterations, r represents the reduced gray level of the overexposed region, and the gray level is taken to be 60. And further reducing the maximum input gray level inside the closed boundary, and regenerating the maximum input gray level map and the sine grating fringes.
Step 6: projecting the maximum input gray level image and sinusoidal grating stripes again onto the object and collecting, and performing steps 2 to 5, so that after 3 iterations, the regenerated high frequency stripes are shown in fig. 4 (b). Finally, the image acquired by the maximum input gray level image is projected to have no overexposure area, the continuous phase obtained by utilizing the sinusoidal grating stripes is finally adopted to calculate the complete three-dimension of the measured object by adopting equations (5), (7) and (8), as shown in fig. 5 (b).

Claims (7)

1. A high-dynamic three-dimensional measurement method based on self-adaptive distribution of overexposure connected domain intensity is characterized by comprising the following specific steps:
step 1: generating a maximum input gray level image and a four-step phase-shift sinusoidal fringe image by using a computer, projecting the maximum input gray level image and the four-step phase-shift sinusoidal fringe image onto the surface of a measured object by using a monocular fringe projection system, and synchronously collecting the sinusoidal fringe image and the maximum input gray level image;
step 2: identifying a communicated overexposure region from the acquired maximum input gray level diagram, extracting the boundary of the overexposure region under a camera pixel coordinate system, and tracking;
step 3: carrying out preliminary three-dimensional reconstruction on the object by combining parameters obtained by calibrating the system and continuous phases obtained by the acquired sine stripes;
step 4: mapping the closed boundary coordinates under the camera pixel coordinate system to the projector pixel coordinate system, tracking the coordinates of the closed boundary under the projector pixel coordinate system, and automatically connecting to form the closed boundary;
step 5: generating a binary mask by the closed boundary, reducing the maximum input gray level in the closed boundary, and regenerating the maximum input gray level map and the sine stripes with the adjusted intensity;
step 6: and projecting the maximum input gray level image and the sinusoidal stripes onto the object again, collecting, repeating the steps 2-5 until the image collected by the maximum input gray level image is projected without an overexposed area, and reconstructing the complete three-dimension of the object by utilizing the continuous phases obtained by the sinusoidal stripes.
2. The high dynamic three-dimensional measurement method based on the self-adaptive distribution of the intensity of the overexposed connected domain as claimed in claim 1, wherein the four-step phase-shifted sinusoidal grating fringes generated in step 1 are computer-generated
Figure FDA0004109269410000011
The optical expression of (2) is as follows:
Figure FDA0004109269410000012
where n=0, 1,2,3 is the phase shift index, (u) p ,v p ) For projector pixel coordinates, M migl A is the maximum input gray scale map, a p Represents the average intensity, b p The amplitude is represented by a value representing the amplitude,
Figure FDA0004109269410000013
is the frequency at which the sinusoidal fringes are projected.
3. The high-dynamic three-dimensional measurement method based on the self-adaptive distribution of the intensity of the overexposed connected domain according to claim 1, wherein the maximum input gray level diagram acquired in the step 2 is represented as I c (u c ,v c ),(u c ,v c ) For the camera pixel coordinates, according to a given threshold thr, the pixel is over-exposed and passes through a binary image
Figure FDA0004109269410000014
The marks identify the areas that have been exposed to light,
Figure FDA0004109269410000021
for binary images
Figure FDA0004109269410000022
Carrying out connected domain analysis, finding and marking each connected domain in the image, removing smaller connected domains, extracting the boundary of the residual saturated pixel region by adopting a Canny operator, and obtaining a mask space of the boundary
Figure FDA0004109269410000023
4. The overbased of claim 1The high dynamic three-dimensional measurement method of the self-adaptive distribution of the intensity of the exposure connected domain is characterized in that the continuous phase diagram obtained in the step 3 is phi (u) c ,v c ) According to the rule of phase distribution, calculating the horizontal position of the continuous phase diagram under the pixel coordinate system of the projector is as follows:
Figure FDA0004109269410000024
u is the resolution in the projection space U direction, T is the period of the projection space stripe.
Horizontal position in the pixel coordinate system of the projector, camera coordinate system lower coordinates (X c ,Y c ,Z c ) To pixel coordinates (u) p ,v p ) And the camera coordinate system lower coordinate (X c ,Y c ,Z c ) To camera pixel coordinates (u c ,v c ) Is used for initially calculating the three-dimensional (X c ,Y c ,Z c )。
5. The method for high-dynamic three-dimensional measurement based on the self-adaptive distribution of the intensity of the overexposed connected domain as claimed in claim 1, wherein the parameters obtained by the system calibration in the step 3 include camera internal parameters A c And exo-ginseng R c And t c Projector internal reference A p And exo-ginseng R p And t p
Lower coordinate of camera coordinate system (X) c ,Y c ,Z c ) To pixel coordinates (u) p ,v p ) The conversion process of (2) is as follows:
Figure FDA0004109269410000025
the process is carried out by the steps of,
Figure FDA0004109269410000026
the simplified post-conversion process comprises the following steps:
Figure FDA0004109269410000031
and (3) integrating to obtain:
Figure FDA0004109269410000032
lower coordinate of camera coordinate system (X) c ,Y c ,Z c ) To camera pixel coordinates (u c ,v c ) The conversion process of (2) is as follows:
Figure FDA0004109269410000033
the product can be obtained by the method,
Figure FDA0004109269410000034
6. the method for high-dynamic three-dimensional measurement based on the adaptive distribution of the intensity of the overexposed connected domain as claimed in claim 5, wherein the method is characterized in that the method is performed according to the coordinates (u c ,v c ) Mapping to pixel coordinates u under projector p And the preliminarily calculated three-dimensional (X c ,Y c ,Z c ) To obtain its mapping to the pixel coordinate v under the projector p The method comprises the following steps:
Figure FDA0004109269410000035
7. the overexposed connected domain-based intensity adaptive segmentation of claim 1A cloth high dynamic three-dimensional measurement method is characterized in that in step 5, 1 is taken from the inside of a closed boundary under a projector pixel coordinate system, 0 is taken from the outside, and a binary mask is generated
Figure FDA0004109269410000036
Regenerating a maximum input gray scale map:
Figure FDA0004109269410000041
i represents the number of iterations and r represents the reduced gray level of the overexposed region.
CN202310201805.1A 2022-12-12 2023-03-06 High-dynamic three-dimensional measurement method based on self-adaptive distribution of intensity of overexposure connected domain Pending CN116310101A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211591319 2022-12-12
CN2022115913197 2022-12-12

Publications (1)

Publication Number Publication Date
CN116310101A true CN116310101A (en) 2023-06-23

Family

ID=86795474

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310201805.1A Pending CN116310101A (en) 2022-12-12 2023-03-06 High-dynamic three-dimensional measurement method based on self-adaptive distribution of intensity of overexposure connected domain

Country Status (1)

Country Link
CN (1) CN116310101A (en)

Similar Documents

Publication Publication Date Title
CN110288642B (en) Three-dimensional object rapid reconstruction method based on camera array
CN112106105B (en) Method and system for generating three-dimensional image of object
CN106705855B (en) A kind of high dynamic performance method for three-dimensional measurement based on adaptive optical grating projection
CN107607060A (en) A kind of phase error compensation method in the measurement applied to grating tripleplane
CN107894215B (en) High dynamic range grating projection three-dimensional measurement method based on full-automatic exposure
US20120176478A1 (en) Forming range maps using periodic illumination patterns
US20120176380A1 (en) Forming 3d models using periodic illumination patterns
CN111288925B (en) Three-dimensional reconstruction method and device based on digital focusing structure illumination light field
JP5633058B1 (en) 3D measuring apparatus and 3D measuring method
CN104897083A (en) Three-dimensional rapid measurement method for raster projection based on defocusing phase-unwrapping of projector
CN111971525B (en) Method and system for measuring an object with a stereoscope
CN113358063A (en) Surface structured light three-dimensional measurement method and system based on phase weighted fusion
CN110692084A (en) Deriving topology information for a scene
JP5822463B2 (en) Three-dimensional measuring apparatus, three-dimensional measuring method, and program
JP2006058091A (en) Three-dimensional image measuring device and method
Sui et al. Accurate 3D Reconstruction of Dynamic Objects by Spatial-Temporal Multiplexing and Motion-Induced Error Elimination
KR20140021765A (en) A hologram generating method using virtual view-point depth image synthesis
CN113624159A (en) Micro laser three-dimensional model reconstruction system and method
CN110926369B (en) High-precision structured light three-dimensional measurement system and method
CN116310101A (en) High-dynamic three-dimensional measurement method based on self-adaptive distribution of intensity of overexposure connected domain
JP2538435B2 (en) Fringe phase distribution analysis method and fringe phase distribution analyzer
JP2006023133A (en) Instrument and method for measuring three-dimensional shape
Zhang et al. Accurate measurement of high-reflective surface based on adaptive fringe projection technique
KR101186103B1 (en) Method and apparatus for measuring 3d height information of measured object
CN112325799A (en) High-precision three-dimensional face measurement method based on near-infrared light projection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination