CN116310101A - High-dynamic three-dimensional measurement method based on self-adaptive distribution of intensity of overexposure connected domain - Google Patents
High-dynamic three-dimensional measurement method based on self-adaptive distribution of intensity of overexposure connected domain Download PDFInfo
- Publication number
- CN116310101A CN116310101A CN202310201805.1A CN202310201805A CN116310101A CN 116310101 A CN116310101 A CN 116310101A CN 202310201805 A CN202310201805 A CN 202310201805A CN 116310101 A CN116310101 A CN 116310101A
- Authority
- CN
- China
- Prior art keywords
- maximum input
- gray level
- input gray
- coordinate system
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2513—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/30—Assessment of water resources
Abstract
The invention discloses a high-dynamic three-dimensional measurement method based on self-adaptive distribution of projection intensity of an overexposure connected domain, which generates a maximum input gray level image and a sinusoidal fringe image, projects the maximum input gray level image and the sinusoidal fringe image onto the surface of an object to be measured and synchronously collects the maximum input gray level image and the sinusoidal fringe image; identifying an overexposed region from the acquired maximum input gray level diagram according to a threshold value, extracting the boundary of the overexposed region and tracking; carrying out preliminary three-dimensional reconstruction on the object by combining parameters obtained by pre-calibration and acquired sine stripes; mapping the coordinates of the boundary to a projector pixel coordinate system, tracking, and automatically connecting to form a closed boundary; generating a binary mask by the closed boundary, reducing the maximum input gray level in the closed boundary, and regenerating the maximum input gray level map and the sine stripes; projecting the maximum input gray level image and the sinusoidal stripes onto the object again, collecting until the collected maximum input gray level image has no overexposed area, and reconstructing the complete three-dimension of the object by using the corresponding sinusoidal stripes. The invention can realize three-dimensional measurement of the object with high dynamic range.
Description
Technical Field
The invention relates to the technical field of optical measurement, in particular to a high-dynamic three-dimensional measurement method based on self-adaptive distribution of the intensity of an overexposed connected domain.
Background
With the development of digital technology, three-dimensional measurement technology has an increasingly wide application requirement in various fields. The optical three-dimensional measurement method is various and can be roughly classified into multi (double) eye stereoscopic vision, a time-of-flight method, a structured light projection method and the like, and the shape of the surface of the object can be flexibly and efficiently perceived without physical contact. Among them, structured-light fringe projection is considered as one of the most promising technologies at present. The method comprises the steps of projecting a pre-designed sinusoidal fringe pattern onto an object through a computer-controlled projector, capturing corresponding deformed fringe by a camera, transmitting the corresponding deformed fringe pattern to the computer, extracting phase information from the captured fringe pattern by the computer, and finally realizing real three-dimensional measurement of the object by combining parameters obtained by a pre-calibration system.
Fringe projection techniques have been used in many ways, but are prone to overexposure when measuring highly reflective areas of an object, and while image saturation can be avoided by changing the aperture or exposure time, the intensity modulation of the low reflective areas is reduced, resulting in poor measurement accuracy. In the existing high-dynamic three-dimensional measurement method, the deformation fringe images under the same scene are acquired for multiple times by changing the exposure time for multiple times based on the multiple exposure technology, and the composite fringe images with high dynamic range are generated for three-dimensional reconstruction in a fusion mode. Three-dimensional measurement of highly reflective areas remains a challenging problem.
Disclosure of Invention
The invention aims to provide a high-dynamic three-dimensional measurement method based on self-adaptive distribution of overexposure connected domain intensity, which can realize three-dimensional measurement of an object with a high dynamic range under the condition of not changing camera aperture and exposure time.
In order to achieve the above object, the present invention provides the following technical solutions: a high dynamic three-dimensional measurement method based on the self-adaptive distribution of the intensity of overexposure connected domain comprises the following specific steps:
step 1: generating a maximum input gray level image and a four-step phase-shift sinusoidal fringe image by using a computer, projecting the maximum input gray level image and the four-step phase-shift sinusoidal fringe image onto the surface of a measured object by using a monocular fringe projection system, and synchronously collecting the sinusoidal fringe image and the maximum input gray level image;
step 2: identifying a communicated overexposure region from the acquired maximum input gray level diagram, extracting the boundary of the overexposure region under a camera pixel coordinate system, and tracking;
step 3: carrying out preliminary three-dimensional reconstruction on the object by combining parameters obtained by calibrating the system and continuous phases obtained by the acquired sine stripes;
step 4: mapping the closed boundary coordinates under the camera pixel coordinate system to the projector pixel coordinate system, tracking the coordinates of the closed boundary under the projector pixel coordinate system, and automatically connecting to form the closed boundary;
step 5: generating a binary mask by the closed boundary, reducing the maximum input gray level in the closed boundary, and regenerating the maximum input gray level map and the sine stripes with the adjusted intensity;
step 6: and projecting the maximum input gray level image and the sinusoidal stripes onto the object again, collecting, repeating the steps 2-5 until the image collected by the maximum input gray level image is projected without an overexposed area, and reconstructing the complete three-dimension of the object by utilizing the continuous phases obtained by the sinusoidal stripes.
Preferably, the four-step phase-shifted sinusoidal grating fringes generated in step 1 are computer-generatedThe optical expression of (2) is as follows:
where n=0, 1,2,3 is the phase shift index, (u) p ,v p ) For projector pixel coordinates, M migl A is the maximum input gray scale map, a p Represents the average intensity, b p The amplitude is represented by a value representing the amplitude,is the frequency at which the sinusoidal fringes are projected.
Preferably, the maximum input gray-scale map acquired in step 2 is denoted as I c (u c ,v c ),(u c ,v c ) For the camera pixel coordinates, according to a given threshold thr, the pixel is over-exposed and passes through a binary imageThe marks identify the areas that have been exposed to light,
for binary imagesCarrying out connected domain analysis, finding and marking each connected domain in the image, removing smaller connected domains, extracting the boundary of the residual saturated pixel region by adopting a Canny operator, and obtaining mask space of the boundary>
Preferably, the continuous phase diagram obtained in step 3 is Φ (u c ,v c ) According to the rule of phase distribution, calculating the horizontal position of the continuous phase diagram under the pixel coordinate system of the projector is as follows:
u is the resolution in the projection space U direction, T is the period of the projection space stripe.
Horizontal position in the pixel coordinate system of the projector, camera coordinate system lower coordinates (X c ,Y c ,Z c ) To pixel coordinates (u) p ,v p ) And the camera coordinate system lower coordinate (X c ,Y c ,Z c ) To camera pixel coordinates (u c ,v c ) Is used for initially calculating the three-dimensional (X c ,Y c ,Z c )。
Preferably, the parameters obtained by the calibration of the system described in step 3 include camera internal parameters A c And exo-ginseng R c And t c Projector internal reference A p And exo-ginseng R p And t p ;
Lower coordinate of camera coordinate system (X) c ,Y c ,Z c ) To pixel coordinates (u) p ,v p ) The conversion process of (2) is as follows:
the process is carried out by the steps of,
the simplified post-conversion process comprises the following steps:
and (3) integrating to obtain:
lower coordinate of camera coordinate system (X) c ,Y c ,Z c ) To camera pixel coordinates (u c ,v c ) The conversion process of (2) is as follows:
the product can be obtained by the method,
preferably, the image is displayed according to the coordinates (u) of any point on the closed boundary under the camera pixel coordinate system c ,v c ) Mapping to pixel coordinates u under projector p And the preliminarily calculated three-dimensional (X c ,Y c ,Z c ) To obtain its mapping to the pixel coordinate v under the projector p The method comprises the following steps:
preferably, in step 5, 1 is taken from inside the closed boundary under the projector pixel coordinate system, and 0 is taken from outside to generate a binary maskRegenerating a maximum input gray scale map:
i represents the number of iterations and r represents the reduced gray level of the overexposed region.
Compared with the prior art, the invention has the remarkable advantages that:
the invention recognizes the over-exposed area through the acquired maximum input gray level image, locally reduces the maximum input gray level for multiple times, so that the finally acquired maximum input gray level image is subjected to three-dimensional reconstruction after the over-exposed area is not left, and the highlight problem in the prior art is well solved. The images are collected for fusion without blindly adjusting the exposure time for multiple times, so that the number of collected pictures is reduced, and the operation flow is simplified.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, some descriptions will be given below to drawings which are required to be used in the embodiments. The drawings are merely examples and other drawings may be derived from them without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a high dynamic three-dimensional measurement method based on the self-adaptive distribution of the projection intensity of the overexposed connected domain.
FIG. 2 is an image model of a monocular fringe projection system in accordance with an embodiment of the present invention.
FIG. 3 is a circular dot planar panel used in an embodiment of the present invention.
Fig. 4 shows a comparison of the projection intensity before and after the adaptation, in which (a) is a high-frequency fringe pattern of the initial projection and (b) is a high-frequency fringe pattern after 3 iterations.
Fig. 5 is a comparison of the reconstruction effects before and after the projection intensity adaptation in the embodiment of the present invention, where (a) is an initial reconstruction result and (b) is a reconstruction result after 3 iterations.
Detailed Description
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application. Other embodiments, which may be made by those of ordinary skill in the art without the benefit of the teachings of this invention, are within the scope of the invention.
A high dynamic three-dimensional measurement method based on overexposure connected domain intensity self-adaptive distribution is shown in fig. 1, and comprises the following steps:
step 1: a monocular imaging system is built, and an imaging model of the monocular imaging system is shown in fig. 2.
In some embodiments, a white dot planar plate with a large reflectance change is used as the object to be measured, as shown in fig. 3. Generating a sinusoidal grating fringe pattern and the maximum input gray level by using a computer, projecting the sinusoidal grating fringe pattern onto the surface of a measured object by using a monocular fringe projection system, and synchronously acquiring corresponding images; computer-generated four-step phase-shift sinusoidal grating fringesThe optical expression of (c) is as follows,
the image pixel depth is 8 bits, where n=0, 1,2,3 is the phase shift index, (u) p ,v p ) For projector pixel coordinates, M migl A is the maximum input gray scale map, a p Represents the average intensity, b p Represents the amplitude, a p =b p =0.5,Is the frequency of the projected sine stripes, expressed as +.>T represents the period of the projection space stripe, and U represents the resolution in the direction of the projection space U in pixel. Wherein the initial high frequency fringe pattern is shown in fig. 4 (a).
Step 2: and identifying the connected overexposed region from the acquired maximum input gray level diagram according to the threshold value, extracting the boundary of the overexposed region under the camera pixel coordinate system, and tracking.
The maximum input gray-scale image collected can be represented as I c (u c ,v c ),(u c ,v c ) For projector pixel coordinates, according to the threshold thr=250, the pixel value is equal to or greater than the threshold value, which is overexposureRegion and pass through binary imageThe marks identify the areas that have been exposed to light,
for binary imagesAnd (3) carrying out connected domain analysis, finding and marking each connected domain in the image, and removing smaller connected domains. Extracting the boundary of the residual saturated pixel area by adopting a Canny operator to obtain a mask space of the boundary +.>
Step 3: carrying out three-dimensional reconstruction on an object by combining parameters obtained by calibrating the system in advance and the sinusoidal grating stripes acquired in the step 1; the parameters obtained by the system calibration comprise camera internal parameters A c And exo-ginseng R c And t c Projector internal reference A p And exo-ginseng R p And t p Lower coordinate of camera coordinate system (X c ,Y c ,Z c ) Conversion to pixel coordinates under projector (u p ,v p ) The process of (1) is as follows:
the process is carried out by the steps of,
equation (3) simplifies the post-conversion process to,
further integration of equation (4), it can be obtained,
lower coordinate of camera coordinate system (X) c ,Y c ,Z c ) Conversion to camera pixel coordinates (u c ,v c ) The process of (1) is that,
it is possible to obtain a solution,
the continuous phase map obtained can be recorded as Φ (u c ,v c ) According to the law of phase distribution, the horizontal position of the projector under the pixel coordinate system can be calculated as,
preliminary calculation of three-dimensional (X) of object by combining equations (5), (7) and (8) c ,Y c ,Z c ). The initial three-dimensional of the white dot planar panel is thus obtained as shown in fig. 5 (a).
Step 4: mapping the closed boundary coordinates under the camera pixel coordinate system to the projector pixel coordinate system, tracking the coordinates of the closed boundary under the projector pixel coordinate system, and automatically connecting to form the closed boundary;
coordinates (u) of a point of the closed boundary in the camera pixel coordinate system obtained in step 2 c ,v c ) Mapping to pixel coordinates u under projector p Equation (8) is employed, while since the three-dimensional (X c ,Y c ,Z c ) According to equation (4),to obtain its mapping to the pixel coordinate v under the projector p In order to achieve this, the first and second,
so that the boundary in step 2 can be madeMapping to projector pixel coordinate system one by one, and automatically connecting to obtain closed boundary +.>
Step 5: by closed boundariesGenerating a binary mask, wherein 1 is taken from the inside of a closed boundary under a projector pixel coordinate system, 0 is taken from the outside, and the binary mask generated by the binary mask is +.>The maximum input gray-scale map generated is represented as,
where i represents the number of iterations, r represents the reduced gray level of the overexposed region, and the gray level is taken to be 60. And further reducing the maximum input gray level inside the closed boundary, and regenerating the maximum input gray level map and the sine grating fringes.
Step 6: projecting the maximum input gray level image and sinusoidal grating stripes again onto the object and collecting, and performing steps 2 to 5, so that after 3 iterations, the regenerated high frequency stripes are shown in fig. 4 (b). Finally, the image acquired by the maximum input gray level image is projected to have no overexposure area, the continuous phase obtained by utilizing the sinusoidal grating stripes is finally adopted to calculate the complete three-dimension of the measured object by adopting equations (5), (7) and (8), as shown in fig. 5 (b).
Claims (7)
1. A high-dynamic three-dimensional measurement method based on self-adaptive distribution of overexposure connected domain intensity is characterized by comprising the following specific steps:
step 1: generating a maximum input gray level image and a four-step phase-shift sinusoidal fringe image by using a computer, projecting the maximum input gray level image and the four-step phase-shift sinusoidal fringe image onto the surface of a measured object by using a monocular fringe projection system, and synchronously collecting the sinusoidal fringe image and the maximum input gray level image;
step 2: identifying a communicated overexposure region from the acquired maximum input gray level diagram, extracting the boundary of the overexposure region under a camera pixel coordinate system, and tracking;
step 3: carrying out preliminary three-dimensional reconstruction on the object by combining parameters obtained by calibrating the system and continuous phases obtained by the acquired sine stripes;
step 4: mapping the closed boundary coordinates under the camera pixel coordinate system to the projector pixel coordinate system, tracking the coordinates of the closed boundary under the projector pixel coordinate system, and automatically connecting to form the closed boundary;
step 5: generating a binary mask by the closed boundary, reducing the maximum input gray level in the closed boundary, and regenerating the maximum input gray level map and the sine stripes with the adjusted intensity;
step 6: and projecting the maximum input gray level image and the sinusoidal stripes onto the object again, collecting, repeating the steps 2-5 until the image collected by the maximum input gray level image is projected without an overexposed area, and reconstructing the complete three-dimension of the object by utilizing the continuous phases obtained by the sinusoidal stripes.
2. The high dynamic three-dimensional measurement method based on the self-adaptive distribution of the intensity of the overexposed connected domain as claimed in claim 1, wherein the four-step phase-shifted sinusoidal grating fringes generated in step 1 are computer-generatedThe optical expression of (2) is as follows:
where n=0, 1,2,3 is the phase shift index, (u) p ,v p ) For projector pixel coordinates, M migl A is the maximum input gray scale map, a p Represents the average intensity, b p The amplitude is represented by a value representing the amplitude,is the frequency at which the sinusoidal fringes are projected.
3. The high-dynamic three-dimensional measurement method based on the self-adaptive distribution of the intensity of the overexposed connected domain according to claim 1, wherein the maximum input gray level diagram acquired in the step 2 is represented as I c (u c ,v c ),(u c ,v c ) For the camera pixel coordinates, according to a given threshold thr, the pixel is over-exposed and passes through a binary imageThe marks identify the areas that have been exposed to light,
4. The overbased of claim 1The high dynamic three-dimensional measurement method of the self-adaptive distribution of the intensity of the exposure connected domain is characterized in that the continuous phase diagram obtained in the step 3 is phi (u) c ,v c ) According to the rule of phase distribution, calculating the horizontal position of the continuous phase diagram under the pixel coordinate system of the projector is as follows:
u is the resolution in the projection space U direction, T is the period of the projection space stripe.
Horizontal position in the pixel coordinate system of the projector, camera coordinate system lower coordinates (X c ,Y c ,Z c ) To pixel coordinates (u) p ,v p ) And the camera coordinate system lower coordinate (X c ,Y c ,Z c ) To camera pixel coordinates (u c ,v c ) Is used for initially calculating the three-dimensional (X c ,Y c ,Z c )。
5. The method for high-dynamic three-dimensional measurement based on the self-adaptive distribution of the intensity of the overexposed connected domain as claimed in claim 1, wherein the parameters obtained by the system calibration in the step 3 include camera internal parameters A c And exo-ginseng R c And t c Projector internal reference A p And exo-ginseng R p And t p ;
Lower coordinate of camera coordinate system (X) c ,Y c ,Z c ) To pixel coordinates (u) p ,v p ) The conversion process of (2) is as follows:
the process is carried out by the steps of,
the simplified post-conversion process comprises the following steps:
and (3) integrating to obtain:
lower coordinate of camera coordinate system (X) c ,Y c ,Z c ) To camera pixel coordinates (u c ,v c ) The conversion process of (2) is as follows:
the product can be obtained by the method,
6. the method for high-dynamic three-dimensional measurement based on the adaptive distribution of the intensity of the overexposed connected domain as claimed in claim 5, wherein the method is characterized in that the method is performed according to the coordinates (u c ,v c ) Mapping to pixel coordinates u under projector p And the preliminarily calculated three-dimensional (X c ,Y c ,Z c ) To obtain its mapping to the pixel coordinate v under the projector p The method comprises the following steps:
7. the overexposed connected domain-based intensity adaptive segmentation of claim 1A cloth high dynamic three-dimensional measurement method is characterized in that in step 5, 1 is taken from the inside of a closed boundary under a projector pixel coordinate system, 0 is taken from the outside, and a binary mask is generatedRegenerating a maximum input gray scale map:
i represents the number of iterations and r represents the reduced gray level of the overexposed region.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211591319 | 2022-12-12 | ||
CN2022115913197 | 2022-12-12 |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116310101A true CN116310101A (en) | 2023-06-23 |
Family
ID=86795474
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310201805.1A Pending CN116310101A (en) | 2022-12-12 | 2023-03-06 | High-dynamic three-dimensional measurement method based on self-adaptive distribution of intensity of overexposure connected domain |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116310101A (en) |
-
2023
- 2023-03-06 CN CN202310201805.1A patent/CN116310101A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110288642B (en) | Three-dimensional object rapid reconstruction method based on camera array | |
CN112106105B (en) | Method and system for generating three-dimensional image of object | |
CN106705855B (en) | A kind of high dynamic performance method for three-dimensional measurement based on adaptive optical grating projection | |
CN107607060A (en) | A kind of phase error compensation method in the measurement applied to grating tripleplane | |
CN107894215B (en) | High dynamic range grating projection three-dimensional measurement method based on full-automatic exposure | |
US20120176478A1 (en) | Forming range maps using periodic illumination patterns | |
US20120176380A1 (en) | Forming 3d models using periodic illumination patterns | |
CN111288925B (en) | Three-dimensional reconstruction method and device based on digital focusing structure illumination light field | |
JP5633058B1 (en) | 3D measuring apparatus and 3D measuring method | |
CN104897083A (en) | Three-dimensional rapid measurement method for raster projection based on defocusing phase-unwrapping of projector | |
CN111971525B (en) | Method and system for measuring an object with a stereoscope | |
CN113358063A (en) | Surface structured light three-dimensional measurement method and system based on phase weighted fusion | |
CN110692084A (en) | Deriving topology information for a scene | |
JP5822463B2 (en) | Three-dimensional measuring apparatus, three-dimensional measuring method, and program | |
JP2006058091A (en) | Three-dimensional image measuring device and method | |
Sui et al. | Accurate 3D Reconstruction of Dynamic Objects by Spatial-Temporal Multiplexing and Motion-Induced Error Elimination | |
KR20140021765A (en) | A hologram generating method using virtual view-point depth image synthesis | |
CN113624159A (en) | Micro laser three-dimensional model reconstruction system and method | |
CN110926369B (en) | High-precision structured light three-dimensional measurement system and method | |
CN116310101A (en) | High-dynamic three-dimensional measurement method based on self-adaptive distribution of intensity of overexposure connected domain | |
JP2538435B2 (en) | Fringe phase distribution analysis method and fringe phase distribution analyzer | |
JP2006023133A (en) | Instrument and method for measuring three-dimensional shape | |
Zhang et al. | Accurate measurement of high-reflective surface based on adaptive fringe projection technique | |
KR101186103B1 (en) | Method and apparatus for measuring 3d height information of measured object | |
CN112325799A (en) | High-precision three-dimensional face measurement method based on near-infrared light projection |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |