CN115829860A - High-reflectivity object image restoration method - Google Patents

High-reflectivity object image restoration method Download PDF

Info

Publication number
CN115829860A
CN115829860A CN202211363259.3A CN202211363259A CN115829860A CN 115829860 A CN115829860 A CN 115829860A CN 202211363259 A CN202211363259 A CN 202211363259A CN 115829860 A CN115829860 A CN 115829860A
Authority
CN
China
Prior art keywords
image
gray
full
gray scale
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211363259.3A
Other languages
Chinese (zh)
Inventor
吕盛坪
熊伟
李鑫
金鸿
丁克
信德全
赵贺杰
欧阳斌
张胡成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Foshan Xianyang Technology Co ltd
South China Agricultural University
Original Assignee
Foshan Xianyang Technology Co ltd
South China Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Foshan Xianyang Technology Co ltd, South China Agricultural University filed Critical Foshan Xianyang Technology Co ltd
Priority to CN202211363259.3A priority Critical patent/CN115829860A/en
Publication of CN115829860A publication Critical patent/CN115829860A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to the technical field of three-dimensional visual reconstruction, in particular to a high-reflectivity object image restoration method, which comprises the steps of obtaining an internal parameter matrix and an external parameter matrix of a three-dimensional sensor by using a Zhang Zhengyou stereo calibration method; constructing a light source model based on the phase-shift Gray code and providing light intensity calibration, wherein when the light intensity calibration is carried out, no extra coding pattern is needed to be added, and the operation is simple, convenient and quick; when the overexposure part is divided into stripe areas with alternating brightness and darkness, the stripe grating is moderate in width and strong in anti-interference capability; when the gray code is used for decoding, the phase expansion error can be well eliminated, the time consumption of point cloud reconstruction is reduced, the decoding efficiency is high, the errors are not easy to accumulate, and the integrity of the reconstructed point cloud is improved.

Description

High-reflectivity object image restoration method
Technical Field
The invention relates to the technical field of three-dimensional visual reconstruction, in particular to a high-reflectivity object image restoration method.
Background
The structured light three-dimensional reconstruction has the characteristics of high efficiency and precision, non-contact and the like, and is widely applied to the fields of mechanical manufacturing, cultural relic repair, electronic element detection and the like. The gray code has the characteristics of high decoding efficiency, convenience in calculation, difficulty in accumulation of errors and the like, is a common encoding method for structured light, and is widely applied to three-dimensional point cloud reconstruction of structured light. However, when a reflective object is imaged, the structured light irradiates on the non-Lambert surface, the reflected light intensity exceeds the imaging dynamic range of the sensor, local pixel distortion is caused, the decoding error is increased, and the reconstruction effect is poor. For this reason, the related scholars propose methods of modifying lighting conditions, fusing multi-view images, extracting highlight features, and the like.
The main idea of adapting the illumination conditions is to modify the projection pattern encoding rules such that more feature information is contained within the same pixel. And the multi-view image fusion is to splice point clouds acquired by a plurality of sets of equipment under different views by using the parallax characteristic. The two methods can better solve the problem of image local pixel distortion of a specific reconstruction scene, but cannot solve the problem of interference caused by multiple reflections of a light source, namely, the light is diffused after surface reflection and lens refraction, and light emitted by a single micro mirror wafer in a projector affects surrounding areas to form images on sensor target pixels. The method for extracting the highlight features mainly integrates near-field photometry and binary stripe structured light, and introduces information such as reflectivity and the like into the binary stripe structured light for positioning so as to eliminate specular reflection components in the stripes and improve the reconstruction precision of the reflecting object. However, such methods require the use of precision components to obtain the projection light source or object reflection characteristics, and are difficult to rapidly deploy in conventional industrial scenarios.
The prior art discloses a high-light-reflection surface three-dimensional vision measurement system and a method, which comprises the following steps: the relation between the projection intensity of the high-reflection surface and the imaging gray level is established, and the low gray level is rapidly calculated through twice exposure, so that the complex experimental process of multiple tests is avoided; the camera-projector sub-pixel coordinate mapping method is utilized to judge the absolute phase of the camera pixel corresponding to the sub-pixel in the pixel or in the pixel of the four adjacent domains, and calculate one or more camera pixels corresponding to each projection pixel, so that quantization errors are avoided, and each pixel of the area to be adjusted is accurately adjusted; and fitting model parameters according to the projector-camera gray level corresponding samples in the low-gray level sinusoidal image, and calculating the optimal projection intensity of each pixel to be adjusted, so that the sample pairs are more accurate. And finally, performing three-dimensional reconstruction on the high-reflection surface by using a multi-frequency heterodyne dephasing method and based on a monocular projector fringe projection profilometry three-dimensional measurement model. The experimental process for obtaining the low gray value is simplified, the accuracy of the model is improved, the projection gray can be accurately adjusted according to different reflection conditions of the surface, and finally the complete three-dimensional shape information of the surface is obtained. However, the scheme carries out three-dimensional reconstruction on the high-reflection surface through a multi-frequency heterodyne phase solution method and a three-dimensional measurement model based on a monocular projector fringe projection profilometry, and has the disadvantages of more error pixels in decoding and unsatisfactory reconstruction effect.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provide a high-reflectivity object image restoration method, which eliminates phase expansion errors, reduces the time consumption of point cloud reconstruction, improves decoding efficiency and improves the integrity of point cloud after the reconstruction of a reflectivity object.
In order to solve the technical problems, the invention adopts the technical scheme that:
the provided high-reflectivity object image restoration method comprises the following steps:
s1: obtaining an internal and external parameter matrix of the three-dimensional sensor by using a Zhang Zhengyou stereo calibration method;
s2: the method comprises the steps of calibrating light intensity of a three-dimensional sensor, setting a gray scale card in an imaging range of a camera, and projecting full-bright, full-dark and N-th complementary codes of gray codes by a projector respectively to obtain a full-bright image, a full-dark image and a gray code complementary code image;
s3: matching pixel values of corresponding positions of the full-bright image according to the real gray scale of discrete change of the gray scale card, extracting overexposed pixel positions in the full-bright image, the full-dark image and the gray code complementary code image by using a maximum threshold value method, and recording the real gray scale of the corresponding gray scale card;
s4: solving the relation among the illumination brightness, the object gray level and the image gray level by utilizing the full-bright image, the full-dark image and the gray code complementary code image, and fitting the parameters of the light source model;
s5: replacing the gray scale card with an object to be detected, sequentially projecting 1 full bright pattern, 2N Zhang Hubu Gray code and 4 Xiang Yima by the light source, synchronously drawing by the three-dimensional sensor, and sequentially marking as G 0 、G 1 、G 2 、…、G n
S6: extracting the full bright picture G 0 4 phase shift diagrams G i (i = n-3, …, n) as the area to be repaired, and recording the gray scale g of the non-overexposed part of the object to be detected corresponding to the projected white light 1 And image coordinates (u) 1 ,v 1 );
S7: extracting jagged boundary pixel coordinates of the overexposed part of each image in the step S6 by using a full 1 convolution kernel, and dividing the overexposed part into stripe areas with alternating brightness and darkness;
s8: calculating a correction value of the overexposure position of the image according to the parameters of the light source model in the step S4, and normalizing the corrected image to obtain a repaired image S i (x,y);
S9: obtaining wrapped phase at each pixel point by four phase shift graphs
Figure BDA0003923479500000021
Then, the absolute value is solved by the phase shift order P after the Gray code decodingPhase position
Figure BDA0003923479500000022
And converting the absolute phase into a three-dimensional point cloud by using the internal reference matrix in the step S1.
The high-reflectivity object image restoration method of the invention adopts a Zhang Zhengyou stereo calibration method to obtain an internal and external parameter matrix of the three-dimensional sensor; constructing a light source model based on the phase-shift Gray code, wherein when light intensity calibration is carried out, no extra coding pattern is needed to be added, and the operation is simple, convenient and quick; when the overexposure part is divided into stripe areas with alternating brightness and darkness, the stripe grating is moderate in width and strong in anti-interference capability; when the gray code is used for decoding, the phase expansion error can be well eliminated, the time consumption of point cloud reconstruction is reduced, the decoding efficiency is high, the errors are not easy to accumulate, and the integrity of the reconstructed point cloud is improved.
Further, in step S2, the calculation process of the order N of the required gray code is as follows:
Figure BDA0003923479500000031
in the formula, W p The number of pixels in the horizontal direction of projection is used; t is the number of projector pixels contained in a single period of the phase-shift grating;
Figure BDA0003923479500000032
to round up the symbol, the smallest integer not smaller than the internal value is returned.
Further, in step S4, the process of fitting the light source model includes:
s41: establishing a light source model;
s42: and solving the light source model.
Further, the specific process of establishing the light source model includes:
s411: averaging the gray value of the projected image to obtain the projected integral equivalent gray B w
Figure BDA0003923479500000033
In the formula, W p The number of pixels in the horizontal direction of projection is used; h p The number of pixels in the vertical direction of projection is taken as the pixel number; u is a horizontal coordinate under an image coordinate system; v is a vertical coordinate under an image coordinate system; b (u, v) is the gray value of the pixel point corresponding to the projection pattern;
s412: calculating theoretical light intensity gain I at each pixel point (x, y) of the image w (x,y):
Figure BDA0003923479500000034
In the formula, k is the whole photosensitive coefficient of the projection light source; sigma is the gray value standard deviation of the whole image;
s413: considering the real color of an object and the influence of external noise in imaging, finishing a full-dark image, the real gray scale G of the object and theoretical light intensity gain to obtain a gray scale image G (x, y) containing highlight features:
G(x,y)=γ(g)*[B(x,y)+I w (x,y)]+G 0 (x,y)
wherein gamma (g) is the influence coefficient of the standard gray scale of the object on the image gray scale; b (x, y) is a continuous function of B (u, v); g 0 (x, y) is a full dark image.
Further, the process of solving the light source model includes:
s421: using a contour extraction function to the full bright pattern overexposure area in the step S3, keeping the contour information of the same part of the real gray scale, and solving the image coordinate (u) corresponding to the optical center of the projector by using circle center fitting 0 ,v 0 ) As the origin of the gaussian model;
s422: in the unsaturated region, for the same image with the same illumination, subtracting the full bright image and the full dark image to eliminate additive noise, and comparing gray values at different gray levels to obtain a mapping relation between an image pixel value and an object gray level:
ΔG=γ(Δg)*B(x,y)
wherein Δ G represents the difference between the image pixel and the reference pixel; gamma (Δ g) represents a difference between an influence coefficient of a gray scale on a full bright image and an influence coefficient of a gray scale on a full dark image;
solving an influence coefficient gamma (g) of the gray scale on the image by utilizing polynomial fitting;
in the unsaturated region, for images with different illumination, comparing the same pixel and gray scale to be g 0 Calculating the two-dimensional Gaussian distribution of the whole light source acting on the image in the space according to the gray value of the image:
Figure BDA0003923479500000041
in the formula, g 0 Representing the gray scale value of any gray scale card in the image; Δ x represents the difference between the abscissa of the pixels; Δ y represents the difference in ordinate between pixels;
solving for σ and k using a least squares method:
Figure BDA0003923479500000042
Figure BDA0003923479500000043
in the formula, r represents the linear distance from the pixel point to the optical center.
Further, in step S8, the correction value of the overexposure of the image is calculated as follows:
Figure BDA0003923479500000044
in the formula, Δ G (x, y) represents a correction value at the overexposure of the image.
Further, in step S8, the restored image S i (x,y):
Figure BDA0003923479500000051
In the formula,. DELTA.G max The maximum value of the image overexposure correction value is obtained; g i (x, y) is the image of the region to be repaired in step S6.
Further, in step S9, the image of the four-step phase-shift grating can be represented as:
Figure BDA0003923479500000052
in the formula, A represents the gray level of the background of an object; b represents the amplitude of the modulated light intensity of the projector; s n (u, v) represents a corrected gradation value at the pixel coordinate (u, v) at which the three-dimensional sensor acquired the n-th (n =1,2,3,4) picture.
Further, in step S9, the internal and external reference matrices in step S1 are used to convert the absolute phase into a three-dimensional point cloud:
Figure BDA0003923479500000053
Figure BDA0003923479500000054
where P (u, v) represents the Gray code phase shift order at the pixel coordinates.
Further, in step S9, after the object is replaced, steps S5 to S9 are repeated, and continuous reconstruction can be achieved.
Compared with the background technology, the high-reflectivity object image restoration method of the invention has the following beneficial effects:
the phase expansion error is eliminated, the time consumption of point cloud reconstruction is reduced, the decoding efficiency is improved, and the integrity of the point cloud after the reconstruction of the reflecting object is improved.
Drawings
FIG. 1 is a flowchart of image inpainting of a highly reflective object according to an embodiment of the present invention;
FIG. 2 is a diagram of a gray card according to an embodiment of the present invention;
FIG. 3 is a graph of a fit curve of gray scale versus image impact in an embodiment of the present invention;
FIG. 4 is a diagram of a metal wrench before repairing an image according to a third embodiment of the present invention;
fig. 5 shows a repaired metal wrench image according to a third embodiment of the present invention.
Detailed Description
The present invention will be further described with reference to the following embodiments. Wherein the showings are for the purpose of illustration only and are shown by way of illustration only and not in actual form, and are not to be construed as limiting the present patent; for a better explanation of the embodiments of the present invention, some parts of the drawings may be omitted, enlarged or reduced, and do not represent the size of an actual product; it will be understood by those skilled in the art that certain well-known structures in the drawings and descriptions thereof may be omitted.
The same or similar reference numerals in the drawings of the embodiments of the present invention correspond to the same or similar components; in the description of the present invention, it should be understood that if there is an orientation or positional relationship indicated by the terms "upper", "lower", "left", "right", etc., based on the orientation or positional relationship shown in the drawings, it is only for convenience of description and simplification of the description, but it is not intended to indicate or imply that the device or element referred to must have a specific orientation, be constructed in a specific orientation and operate, and therefore the terms describing the positional relationship in the drawings are only used for illustrative purposes and are not to be construed as limiting the present patent, and it is possible for one of ordinary skill in the art to understand the specific meaning of the above terms according to the specific situation.
Example one
A method for restoring an image of a highly reflective object, as shown in fig. 1, includes the following steps:
s1: obtaining an internal and external parameter matrix of the three-dimensional sensor by using a Zhang Zhengyou stereo calibration method;
s2: the method comprises the steps of calibrating light intensity of a three-dimensional sensor, setting a gray scale card in an imaging range of a camera, and projecting full-bright, full-dark and N-th complementary codes of gray codes by a projector respectively to obtain a full-bright image, a full-dark image and a gray code complementary code image;
s3: matching pixel values of corresponding positions of the full-bright image according to the real gray scale of discrete change of the gray scale card, then extracting the positions of overexposed pixels in the full-bright image, the full-dark image and the gray code complementary code image by using a maximum threshold value method, and recording the real gray scale of the corresponding gray scale card; wherein, the gray scale card is shown in FIG. 2;
s4: solving the relation among the illumination brightness, the object gray level and the image gray level by utilizing the full-bright image, the full-dark image and the gray code complementary code image, and fitting the parameters of the light source model;
s5: replacing the gray scale card with an object to be detected, sequentially projecting 1 full bright pattern, 2N Zhang Hubu Gray code and 4 Xiang Yima by the light source, synchronously drawing by the three-dimensional sensor, and sequentially marking as G 0 、G 1 、G 2 、…、G n
S6: extracting the full bright picture G 0 4 phase shift diagrams G i (i = n-3, …, n) as the area to be repaired, and recording the gray scale g of the non-overexposed part of the object to be detected corresponding to the projected white light 1 And image coordinates (u) 1 ,v 1 );
S7: extracting jagged boundary pixel coordinates of the overexposed part of each image in the step S6 by using a full 1 convolution kernel, and dividing the overexposed part into stripe areas with alternating brightness and darkness;
s8: calculating a correction value of the overexposure position of the image according to the parameters of the light source model in the step S4, and performing normalization processing on the corrected image to obtain a repaired image S i (x,y);
S9: obtaining wrapped phase at each pixel point by four phase shift graphs
Figure BDA0003923479500000061
Then, the absolute phase is resolved by the phase shift order P decoded by the Gray code
Figure BDA0003923479500000062
And converting the absolute phase into a three-dimensional point cloud by using the internal reference matrix in the step S1.
The high-reflectivity object image restoration method adopts a Zhang Zhengyou stereo calibration method to obtain an internal and external parameter matrix of the three-dimensional sensor; constructing a light source model based on the phase-shift Gray code, wherein when light intensity calibration is carried out, no extra coding pattern is needed to be added, and the operation is simple, convenient and quick; when the overexposure part is divided into stripe areas with alternating brightness and darkness, the stripe grating is moderate in width and strong in anti-interference capability; when decoding through the Gray code, can eliminate phase place expansion error well, reduce consuming time of point cloud reconsitution, decode efficiently, the difficult accumulation of error improves the integrality of reconsitution back point cloud.
In step S2, the calculation process of the order N of the required gray code is:
Figure BDA0003923479500000071
in the formula, W p The number of pixels in the horizontal direction of projection; t is the number of projector pixels contained in a single period of the phase-shift grating
Figure BDA0003923479500000072
To round up the symbol, the smallest integer not smaller than the internal value is returned.
In step S4, the process of fitting the light source model includes:
s41: establishing a light source model: according to the characteristics of sensor pixel imaging, under the condition of not considering external illumination influence, the product of two-dimensional Gaussian distribution and a photosensitive coefficient is used for fitting the theoretical gray level of a light source partial image obtained by a sensor, and then information such as color, texture and the like of the surface of the body is combined, so that a light source model can be constructed:
s411: the gray value of the projected image is averaged to obtain the integral equivalent gray B of the projection w
Figure BDA0003923479500000073
In the formula, W p The number of pixels in the horizontal direction of projection; h p The number of pixels in the vertical direction of projection is used; u is a horizontal coordinate under an image coordinate system; v is a drawingLike vertical coordinates under a coordinate system; b (u, v) is the gray value of the pixel point corresponding to the projection pattern;
s412: calculating theoretical light intensity gain I at each pixel point (x, y) of the image w (x,y):
Figure BDA0003923479500000074
In the formula, k is the whole light sensitivity coefficient of the projection light source; sigma is the gray value standard deviation of the whole image;
s413: considering the real color of an object and the influence of external noise in imaging, finishing a full-dark image, the real gray scale G of the object and theoretical light intensity gain to obtain a gray scale image G (x, y) containing highlight features:
G(x,y)=γ(g)*[B(x,y)+I w (x,y)]+G 0 (x,y)
wherein gamma (g) is the influence coefficient of the standard gray scale of the object on the image gray scale; b (x, y) is a continuous function of B (u, v); g 0 (x, y) is a full dark image;
s42: solving a light source model:
s421: using a contour extraction function to the full bright pattern overexposure area in the step S3, keeping the contour information of the same part of the real gray scale, and solving the image coordinate (u) corresponding to the optical center of the projector by using circle center fitting 0 ,v 0 ) As the origin of the gaussian model;
s422: in the unsaturated area, for the same image with the same illumination, subtracting the full-bright image from the full-dark image to eliminate additive noise, and comparing gray values at different gray levels to obtain a mapping relation between an image pixel value and an object gray level:
ΔG=γ(Δg)*B(x,y)
wherein Δ G represents the difference between the image pixel and the reference pixel; gamma (Δ g) represents a difference between an influence coefficient of a gray scale on a full-bright image and an influence coefficient of a gray scale on a full-dark image;
solving the influence coefficient gamma (g) of the gray scale on the image by using polynomial fitting, as shown in FIG. 3;
in the absence of fullnessIn the sum region, for images with different illumination, comparing the same pixel and gray scale to be g 0 Calculating the two-dimensional Gaussian distribution of the whole light source acting on the image in the space according to the gray value of the image:
Figure BDA0003923479500000081
in the formula, g 0 Representing the gray scale value of any gray scale card in the image; Δ x represents the difference between the abscissa of the pixel; Δ y represents a difference in ordinate between pixels;
solving for σ and k using a least squares method:
Figure BDA0003923479500000082
Figure BDA0003923479500000083
in the formula, r represents the linear distance from the pixel point to the optical center.
Example two
This embodiment is similar to the embodiment except that in step S8, the correction value at the overexposure position of the image is calculated as follows:
Figure BDA0003923479500000084
in the formula, Δ G (x, y) represents a correction value at the overexposure of the image.
Restored image S i (x,y):
Figure BDA0003923479500000091
In the formula,. DELTA.G max The maximum value of the image overexposure correction value is obtained; g i And (x, y) is the image of the area to be repaired in step S6.
In step S9, the image of the four-step phase-shift grating can be represented as:
Figure BDA0003923479500000092
in the formula, A represents the background gray level of an object; b represents the amplitude of the modulated light intensity of the projector; s n (u, v) represents a corrected gradation value at the pixel coordinate (u, v) at which the three-dimensional sensor acquired the n-th (n =1,2,3,4) picture.
And (3) converting the absolute phase into a three-dimensional point cloud by using the internal and external parameter matrix in the step (S1):
Figure BDA0003923479500000093
Figure BDA0003923479500000094
where P (u, v) represents the Gray code phase shift order at the pixel coordinates.
In step S9, after the object is replaced, as shown in fig. 1, steps S5 to S9 are repeated, so that continuous reconstruction can be realized, and the requirement of continuous reconstruction in industry can be met.
EXAMPLE III
In this embodiment, the image of the metal wrench is repaired by the image repairing method for the highly reflective object in the first embodiment or the second embodiment, the image of the metal wrench before reconstruction is shown in fig. 4, the image of the metal wrench after reconstruction is shown in fig. 5, and point cloud data in the image of the metal wrench before and after reconstruction is shown in table 1:
TABLE 1 Point cloud data sheet before and after repairing metal spanner image
Point cloud data name Total number of pixels/ Number of effective pixels/ Effective pixel fraction%
Image point cloud before restoration 66341 58328 87.92
Repaired image point cloud 66341 64979 97.95
As can be seen from table 1, by the method for repairing a highly reflective object image according to the embodiment or the second embodiment, the number proportion of effective pixel points of the point cloud can be increased to more than 97%, and the integrity of the point cloud after reconstruction of the reflective object is improved.
In the detailed description of the embodiments, various technical features may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
It should be understood that the above-described embodiments of the present invention are merely examples for clearly illustrating the present invention, and are not intended to limit the embodiments of the present invention. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. And are neither required nor exhaustive of all embodiments. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the claims of the present invention.

Claims (10)

1. The method for restoring the image of the high-reflectivity object is characterized by comprising the following steps of:
s1: obtaining an internal and external parameter matrix of the three-dimensional sensor by using a Zhang Zhengyou stereo calibration method;
s2: the method comprises the steps of calibrating light intensity of a three-dimensional sensor, setting a gray scale card in an imaging range of a camera, and projecting full-bright, full-dark and N-th complementary codes of gray codes by a projector respectively to obtain a full-bright image, a full-dark image and a gray code complementary code image;
s3: matching pixel values of corresponding positions of the full-bright image according to the real gray scale of discrete change of the gray scale card, extracting overexposed pixel positions in the full-bright image, the full-dark image and the gray code complementary code image by using a maximum threshold value method, and recording the real gray scale of the corresponding gray scale card;
s4: solving the relation among the illumination brightness, the object gray level and the image gray level by utilizing the full-bright image, the full-dark image and the gray code complementary code image, and fitting the parameters of the light source model;
s5: replacing the gray scale card with an object to be detected, sequentially projecting 1 full bright pattern, 2N Zhang Hubu Gray code and 4 Xiang Yima by a light source, synchronously drawing by the three-dimensional sensor, and sequentially marking as G 0 、G 1 、G 2 、…、G n
S6: extracting the full bright picture G 0 4 phase shift diagrams G i (i = n-3, …, n) as the area to be repaired, and recording the gray scale g of the non-overexposed part of the object to be detected corresponding to the projected white light 1 And image coordinates (u) 1 ,v 1 );
S7: extracting jagged boundary pixel coordinates of the overexposed part of each image in the step S6 by using a full 1 convolution kernel, and dividing the overexposed part into stripe areas with alternating brightness and darkness;
s8: calculating a correction value of the overexposure position of the image according to the parameters of the light source model in the step S4, and performing normalization processing on the corrected image to obtain a repaired image S i (x,y);
S9: obtaining wrapped phase at each pixel point by four phase shift graphs
Figure FDA0003923479490000011
Then, the absolute phase is resolved by the phase shift order P decoded by the Gray code
Figure FDA0003923479490000012
And converting the absolute phase into a three-dimensional point cloud by using the internal reference matrix in the step S1.
2. The image inpainting method for highly reflective objects according to claim 1, wherein in step S2, the calculation process of the order N of the required gray code is as follows:
Figure FDA0003923479490000013
in the formula, W p The number of pixels in the horizontal direction of projection is used; t is the number of projector pixels contained in a single period of the phase-shift grating;
Figure FDA0003923479490000014
to round up the symbol, the smallest integer not smaller than the internal value is returned.
3. The method for repairing highly reflective object images according to claim 1, wherein the step S4 of fitting the light source model comprises:
s41: establishing a light source model;
s42: and solving the light source model.
4. The image restoration method for the highly reflective object according to claim 3, wherein the specific process of establishing the light source model comprises:
s411: averaging the gray value of the projected image to obtain the projected integral equivalent gray B w
Figure FDA0003923479490000021
In the formula, W p The number of pixels in the horizontal direction of projection is used; h p The number of pixels in the vertical direction of projection is used; u is a horizontal coordinate under an image coordinate system; v is a vertical coordinate under an image coordinate system; b (u, v) is the gray value of the pixel point corresponding to the projection pattern;
s412: calculating theoretical light intensity gain I at each pixel point (x, y) of the image w (x,y):
Figure FDA0003923479490000022
In the formula, k is the whole photosensitive coefficient of the projection light source; sigma is the gray value standard deviation of the whole image;
s413: considering the real color of an object and the influence of external noise in imaging, finishing a full-dark image, the real gray scale G of the object and theoretical light intensity gain to obtain a gray scale image G (x, y) containing highlight features:
G(x,y)=γ(g)*[B(x,y)+I w (x,y)]+G 0 (x,y)
wherein gamma (g) is the influence coefficient of the standard gray scale of the object on the image gray scale; b (x, y) is a continuous function of B (u, v); g 0 (x, y) is a full dark image.
5. The image inpainting method for the highly reflective object according to claim 4, wherein the process of solving the light source model comprises:
s421: using a contour extraction function for the full bright pattern overexposure area in the step S3, reserving contour information of the same part of the real gray scale, and solving an image coordinate (u) corresponding to the optical center of the projector by using circle center fitting 0 ,v 0 ) As the origin of the gaussian model;
s422: in the unsaturated region, for the same image with the same illumination, subtracting the full bright image and the full dark image to eliminate additive noise, and comparing gray values at different gray levels to obtain a mapping relation between an image pixel value and an object gray level:
ΔG=γ(Δg)*B(x,y)
wherein Δ G represents the difference between the image pixel and the reference pixel; gamma (Δ g) represents a difference between an influence coefficient of a gray scale on a full bright image and an influence coefficient of a gray scale on a full dark image;
solving an influence coefficient gamma (g) of the gray scale on the image by utilizing polynomial fitting;
in the unsaturated area, for images with different illumination, comparing the same pixels and gray scales are g 0 Calculating the two-dimensional Gaussian distribution of the whole light source acting on the image in the space according to the gray value of the image:
Figure FDA0003923479490000031
in the formula, g 0 Representing the gray scale value of any gray scale card in the image; Δ x represents the difference between the abscissa of the pixel; Δ y represents a difference in ordinate between pixels;
solving for σ and k using least squares:
Figure FDA0003923479490000032
Figure FDA0003923479490000033
in the formula, r represents the linear distance from the pixel point to the optical center.
6. The image restoration method for the highly reflective object according to claim 5, wherein in step S8, the correction value of the overexposure position of the image is calculated as follows:
Figure FDA0003923479490000034
in the formula, Δ G (x, y) represents a correction value at the overexposure of the image.
7. The method for restoring an image of a highly reflective object as claimed in claim 6, wherein in step S8, the restored image S i (x,y):
Figure FDA0003923479490000035
In the formula,. DELTA.G max The maximum value of the image overexposure correction value; g i (x, y) is the image of the region to be repaired in step S6.
8. The method for restoring a highly reflective object image according to claim 7, wherein in step S9, the image of the four-step phase-shift grating is represented as:
Figure FDA0003923479490000036
in the formula, A represents the background gray level of an object; b represents the amplitude of the modulated light intensity of the projector; s n (u, v) represents a corrected gradation value at the pixel coordinate (u, v) of the n-th (n =1,2,3,4) picture acquired by the three-dimensional sensor.
9. The method for repairing highly reflective object images according to claim 8, wherein in step S9, the internal and external reference matrices in step S1 are used to convert the absolute phase into three-dimensional point cloud:
Figure FDA0003923479490000041
Figure FDA0003923479490000042
where P (u, v) represents the Gray code phase shift order at the pixel coordinates.
10. The image restoration method for highly reflective objects according to any one of claims 1 to 8, wherein in step S9, after replacing the object, the steps S5 to S9 are repeated to realize continuous reconstruction.
CN202211363259.3A 2022-11-02 2022-11-02 High-reflectivity object image restoration method Pending CN115829860A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211363259.3A CN115829860A (en) 2022-11-02 2022-11-02 High-reflectivity object image restoration method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211363259.3A CN115829860A (en) 2022-11-02 2022-11-02 High-reflectivity object image restoration method

Publications (1)

Publication Number Publication Date
CN115829860A true CN115829860A (en) 2023-03-21

Family

ID=85526263

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211363259.3A Pending CN115829860A (en) 2022-11-02 2022-11-02 High-reflectivity object image restoration method

Country Status (1)

Country Link
CN (1) CN115829860A (en)

Similar Documents

Publication Publication Date Title
CN107607040B (en) Three-dimensional scanning measurement device and method suitable for strong reflection surface
CN107091617B (en) Shape measurement system, shape measurement device, and shape measurement method
JP5907596B2 (en) Three-dimensional measuring apparatus, three-dimensional measuring method and program
US9857166B2 (en) Information processing apparatus and method for measuring a target object
US20140002610A1 (en) Real-time 3d shape measurement system
CA2666256C (en) Deconvolution-based structured light system with geometrically plausible regularization
JP2021507440A (en) Methods and systems for generating 3D images of objects
CN106091986B (en) A kind of method for three-dimensional measurement suitable for glossy surface
CN107071248B (en) High dynamic range imaging method for extracting geometric features of strong reflection surface
Zhang et al. A robust surface coding method for optically challenging objects using structured light
US20140307085A1 (en) Distance measuring apparatus and method
CN113506348B (en) Gray code-assisted three-dimensional coordinate calculation method
Herakleous et al. 3dunderworld-sls: An open-source structured-light scanning system for rapid geometry acquisition
KR101445831B1 (en) 3D measurement apparatus and method
CN113962910A (en) Dark image enhancement
CN111971525A (en) Method and system for measuring an object with a stereoscope
CN113155053A (en) Three-dimensional geometry measuring device and three-dimensional geometry measuring method
JP4379626B2 (en) Three-dimensional shape measuring method and apparatus
CN116608794B (en) Anti-texture 3D structured light imaging method, system, device and storage medium
CN113793273A (en) Point cloud noise deleting method based on phase shift fringe brightness amplitude
Kim et al. Antipodal gray codes for structured light
CN115829860A (en) High-reflectivity object image restoration method
Im et al. A solution for camera occlusion using a repaired pattern from a projector
CN114166150B (en) Stripe reflection three-dimensional measurement method, system and storage medium
CN114998409B (en) Self-adaptive structured light measurement method, device, electronic equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination