CN113160393A - High-precision three-dimensional reconstruction method and device based on large field depth and related components thereof - Google Patents
High-precision three-dimensional reconstruction method and device based on large field depth and related components thereof Download PDFInfo
- Publication number
- CN113160393A CN113160393A CN202110529545.1A CN202110529545A CN113160393A CN 113160393 A CN113160393 A CN 113160393A CN 202110529545 A CN202110529545 A CN 202110529545A CN 113160393 A CN113160393 A CN 113160393A
- Authority
- CN
- China
- Prior art keywords
- dimensional
- imaging device
- absolute phase
- mapping coefficient
- target
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 61
- 238000013507 mapping Methods 0.000 claims abstract description 86
- 238000005259 measurement Methods 0.000 claims abstract description 51
- 238000009826 distribution Methods 0.000 claims abstract description 41
- 238000004364 calculation method Methods 0.000 claims abstract description 13
- 238000003384 imaging method Methods 0.000 claims description 145
- 239000011159 matrix material Substances 0.000 claims description 18
- 230000003287 optical effect Effects 0.000 claims description 17
- 238000006243 chemical reaction Methods 0.000 claims description 16
- 230000006870 function Effects 0.000 claims description 12
- 238000004590 computer program Methods 0.000 claims description 11
- 238000010586 diagram Methods 0.000 claims description 11
- 238000012545 processing Methods 0.000 claims description 10
- 238000004422 calculation algorithm Methods 0.000 claims description 9
- 238000009792 diffusion process Methods 0.000 claims description 6
- 238000003860 storage Methods 0.000 claims description 6
- 230000017105 transposition Effects 0.000 claims description 3
- 230000008569 process Effects 0.000 abstract description 14
- 230000010363 phase shift Effects 0.000 description 13
- 239000013598 vector Substances 0.000 description 5
- 230000009471 action Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000005070 sampling Methods 0.000 description 3
- 238000007476 Maximum Likelihood Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000013519 translation Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 239000006185 dispersion Substances 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012634 optical imaging Methods 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
- G06T7/85—Stereo camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/30—Assessment of water resources
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Graphics (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention discloses a high-precision three-dimensional reconstruction method and device based on large depth of field and related components thereof. The method comprises the following steps: carrying out region division on a large depth-of-field measurement scene; performing binocular vision three-dimensional calibration on each divided area by using a three-dimensional measurement system to obtain calibration data; calculating an absolute phase distribution map and three-dimensional information of the flat panel at different positions; establishing a three-dimensional mapping coefficient table of a corresponding area; acquiring a target image of a measured object and calculating an absolute phase distribution map; and acquiring the absolute phase of each pixel point in the absolute phase distribution map of the target image, searching a three-dimensional mapping coefficient in a three-dimensional mapping coefficient table of a corresponding region, and calculating the spatial three-dimensional point coordinate by using the three-dimensional mapping coefficient. According to the invention, the large depth-of-field scene is divided into a plurality of regions for calculation, so that the whole calculation process is more rigorous and accurate, and the space three-dimensional point coordinates of the measured object are acquired more quickly and accurately after the three-dimensional mapping coefficient table is established.
Description
Technical Field
The invention relates to the technical field of three-dimensional imaging, in particular to a high-precision three-dimensional reconstruction method and device based on large depth of field and related components thereof.
Background
The stripe projection three-dimensional measurement in the prior art belongs to one of the structure illumination methods, has the advantages of high speed, high precision, low cost, easy operation and the like, and is widely applied to the fields of industrial measurement, intelligent manufacturing, cultural relic protection and the like. The principle of the fringe projection three-dimensional measurement technology is that a standard sine fringe pattern is projected on a measured object, the height of the object modulates projected fringes, a camera acquires a modulation deformation fringe pattern, and then three-dimensional information of the object is reconstructed by combining the fringe phase demodulation and system calibration technology with the phase-height mapping principle.
Generally, fringe projection uses a digital micromirror array (DMD) based on optical imaging principle to generate a fringe pattern, and an optical projection lens with a fixed focal length has a limited depth of field. Especially, in order to improve the image brightness, the commercial projectors adopt a large aperture design, which results in a smaller depth of field. In some large measurement scenes, the projection system cannot meet the requirement due to the large volume of the measured object or the large span space range formed by a plurality of objects. And the projector of MEMS (micro electro mechanical system) galvanometer laser scanning has an imaging range with large depth of field due to the use of a laser light source and a projection mode of galvanometer scanning. Camera lenses also suffer from a limited depth of field range, while electronic zoom lenses have the capability of continuously varying the focal length of the lens.
In recent years, the development of Electrically Tunable Lenses (ETL) has provided more options for designing more compact optical systems, especially its precision, rapidity, convenience, repeatability, etc., and has been widely used in the fields of display, microscope, auto-focus imaging, laser processing, etc. For three-dimensional measurement, the use of electronically tunable lenses to perform certain functions has also been investigated. One technique is to rapidly acquire multiple discrete focused images using electronically adjustable lenses to obtain three-dimensional information of the entire scene by combining data from different focusing situations. However, the existing three-dimensional measurement system has low precision when measuring a scene with a large depth of field, and the problem of poor reconstruction recovery effect is still solved.
Disclosure of Invention
The embodiment of the invention provides a high-precision three-dimensional reconstruction method, a high-precision three-dimensional reconstruction device and related components thereof based on a large depth of field, and aims to solve the problems of low three-dimensional measurement precision and poor three-dimensional reconstruction effect in a large depth of field scene in the prior art.
In a first aspect, an embodiment of the present invention provides a high-precision three-dimensional reconstruction method based on a large depth of field, including:
carrying out region division on a large depth-of-field measurement scene;
performing binocular vision three-dimensional calibration on each divided area by using a three-dimensional measurement system to obtain calibration data; wherein the three-dimensional measurement system comprises an imaging device and a projection device;
projecting specified patterns to the plane flat plates at different positions by using a projection device in different areas, acquiring flat plate images of the plane flat plates at different positions by using an imaging device, calculating to obtain absolute phase distribution maps of the plane flat plates at different positions, and calculating to obtain three-dimensional information of the plane flat plates at different positions according to the calibration data;
in each region, acquiring the absolute phase of each pixel point of the imaging device in the absolute phase distribution map of the plane panel, and establishing a three-dimensional mapping coefficient table of the corresponding region according to the mapping relation between the three-dimensional information of each plane panel and the absolute phase of the corresponding pixel point;
projecting a target pattern to a measured object by using a projection device, acquiring a target focal scanning image of the measured object by using an imaging device, and performing deblurring processing on the target focal scanning image to obtain a target image and calculate an absolute phase distribution diagram of the target image;
and acquiring the absolute phase of each pixel point of the target image in the absolute phase distribution map of the target image, searching a corresponding three-dimensional mapping coefficient in a three-dimensional mapping coefficient table of a corresponding region according to the region to which the absolute phase belongs, and calculating by using the three-dimensional mapping coefficient to obtain a corresponding space three-dimensional point coordinate.
In a second aspect, an embodiment of the present invention provides a high-precision three-dimensional reconstruction apparatus based on a large depth of field, including:
the area division unit is used for carrying out area division on the large depth-of-field measurement scene;
the calibration data acquisition unit is used for carrying out binocular vision three-dimensional calibration on each divided area by using the three-dimensional measurement system to obtain calibration data; wherein the three-dimensional measurement system comprises an imaging device and a projection device;
the three-dimensional information acquisition unit is used for projecting specified patterns to the plane flat plates at different positions by using the projection device in different areas, acquiring flat plate images of the plane flat plates at different positions by using the imaging device, calculating to obtain absolute phase distribution maps of the plane flat plates at different positions, and calculating to obtain three-dimensional information of the plane flat plates at different positions according to the calibration data;
the three-dimensional mapping coefficient table acquisition unit is used for acquiring the absolute phase of each pixel point of the imaging device in the absolute phase distribution map of the plane panel in each area and establishing a three-dimensional mapping coefficient table of the corresponding area according to the mapping relation between the three-dimensional information of each plane panel and the absolute phase of the corresponding pixel point;
the target focal-scanning image acquisition unit is used for projecting a target pattern to a measured object by using the projection device, acquiring a target focal-scanning image of the measured object by using the imaging device, and performing deblurring processing on the target focal-scanning image to obtain a target image and calculate an absolute phase distribution map of the target image;
and the space three-dimensional point coordinate acquisition unit is used for acquiring the absolute phase of each pixel point of the target image in the absolute phase distribution diagram of the target image, searching a corresponding three-dimensional mapping coefficient in a three-dimensional mapping coefficient table of a corresponding area according to the area to which the absolute phase belongs, and calculating by using the three-dimensional mapping coefficient to obtain the corresponding space three-dimensional point coordinate.
In a third aspect, an embodiment of the present invention further provides a computer device, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, and when the processor executes the computer program, the processor implements the large depth-of-field based high-precision three-dimensional reconstruction method according to the first aspect.
In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program, and the computer program, when executed by a processor, causes the processor to execute the method for high-precision three-dimensional reconstruction based on a large depth of field according to the first aspect.
The embodiment of the invention provides a high-precision three-dimensional reconstruction method and device based on large depth of field and related components thereof. The method comprises the following steps: carrying out region division on a large depth-of-field measurement scene; performing binocular vision three-dimensional calibration on each divided area by using a three-dimensional measurement system to obtain calibration data; wherein the three-dimensional measurement system comprises an imaging device and a projection device; projecting specified patterns to the plane flat plates at different positions by using a projection device in different areas, acquiring flat plate images of the plane flat plates at different positions by using an imaging device, calculating to obtain absolute phase distribution maps of the plane flat plates at different positions, and calculating to obtain three-dimensional information of the plane flat plates at different positions according to the calibration data; in each region, acquiring the absolute phase of each pixel point of the imaging device in the absolute phase distribution map of the plane panel, and establishing a three-dimensional mapping coefficient table of the corresponding region according to the mapping relation between the three-dimensional information of each plane panel and the absolute phase of the corresponding pixel point; projecting a target pattern to a measured object by using a projection device, acquiring a target focal scanning image of the measured object by using an imaging device, and performing deblurring processing on the target focal scanning image to obtain a target image and calculate an absolute phase distribution diagram of the target image; and acquiring the absolute phase of each pixel point of the target image in the absolute phase distribution map of the target image, searching a corresponding three-dimensional mapping coefficient in a three-dimensional mapping coefficient table of a corresponding region according to the region to which the absolute phase belongs, and calculating by using the three-dimensional mapping coefficient to obtain a corresponding space three-dimensional point coordinate. According to the embodiment of the invention, the large depth-of-field scene is subjected to region division, the three-dimensional mapping coefficient table is established for each region, the three-dimensional mapping coefficient of the corresponding region of the measured object is directly obtained during three-dimensional reconstruction, so that the spatial three-dimensional point coordinates are calculated, the large depth-of-field scene is divided into a plurality of regions for calculation, the whole calculation process is more precise and accurate, and the spatial three-dimensional point coordinates of the measured object are obtained more quickly and accurately after the three-dimensional mapping coefficient table is established.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic flow chart of a high-precision three-dimensional reconstruction method based on a large depth of field according to an embodiment of the present invention;
fig. 2 is a simulation diagram of a high-precision three-dimensional reconstruction method based on a large depth of field according to an embodiment of the present invention;
fig. 3 is a schematic block diagram of a large depth-of-field-based high-precision three-dimensional reconstruction apparatus according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the specification of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
Referring to fig. 1 and fig. 2, fig. 1 is a schematic flow chart of a high-precision three-dimensional reconstruction method based on a large depth of field according to an embodiment of the present invention, where the method includes steps S101 to S106.
S101, carrying out region division on a large depth-of-field measurement scene;
in this step, since the large depth-of-field scene is large, when performing three-dimensional reconstruction, the accuracy may be reduced if the large depth-of-field scene is directly measured, and therefore, the measurement of the large depth-of-field scene requires dividing the measurement space of the large depth-of-field scene in the depth direction to obtain several small measurement regions.
In one embodiment, after the step S101, the method includes
Calibrating the positions of the imaging device and the projection device by using a calibration algorithm;
and acquiring the maximum value and the minimum value of the total control current of the large-depth-of-field measurement scene measured by the zoom lens of the imaging device, and the corresponding area control current value when the zoom lens of the imaging device is focused on the central position of each area, and recording the maximum value and the minimum value of the total control current and the area control current value corresponding to each area.
In this step, the positions of the imaging device and the projection device are calibrated, then a proper depth range of a measured scene is determined according to a measurement depth range when the imaging device is focused and a diopter change range of a zoom lens of the imaging device, the diopter of the zoom lens is changed, so that the zoom lens can be focused at the central positions of different areas, then corresponding control current values when the zoom lens is focused at the central positions of the different areas are recorded, the diopter of the zoom lens is adjusted, so that the imaging device is focused at the maximum depth and the minimum depth of the large depth-of-field measurement scene respectively, and the control current values of the corresponding depths are recorded. In this embodiment, the minimum depth of the large depth-of-field measurement scene is 400mm, and the maximum depth is 1000 mm.
S102, performing binocular vision three-dimensional calibration on each divided area by using a three-dimensional measurement system to obtain calibration data; wherein the three-dimensional measurement system comprises an imaging device and a projection device;
in the step, a three-dimensional measurement system is formed by an imaging device with a zoom lens and a projection device with an MEMS galvanometer, and binocular vision three-dimensional calibration is carried out on each area by the three-dimensional measurement system to obtain calibration data. The imaging device may be a zoom camera having a zoom lens, and the projection device may be a projector having a MEMS galvanometer.
In an embodiment, the performing binocular vision stereo calibration on each divided region by using a three-dimensional measurement system to obtain calibration data includes:
establishing an imaging device coordinate system by taking the optical center of the imaging device as an origin and taking the optical axis of the imaging device as a Z axis; establishing a projection device coordinate system by taking the optical center of the projection device as an origin and taking the optical axis of the projection device as a Z axis;
and acquiring a conversion relation between the intrinsic parameters of the imaging device and a coordinate system of the imaging device and a conversion relation between the intrinsic parameters of the projection device and the projection coordinate system by using a binocular vision stereo calibration algorithm, and calculating the conversion relation between the coordinate system of the imaging device and the projection coordinate system to obtain calibration data.
In this embodiment, intrinsic parameters of the imaging device and the projection device, that is, an intrinsic parameter matrix including a focal length, an optical center position, and the number of pixels in a unit distance, are obtained. The internal parameters of the imaging device and the coordinate system of the imaging device have a corresponding conversion relation, the internal parameters of the projection device and the coordinate system of the projection device also have a corresponding conversion relation, and the conversion relation between the coordinate system of the imaging device and the coordinate system of the projection device is calculated according to the conversion relation.
The calibration process of the imaging device is as follows:
assuming that a point P exists in a certain region, its coordinates in the world coordinate system and the imaging device coordinate system are (X) respectivelyW,YW,ZW) and (Xc,Yc,Zc) And the projection coordinates on the imaging plane of the imaging device are (u, v), the perspective projection imaging process is as follows: wherein ,sx,syThe number of pixels (pixel/mm) per unit distance of the image plane along the coordinate axis of the corresponding image, respectively; (u)0,v0) The projection of the intersection point of the optical axis of the imaging device and the image plane, namely the optical center on the image plane is called a principal point; f. ofx,fyRespectively equivalent focal lengths along the coordinate axes of the corresponding images; r is a 3 x 3 orthogonal matrix, T is a 3 x 1 vector, and R and T represent the rotational and translational transformations of the world coordinate system to the imaging device coordinate system, respectively. The above formula can be abbreviated as:wherein s is a scale factor; [ R T]Is an external parameter matrix;andrespectively are homogeneous coordinates of a space three-dimensional point P and an image point thereof; m is a projection matrix; k is an internal parameter matrix:the conversion relationship from the world coordinate system to the imaging device coordinate system is as follows:since the imaging device is biased during imaging, it is necessary to calculate the radial distortion and the tangential distortion, which are respectively expressed as: deltaRx=x(k1r2+k2r4+k3r6),δRy=y(k1r2+k2r4+k3r6),δTx=2p1xy+p2(r2+2x2),δTy=p1(r2+2y2)+2p2xy. Wherein, deltaRx and δRyRadial distortion in the x-direction and y-direction, respectively; deltaTx and δTyTangential distortions in the x-direction and y-direction, respectively; (x, y) are ideal image coordinates;representing the distance from the ideal image point to the principal point; k is a radical of1、k2 and k3Is a radial distortion parameter; p is a radical of1 and p2Is a tangential distortion parameter. After considering these two distortion errors, the process of converting the ideal image point (x, y) into the distorted image point (x ', y') can be expressed as: x' ═ x + deltaRx+δTx,y'=y+δRy+δTy。
For each region, the three-dimensional measurement system is used for acquiring target images of the planar target at different positions, and the position of the imaging device is calculated according to the target images by using a Zhang-Yongyou calibration algorithm. When a target image is collected, the plane target is firstly placed at the middle position of a certain area, the target image is collected by using an imaging device, the plane target is converted to another position, the target image is continuously collected, and a plurality of groups of target images are obtained by carrying out position conversion on the plane target for a plurality of times. The overall flow of the Zhangyingyou calibration algorithm is as follows: firstly, shooting a plurality of target images of the plane target at different positions by using an imaging device, then detecting the target images to obtain characteristic points of the target images, solving internal parameters and external parameters of the imaging device under an ideal distortion-free condition, estimating and improving the precision by using the maximum likelihood, solving an actual radial distortion coefficient of the imaging device by using the least square, and optimizing and estimating and improving the estimation precision by using the maximum likelihood method according to the internal parameters, the external parameters and the radial distortion coefficient of the imaging device.
The calibration process of the projection device is as follows:
in the calibration process of the projection device, the corresponding relation between the image of the imaging device and the image of the projection device is obtained by using a phase shift technology. The projection device projects a group of phase shift images of horizontal stripes and Gray code images on the plane target, and the imaging device collects target images. Then, the phase shift plus Gray code is used to calculate the center (u) of the plane targetC,vC) Absolute phase value phi of the horizontal direction ofhFinding a horizontal line corresponding to the image of the projector by the absolute phase value, and obtaining a coordinate value vPComprises the following steps: wherein ,NhTotal number of stripes for horizontal phase shift pattern; h is the vertical resolution of the projection device image. Similarly, the projection device projects a set of phase shift pattern and gray code pattern of vertical stripes to obtain the same patternThe centre of a circle of the calibration point (u)C,vC) Of the absolute phase value phi in the vertical directionvThe corresponding coordinate value u on the projection device imagePComprises the following steps: wherein ,NvIs the total number of stripes of the vertical phase shift pattern; w is the horizontal resolution of the projection device image. The method for calibrating the parameters of the projection device is the same as that of the imaging device, and the internal parameter matrix of the projection device is obtained by calibrating the projection device.
And for each area, calibrating the projection device after the imaging device is calibrated. The method comprises the steps of utilizing a projection device to project patterns to a plane target, utilizing an imaging device to collect target images of the plane target with the patterns, then changing the position of the plane target to continuously project the patterns and collect the target images, changing the position of the plane target for multiple times to obtain multiple groups of target images, and then utilizing the obtained target images to calculate an orthogonal absolute phase distribution diagram of each target image. Specifically, the method may use gray code in combination with a phase shift method to perform phase decoding, so as to calculate an orthogonal absolute phase distribution map of each target image. The Gray code plus phase shift method can reduce the number of encoding bits of the Gray code, accelerate the decoding speed, and make up for the defect that the discontinuous positions are difficult to reconstruct by a simple phase shift method and a Gray code method. The specific coding method combining Gray code and phase shift method is as follows: firstly, a series of gray code black and white stripe patterns are projected to a measured object, wherein the area with the same code is used as a code period, and then the phase shift patterns are projected in sequence by adopting a phase shift method, so that each code area is further and continuously subdivided. And finding out a pair of homonymous points on the image surface of the imaging device and the image surface of the projection device through the orthogonal absolute phase distribution diagram of the target image, namely, searching a first sub-pixel point with the same absolute phase as the point to be matched on the image surface of the projection device by taking the image point of the image surface of the imaging device as the point to be matched. And finally, calculating the position of the projection device by using a Zhang-Zhengyou calibration algorithm according to the target image.
S103, projecting specified patterns to the plane flat plates at different positions by using a projection device in different areas, acquiring flat plate images of the plane flat plates at different positions by using an imaging device, calculating to obtain absolute phase distribution maps of the plane flat plates at different positions, and calculating to obtain three-dimensional information of the plane flat plates at different positions according to the calibration data;
in this step, in each region, projecting a specified pattern to the planar plate at different positions by using the projection device with the MEMS galvanometer, acquiring a plate image by using the imaging device, calculating an absolute phase distribution map of the planar plate at different positions according to the plate image, and calculating three-dimensional information of the planar plate by using calibration data acquired in advance. The calibration data includes: intrinsic parameters of the imaging device and the projection device, distortion coefficients of a zoom lens of the imaging device, and a conversion relationship between the two coordinate systems.
The specific acquisition process of the flat image of the flat panel is as follows: the plane flat plate is placed in a measuring area, the projection device projects an orthogonal sine phase shift stripe pattern and a Gray code coding pattern on the plane flat plate, the imaging device shoots and collects flat plate images of the plane flat plate at different positions, then the position of the plane flat plate is changed, and the projection and collection processes are repeated to obtain a plurality of groups of flat plate image data.
In an embodiment, the calculating the three-dimensional information of the flat panel at different positions according to the calibration data includes:
the three-dimensional information is calculated according to the following formula:
sC[uC,vC,1]T=KCMC[XW,YW,ZW,1]T
sP[uP,vP,1]T=KPI[XW,YW,ZW,1]T
wherein ,sC and sPScale factors of the imaging device and the projection device respectively,KC and KPInternal parameter matrices, M, for the imaging device and the projection device, respectivelyCIs an external parameter matrix of the imaging device, I is an identity matrix, (u)C,vC) and (uP,vP) The image coordinates of the imaging device and the projection device after the distortion parameters of the three-dimensional measurement system are corrected, and T is the transposition of the matrix.
In this embodiment, the three-dimensional reconstruction process of the three-dimensional measurement system on the P point may be represented as: wherein ,sC and sPScale factors, K, of the imaging device and the projection device, respectivelyC and KPInternal parameter matrices, M, for the imaging device and the projection device, respectivelyC and MPRespectively, the external parameter matrices of the imaging device and the projection device. The structural parameters of the three-dimensional measurement system can be expressed as:wherein r is a rotation vector converted from the projection device coordinate system to the imaging device coordinate system, and t is a translation vector converted from the projection device coordinate system to the imaging device coordinate system; the world coordinate system is established under the projection device coordinate system, when RPIs an identity matrix, TPIs a zero matrix, RCFor a rotation vector of the world coordinate system to the imaging device coordinate system, TCFor translation vectors from the world coordinate system to the imaging device coordinate system, MPI, the three-dimensional reconstruction process transforms to:wherein I is an identity matrix; (u)C,vC) and (uP,vP) And the image coordinates of the imaging device and the projection device after the system distortion parameters are corrected. Solving for P point three-dimensional coordinate (X)W,YW,ZW) Then, the absolute phase value phi of the P point is combined1And obtaining the sampling data of the phase three-dimensional mapping coefficient table of one imaging device pixel point in one area.
S104, in each area, acquiring the absolute phase of each pixel point of the imaging device in the absolute phase distribution map of the plane panel, and establishing a three-dimensional mapping coefficient table of the corresponding area according to the mapping relation between the three-dimensional information of each plane panel and the absolute phase of the corresponding pixel point;
in this step, according to the absolute phase distribution map of the planar flat plate obtained by pre-calculation, the corresponding absolute phase of each pixel point of the imaging device in different areas is obtained, and then the mapping relationship between each pixel point and the corresponding three-dimensional information is obtained, so as to establish the three-dimensional mapping coefficient table of the corresponding area. In each region, a position information of the flat panel is used as a set of sampling data, that is, a pixel point corresponds to a three-dimensional space point and a phase value. The plane flat plates at multiple positions provide multiple groups of sampling data, and a mapping coefficient table of each pixel in one area is obtained by fitting mapping coefficients.
In one embodiment, the step S104 includes:
according to the three-dimensional information corresponding to the pixel points, calculating a three-dimensional mapping coefficient according to the following formula, and establishing a three-dimensional mapping coefficient table:
wherein ,αn,cX,bn,cY,cn and cZIs a three-dimensional mapping coefficient, N is a polynomial order, and phi is the absolute phase of the corresponding pixel point.
In this embodiment, a certain pixel m is givencHas an absolute phase of phiCAnd the three-dimensional space point is (X, Y, Z), and according to the three-dimensional information of the pixel point, the following can be deduced: wherein ,αn,cX,bn,cY,cn and cZCorrespond toThere are three spatial dimensions of the mapping coefficients. The mapping coefficient { a corresponding to each pixel point can be calculated by the formulan,bn,cnAnd establishing a three-dimensional mapping coefficient table corresponding to the absolute phase of each pixel point in each area.
S105, projecting a target pattern to a measured object by using a projection device, acquiring a target focal scanning image of the measured object by using an imaging device, and performing deblurring processing on the target focal scanning image to obtain a target image and calculate an absolute phase distribution map of the target image;
in the step, a projection device with an MEMS galvanometer is used for projecting a pattern to a measured object, the imaging device acquires a target focal-scanning image of the measured object under a single-frame exposure condition, then the target focal-scanning image is subjected to deblurring processing to obtain a target image, and then an absolute phase distribution diagram of the target image is calculated. The imaging device continuously performs focal plane scanning under a single-frame exposure condition to obtain a target focal scanning image, wherein the single-frame exposure time is the current period for controlling the zoom lens of the imaging device. The current period changes in a triangular wave mode along with time, and the maximum value and the minimum value of the triangular wave are the maximum value and the minimum value of a current value range for controlling the imaging device to continuously focus on the whole large-depth-of-field measurement scene. And controlling the single-frame exposure time of the target focal-scanning image acquired by the imaging device by controlling the current value of the imaging device.
When the target focal-scanning pattern of each measured object is collected, the period of the control current of the zoom lens is T, and the maximum value is IHMinimum value of ILThe function of the current over time t is:wherein n is a natural number.
In one embodiment, the step S105 includes:
inputting the target focal scanning image to an integral point diffusion function for deconvolution operation to obtain a deblurred target image;
the calculation formula of the integral point spread function is as follows:
wherein r is the distance of the center of a circle of confusion of the object point imaging on the sensor plane of the imaging device; b0In order to control the current value of the electronic zoom lens to be 0 moment, the diameter of a circle of confusion of an object point imaged on the plane of the sensor of the imaging device; b1Spot diameter focused on the imaging device sensor plane for an object point; b2Controlling the current value of the electronic zoom lens to be at the moment of half period T, and imaging an object point on the diameter of a diffusion circle on a sensor plane of an imaging device; c1 and C2Are two constants.
In this embodiment, an integral point spread function is constructed based on the focal sweep model of the imaging device, and deconvolution operation is performed on the target focal sweep image by using the integral point spread function, so as to obtain a deblurred target focal sweep image. The calculation formula of the integral point spread function is as described above, and the calculation result is deblurred by wiener filtering.
S106, obtaining the absolute phase of each pixel point of the target image in the absolute phase distribution map of the target image, searching a corresponding three-dimensional mapping coefficient in a three-dimensional mapping coefficient table of a corresponding region according to the region to which the absolute phase belongs, and calculating by using the three-dimensional mapping coefficient to obtain a corresponding space three-dimensional point coordinate.
In this step, the area to which each pixel point corresponding to the target focal-scanning image belongs is determined according to the absolute phase distribution map of the target focal-scanning image, then the corresponding three-dimensional mapping coefficient is found in the three-dimensional mapping coefficient table of the corresponding area, and the three-dimensional mapping coefficient is used to calculate the three-dimensional point coordinates of the corresponding space.
In one embodiment, the step S106 includes:
calculating the spatial three-dimensional point coordinates by the following formula:
wherein ,{an,bn,cnThe mapping coefficient is the three-dimensional mapping coefficient of the corresponding area, phi is the absolute phase of the pixel point, and N is the polynomial order.
In this embodiment, the absolute phase of each pixel point of the imaging device in the target focal-scanning image is obtained as a target phase, the region to which the target phase belongs is determined, and the corresponding three-dimensional mapping coefficient { a ] is found according to the three-dimensional mapping coefficient table of the corresponding regionn,bn,cnAnd calculating to obtain a space three-dimensional point coordinate according to the absolute phase and the three-dimensional mapping coefficient.
As shown in fig. 2, if the absolute phase of a certain pixel is Φ, the range of the phase value of the pixel in the region 1 is Φ1 1~Ф1 nThe range of phase values in region 2 is phi2 1~Ф2 nThe range of phase values in region 3 is phi3 1~Ф3 nThe range of phase values in region n is phin 1~Фn n. Determining the absolute phase region if phi2 1≤Φ≤Φ2 nIf the pixel point belongs to the area 2, finding out the three-dimensional mapping coefficient corresponding to the pixel point in the three-dimensional mapping coefficient table of the area 2, and calculating to obtain the corresponding space three-dimensional point coordinate according to the following formula: wherein ,{an,bn,cnAnd is the three-dimensional mapping coefficient corresponding to the area 2.
If phi2 1≤Φ≤Φ2 nAnd phi1 1≤Φ≤Φ1 nThe condition for determining which region Φ belongs to is as follows:when L is larger than or equal to 0, the phase value of the image point belongs to the area 1; when L < 0, the phase value of this image point belongs to region 2.
Referring to fig. 3, fig. 3 is a schematic block diagram of a large depth-of-field based high-precision three-dimensional reconstruction apparatus 200 according to an embodiment of the present invention, including:
the region dividing unit 201 is configured to perform region division on a large depth-of-field measurement scene;
a calibration data obtaining unit 202, configured to perform binocular vision three-dimensional calibration on each divided region by using a three-dimensional measurement system, so as to obtain calibration data; wherein the three-dimensional measurement system comprises an imaging device and a projection device;
the three-dimensional information acquisition unit 203 is used for projecting specified patterns to the plane flat plates at different positions by using a projection device in different areas, acquiring flat plate images of the plane flat plates at different positions by using an imaging device, calculating to obtain absolute phase distribution maps of the plane flat plates at different positions, and calculating to obtain three-dimensional information of the plane flat plates at different positions according to the calibration data;
a three-dimensional mapping coefficient table obtaining unit 204, configured to obtain, in each region, an absolute phase of each pixel of the imaging device in the absolute phase distribution map of the planar plate, and establish a three-dimensional mapping coefficient table of a corresponding region according to a mapping relationship between three-dimensional information of each planar plate and the absolute phase of the corresponding pixel;
a target focal-scanning image obtaining unit 205, configured to project a target pattern to a measured object by using a projection device, acquire a target focal-scanning image of the measured object by using an imaging device, perform deblurring processing on the target focal-scanning image, obtain a target image, and calculate an absolute phase distribution map of the target image;
and the spatial three-dimensional point coordinate obtaining unit 206 is configured to obtain an absolute phase of each pixel point of the target image in the absolute phase distribution map of the target image, search a corresponding three-dimensional mapping coefficient in a three-dimensional mapping coefficient table of a corresponding region according to a region to which the absolute phase belongs, and calculate a corresponding spatial three-dimensional point coordinate by using the three-dimensional mapping coefficient.
In an embodiment, the area dividing unit 201 includes:
the position calibration unit is used for calibrating the positions of the imaging device and the projection device by using a calibration algorithm;
and the control current value recording unit is used for acquiring the maximum value and the minimum value of the total control current of the large-depth-of-field measurement scene measured by the zoom lens of the imaging device, the corresponding area control current value when the zoom lens of the imaging device focuses on the central position of each area, and recording the maximum value and the minimum value of the total control current and the corresponding area control current value of each area.
In an embodiment, the calibration data obtaining unit 202 includes:
a coordinate system establishing unit, configured to establish a coordinate system of the imaging device with an optical center of the imaging device as an origin and an optical axis of the imaging device as a Z-axis; establishing a projection device coordinate system by taking the optical center of the projection device as an origin and taking the optical axis of the projection device as a Z axis;
and the calibration data calculation unit is used for acquiring the conversion relation between the internal parameters of the imaging device and the coordinate system of the imaging device and the conversion relation between the internal parameters of the projection device and the projection coordinate system by using a binocular vision stereo calibration algorithm, and calculating the conversion relation between the coordinate system of the imaging device and the projection coordinate system to obtain calibration data.
In one embodiment, the three-dimensional information obtaining unit 203 includes:
a three-dimensional information formula calculation unit for calculating three-dimensional information according to the following formula:
sC[uC,vC,1]T=KCMC[XW,YW,ZW,1]T
sP[uP,vP,1]T=KPI[XW,YW,ZW,1]T
wherein ,sC and sPScale factors, K, of the imaging device and the projection device, respectivelyC and KPInternal parameter matrices, M, for the imaging device and the projection device, respectivelyCIs an external parameter matrix of the imaging device, I is an identity matrix, (u)C,vC) and (uP,vP) The image coordinates of the imaging device and the projection device after the distortion parameters of the three-dimensional measurement system are corrected, and T is the transposition of the matrix.
In one embodiment, the three-dimensional mapping coefficient table obtaining unit 204 includes:
the three-dimensional mapping coefficient calculating unit is used for calculating a three-dimensional mapping coefficient according to the three-dimensional information corresponding to the pixel points and the following formula, and establishing a three-dimensional mapping coefficient table:
wherein ,αn,cX,bn,cY,cn and cZIs a three-dimensional mapping coefficient, N is a polynomial order, and phi is the absolute phase of the corresponding pixel point.
In one embodiment, the target focal scan image acquisition unit 205 includes:
the de-blurring processing unit is used for inputting the target focal scanning image to an integral point diffusion function for deconvolution operation to obtain a de-blurring processed target image;
an integral point spread function calculation unit, configured to calculate an integral point spread function according to the following formula:
wherein r is the distance of the center of a circle of confusion of the object point imaging on the sensor plane of the imaging device; b0In order to control the current value of the electronic zoom lens to be 0 moment, an object point is imaged on an imaging deviceCircle of dispersion diameter on the plane of the device; b1Spot diameter focused on the imaging device sensor plane for an object point; b2Controlling the current value of the electronic zoom lens to be at the moment of half period T, and imaging an object point on the diameter of a diffusion circle on a sensor plane of an imaging device; c1 and C2Are two constants.
In one embodiment, the spatial three-dimensional point coordinate obtaining unit 206 includes:
a spatial three-dimensional point coordinate calculation unit for calculating the spatial three-dimensional point coordinates by the following formula:
wherein ,{an,bn,cnThe mapping coefficient is the three-dimensional mapping coefficient of the corresponding area, phi is the absolute phase of the pixel point, and N is the polynomial order.
The embodiment of the present invention further provides a computer device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the processor implements the high-precision three-dimensional reconstruction method based on the large depth of field as described above.
An embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the method for high-precision three-dimensional reconstruction based on large depth of field as described above is implemented.
The embodiments are described in a progressive manner in the specification, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. For the system disclosed by the embodiment, the description is relatively simple because the system corresponds to the method disclosed by the embodiment, and the relevant points can be referred to the method part for description. It should be noted that, for those skilled in the art, it is possible to make various improvements and modifications to the present invention without departing from the principle of the present invention, and those improvements and modifications also fall within the scope of the claims of the present invention.
It is further noted that, in the present specification, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
Claims (10)
1. A high-precision three-dimensional reconstruction method based on large depth of field is characterized by comprising the following steps:
carrying out region division on a large depth-of-field measurement scene;
performing binocular vision three-dimensional calibration on each divided area by using a three-dimensional measurement system to obtain calibration data; wherein the three-dimensional measurement system comprises an imaging device and a projection device;
projecting specified patterns to the plane flat plates at different positions by using a projection device in different areas, acquiring flat plate images of the plane flat plates at different positions by using an imaging device, calculating to obtain absolute phase distribution maps of the plane flat plates at different positions, and calculating to obtain three-dimensional information of the plane flat plates at different positions according to the calibration data;
in each region, acquiring the absolute phase of each pixel point of the imaging device in the absolute phase distribution map of the plane panel, and establishing a three-dimensional mapping coefficient table of the corresponding region according to the mapping relation between the three-dimensional information of each plane panel and the absolute phase of the corresponding pixel point;
projecting a target pattern to a measured object by using a projection device, acquiring a target focal scanning image of the measured object by using an imaging device, and performing deblurring processing on the target focal scanning image to obtain a target image and calculate an absolute phase distribution diagram of the target image;
and acquiring the absolute phase of each pixel point of the target image in the absolute phase distribution map of the target image, searching a corresponding three-dimensional mapping coefficient in a three-dimensional mapping coefficient table of a corresponding region according to the region to which the absolute phase belongs, and calculating by using the three-dimensional mapping coefficient to obtain a corresponding space three-dimensional point coordinate.
2. The large depth of field based high-precision three-dimensional reconstruction method according to claim 1, wherein after the area division of the large depth of field measurement scene, the method comprises:
calibrating the positions of the imaging device and the projection device by using a calibration algorithm;
and acquiring the maximum value and the minimum value of the total control current of the large-depth-of-field measurement scene measured by the zoom lens of the imaging device, and the corresponding area control current value when the zoom lens of the imaging device is focused on the central position of each area, and recording the maximum value and the minimum value of the total control current and the area control current value corresponding to each area.
3. The large depth-of-field based high-precision three-dimensional reconstruction method according to claim 1, wherein the performing binocular vision three-dimensional calibration on each divided region by using the three-dimensional measurement system to obtain calibration data comprises:
establishing an imaging device coordinate system by taking the optical center of the imaging device as an origin and taking the optical axis of the imaging device as a Z axis; establishing a projection device coordinate system by taking the optical center of the projection device as an origin and taking the optical axis of the projection device as a Z axis;
and acquiring a conversion relation between the intrinsic parameters of the imaging device and a coordinate system of the imaging device and a conversion relation between the intrinsic parameters of the projection device and the projection coordinate system by using a binocular vision stereo calibration algorithm, and calculating the conversion relation between the coordinate system of the imaging device and the projection coordinate system to obtain calibration data.
4. The large depth-of-field based high-precision three-dimensional reconstruction method according to claim 1, wherein the calculating three-dimensional information of the planar flat plate at different positions according to the calibration data includes:
the three-dimensional information is calculated according to the following formula:
sC[uC,vC,1]T=KCMC[XW,YW,ZW,1]T
sP[uP,vP,1]T=KPI[XW,YW,ZW,1]T
wherein ,sC and sPScale factors, K, of the imaging device and the projection device, respectivelyC and KPInternal parameter matrices, M, for the imaging device and the projection device, respectivelyCIs an external parameter matrix of the imaging device, I is an identity matrix, (u)C,vC) and (uP,vP) The image coordinates of the imaging device and the projection device after the distortion parameters of the three-dimensional measurement system are corrected, and T is the transposition of the matrix.
5. The method according to claim 1, wherein in each region, acquiring an absolute phase of each pixel of the imaging device in the absolute phase distribution map of the flat panel, and establishing a three-dimensional mapping coefficient table of the corresponding region according to a mapping relationship between three-dimensional information of each flat panel and the absolute phase of the corresponding pixel, comprises:
according to the three-dimensional information corresponding to the pixel points, calculating a three-dimensional mapping coefficient according to the following formula, and establishing a three-dimensional mapping coefficient table:
wherein ,αn,cX,bn,cY,cn and cZIs a three-dimensional mapping coefficient, N is a polynomial order, and phi is the absolute phase of the corresponding pixel point.
6. The large-depth-of-field based high-precision three-dimensional reconstruction method according to claim 1, wherein the projecting device projects a target pattern to a measured object, an imaging device acquires a target focal-scanning image of the measured object, and the target focal-scanning image is deblurred to obtain a target image and calculate an absolute phase distribution map of the target image, and the method comprises:
inputting the target focal scanning image to an integral point diffusion function for deconvolution operation to obtain a deblurred target image;
the calculation formula of the integral point spread function is as follows:
wherein r is the distance of the center of a circle of confusion of the object point imaging on the sensor plane of the imaging device; b0In order to control the current value of the electronic zoom lens to be 0 moment, the diameter of a circle of confusion of an object point imaged on the plane of the sensor of the imaging device; b1Spot diameter focused on the imaging device sensor plane for an object point; b2Controlling the current value of the electronic zoom lens to be at the moment of half period T, and imaging an object point on the diameter of a diffusion circle on a sensor plane of an imaging device; c1 and C2Are two constants.
7. The large-depth-of-field based high-precision three-dimensional reconstruction method according to claim 1, wherein the obtaining of the absolute phase of each pixel point of the target image in the absolute phase distribution map of the target image, looking up a corresponding three-dimensional mapping coefficient in a three-dimensional mapping coefficient table of a corresponding region according to a region to which the absolute phase belongs, and calculating by using the three-dimensional mapping coefficient to obtain a corresponding spatial three-dimensional point coordinate comprises:
calculating the spatial three-dimensional point coordinates by the following formula:
wherein ,{an,bn,cnThe mapping coefficient is the three-dimensional mapping coefficient of the corresponding area, phi is the absolute phase of the pixel point, and N is the polynomial order.
8. A high-precision three-dimensional reconstruction device based on a large depth of field is characterized by comprising:
the area division unit is used for carrying out area division on the large depth-of-field measurement scene;
the calibration data acquisition unit is used for carrying out binocular vision three-dimensional calibration on each divided area by using the three-dimensional measurement system to obtain calibration data; wherein the three-dimensional measurement system comprises an imaging device and a projection device;
the three-dimensional information acquisition unit is used for projecting specified patterns to the plane flat plates at different positions by using the projection device in different areas, acquiring flat plate images of the plane flat plates at different positions by using the imaging device, calculating to obtain absolute phase distribution maps of the plane flat plates at different positions, and calculating to obtain three-dimensional information of the plane flat plates at different positions according to the calibration data;
the three-dimensional mapping coefficient table acquisition unit is used for acquiring the absolute phase of each pixel point of the imaging device in the absolute phase distribution map of the plane panel in each area and establishing a three-dimensional mapping coefficient table of the corresponding area according to the mapping relation between the three-dimensional information of each plane panel and the absolute phase of the corresponding pixel point;
the target focal-scanning image acquisition unit is used for projecting a target pattern to a measured object by using the projection device, acquiring a target focal-scanning image of the measured object by using the imaging device, and performing deblurring processing on the target focal-scanning image to obtain a target image and calculate an absolute phase distribution map of the target image;
and the space three-dimensional point coordinate acquisition unit is used for acquiring the absolute phase of each pixel point of the target image in the absolute phase distribution diagram of the target image, searching a corresponding three-dimensional mapping coefficient in a three-dimensional mapping coefficient table of a corresponding area according to the area to which the absolute phase belongs, and calculating by using the three-dimensional mapping coefficient to obtain the corresponding space three-dimensional point coordinate.
9. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the large depth of field based high precision three dimensional reconstruction method according to any one of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when executed by a processor, causes the processor to perform the large depth of field based high precision three-dimensional reconstruction method according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110529545.1A CN113160393B (en) | 2021-05-14 | 2021-05-14 | High-precision three-dimensional reconstruction method and device based on large depth of field and related components thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110529545.1A CN113160393B (en) | 2021-05-14 | 2021-05-14 | High-precision three-dimensional reconstruction method and device based on large depth of field and related components thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113160393A true CN113160393A (en) | 2021-07-23 |
CN113160393B CN113160393B (en) | 2023-08-04 |
Family
ID=76875231
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110529545.1A Active CN113160393B (en) | 2021-05-14 | 2021-05-14 | High-precision three-dimensional reconstruction method and device based on large depth of field and related components thereof |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113160393B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113989517A (en) * | 2021-11-01 | 2022-01-28 | 西北工业大学 | Image processing method and device and electronic equipment |
CN115546285A (en) * | 2022-11-25 | 2022-12-30 | 南京理工大学 | Large-field-depth fringe projection three-dimensional measurement method based on point spread function calculation |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106767405A (en) * | 2016-12-15 | 2017-05-31 | 深圳大学 | The method and device of the quick corresponding point matching of phase mapping assist three-dimensional imaging system |
CN107358631A (en) * | 2017-06-27 | 2017-11-17 | 大连理工大学 | A kind of binocular vision method for reconstructing for taking into account three-dimensional distortion |
CN110477936A (en) * | 2019-08-20 | 2019-11-22 | 新里程医用加速器(无锡)有限公司 | Beam-defining clipper scaling method, device, equipment and the medium of radiation imaging system |
CN111649691A (en) * | 2020-03-06 | 2020-09-11 | 福州大学 | Digital fringe projection three-dimensional imaging system and method based on single-pixel detector |
-
2021
- 2021-05-14 CN CN202110529545.1A patent/CN113160393B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106767405A (en) * | 2016-12-15 | 2017-05-31 | 深圳大学 | The method and device of the quick corresponding point matching of phase mapping assist three-dimensional imaging system |
CN107358631A (en) * | 2017-06-27 | 2017-11-17 | 大连理工大学 | A kind of binocular vision method for reconstructing for taking into account three-dimensional distortion |
CN110477936A (en) * | 2019-08-20 | 2019-11-22 | 新里程医用加速器(无锡)有限公司 | Beam-defining clipper scaling method, device, equipment and the medium of radiation imaging system |
CN111649691A (en) * | 2020-03-06 | 2020-09-11 | 福州大学 | Digital fringe projection three-dimensional imaging system and method based on single-pixel detector |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113989517A (en) * | 2021-11-01 | 2022-01-28 | 西北工业大学 | Image processing method and device and electronic equipment |
CN115546285A (en) * | 2022-11-25 | 2022-12-30 | 南京理工大学 | Large-field-depth fringe projection three-dimensional measurement method based on point spread function calculation |
Also Published As
Publication number | Publication date |
---|---|
CN113160393B (en) | 2023-08-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110447220B (en) | Calibration device, calibration method, optical device, imaging device, and projection device | |
CN104729429B (en) | A kind of three dimensional shape measurement system scaling method of telecentric imaging | |
RU2677562C2 (en) | System and method for modeling and calibrating imaging device | |
CN113205593B (en) | High-light-reflection surface structure light field three-dimensional reconstruction method based on point cloud self-adaptive restoration | |
Douxchamps et al. | High-accuracy and robust localization of large control markers for geometric camera calibration | |
CN112967342B (en) | High-precision three-dimensional reconstruction method and system, computer equipment and storage medium | |
CN104061879A (en) | Continuous-scanning structured light three-dimensional surface shape perpendicular measuring method | |
CN115775303B (en) | Three-dimensional reconstruction method for high-reflection object based on deep learning and illumination model | |
CN106504290A (en) | A kind of high-precision video camera dynamic calibrating method | |
CN113160393B (en) | High-precision three-dimensional reconstruction method and device based on large depth of field and related components thereof | |
CN111238403A (en) | Three-dimensional reconstruction method and device based on light field sub-aperture stripe image | |
CN108020175A (en) | A kind of more optical grating projection binocular vision tongue body surface three dimension entirety imaging methods | |
US11512946B2 (en) | Method and system for automatic focusing for high-resolution structured light 3D imaging | |
CN115546285B (en) | Large-depth-of-field stripe projection three-dimensional measurement method based on point spread function calculation | |
CN111080705A (en) | Calibration method and device for automatic focusing binocular camera | |
CN114792345B (en) | Calibration method based on monocular structured light system | |
CN115578296B (en) | Stereo video processing method | |
CN110108230B (en) | Binary grating projection defocus degree evaluation method based on image difference and LM iteration | |
CN113251953B (en) | Mirror included angle measuring device and method based on stereo deflection technology | |
Chen et al. | Finding optimal focusing distance and edge blur distribution for weakly calibrated 3-D vision | |
CN112767536B (en) | Three-dimensional reconstruction method, device and equipment for object and storage medium | |
CN111754587B (en) | Zoom lens rapid calibration method based on single-focus focusing shooting image | |
CN111998834A (en) | Crack monitoring method and system | |
CN109859313B (en) | 3D point cloud data acquisition method and device, and 3D data generation method and system | |
Ueno et al. | Compound-Eye Camera Module as Small as 8.5$\times $8.5$\times $6.0 mm for 26 k-Resolution Depth Map and 2-Mpix 2D Imaging |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |