CN111288925A - Three-dimensional reconstruction method and device based on digital focusing structure illumination light field - Google Patents

Three-dimensional reconstruction method and device based on digital focusing structure illumination light field Download PDF

Info

Publication number
CN111288925A
CN111288925A CN202010055378.7A CN202010055378A CN111288925A CN 111288925 A CN111288925 A CN 111288925A CN 202010055378 A CN202010055378 A CN 202010055378A CN 111288925 A CN111288925 A CN 111288925A
Authority
CN
China
Prior art keywords
light field
image
plane
dimensional
fringe
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010055378.7A
Other languages
Chinese (zh)
Other versions
CN111288925B (en
Inventor
刘甜
胡祺
桂宾
丁毅
王栋云
胡国亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Fenghuo Kaizhuo Technology Co ltd
Original Assignee
Wuhan Fenghuo Kaizhuo Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Fenghuo Kaizhuo Technology Co ltd filed Critical Wuhan Fenghuo Kaizhuo Technology Co ltd
Priority to CN202010055378.7A priority Critical patent/CN111288925B/en
Publication of CN111288925A publication Critical patent/CN111288925A/en
Application granted granted Critical
Publication of CN111288925B publication Critical patent/CN111288925B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/254Projection of a pattern, viewing through a pattern, e.g. moiré
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Abstract

The embodiment of the invention provides a three-dimensional reconstruction method and a three-dimensional reconstruction device based on an illumination light field of a digital focusing structure, wherein the method comprises the following steps: acquiring four-dimensional light field information of an object to be detected; carrying out digital refocusing processing on the four-dimensional light field information by using a space domain digital focusing method to obtain a focusing plane sequence image; performing fringe analysis by Fourier transform profilometry to obtain a wrapped phase diagram; performing phase expansion on the wrapped phase diagram to obtain an absolute phase difference between the reference fringe and the deformation fringe; and obtaining the height information of the surface of the object to be detected according to the absolute phase difference, and further recovering the three-dimensional shape information of the object to be detected. The method obtains the projection pictures with different depths of field through digital focusing, and obtains the three-dimensional reconstruction result by adopting the single-frequency projection fringe space phase expansion technology after the fringe analysis is carried out by adopting the Fourier transform profilometry.

Description

Three-dimensional reconstruction method and device based on digital focusing structure illumination light field
Technical Field
The embodiment of the invention relates to the field of stereoscopic vision, in particular to a three-dimensional reconstruction method and a three-dimensional reconstruction device based on an illumination light field of a digital focusing structure.
Background
With the rapid development of information technology and robot technology, the demand for accurately acquiring three-dimensional data of an object surface to meet diversified applications is becoming stronger. The high-precision three-dimensional measurement technology is widely applied to the fields of measurement and processing of precision industrial components, reconstruction of medical images, aviation mapping, reverse reconstruction of model instruments and the like. Besides the applications in the fields of traditional industry, medicine and the like, the emerging technical fields such as unmanned driving, intelligent robots, smart cities and the like with the rise of artificial intelligence waves need the participation of three-dimensional measurement technology. A great deal of scientific research achievements of three-dimensional measurement technology have changed the production and life of human society in many aspects over decades, emerging application scenes and technical requirements are emerging continuously, the development potential of the research field is shown, and it can be ensured that the rapid and high-precision three-dimensional shape measurement technology will play an increasingly greater role in various fields.
The structured light three-dimensional measurement technology is an active three-dimensional vision method for recovering scene three-dimensional data according to a modulated and deformed coding template by projecting a coding template image to a scene to be measured. The method can acquire high-precision three-dimensional shape data of the object in real time in a non-contact mode. For the measurement of three-dimensional topography, fourier transform Profilometry (FTP for short) was proposed by m.takenda and k.mutoh in 1983, which is one of the very effective non-contact, fast three-dimensional topography measurement methods, and has the remarkable characteristics of simplicity, convenience and rapidness, only one frame of image in a fringe pattern needs to be selected, and the three-dimensional image of an object can be reproduced by applying fourier transform, filtering and inverse fourier transform without more than two frames at most. Therefore, the method can be applied to the three-dimensional reconstruction of the surface of the object in a dynamic scene.
The existing three-dimensional reconstruction method uses a traditional common camera to shoot an object for three-dimensional reconstruction, needs a user to carefully select camera parameters before shooting, needs multiple mechanical focusing in the shooting process, shoots pictures for the object with each depth of field, and is complex in operation and low in efficiency.
Disclosure of Invention
The embodiment of the invention provides a three-dimensional reconstruction method and a three-dimensional reconstruction device based on an illuminating light field of a digital focusing structure.
In a first aspect, an embodiment of the present invention provides a three-dimensional reconstruction method for an illumination light field based on a digital focusing structure, including:
s1, acquiring four-dimensional light field information of the object to be detected; the four-dimensional light field information is obtained by shooting a deformed stripe image projected on the surface of the object to be detected through a light field camera;
s2, carrying out digital refocusing processing on the four-dimensional light field information by using a space domain digital focusing method to obtain a focusing plane sequence image;
s3, performing fringe analysis through Fourier transform profilometry based on the reference fringe image and the deformation fringe image to obtain a wrapped phase diagram;
s4, performing phase unwrapping on the wrapped phase diagram to obtain an absolute phase difference between the reference fringe and the deformed fringe;
and S5, acquiring the height information of the surface of the object to be detected according to the absolute phase difference, and recovering the three-dimensional shape information of the object to be detected according to the height information and the focusing plane sequence image.
Further, before the step S1 of acquiring the four-dimensional light field information of the object to be measured, the method further includes:
projecting a sine stripe image with a preset frequency onto a reference surface to obtain a reference stripe image; and projecting the reference stripe image to the surface of an object to be measured to obtain a deformed stripe image, and shooting the deformed stripe image by using a light field camera.
Further, the four-dimensional light field information includes light intensity, two-dimensional position information, and two-dimensional direction information recorded by the light field camera.
Further, in step S2, the performing digital refocusing processing on the four-dimensional light field information by using a spatial domain digital focusing method specifically includes:
defining a lens plane of the light field camera as a u-v plane, defining an imaging plane where the image sensor is located as an x-y plane, and then after the light enters the light field camera, defining the intersection point of the light and the u-v plane as (u, v) and the intersection point of the light and the x-y plane of the image sensor as (x, y); denote the ray as LF(u, v, x, y), wherein subscript F is the distance of the lens plane relative to the imaging plane;
assuming that the image obtained by the light field camera at the image distance F is unclear and the image obtained on the image plane at the image distance α F is clear, the four-dimensional light field information on the new focal plane F' α F is:
Figure BDA0002372582570000031
pixel value E at the new focus plane (x ', y')(x·F)The formula for the calculation of (x ', y') is:
Figure BDA0002372582570000032
where α is the virtual imaging plane depth scale factor and F is the distance of the x-y plane from the u-v plane.
Further, in step S3, performing fringe analysis by fourier transform profilometry based on the reference fringe image and the deformed fringe image to obtain a wrapped phase map, specifically including:
s31, selecting a cross section of the object to be measured with a fixed y coordinate, wherein the distribution function of the intensity of the deformed stripes and the height of the object can be expressed as a function of a univariate x, wherein:
the reference fringe image intensity can be expressed as:
Figure BDA0002372582570000041
the intensity of the deformed fringe image can be expressed as:
Figure BDA0002372582570000042
in the formula, wherein f0Representing the spatial frequency of the fundamental component of the projected fringes, bkIs the amplitude of the k-th harmonic component of the projected fringe, ΨkRepresenting the initial phase of each order harmonic component, wherein
Figure BDA0002372582570000043
Is the phase change of the fringe deformation caused by the k-th harmonic;
s32, extracting fundamental frequency information of the reference stripe image and the deformed stripe image by utilizing Fourier transform profilometry, wherein the fundamental frequency information is fundamental frequency components of r (x) and d (x);
s33, carrying out inverse Fourier transform on the fundamental frequency component to obtain rf(x) And df(x) Wherein:
rf(x)=b1cos(2πf0x+Ψ1)
Figure BDA0002372582570000044
s34, for rf(x) And df(x) And performing arc tangent calculation to obtain a wrapped phase diagram.
In a second aspect, an embodiment of the present invention provides a three-dimensional reconstruction apparatus for illuminating a light field based on a digital focusing structure, including:
the light field information acquisition module is used for acquiring four-dimensional light field information of the object to be detected; the four-dimensional light field information is obtained by shooting a deformed stripe image projected on the surface of the object to be detected through a light field camera;
the digital refocusing module is used for carrying out digital refocusing processing on the four-dimensional light field information by utilizing a space domain digital focusing method to obtain a focusing plane sequence image;
the fringe analysis module is used for carrying out fringe analysis through a Fourier transform profilometry based on the reference fringe image and the deformed fringe image to obtain a wrapped phase diagram;
the phase unwrapping module is used for unwrapping the wrapped phase diagram to obtain an absolute phase;
and the three-dimensional reconstruction module is used for obtaining the height information of the surface of the object to be detected according to the absolute phase and recovering the three-dimensional appearance information of the object to be detected according to the height information and the focusing plane sequence image.
Further, the apparatus further comprises:
the projection module is used for projecting the sine stripe image with the preset frequency onto a reference surface to obtain a reference stripe image; and projecting the reference stripe image to the surface of an object to be measured to obtain a deformed stripe image, and shooting the deformed stripe image by using a light field camera.
Further, the digital refocusing module is specifically configured to:
defining a lens plane of the light field camera as a u-v plane, defining an imaging plane where the image sensor is located as an x-y plane, and then after the light enters the light field camera, defining the intersection point of the light and the u-v plane as (u, v) and the intersection point of the light and the x-y plane of the image sensor as (x, y); denote the ray as LF(u, v, x, y), wherein subscript F is the distance of the lens plane relative to the imaging plane;
assuming that the image obtained by the light field camera at the image distance F is unclear and the image obtained on the image plane at the image distance α F is clear, the four-dimensional light field information on the new focal plane F' α F is:
Figure BDA0002372582570000051
pixel value E at the new focus plane (x ', y')(x·F)The formula for the calculation of (x ', y') is:
Figure BDA0002372582570000052
where α is the virtual imaging plane depth scale factor and F is the distance of the x-y plane from the u-v plane.
In a third aspect, an embodiment of the present invention provides an electronic device, including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor implements the steps of the method for three-dimensional reconstruction of an illuminated light field based on a digital focusing structure according to the embodiment of the first aspect of the present invention when executing the program.
In a fourth aspect, embodiments of the present invention provide a non-transitory computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the steps of the method for three-dimensional reconstruction of an illuminated light field based on a digital focus structure as described in embodiments of the first aspect of the present invention.
Compared with the prior art, the three-dimensional reconstruction method and the device based on the digital focusing structure illumination light field have the following beneficial effects:
1) the invention obtains projection pictures with different depths of field through digital focusing, carries out fringe analysis by adopting Fourier transform profilometry, and then obtains absolute phase by utilizing single-frequency projection fringe space phase expansion technology, thereby obtaining three-dimensional reconstruction results, overcoming the limitation of small depth of field of the traditional imaging, having no need of mechanical focusing, and being capable of refocusing on a target object by adopting digital focusing technology even if the shot target object is not focused.
2) Compared with the prior art that multiple times of mechanical focusing are needed and pictures are taken for objects with each depth of field, the method can focus to different planes in a digital focusing mode by only taking one picture, and carries out three-dimensional reconstruction on the objects with different depths of field in the picture, thereby improving the efficiency.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a schematic flow chart of a three-dimensional reconstruction method for an illumination field based on a digital focusing structure according to an embodiment of the present invention;
FIG. 2 is a projection of stripes provided by an embodiment of the present invention;
FIG. 3 is a schematic diagram of a projection fringe topography measurement technique according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a digital focusing method according to an embodiment of the present invention;
FIG. 5 is a block diagram of a three-dimensional reconstruction apparatus for illuminating a light field based on a digital focusing structure according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The existing three-dimensional reconstruction method uses a traditional common camera to shoot an object for three-dimensional reconstruction, needs a user to carefully select camera parameters before shooting, needs multiple mechanical focusing in the shooting process, shoots pictures for the object with each depth of field, and is complex in operation and low in efficiency.
Therefore, the embodiment of the invention provides a three-dimensional reconstruction method based on a digital focusing structure illumination light field, projection pictures with different depths of field are obtained through digital focusing, after fringe analysis is carried out by adopting Fourier transform profilometry, an absolute phase is obtained by utilizing a single-frequency projection fringe space phase expansion technology, and thus a three-dimensional reconstruction result is obtained. The following description and description will proceed with reference being made to various embodiments.
Fig. 1 provides a three-dimensional reconstruction method for an illumination light field based on a digital focusing structure according to an embodiment of the present invention, including:
s1, acquiring four-dimensional light field information of the object to be detected; the four-dimensional light field information is obtained by shooting a deformed stripe image projected on the surface of the object to be measured through a light field camera.
Specifically, before executing step S1, first, a sinusoidal fringe image with a preset frequency needs to be projected onto a reference surface to obtain a reference fringe image; and then, projecting the reference stripe image to the surface of the object to be measured to obtain a deformed stripe image. Fig. 2 is a projection diagram of stripes according to an embodiment of the present invention, and fig. 2 shows an image of deformed stripes. Then, the deformed fringe image is photographed using a light field camera. Different from a two-dimensional image acquisition mode of a traditional camera, the light field camera can record space and angle information of a three-dimensional scene, namely four-dimensional light field information, through single exposure.
After the light field camera shoots the deformed stripe image, step S1 is executed to obtain the four-dimensional light field information of the object to be measured. In this embodiment, the four-dimensional light field information includes light intensity, two-dimensional position information, and two-dimensional direction information recorded by the light field camera.
And S2, carrying out digital refocusing processing on the four-dimensional light field information by using a space domain digital focusing method to obtain a focusing plane sequence image.
The light field digital refocusing is an important application of a light field imaging technology, and means that a picture obtained after one exposure is subjected to inversion of a fuzzy defocused image by using a digital image processing technology, so that a clear target image with accurate focus is reconstructed.
It can be understood that, in the prior art, when an object is shot by a conventional common camera to perform three-dimensional reconstruction, a user needs to carefully select camera parameters before shooting, and multiple times of mechanical focusing is needed in the shooting process, so that pictures are shot for the object with each depth of field, and the efficiency is low. The invention adopts the digital focusing technology, can shoot firstly and then focus, and realizes the flexible control of the depth of field.
And S3, performing fringe analysis by Fourier transform profilometry based on the reference fringe image and the deformation fringe image to obtain a wrapped phase map.
And S4, performing phase unwrapping on the wrapped phase diagram to obtain the absolute phase difference between the reference fringe and the deformed fringe.
And S5, acquiring the height information of the surface of the object to be detected according to the absolute phase difference, and recovering the three-dimensional shape information of the object to be detected according to the height information and the focusing plane sequence image.
In S3, Fourier Transform Profilometry (FTP for short) is one of three-dimensional topography measurement methods, and it is remarkably characterized by simplicity, convenience, and rapidness, and only one frame of image in a fringe pattern needs to be selected, and at most, no more than two frames are needed, and Fourier Transform, filtering, and inverse Fourier Transform can be applied to reproduce a three-dimensional figure of an object. Therefore, the method can be applied to the three-dimensional reconstruction of the surface of the object in a dynamic scene.
In the embodiment, fundamental frequency information of the reference fringe image and the deformed fringe image is extracted by utilizing Fourier transform profilometry, and inverse Fourier transform is carried out to obtain the wrapped phase diagram. The method specifically comprises the following steps:
s31, selecting a cross section of the object to be measured with a fixed y coordinate, wherein the distribution function of the intensity of the deformed stripes and the height of the object can be expressed as a function of a univariate x, wherein:
the reference fringe image intensity can be expressed as:
Figure BDA0002372582570000091
the intensity of the deformed fringe image can be expressed as:
Figure BDA0002372582570000092
in the formula, wherein f0Representing the spatial frequency of the fundamental component of the projected fringes, bkIs the amplitude of the k-th harmonic component of the projected fringe, ΨkRepresents the initial phase of the k-th harmonic component, wherein
Figure BDA0002372582570000093
Is the phase change of the k-th harmonic that causes the distortion of the fringes. k represents the harmonic order. At the same time handle
Figure BDA0002372582570000099
The absolute phase of the kth harmonic component, referred to as the reference fringe r, which represents the absolute phase value at each point of the reference fringe,
Figure BDA0002372582570000094
is a monotonically increasing function of x, i.e. there is only one per x
Figure BDA0002372582570000095
Same handle
Figure BDA0002372582570000096
Referred to as the absolute phase of the kth harmonic component of the deformed fringe d, it represents the absolute phase value at each point of the deformed fringe, which is also a monotonically increasing function of x. Wherein the content of the first and second substances,
Figure BDA0002372582570000097
r in (1) corresponds to the reference stripe r,
Figure BDA0002372582570000098
d in (1) corresponds to the deformed streak d.
Fig. 3 is an optical diagram of a projected fringe topography measurement technique according to an embodiment of the present invention, where Ec in fig. 3 is a camera exit pupil position, and Ep is a projector exit pupil position. The basic optical principle of the projected fringe measurement technique is that digital light generated by a projector is reflected by a reference plane to a camera at point C when no object to be measured is present. When the reference plane is removed, the same path of digital light is reflected to the camera through the point D at the point H. That is to say, for the same path of digital light, the spatial positions photographed in the cameras are different due to the existence of the object to be measured, the intensity result r (x) of the reference stripe is directly reflected from the point C, and the intensity result d (x) of the deformed stripe is reflected from the point H. Therefore, the absolute phase difference between the reference stripe and the deformed stripe obtained by the camera twice shooting is equivalent to the absolute phase difference between the point C and the point D. Therefore, for the D point, the absolute phase difference delta phi of the kth harmonic in the reference stripe and the deformed stripek(D) Can be expressed as follows:
Figure BDA0002372582570000101
wherein
Figure BDA0002372582570000102
Representing the spatial displacement difference between points C and D. Because the D point is arbitrarily selected and exists
Figure BDA0002372582570000103
Can obtain
Figure BDA0002372582570000104
Wherein, Δ Φ1(x) Representing the absolute phase difference generated by the fundamental frequency component of the projected fringe after passing through the object surface.
Figure BDA0002372582570000105
Representing the absolute phase of the fundamental frequency component of the deformed fringes;
Figure BDA0002372582570000106
representing the absolute phase of the reference fringe fundamental frequency component. As can be seen from FIG. 3, d0Denotes the distance, l, from the camera exit pupil to the projector exit pupil0Representing the distance of the reference plane to the exit pupil of the camera, which are fixed when the camera and projector are fixed. The distance between point C and point D is therefore dependent on the distance h (x) of point C from the reference plane (i.e., point C is not in contact with the reference plane)
Figure BDA0002372582570000107
) According to triangle EpHEcSimilar to CHD, one can obtain:
Figure BDA0002372582570000108
bringing in
Figure BDA0002372582570000109
Then there are:
Figure BDA00023725825700001010
to facilitate fourier transformation, r (x) and d (x) in S31 are rewritten to exponential form:
Figure BDA00023725825700001011
Figure BDA00023725825700001012
fourier transform is carried out, fundamental frequency components are filtered out, and then inverse Fourier transform is carried out to obtain
Figure BDA0002372582570000111
Figure BDA0002372582570000112
In the formula, rf(x) Representing the fundamental frequency component of the reference fringe, df(x) Fundamental frequency component representing deformation fringes, ejThe representation uses an exponential form to represent the fundamental frequency.
The wrapped phase map was obtained by calculation as follows:
Figure BDA0002372582570000113
in the formula (I), the compound is shown in the specification,
Figure BDA0002372582570000114
represents;
Figure BDA0002372582570000115
the expression takes the imaginary part;
Figure BDA0002372582570000116
the representation takes the real part.
The phase calculated by the method presents discontinuous truncation distribution, the phase difference changes within (-pi, pi), also called wrapping phase, and the phase unfolding technology is used for eliminating truncation jump in the phase distribution and unfolding the truncation jump into continuous phase, namely absolute phase difference at fundamental frequency, so that the height information can be recovered.
Specifically, the wrapped phase diagram is subjected to phase unwrapping operation to obtain surface topography data of the object to be measured:
Figure BDA0002372582570000117
where unwrap () is phase unwrapping. Phase unwrapping is actually an integration process of wrapped phase differential rewrap values. The unwrapped phase value for the wrapped phase map is denoted by up, which is related to the wrapped phase by:
up(x)=wp(x)-2nkπ,nk∈Z
wherein up (x) represents the wrapped phase map after unwrappingA phase value; wp (x) is the wrapped phase before unwrapping, nkDenotes a truncation transition at the k-th order, and n0When being equal to 0, then there is
Figure BDA0002372582570000118
ΔΦ1(x)=up(xk)-up(xk-1)
In the formula, xkRepresents the pixel x at the truncation transition of the k-th order; wp (x)k) Representing the wrapped phase of pixel x at the k-th order truncation transition; up (x)k) Representing the unwrapped phase of pixel x at the k-th truncation transition.
According to the continuous phase change < pi between any two adjacent sampling points, namely:
-π<up(xk)-up(xk-1)<π
combining the two formulas to obtain:
-π<wp(xk)-wp(xk-1)-2nkπ+2nk-1π<π
adding pi at both ends, and dividing by 2 pi to obtain:
Figure BDA0002372582570000121
and (3) rounding down the inequality items simultaneously to obtain:
Figure BDA0002372582570000122
in the formula, floor represents rounding down.
From this, up (x) can be determinedk) Further, the value of [ Delta ] [ phi ] can be obtained1(x) And acquiring the height information of the surface of the object to be detected according to the absolute phase difference, and recovering the three-dimensional appearance information of the object to be detected according to the height information and the focusing plane sequence image.
The phase difference obtained in S4 is substituted into the following equation:
Figure BDA0002372582570000123
therefore, the height information of the surface of the object to be detected can be recovered. Wherein h (x) represents the distance from point C in fig. 3 to the reference plane; delta phi1(x) Representing the absolute phase difference generated after the fundamental frequency component of the projection fringe passes through the surface of the object; f. of0Spatial frequencies representing fundamental frequency components of the projected fringes; d0Denotes the distance, l, from the camera exit pupil to the projector exit pupil0Indicating the distance of the reference plane from the exit pupil of the camera.
And further, recovering the three-dimensional morphology information of the object to be detected according to the height information and the focusing plane sequence image.
According to the embodiment of the invention, projection pictures with different depths of field are obtained through digital focusing, after fringe analysis is carried out by adopting Fourier transform profilometry, an absolute phase is obtained by utilizing a single-frequency projection fringe space phase expansion technology, so that a three-dimensional reconstruction result is obtained. The limitation of small depth of field of the traditional imaging is overcome, and mechanical focusing is not needed. Compared with the prior art that multiple times of mechanical focusing are needed and pictures are taken for objects with each depth of field, the method can focus to different planes in a digital focusing mode by only taking one picture, and carries out three-dimensional reconstruction on the objects with different depths of field in the picture, thereby improving the efficiency.
On the basis of the foregoing embodiment, in step S2, the performing digital refocusing processing on the four-dimensional light field information by using a spatial domain digital focusing method specifically includes:
fig. 4 is a digital focusing schematic diagram according to an embodiment of the present invention, and referring to fig. 4, the present embodiment defines a lens plane of a light field camera as a u-v plane, and defines an imaging plane where an image sensor is located as an x-y plane, so that after a light ray enters the light field camera, an intersection point of the light ray and the u-v plane is (u, v), and an intersection point of the light ray and the x-y plane of the image sensor is (x, y); denote the ray as LF(u, v, x, y), wherein subscript F is the distance of the lens plane relative to the imaging plane;
assuming that the image obtained by the light field camera at the image distance F is unclear and the image obtained on the image plane at the image distance α F is clear, the four-dimensional light field information on the new focal plane F' α F is:
Figure BDA0002372582570000131
pixel value E at the new focus plane (x ', y')(x·F)The formula for the calculation of (x ', y') is:
Figure BDA0002372582570000132
where α is the virtual imaging plane depth scale factor and F is the distance of the x-y plane from the u-v plane.
Specifically, the selection of the virtual imaging plane depth scale factor α, F', α · F, indicates that the new focal plane is in front of the actually photographed imaging plane when 0< α <1, and indicates that the new focal plane is behind the actually photographed imaging plane when α >1, the selection of different α allows refocusing on different planes, resulting in the imaging of the light field camera on the image plane at different image distances α F, i.e., a focal plane sequence image.
The digital focusing algorithm is adopted in the embodiment, so that the depth of field larger than that of the traditional camera can be obtained, namely, the digital focusing algorithm can also realize focusing at the depth which can not be focused by the traditional camera.
The embodiment can adopt the light field camera based on the micro-lens array to carry out 1/N on the aperture of the main lens2Sub-dividing and sampling to obtain F number of original diaphragmNIs equivalent to N2Number of FNThe combined digital focus depth range of the virtual camera of the/N is equivalent to the depth of field of a single small-aperture camera. Conversely, if each camera is itself sampled as a sub-aperture, the combination of multiple cameras can also be equivalent to a large aperture camera with a larger f-number. Digital focusing is performed on the virtual large-aperture camera, and the synthesized image has extremely shallow depth of field.
Fig. 5 is a block diagram of a three-dimensional reconstruction apparatus for illuminating a light field based on a digital focusing structure according to an embodiment of the present invention, and with reference to fig. 5, the apparatus includes:
a light field information obtaining module 501, configured to obtain four-dimensional light field information of an object to be detected; the four-dimensional light field information is obtained by shooting a deformed stripe image projected on the surface of the object to be detected through a light field camera;
a digital refocusing module 502, configured to perform digital refocusing processing on the four-dimensional light field information by using a spatial domain digital focusing method, so as to obtain a focusing plane sequence image;
a fringe analysis module 503, configured to perform fringe analysis by fourier transform profilometry based on the reference fringe image and the deformed fringe image, to obtain a wrapped phase map;
a phase unwrapping module 504 configured to perform phase unwrapping on the wrapped phase map to obtain an absolute phase;
and the three-dimensional reconstruction module 505 is configured to obtain height information of the surface of the object to be detected according to the absolute phase, and recover three-dimensional morphology information of the object to be detected according to the height information and the focal plane sequence image.
On the basis of the above embodiments, the three-dimensional reconstruction device based on the digital focusing structure illumination light field further includes a projection module, configured to project a sinusoidal fringe image with a preset frequency onto a reference surface to obtain a reference fringe image; and projecting the reference stripe image to the surface of an object to be measured to obtain a deformed stripe image, and shooting the deformed stripe image by using a light field camera.
On the basis of the foregoing embodiments, the digital refocusing module is specifically configured to:
defining a lens plane of the light field camera as a u-v plane, defining an imaging plane where the image sensor is located as an x-y plane, and then after the light enters the light field camera, defining the intersection point of the light and the u-v plane as (u, v) and the intersection point of the light and the x-y plane of the image sensor as (x, y); denote the ray as LF(u, v, x, y), wherein subscript F is the distance of the lens plane relative to the imaging plane;
assuming that the image obtained by the light field camera at the image distance F is unclear and the image obtained on the image plane at the image distance α F is clear, the four-dimensional light field information on the new focal plane F' α F is:
Figure BDA0002372582570000151
pixel value E at the new focus plane (x ', y')(x·F)The formula for the calculation of (x ', y') is:
Figure BDA0002372582570000152
where α is the virtual imaging plane depth scale factor and F is the distance of the x-y plane from the u-v plane.
An embodiment of the present invention provides an electronic device, and as shown in fig. 6, the server may include: a processor (processor)601, a communication Interface (Communications Interface)602, a memory (memory)603 and a communication bus 604, wherein the processor 601, the communication Interface 602 and the memory 603 complete communication with each other through the communication bus 604. The processor 601 may call logic instructions in the memory 503 to perform the three-dimensional reconstruction method based on the digital focusing structure illumination light field provided by the above embodiments, for example, including: s1, acquiring four-dimensional light field information of the object to be detected; the four-dimensional light field information is obtained by shooting a deformed stripe image projected on the surface of the object to be detected through a light field camera; s2, carrying out digital refocusing processing on the four-dimensional light field information by using a space domain digital focusing method to obtain a focusing plane sequence image; s3, performing fringe analysis through Fourier transform profilometry based on the reference fringe image and the deformation fringe image to obtain a wrapped phase diagram; s4, performing phase unwrapping on the wrapped phase diagram to obtain an absolute phase difference between the reference fringe and the deformed fringe; and S5, acquiring the height information of the surface of the object to be detected according to the absolute phase difference, and recovering the three-dimensional shape information of the object to be detected according to the height information and the focusing plane sequence image.
Embodiments of the present invention further provide a non-transitory computer-readable storage medium, on which a computer program is stored, where the computer program is implemented by a processor to execute the three-dimensional reconstruction method based on the digital focusing structure illumination light field provided in the foregoing embodiments, for example, the method includes: s1, acquiring four-dimensional light field information of the object to be detected; the four-dimensional light field information is obtained by shooting a deformed stripe image projected on the surface of the object to be detected through a light field camera; s2, carrying out digital refocusing processing on the four-dimensional light field information by using a space domain digital focusing method to obtain a focusing plane sequence image; s3, performing fringe analysis through Fourier transform profilometry based on the reference fringe image and the deformation fringe image to obtain a wrapped phase diagram; s4, performing phase unwrapping on the wrapped phase diagram to obtain an absolute phase difference between the reference fringe and the deformed fringe; and S5, acquiring the height information of the surface of the object to be detected according to the absolute phase difference, and recovering the three-dimensional shape information of the object to be detected according to the height information and the focusing plane sequence image.
In summary, embodiments of the present invention provide a three-dimensional reconstruction method and apparatus based on a digital focusing structure illumination light field, which obtains projection pictures with different depths of field through digital focusing, performs fringe analysis by using fourier transform profiling, and then obtains an absolute phase by using a single-frequency projection fringe space phase expansion technique, thereby obtaining a three-dimensional reconstruction result.
The terms "comprising" and "having" and any variations thereof in the embodiments of the present application are intended to cover non-exclusive inclusions. For example, a system, product or apparatus that comprises a list of elements or components is not limited to only those elements or components but may alternatively include other elements or components not expressly listed or inherent to such product or apparatus. In the description of the present application, "plurality" means at least two, e.g., two, three, etc., unless explicitly specifically limited otherwise.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the embodiments or some parts of the embodiments.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. A three-dimensional reconstruction method based on an illumination light field of a digital focusing structure is characterized by comprising the following steps:
s1, acquiring four-dimensional light field information of the object to be detected; the four-dimensional light field information is obtained by shooting a deformed stripe image projected on the surface of the object to be detected through a light field camera;
s2, carrying out digital refocusing processing on the four-dimensional light field information by using a space domain digital focusing method to obtain a focusing plane sequence image;
s3, performing fringe analysis through Fourier transform profilometry based on the reference fringe image and the deformation fringe image to obtain a wrapped phase diagram;
s4, performing phase unwrapping on the wrapped phase diagram to obtain an absolute phase difference between the reference fringe and the deformed fringe;
and S5, acquiring the height information of the surface of the object to be detected according to the absolute phase difference, and recovering the three-dimensional shape information of the object to be detected according to the height information and the focusing plane sequence image.
2. The method for three-dimensional reconstruction based on digital focusing structure illumination light field according to claim 1, wherein before the step S1 obtaining the four-dimensional light field information of the object to be measured, the method further comprises:
projecting a sine stripe image with a preset frequency onto a reference surface to obtain a reference stripe image; and projecting the reference stripe image to the surface of an object to be measured to obtain a deformed stripe image, and shooting the deformed stripe image by using a light field camera.
3. The method for three-dimensional reconstruction of an illuminated light field based on a digital focusing structure as claimed in claim 1, wherein said four-dimensional light field information comprises light intensity, two-dimensional position information and two-dimensional direction information recorded by a light field camera.
4. The method for reconstructing a three-dimensional illuminated light field based on a digital focusing structure according to claim 3, wherein in step S2, the digital refocusing processing is performed on the four-dimensional light field information by using a spatial domain digital focusing method, which specifically includes:
defining a lens plane of the light field camera as a u-v plane, defining an imaging plane where the image sensor is located as an x-y plane, and then after the light enters the light field camera, defining the intersection point of the light and the u-v plane as (u, v) and the intersection point of the light and the x-y plane of the image sensor as (x, y); the light is emittedLine is marked as LF(u, v, x, y), wherein subscript F is the distance of the lens plane relative to the imaging plane;
assuming that the image obtained by the light field camera at the image distance F is unclear and the image obtained on the image plane at the image distance α F is clear, the four-dimensional light field information on the new focal plane F' α F is:
Figure FDA0002372582560000021
pixel value E at the new focus plane (x ', y')(x·F)The formula for the calculation of (x ', y') is:
Figure FDA0002372582560000022
where α is the virtual imaging plane depth scale factor and F is the distance of the x-y plane from the u-v plane.
5. The method according to claim 2, wherein in step S3, the obtaining of the wrapped phase map by performing fringe analysis through fourier transform profilometry based on the reference fringe image and the deformed fringe image comprises:
s31, selecting a cross section of the object to be measured with a fixed y coordinate, wherein the distribution function of the intensity of the deformed stripes and the height of the object can be expressed as a function of a univariate x, wherein:
the reference fringe image intensity can be expressed as:
Figure FDA0002372582560000023
the intensity of the deformed fringe image can be expressed as:
Figure FDA0002372582560000031
in the formula, wherein f0Representing a projectionSpatial frequency of fringe fundamental frequency component, bkIs the amplitude of the k-th harmonic component of the projected fringe, ΨkRepresenting the initial phase of each order harmonic component, wherein
Figure FDA0002372582560000032
Is the phase change of the fringe deformation caused by the k-th harmonic;
s32, extracting fundamental frequency information of the reference stripe image and the deformed stripe image by utilizing Fourier transform profilometry, wherein the fundamental frequency information is fundamental frequency components of r (x) and d (x);
s33, carrying out inverse Fourier transform on the fundamental frequency component to obtain rf(x) And df(x) Wherein:
rf(x)=b1cos(2πf0x+Ψ1)
Figure FDA0002372582560000033
s34, for rf(x) And df(x) And performing arc tangent calculation to obtain a wrapped phase diagram.
6. A three-dimensional reconstruction apparatus for illuminating a light field based on a digital focus structure, comprising:
the light field information acquisition module is used for acquiring four-dimensional light field information of the object to be detected; the four-dimensional light field information is obtained by shooting a deformed stripe image projected on the surface of the object to be detected through a light field camera;
the digital refocusing module is used for carrying out digital refocusing processing on the four-dimensional light field information by utilizing a space domain digital focusing method to obtain a focusing plane sequence image;
the fringe analysis module is used for carrying out fringe analysis through a Fourier transform profilometry based on the reference fringe image and the deformed fringe image to obtain a wrapped phase diagram;
the phase unwrapping module is used for unwrapping the wrapped phase diagram to obtain an absolute phase;
and the three-dimensional reconstruction module is used for obtaining the height information of the surface of the object to be detected according to the absolute phase and recovering the three-dimensional appearance information of the object to be detected according to the height information and the focusing plane sequence image.
7. The apparatus for three-dimensional reconstruction based on an illuminated light field with a digital focusing structure according to claim 6, further comprising:
the projection module is used for projecting the sine stripe image with the preset frequency onto a reference surface to obtain a reference stripe image; and projecting the reference stripe image to the surface of an object to be measured to obtain a deformed stripe image, and shooting the deformed stripe image by using a light field camera.
8. The digital focusing structure based three-dimensional reconstruction apparatus for illuminating a light field according to claim 6, wherein the digital refocusing module is specifically configured to:
defining a lens plane of the light field camera as a u-v plane, defining an imaging plane where the image sensor is located as an x-y plane, and then after the light enters the light field camera, defining the intersection point of the light and the u-v plane as (u, v) and the intersection point of the light and the x-y plane of the image sensor as (x, y); denote the ray as LF(u, v, x, y), wherein subscript F is the distance of the lens plane relative to the imaging plane;
assuming that the image obtained by the light field camera at the image distance F is unclear and the image obtained on the image plane at the image distance α F is clear, the four-dimensional light field information on the new focal plane F' α F is:
Figure FDA0002372582560000041
pixel value E at the new focus plane (x ', y')(x·F)The formula for the calculation of (x ', y') is:
Figure FDA0002372582560000042
where α is the virtual imaging plane depth scale factor and F is the distance of the x-y plane from the u-v plane.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the program performs the steps of the method for three-dimensional reconstruction of an illuminated light field based on a digital focusing structure as claimed in any one of claims 1 to 5.
10. A non-transitory computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method for three-dimensional reconstruction of an illuminated light field based on a digital focus structure according to any one of claims 1 to 5.
CN202010055378.7A 2020-01-18 2020-01-18 Three-dimensional reconstruction method and device based on digital focusing structure illumination light field Active CN111288925B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010055378.7A CN111288925B (en) 2020-01-18 2020-01-18 Three-dimensional reconstruction method and device based on digital focusing structure illumination light field

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010055378.7A CN111288925B (en) 2020-01-18 2020-01-18 Three-dimensional reconstruction method and device based on digital focusing structure illumination light field

Publications (2)

Publication Number Publication Date
CN111288925A true CN111288925A (en) 2020-06-16
CN111288925B CN111288925B (en) 2022-05-06

Family

ID=71023211

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010055378.7A Active CN111288925B (en) 2020-01-18 2020-01-18 Three-dimensional reconstruction method and device based on digital focusing structure illumination light field

Country Status (1)

Country Link
CN (1) CN111288925B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111721237A (en) * 2020-06-30 2020-09-29 苏州东方克洛托光电技术有限公司 Full-automatic multi-frequency defocusing projection three-dimensional imaging measurement system and measurement method thereof
CN112488998A (en) * 2020-11-19 2021-03-12 安徽农业大学 Apple stem and calyx detection method based on stripe projection
CN112489196A (en) * 2020-11-30 2021-03-12 太原理工大学 Particle three-dimensional shape reconstruction method based on multi-scale three-dimensional frequency domain transformation
CN112556602A (en) * 2020-12-02 2021-03-26 深圳大学 Method and system for rapidly expanding phase
CN112630469A (en) * 2020-12-07 2021-04-09 清华大学深圳国际研究生院 Three-dimensional detection method based on structured light and multi-light-field camera
CN112945141A (en) * 2021-01-29 2021-06-11 中北大学 Structured light rapid imaging method and system based on micro-lens array
CN113205592A (en) * 2021-05-14 2021-08-03 湖北工业大学 Light field three-dimensional reconstruction method and system based on phase similarity
CN113670218A (en) * 2021-08-12 2021-11-19 北京航空航天大学 Method and device for measuring three-dimensional deformation of wing

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101451826A (en) * 2008-12-17 2009-06-10 中国科学院上海光学精密机械研究所 Object three-dimensional contour outline measuring set and measuring method
US7595894B2 (en) * 2006-06-02 2009-09-29 General Electric Company Profilometry apparatus and method of operation
CN104463949A (en) * 2014-10-24 2015-03-25 郑州大学 Rapid three-dimensional reconstruction method and system based on light field digit refocusing
CN104729429A (en) * 2015-03-05 2015-06-24 深圳大学 Calibration method of telecentric imaging three-dimension topography measuring system
CN104729404A (en) * 2015-03-27 2015-06-24 苏州汉基视测控设备有限公司 High-speed 3D industry digital microscope
CN105547190A (en) * 2015-12-14 2016-05-04 深圳先进技术研究院 Three-dimensional shape measuring method and device based on dual-angle single-frequency fringe projection
CN105953747A (en) * 2016-06-07 2016-09-21 杭州电子科技大学 Structured light projection full view three-dimensional imaging system and method
CN106447762A (en) * 2015-08-07 2017-02-22 中国科学院深圳先进技术研究院 Three-dimensional reconstruction method based on light field information and system
CN109186496A (en) * 2018-10-18 2019-01-11 淮阴师范学院 A kind of three dimension profile measurement method based on Moving Least
CN110567398A (en) * 2019-09-02 2019-12-13 武汉光发科技有限公司 Binocular stereo vision three-dimensional measurement method and system, server and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7595894B2 (en) * 2006-06-02 2009-09-29 General Electric Company Profilometry apparatus and method of operation
CN101451826A (en) * 2008-12-17 2009-06-10 中国科学院上海光学精密机械研究所 Object three-dimensional contour outline measuring set and measuring method
CN104463949A (en) * 2014-10-24 2015-03-25 郑州大学 Rapid three-dimensional reconstruction method and system based on light field digit refocusing
CN104729429A (en) * 2015-03-05 2015-06-24 深圳大学 Calibration method of telecentric imaging three-dimension topography measuring system
CN104729404A (en) * 2015-03-27 2015-06-24 苏州汉基视测控设备有限公司 High-speed 3D industry digital microscope
CN106447762A (en) * 2015-08-07 2017-02-22 中国科学院深圳先进技术研究院 Three-dimensional reconstruction method based on light field information and system
CN105547190A (en) * 2015-12-14 2016-05-04 深圳先进技术研究院 Three-dimensional shape measuring method and device based on dual-angle single-frequency fringe projection
CN105953747A (en) * 2016-06-07 2016-09-21 杭州电子科技大学 Structured light projection full view three-dimensional imaging system and method
CN109186496A (en) * 2018-10-18 2019-01-11 淮阴师范学院 A kind of three dimension profile measurement method based on Moving Least
CN110567398A (en) * 2019-09-02 2019-12-13 武汉光发科技有限公司 Binocular stereo vision three-dimensional measurement method and system, server and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
毛家琦: "结合条纹投影和光场的三维成像技术研究", 《中国优秀硕士学位论文全文数据库(信息科技辑)》 *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111721237B (en) * 2020-06-30 2021-07-09 苏州东方克洛托光电技术有限公司 Full-automatic multi-frequency defocusing projection three-dimensional imaging measurement system and measurement method thereof
CN111721237A (en) * 2020-06-30 2020-09-29 苏州东方克洛托光电技术有限公司 Full-automatic multi-frequency defocusing projection three-dimensional imaging measurement system and measurement method thereof
CN112488998A (en) * 2020-11-19 2021-03-12 安徽农业大学 Apple stem and calyx detection method based on stripe projection
CN112488998B (en) * 2020-11-19 2022-10-14 安徽农业大学 Apple stem and calyx detection method based on stripe projection
CN112489196A (en) * 2020-11-30 2021-03-12 太原理工大学 Particle three-dimensional shape reconstruction method based on multi-scale three-dimensional frequency domain transformation
CN112489196B (en) * 2020-11-30 2022-08-02 太原理工大学 Particle three-dimensional shape reconstruction method based on multi-scale three-dimensional frequency domain transformation
CN112556602A (en) * 2020-12-02 2021-03-26 深圳大学 Method and system for rapidly expanding phase
CN112556602B (en) * 2020-12-02 2022-03-22 深圳大学 Method and system for rapidly expanding phase
CN112630469A (en) * 2020-12-07 2021-04-09 清华大学深圳国际研究生院 Three-dimensional detection method based on structured light and multi-light-field camera
CN112630469B (en) * 2020-12-07 2023-04-25 清华大学深圳国际研究生院 Three-dimensional detection method based on structured light and multiple light field cameras
CN112945141A (en) * 2021-01-29 2021-06-11 中北大学 Structured light rapid imaging method and system based on micro-lens array
CN112945141B (en) * 2021-01-29 2023-03-14 中北大学 Structured light rapid imaging method and system based on micro-lens array
CN113205592A (en) * 2021-05-14 2021-08-03 湖北工业大学 Light field three-dimensional reconstruction method and system based on phase similarity
CN113670218A (en) * 2021-08-12 2021-11-19 北京航空航天大学 Method and device for measuring three-dimensional deformation of wing

Also Published As

Publication number Publication date
CN111288925B (en) 2022-05-06

Similar Documents

Publication Publication Date Title
CN111288925B (en) Three-dimensional reconstruction method and device based on digital focusing structure illumination light field
CN110288642B (en) Three-dimensional object rapid reconstruction method based on camera array
CN109506589B (en) Three-dimensional profile measuring method based on structural light field imaging
CN108734776B (en) Speckle-based three-dimensional face reconstruction method and equipment
CN104299211B (en) Free-moving type three-dimensional scanning method
Karpinsky et al. High-resolution, real-time 3D imaging with fringe analysis
CN110567398A (en) Binocular stereo vision three-dimensional measurement method and system, server and storage medium
CN113205592B (en) Light field three-dimensional reconstruction method and system based on phase similarity
CN107990846B (en) Active and passive combination depth information acquisition method based on single-frame structured light
Myles et al. Recovering affine motion and defocus blur simultaneously
CN114526692B (en) Structured light three-dimensional measurement method and device based on defocusing unwrapping
KR101411568B1 (en) A Hologram Generating Method using Virtual View-point Depth Image Synthesis
Suresh et al. PMENet: phase map enhancement for Fourier transform profilometry using deep learning
CN116958419A (en) Binocular stereoscopic vision three-dimensional reconstruction system and method based on wavefront coding
Guo et al. High-quality defocusing phase-shifting profilometry on dynamic objects
Zhou et al. 3D shape measurement based on structured light field imaging
CN115546285B (en) Large-depth-of-field stripe projection three-dimensional measurement method based on point spread function calculation
Ghita et al. A video-rate range sensor based on depth from defocus
CN112325799A (en) High-precision three-dimensional face measurement method based on near-infrared light projection
Chen et al. Electrically tunable lens assisted absolute phase unwrapping for large depth-of-field 3D microscopic structured-light imaging
CN111121663B (en) Object three-dimensional topography measurement method, system and computer-readable storage medium
CN113160393A (en) High-precision three-dimensional reconstruction method and device based on large field depth and related components thereof
Zhou et al. 3D reconstruction from structured light field by Fourier transformation profilometry
CN113450460A (en) Phase-expansion-free three-dimensional face reconstruction method and system based on face shape space distribution
Guo et al. Using facial landmarks to assist the stereo matching in fringe projection based 3D face profilometry

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant