CN115327684B - Super-structured lens, three-dimensional imaging system, method, electronic device, and storage medium - Google Patents

Super-structured lens, three-dimensional imaging system, method, electronic device, and storage medium Download PDF

Info

Publication number
CN115327684B
CN115327684B CN202211254201.5A CN202211254201A CN115327684B CN 115327684 B CN115327684 B CN 115327684B CN 202211254201 A CN202211254201 A CN 202211254201A CN 115327684 B CN115327684 B CN 115327684B
Authority
CN
China
Prior art keywords
polarization component
handed polarization
super
lens
dimensional imaging
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211254201.5A
Other languages
Chinese (zh)
Other versions
CN115327684A (en
Inventor
傅翼斐
潘美妍
郑梦洁
陈皓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202211254201.5A priority Critical patent/CN115327684B/en
Publication of CN115327684A publication Critical patent/CN115327684A/en
Application granted granted Critical
Publication of CN115327684B publication Critical patent/CN115327684B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/0006Arrays
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B21/00Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant
    • G01B21/20Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring contours or curvatures, e.g. determining profile
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/30Polarising elements

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application belongs to the technical field of micro-nano optics and discloses a super-structured lens, a three-dimensional imaging system, a three-dimensional imaging method, electronic equipment and a storage medium, wherein the super-structured lens comprises: the micro-nano array group decomposes incident light waves with any polarization into a left-handed polarization component and a right-handed polarization component which carry depth information of a measured object, and the left-handed polarization component and the right-handed polarization component are respectively focused at different focal planes; the three-dimensional information of the measured object with high resolution (namely XYZ axis) can be obtained after the subsequent simple processing combined with the three-dimensional imaging algorithm. The invention has the advantages of simple structure and good imaging effect.

Description

Super-structured lens, three-dimensional imaging system, method, electronic device, and storage medium
Technical Field
The application relates to the technical field of micro-nano optics, in particular to a super-structured lens, a three-dimensional imaging system, a three-dimensional imaging method, electronic equipment and a storage medium.
Background
With the rapid development of information technology, the requirements of the fields of engineering manufacturing, biomedicine, consumer electronics and the like on the imaging detection of a measured object are increasing day by day, an imaging system does not meet the requirement of only acquiring two-dimensional image information of the measured object any more, and then a plurality of three-dimensional imaging methods are derived, wherein a defocusing measurement method determines the three-dimensional information of the measured object through measuring point spread function light spots and the change rule of the depth position of the measured object, and the defocusing measurement method is widely applied to the fields of microscopic imaging, material science, machine vision detection and the like.
The existing imaging system analyzes and processes three-dimensional imaging auxiliary information (including left-handed polarization component images, right-handed polarization component images and axial depth of a measured object) by acquiring the three-dimensional imaging auxiliary information, so as to realize three-dimensional imaging.
However, in the conventional imaging system of the defocusing measurement method in the field of microscopic imaging, referring to fig. 4, the imaging system includes a first fourier lens 1, a modulator 2, a second fourier lens 3, and a first image sensor 4. In practical application, an object to be measured is placed on the left side of the first fourier lens 1, that is, an arrow on the left side of the first fourier lens 1 in fig. 4, when an emission light wave irradiates the object to be measured, an incident light wave sequentially passes through the first fourier lens 1, the modulator 2 and the second fourier lens 3, and finally a left-handed polarization component image and a right-handed polarization component image are formed on the first image sensor 4, and then the left-handed polarization component image and the right-handed polarization component image are processed by using an existing image processing algorithm, so that the axial depth of the object to be measured is obtained. However, the system has a complicated structure, which is not favorable for the integration and miniaturization of the imaging device.
Therefore, it is necessary to design an integrated high-resolution three-dimensional imaging optical device to replace the conventional massive defocused three-dimensional imaging system.
Disclosure of Invention
An object of the present application is to provide a super-structured lens, a three-dimensional imaging system, a three-dimensional imaging method, an electronic device, and a storage medium, which can obtain three-dimensional imaging auxiliary information, have a simple structure, and are advantageous for integration and miniaturization of an imaging device.
First aspect, the application provides a super structure lens, including substrate and a plurality of micro nano structure, the micro nano structure sets up one side of substrate, wherein, and is a plurality of micro nano structure constitutes and receives array group a little, and when the incident light wave of arbitrary polarization passes through when super structure lens, receive array group a little will the incident light wave decomposes into the levogyration polarization component and the dextrorotation polarization component that carry the testee degree of depth information, levogyration polarization component with dextrorotation polarization component focuses on different focal plane departments respectively.
The utility model provides a super lens of structuring can decompose incident light wave into the levogyration polarization component and the dextrorotation polarization component that carry the measured object degree of depth information respectively through setting up the array group of receiving a little, consequently, compares in traditional imaging system, and simple structure is favorable to imaging device's integration and miniaturization.
Preferably, the total phase distribution of the metamaterial lens is as follows:
Figure 161221DEST_PATH_IMAGE001
Figure 750465DEST_PATH_IMAGE002
Figure 100002_DEST_PATH_IMAGE003
Figure 467885DEST_PATH_IMAGE004
Figure DEST_PATH_IMAGE005
Figure 800778DEST_PATH_IMAGE006
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE007
and
Figure 552833DEST_PATH_IMAGE008
is the total phase distribution of the super-structured lens; l is the mark of the left-handed polarization component, and R is the mark of the right-handed polarization component;
Figure DEST_PATH_IMAGE009
position coordinates representing the collection point of the left-handed polarization component;
Figure 109716DEST_PATH_IMAGE010
a position coordinate representing the right-hand polarization component collection point;
Figure 842661DEST_PATH_IMAGE011
is a Fresnel transformation function;
Figure 510403DEST_PATH_IMAGE012
extracting a function for the phase;
Figure 648123DEST_PATH_IMAGE013
and
Figure 844750DEST_PATH_IMAGE014
is a first phase profile;
Figure 333500DEST_PATH_IMAGE015
and
Figure 273774DEST_PATH_IMAGE016
for the spectral modulation function, the phase distribution of the defocus measurement function including but not limited to a single spiral diffusion function, a double spiral diffusion function, a Tetrapod vortex rotation function, and the like can be selected;
Figure 531580DEST_PATH_IMAGE017
and
Figure 430266DEST_PATH_IMAGE018
a second phase distribution;
Figure 875154DEST_PATH_IMAGE019
is the wavelength of the incident light wave in vacuum; n is the ambient refractive index;
Figure 415856DEST_PATH_IMAGE020
is a first focal length;
Figure 528169DEST_PATH_IMAGE021
is the second focal length; and (x, y) is the coordinate of the micro-nano structure (110) relative to the center of the super-structure lens.
In practical application, based on the principles of geometric phase and propagation phase, the dimension length of the micro-nano structure on the super-structured lens is accurately designed
Figure 332177DEST_PATH_IMAGE022
Width, width
Figure 264361DEST_PATH_IMAGE023
And azimuth orientation angle
Figure 812017DEST_PATH_IMAGE024
The corresponding effective refractive index is regulated and controlled, so that the phase difference between incident light wave electric fields is controlled, phase conversion is realized, the light field distribution of the left-handed polarization component and the light field distribution of the right-handed polarization component are met, the defocusing three-dimensional imaging function of a traditional imaging system can be realized, after light rays emitted by a measured object pass through the super-structure lens, a left-handed polarization component image and a right-handed polarization component image can be obtained on an image sensor on the rear side of the super-structure lens, and a three-dimensional image can be obtained by performing image processing according to the left-handed polarization component image and the right-handed polarization component image. By the method, the polarization information of the measured object can be quickly obtained, so that imaging with richer details and higher contrast can be realized by combining an image processing algorithm in the following.
Preferably, the micro-nano structure is made of TiO 2 、Si、GaN、Si 3 N 4 Ge, pbTe, znSe or CaF.
In practical application, the material of the micro-nano structure is selected from a dielectric material with high refractive index and low loss at a target wavelength, and the focusing efficiency of the super-structure lens is favorably improved.
In a second aspect, the application provides a three-dimensional imaging system, which comprises the first aspect, the super-structure lens and the image sensor, wherein the image sensor is arranged on one side of the micro-nano structure of the super-structure lens in parallel.
In a third aspect, the present application provides a three-dimensional imaging method, based on the three-dimensional imaging system in the second aspect, including the following steps:
s1, irradiating a measured object by adopting modulated irradiation light waves to obtain a left-handed polarization component image and a right-handed polarization component image;
s2, acquiring a first Gaussian spot and a second Gaussian spot according to the left-handed polarization component image and the right-handed polarization component image;
and S3, acquiring the axial depth of the measured object to the super-structure lens according to the first Gaussian light spot and the second Gaussian light spot.
Preferably, the spectral modulation function that modulates the illuminating light wave is a double helix point spread function.
Preferably, step S2 comprises:
s201, preprocessing the left-handed polarization component image and the right-handed polarization component image by using an image optimization processing algorithm to obtain a first clear image and a second clear image;
s202, respectively obtaining a first Gaussian spot and a second Gaussian spot according to the first clear image and the second clear image.
The image optimization processing algorithm may use algorithms including, but not limited to, image digitization, image logic operation, image segmentation enhancement, and the like, in this way, the first clear image and the second clear image are obtained.
Preferably, step S3 comprises:
s301, acquiring first position information of a first geometric center point of a first Gaussian light spot and second position information of a second geometric center point of a second Gaussian light spot;
s302, acquiring a rotation angle between the first Gaussian spot and the second Gaussian spot according to the first position information and the second position information;
and S303, inquiring corresponding defocusing amount in a pre-calibrated defocusing amount-rotation angle relation coordinate graph according to the rotation angle, and taking the corresponding defocusing amount as the axial depth from the measured object to the super-structure lens.
From the above, the three-dimensional imaging method provided by the application obtains the left-handed polarization component image and the right-handed polarization component image by irradiating the measured object with the modulated irradiation light wave; acquiring a first Gaussian spot and a second Gaussian spot according to the left-handed polarization component image and the right-handed polarization component image; and acquiring the axial depth of the measured object to the super-structure lens according to the first Gaussian light spot and the second Gaussian light spot. Compare and use traditional imaging system to obtain the axial depth of levogyration polarization component image, dextrorotation polarization component image and measured object, it is more convenient that this application adopts super lens structure, and the precision is higher moreover, is favorable to follow-up three-dimensional imaging auxiliary information realization object's high accuracy three-dimensional imaging.
In a fourth aspect, the present application provides an electronic device comprising a processor and a memory, wherein the memory stores computer readable instructions, and the computer readable instructions, when executed by the processor, perform the steps of the method as provided in the third aspect.
In a fifth aspect, the present application provides a storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method as provided in the third aspect above.
In summary, the super-structured lens, the three-dimensional imaging system, the three-dimensional imaging method, the electronic device and the storage medium provided by the present application have the following advantages:
the super-structure lens integrates the functions of optical elements such as a phase plate, a polarizer, a Fourier lens and the like in the traditional three-dimensional imaging system, retains the functions of the traditional three-dimensional imaging system and is beneficial to the integration and the miniaturization of imaging equipment;
the depth information of the object to be measured can be converted into plane information so as to realize real-time recording;
after being integrated with an image sensor, the method not only can provide abundant details such as surface texture of a measured object, but also can improve the depth resolution.
Drawings
Fig. 1 is a front view of a super lens provided in an embodiment of the present application.
Fig. 2 is a schematic structural diagram of a three-dimensional imaging system according to an embodiment of the present application.
Fig. 3 is a front view of a micro-nano structure provided in an embodiment of the present application.
Fig. 4 is a schematic structural diagram of a conventional imaging system provided in an embodiment of the present application.
Fig. 5 is a diagram of an imaging effect provided by an embodiment of the present application.
Fig. 6 is a defocus amount-rotation angle relationship diagram provided in the embodiment of the present application.
Fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Description of the reference symbols: 100. a substrate; 110. a micro-nano structure; 200. an image sensor; 1. a first Fourier lens; 2. a modulator; 3. a second Fourier lens; 4. a first image sensor; 301. a processor; 302. a memory; 303. a communication bus.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only some embodiments of the present application, and not all embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not construed as indicating or implying relative importance.
Referring to fig. 1 and 2, the super-structure lens provided by the present application includes a substrate 100 and a plurality of micro-nano structures 110, the micro-nano structures 110 are disposed on one side of the substrate 100, wherein the micro-nano structures 110 form a micro-nano array group, when an incident light wave with any polarization passes through the super-structure lens, the micro-nano array group decomposes the incident light wave into a left-handed polarization component and a right-handed polarization component carrying depth information of an object to be measured, and the left-handed polarization component and the right-handed polarization component are respectively focused on different focal planes.
In practical applications, when the incident light waves include non-circularly polarized light, the non-circularly polarized light may be regarded as being substantially formed by superimposing left-circularly polarized light with a coefficient of 1 and right-circularly polarized light with a coefficient of 1, when the incident light waves include left-circularly polarized light, the circularly polarized light may be regarded as being formed by superimposing left-circularly polarized light with a coefficient of 1 and right-circularly polarized light with a coefficient of 0, and when the incident light waves include right-circularly polarized light, the circularly polarized light may be regarded as being formed by superimposing left-circularly polarized light with a coefficient of 0 and right-circularly polarized light with a coefficient of 1, that is, the incident light waves are a weighted sum of the left-circularly polarized light and the right-circularly polarized light.
The utility model provides a super lens of structuring receives array group a little through setting up, can decompose incident light wave into the levogyration polarization component and the dextrorotation polarization component that carry the measured object degree of depth information, and convenient follow-up levogyration polarization component image and dextrorotation polarization component image of obtaining consequently, compare in traditional imaging system, simple structure is favorable to imaging device's integration and miniaturation.
In a further embodiment, the total phase profile of the superstructural lens is as follows:
Figure 575573DEST_PATH_IMAGE025
Figure 550483DEST_PATH_IMAGE026
Figure 969963DEST_PATH_IMAGE003
Figure 321310DEST_PATH_IMAGE004
Figure 939373DEST_PATH_IMAGE005
Figure 82254DEST_PATH_IMAGE006
wherein the content of the first and second substances,
Figure 989030DEST_PATH_IMAGE007
and
Figure 675226DEST_PATH_IMAGE008
is the total phase distribution of the super-structured lens; l is the mark of the left-handed polarization component, R is the mark of the right-handed polarization component;
Figure 616637DEST_PATH_IMAGE009
a position coordinate representing the left-handed polarization component focus point;
Figure 933349DEST_PATH_IMAGE010
position coordinates representing the right-handed polarization component collection points;
Figure 327421DEST_PATH_IMAGE011
is a fresnel transform function;
Figure 817308DEST_PATH_IMAGE012
extracting a function for the phase;
Figure 347647DEST_PATH_IMAGE013
and
Figure 835260DEST_PATH_IMAGE014
is a first phase profile;
Figure 513366DEST_PATH_IMAGE015
and
Figure 10206DEST_PATH_IMAGE016
for the spectral modulation function, the phase distribution of the defocus measurement function including but not limited to a single spiral diffusion function, a double spiral diffusion function, a Tetrapod vortex rotation function, and the like can be selected;
Figure 395051DEST_PATH_IMAGE017
and
Figure 53566DEST_PATH_IMAGE018
a second phase distribution;
Figure 218968DEST_PATH_IMAGE019
is the wavelength of the incident light wave in vacuum; n is the ambient refractive index;
Figure 253920DEST_PATH_IMAGE020
is a first focal length;
Figure 24430DEST_PATH_IMAGE021
is a second focal length; and (x, y) is the coordinate of the micro-nano structure (110) relative to the center of the super-structure lens.
The first phase distribution and the second phase distribution are respectively the same as the phase distribution of the first fourier lens 1 and the phase distribution of the second fourier lens 3 in the conventional imaging system shown in fig. 4, the first focal length and the second focal length are respectively equal to the focal length of the first fourier lens 1 and the focal length of the second fourier lens 3, and the obtaining mode is the prior art.
The spectral modulation function may be a defocus measurement function such as an existing single-helix diffusion function or a Tetrapod vortex rotation function, but is not limited thereto.
In practical application, based on the principles of geometric phase and propagation phase, the size and the length of the micro-nano structure 110 on the super-structured lens are precisely designed
Figure 588266DEST_PATH_IMAGE022
Width, width
Figure 240965DEST_PATH_IMAGE023
And azimuth orientation angle
Figure 79608DEST_PATH_IMAGE024
The corresponding effective refractive index is regulated and controlled, so that the phase difference between the incident light wave electric fields is controlled, the phase conversion is realized, as shown in figure 3, the optical field distribution of the left-handed polarization component and the optical field distribution of the right-handed polarization component are met, the defocusing three-dimensional imaging function of the traditional imaging system can be realized, and when an object to be measured is obtainedAfter the light emitted from the body passes through the super-structured lens, a left-handed polarization component image and a right-handed polarization component image can be obtained on the image sensor 200 at the rear side of the super-structured lens. By the method, the polarization information of the measured object can be quickly obtained, so that imaging with richer details and higher contrast can be realized by combining an image processing algorithm in the following process.
In some embodiments, the micro-nano structure 110 is made of TiO 2 、Si、GaN、Si 3 N 4 Ge, pbTe, znSe or CaF, but not limited thereto. In practical application, the micro-nano structure 110 is made of a dielectric material with high refractive index and low loss at a target wavelength (the target wavelength is a standard wavelength of an incident light wave required during use), so that the focusing efficiency of the super-structured lens is improved; when the target wavelength is in the visible light band, the micro-nano structure 110 may be made of TiO 2 Si, gaN or Si 3 N 4 Etc. when the target wavelength is in the infrared band, the micro-nano structure 110 may be made of Si, ge, pbTe, znSe, caF, etc.
With reference to fig. 2, the present application provides a three-dimensional imaging system, which includes the above-mentioned super-structured lens and an image sensor 200, where the image sensor 200 is disposed in parallel on a side of the super-structured lens where the micro-nano structure 110 is disposed. The arrow on the left side of the metamaterial lens represents the object to be measured, and d represents the axial depth of the object to be measured to the metamaterial lens.
In practical application, the object to be measured is placed on the side back to the metamaterial lens, when an incident light wave passes through the metamaterial lens from the side of the object to be measured, the micro-nano array group can decompose the incident light wave into a left-handed polarization component and a right-handed polarization component, and the left-handed polarization component and the right-handed polarization component are irradiated onto the image sensor 200, and at this time, the image sensor 200 can obtain a left-handed polarization component image and a right-handed polarization component image.
The application also provides a three-dimensional imaging method, wherein the three-dimensional imaging system based on the three-dimensional imaging method comprises the following steps:
s1, irradiating a measured object by adopting modulated irradiation light waves to obtain a left-handed polarization component image and a right-handed polarization component image;
s2, acquiring a first Gaussian spot and a second Gaussian spot according to the left-handed polarization component image and the right-handed polarization component image;
and S3, acquiring the axial depth from the measured object to the super-structure lens according to the first Gaussian spot and the second Gaussian spot.
In some preferred embodiments:
the spectral modulation function that modulates the illuminating light wave is a double helix point spread function.
The double-spiral point spread function is used for forming two discrete Gaussian spots with equal brightness at a focal plane by modulating the phase of irradiated light waves, compared with the traditional point spread function, the size and the shape of the spot of the double-spiral point spread function are not changed along with the defocusing of a measured object, but the angle formed by the two Gaussian spots is linearly changed along with the defocusing amount of the measured object, the double-spiral point spread function can be generated by linearly superposing Laguerre-Gaussian (LG-Gauss, LG) light beam modes of a plurality of specific modes, and the light field distribution formula of the LG light beam mode is as follows:
Figure 173466DEST_PATH_IMAGE027
Figure 439362DEST_PATH_IMAGE028
Figure 514110DEST_PATH_IMAGE029
wherein, the first and the second end of the pipe are connected with each other,
Figure DEST_PATH_IMAGE031
is the light field distribution of LG beam mode; (x, y) are the position coordinates of the pixel points on the focal plane; z is the axial distance from the pixel point on the focal plane to the super-structured lens;
Figure 156444DEST_PATH_IMAGE032
is the Rayleigh radius;
Figure DEST_PATH_IMAGE033
is a generalized laguerre polynomial;
Figure 901546DEST_PATH_IMAGE034
represents a mode of the LG beam;
Figure 338343DEST_PATH_IMAGE035
the direction of the angle of the rotation is indicated,
Figure 637738DEST_PATH_IMAGE036
Figure 818183DEST_PATH_IMAGE037
is the girdling radius;
Figure 948950DEST_PATH_IMAGE038
is the beam radius;
Figure 556649DEST_PATH_IMAGE039
is an imaginary unit.
In practical application, the values of n and m should satisfy the following formula:
Figure 343340DEST_PATH_IMAGE041
k is a natural number; linear slope of double helix point spread function with superimposed rotation angle of light spot angle under connection LG light beam mode
Figure 61897DEST_PATH_IMAGE042
The influence of the magnetic field on the magnetic field,
Figure 312750DEST_PATH_IMAGE043
the difference of n for any two LG beam modes;
Figure 825770DEST_PATH_IMAGE044
is the difference in m for any two LG beam modes.
Preferably, the application chooses
Figure 365336DEST_PATH_IMAGE045
Multiple LG beam mode superpositionAnd forming a double-spiral point spread function, and acquiring a left-handed polarization component image and a right-handed polarization component image containing the depth information, the surface texture and other detailed characteristics of the target object (namely the measured object).
Referring to fig. 5, in practical application, due to the polarization insensitivity of the complex environment light, the left-handed polarized component image and the right-handed polarized component image both carry light intensity information of the complex environment light, and the right-handed circularly polarized component is modulated by the super-structured lens to generate a first gaussian spot and a second gaussian spot at the left-handed polarized component image. Wherein, LCP is left-handed polarization component image, RCP is right-handed polarization component image.
In some embodiments, step S2 comprises:
s201, preprocessing a left-handed polarization component image and a right-handed polarization component image by using an image optimization processing algorithm to obtain a first clear image and a second clear image;
s202, respectively obtaining a first Gaussian spot and a second Gaussian spot according to the first clear image and the second clear image.
The image optimization processing algorithm may use algorithms including, but not limited to, image digitization, image logic operation, image segmentation enhancement, and the like, so as to obtain the first clear image and the second clear image. The first clear image still keeps the first Gaussian spot and the second Gaussian spot, and the first clear image and the second clear image are subtracted, so that the light intensity information of the complex environment light is equivalently filtered, and the image only with the first Gaussian spot and the second Gaussian spot can be obtained.
In a further embodiment, step S3 comprises:
s301, acquiring first position information of a first geometric center point of a first Gaussian spot and second position information of a second geometric center point of a second Gaussian spot;
s302, acquiring a rotation angle between the first Gaussian spot and the second Gaussian spot according to the first position information and the second position information;
and S303, inquiring corresponding defocusing amount in a pre-calibrated defocusing amount-rotation angle relation coordinate graph according to the rotation angle, and taking the corresponding defocusing amount as the axial depth from the measured object to the super-structure lens.
In step S301, the first position information is acquired as follows: obtaining coordinates (A, B) of a first pixel point at the leftmost end of the first Gaussian spot, coordinates (C, D) of a second pixel point at the rightmost end, coordinates (E, F) of a third pixel point at the uppermost end and coordinates (G, H) of a fourth pixel point at the lowermost end, taking (C + A)/2 as an abscissa value of the first geometric center point, and taking (F + H)/2 as an ordinate value of the first geometric center point. Similarly, second position information may be obtained.
In step 302, the rotation angle is an included angle between a vector of the first geometric center point pointing to the second geometric center point and a positive direction of an x-axis, and the included angle is positive in a counterclockwise direction; as shown in fig. 5, β is a rotation angle.
In step 303, a pre-calibrated defocus-rotation angle relationship coordinate graph is shown in fig. 6.
From the above, the three-dimensional imaging method provided by the application obtains the left-handed polarization component image and the right-handed polarization component image by irradiating the measured object with the modulated irradiation light wave; acquiring a first Gaussian spot and a second Gaussian spot according to the left-handed polarization component image and the right-handed polarization component image; and acquiring the axial depth of the measured object to the super-structure lens according to the first Gaussian light spot and the second Gaussian light spot. Compare and use traditional imaging system to acquire the axial depth of levogyration polarization component image, dextrorotation polarization component image and measured object, this application adopts the super configuration lens more convenient, and the precision is higher moreover, is favorable to follow-up three-dimensional imaging auxiliary information to realize the high accuracy three-dimensional imaging of object.
In summary, the super-structured lens, the three-dimensional imaging system, the three-dimensional imaging method, the electronic device and the storage medium provided by the present application have the following advantages:
the super-structure lens integrates the functions of optical elements such as a phase plate, a polarizer, a Fourier lens and the like in the traditional three-dimensional imaging system, retains the functions of the traditional three-dimensional imaging system and is beneficial to the integration and the miniaturization of imaging equipment;
the depth information of the object to be measured can be converted into plane information so as to realize real-time recording;
after being integrated with an image sensor, the method not only can provide abundant details such as surface texture of a measured object, but also can improve the depth resolution.
Referring to fig. 7, fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure, where the electronic device includes: the processor 301 and the memory 302, the processor 301 and the memory 302 being interconnected and communicating with each other via a communication bus 303 and/or other form of connection mechanism (not shown), the memory 302 storing computer readable instructions executable by the processor 301, the processor 301 executing the computer readable instructions when the electronic device is running to perform the method in any of the alternative implementations of the above embodiments when executing to implement the following functions: irradiating the measured object by adopting the modulated irradiation light wave to obtain a left-handed polarization component image and a right-handed polarization component image; acquiring a first Gaussian spot and a second Gaussian spot according to the left-handed polarization component image and the right-handed polarization component image; and acquiring the axial depth of the measured object to the super-structure lens according to the first Gaussian spot and the second Gaussian spot.
The present application provides a storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the method in any optional implementation manner of the foregoing implementation manner is executed, so as to implement the following functions: irradiating the measured object by adopting modulated irradiation light waves to obtain a left-handed polarization component image and a right-handed polarization component image; acquiring a first Gaussian spot and a second Gaussian spot according to the left-handed polarization component image and the right-handed polarization component image; and acquiring the axial depth of the measured object to the super-structure lens according to the first Gaussian light spot and the second Gaussian light spot. The storage medium may be implemented by any type of volatile or nonvolatile storage device or combination thereof, such as a Static Random Access Memory (SRAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), an Erasable Programmable Read-Only Memory (EPROM), a Programmable Read-Only Memory (PROM), a Read-Only Memory (ROM), a magnetic Memory, a flash Memory, a magnetic disk, or an optical disk.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (9)

1. A super-structure lens comprises a substrate (100) and a plurality of micro-nano structures (110), wherein the micro-nano structures (110) are arranged on one side of the substrate (100), and the super-structure lens is characterized in that a micro-nano array group is formed by the micro-nano structures (110), when incident light waves with any polarization pass through the super-structure lens, the incident light waves are decomposed into a left-handed polarization component and a right-handed polarization component which carry depth information of a measured object by the micro-nano array group, and the left-handed polarization component and the right-handed polarization component are respectively focused on different focal planes; the total phase distribution of the metamaterial lens is as follows:
Figure 584210DEST_PATH_IMAGE001
Figure 766929DEST_PATH_IMAGE002
Figure DEST_PATH_IMAGE003
Figure 451726DEST_PATH_IMAGE004
Figure 315777DEST_PATH_IMAGE005
Figure 802253DEST_PATH_IMAGE006
wherein, the first and the second end of the pipe are connected with each other,
Figure 93557DEST_PATH_IMAGE007
and
Figure 95011DEST_PATH_IMAGE008
is the total phase distribution of the super-structured lens; l is the mark of the left-handed polarization component, R is the mark of the right-handed polarization component;
Figure 730130DEST_PATH_IMAGE009
position coordinates representing the collection point of the left-handed polarization component;
Figure 602271DEST_PATH_IMAGE010
position coordinates representing the right-handed polarization component collection points;
Figure 533318DEST_PATH_IMAGE011
is a Fresnel transformation function;
Figure 22068DEST_PATH_IMAGE012
extracting a function for the phase;
Figure 227921DEST_PATH_IMAGE013
and
Figure 921946DEST_PATH_IMAGE014
is a first phase profile;
Figure 86211DEST_PATH_IMAGE015
and
Figure 796678DEST_PATH_IMAGE016
the phase distribution of the defocusing measurement functions such as a single spiral diffusion function, a double spiral point diffusion function, a Tetrapod vortex rotation function and the like is a spectrum modulation function;
Figure 540643DEST_PATH_IMAGE017
and
Figure 652955DEST_PATH_IMAGE018
a second phase distribution;
Figure 660226DEST_PATH_IMAGE019
is the wavelength of the incident light wave in vacuum; n is the ambient refractive index;
Figure 123568DEST_PATH_IMAGE020
is a first focal length;
Figure 169759DEST_PATH_IMAGE021
is the second focal length; and (x, y) is the coordinate of the micro-nano structure (110) relative to the center of the super-structure lens.
2. A metamaterial lens according to claim 1, wherein the micro-nano structure (110) is made of TiO 2 、Si、GaN、Si 3 N 4 Ge, pbTe, znSe or CaF.
3. A three-dimensional imaging system comprising the metamaterial lens of any one of claims 1-2 and an image sensor (200), wherein the image sensor (200) is disposed in parallel on a side of the metamaterial lens on which the micro-nano structure (110) is disposed.
4. A three-dimensional imaging method, characterized in that the three-dimensional imaging system according to claim 3 comprises the following steps:
s1, irradiating a measured object by adopting modulated irradiation light waves to obtain a left-handed polarization component image and a right-handed polarization component image;
s2, acquiring a first Gaussian spot and a second Gaussian spot according to the left-handed polarization component image and the right-handed polarization component image;
and S3, acquiring the axial depth of the measured object to the super-structure lens according to the first Gaussian light spot and the second Gaussian light spot.
5. The three-dimensional imaging method according to claim 4, wherein the spectral modulation function modulating the illuminating light wave is a double-helix point spread function.
6. The three-dimensional imaging method according to claim 4, wherein step S2 comprises:
s201, preprocessing the left-handed polarization component image and the right-handed polarization component image by using an image optimization processing algorithm to obtain a first clear image and a second clear image;
s202, respectively obtaining a first Gaussian spot and a second Gaussian spot according to the first clear image and the second clear image.
7. The three-dimensional imaging method according to claim 4, wherein step S3 comprises:
s301, acquiring first position information of a first geometric center point of a first Gaussian spot and second position information of a second geometric center point of a second Gaussian spot;
s302, acquiring a rotation angle between the first Gaussian spot and the second Gaussian spot according to the first position information and the second position information;
and S303, inquiring corresponding defocusing amount in a pre-calibrated defocusing amount-rotation angle relation coordinate graph according to the rotation angle, and taking the corresponding defocusing amount as the axial depth from the measured object to the super-structure lens.
8. An electronic device comprising a processor and a memory, the memory storing computer readable instructions which, when executed by the processor, perform the steps of the three-dimensional imaging method according to any one of claims 4 to 7.
9. A storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, performs the steps of the three-dimensional imaging method according to any one of claims 4-7.
CN202211254201.5A 2022-10-13 2022-10-13 Super-structured lens, three-dimensional imaging system, method, electronic device, and storage medium Active CN115327684B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211254201.5A CN115327684B (en) 2022-10-13 2022-10-13 Super-structured lens, three-dimensional imaging system, method, electronic device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211254201.5A CN115327684B (en) 2022-10-13 2022-10-13 Super-structured lens, three-dimensional imaging system, method, electronic device, and storage medium

Publications (2)

Publication Number Publication Date
CN115327684A CN115327684A (en) 2022-11-11
CN115327684B true CN115327684B (en) 2023-01-31

Family

ID=83914193

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211254201.5A Active CN115327684B (en) 2022-10-13 2022-10-13 Super-structured lens, three-dimensional imaging system, method, electronic device, and storage medium

Country Status (1)

Country Link
CN (1) CN115327684B (en)

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107340559B (en) * 2017-07-04 2019-07-23 北京理工大学 High efficiency and broad band circular polarization switching device and method based on super clever surface
CN111258059B (en) * 2020-01-21 2021-09-03 中国科学院上海微系统与信息技术研究所 Flexible mobile phone camera optical lens and manufacturing method thereof
WO2021234924A1 (en) * 2020-05-21 2021-11-25 日本電信電話株式会社 Image capturing element and image capturing device
CN113655548A (en) * 2021-07-08 2021-11-16 湖南大学 Optical edge detection design method and device based on super-structured surface
CN114280707B (en) * 2022-03-03 2022-06-03 季华实验室 Full-polarization medium super-structured lens and use method thereof
CN114595636A (en) * 2022-03-11 2022-06-07 清华大学 Monocular snapshot type depth polarization four-dimensional imaging method and system
CN114791670B (en) * 2022-05-13 2023-10-24 华中科技大学 Super-surface-based polarized imaging lens, design method and detection system

Also Published As

Publication number Publication date
CN115327684A (en) 2022-11-11

Similar Documents

Publication Publication Date Title
Luo et al. Computational imaging without a computer: seeing through random diffusers at the speed of light
CN111367088B (en) Orthogonal polarized light imaging diffraction optical device based on super-structured surface
Quirin et al. Depth estimation and image recovery using broadband, incoherent illumination with engineered point spread functions
Li et al. Development status and key technologies of polarization imaging detection
CN110531530B (en) Rapid calculation method for realizing tight focusing of partially coherent light
Zhou et al. Remote phosphor technology for white LED applications: advances and prospects
ZHANG et al. 3D small-field surface imaging based on microscopic fringe projection profilometry: a review
US20230289989A1 (en) Monocular snapshot four-dimensional imaging method and system
Sukharevsky et al. Manipulation of backscattering from a dielectric cylinder of triangular cross-section using the interplay of GO-like ray effects and resonances
Duan et al. Low-complexity adaptive radius outlier removal filter based on PCA for lidar point cloud denoising
Zhang et al. Optical remote sensor for cloud and aerosol from space: past, present and future
CN115327684B (en) Super-structured lens, three-dimensional imaging system, method, electronic device, and storage medium
CN105698677A (en) Surface Plasmon-based four quadrant detector
Blanche et al. A 100,000 scale factor radar range
Estrada et al. DeblurGAN-C: image restoration using GAN and a correntropy based loss function in degraded visual environments
Shao et al. Transparent Shape from a Single View Polarization Image
Chizhevskaya et al. Comparative features of cylindrical electromagnetic black holes with positive and negative refractive indexes
Marks et al. Cone-beam tomography with a digital camera
CN112213704B (en) Target scattering cross section calculation method and device
Rachon et al. Enhanced sub-wavelength focusing by double-sided lens with phase correction in THz range
Zhou et al. Research on autofocus recognition of the LAMOST fiber view camera system under front and back illumination
Elmalem et al. All-optical athermalization of infrared imaging systems using thermally dependent binary phase masks
Huang et al. Active imaging through dense fog by utilizing the joint polarization defogging and denoising optimization based on range-gated detection
Cai et al. Direct calculation of tightly focused field in an arbitrary plane
Kakade et al. Optimising performance of a confocal fluorescence microscope with a differential pinhole

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant