CN112179504A - Single-frame focal plane light intensity image depth learning phase difference method based on grating modulation - Google Patents

Single-frame focal plane light intensity image depth learning phase difference method based on grating modulation Download PDF

Info

Publication number
CN112179504A
CN112179504A CN202011031209.6A CN202011031209A CN112179504A CN 112179504 A CN112179504 A CN 112179504A CN 202011031209 A CN202011031209 A CN 202011031209A CN 112179504 A CN112179504 A CN 112179504A
Authority
CN
China
Prior art keywords
focal plane
grating
far
field
short
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011031209.6A
Other languages
Chinese (zh)
Inventor
邱学晶
赵旺
杨超
王帅
许冰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Optics and Electronics of CAS
Original Assignee
Institute of Optics and Electronics of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Optics and Electronics of CAS filed Critical Institute of Optics and Electronics of CAS
Priority to CN202011031209.6A priority Critical patent/CN112179504A/en
Publication of CN112179504A publication Critical patent/CN112179504A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J9/00Measuring optical phase difference; Determining degree of coherence; Measuring optical wavelength
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J9/00Measuring optical phase difference; Determining degree of coherence; Measuring optical wavelength
    • G01J2009/002Wavefront phase distribution

Abstract

The invention discloses a single-frame focal plane light intensity image depth learning phase difference method based on grating modulation. A data set fully sampled in a sample space is selected for a Convolutional Neural Network (CNN) to fit a mapping relation between a far-field light spot and a near-field wavefront phase, after network training is converged, a far-field light spot image is input, and a corresponding wavefront aberration can be obtained.

Description

Single-frame focal plane light intensity image deep learning phase difference method based on grating modulation
Technical Field
The invention relates to the technical field of phase inversion methods, in particular to a single-frame focal plane light intensity image deep learning phase difference method based on grating modulation.
Background
The phase inversion technology directly reconstructs wavefront phase information according to far-field light intensity distribution, and is insensitive to environment without a wavefront sensor. The Gerchberg-Saxton (GS) algorithm is a class of classical phase inversion methods. The GS algorithm utilizes an angular spectrum transmission theory, and repeatedly calculates the wave front aberration according to the Fourier transform relation of the far field complex amplitude and the near field complex amplitude, the GS algorithm has a simple structure and is easy to realize, but the GS algorithm is easy to fall into a local extreme value due to the multi-solution problem that the same far field corresponds to a plurality of wave fronts, and the convergence precision is low.
To overcome the multiple solution problem in the GS algorithm, a phase difference method is proposed. The phase difference method requires multiple measurements of the CCD in the in-focus and out-of-focus planes. In addition, the phase difference method still needs multiple iterations to converge, and the real-time performance is poor. Alexandra introduces a non-redundant mask to realize single-frame focal plane image phase inversion without measuring the focal plane and the defocusing plane for multiple times by a CCD (charge coupled device), which is referred to as [ A.Z.Greenbaum, A.Silarakrismann.In-focus moving front sensing non-redundant mask-interpolated pulse diversity ] [ J ]. OPTICS EXPRESS,2016,24(14) ]. However, the method needs a non-redundant mask to enter and exit the optical path for multiple times, the system structure is complex, and the application of the algorithm in actual wavefront detection is limited. Therefore, on the premise of ensuring that the single-frame focal plane far-field light intensity distribution corresponds to the unique near-field wavefront, improving the calculation efficiency is a problem which needs to be solved urgently.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: on the basis of ensuring the uniqueness and the recovery accuracy of the far-field light spot inversion near-field wave front phase solution, the operation speed is further improved. The conventional phase difference method needs the CCD to measure on the focal plane and the defocusing plane of the lens for multiple times, and the system structure is complex. When the defocusing grating is closely contacted with the short-focus lens for use, the focal directions of +/-1-order diffracted lights of the defocusing grating are opposite and are respectively slightly short and slightly longer than the focal length of the short-focus lens, and at the moment, the CCD can simultaneously measure positive and negative defocusing and far-field light intensity distribution of a focal plane on the focal plane of the short-focus lens. In addition, the phase difference method needs to be disturbed for iteration and optimization for many times, and the calculation speed of the algorithm is limited by the iteration times and the single-step iteration calculation time. The deep learning can self-extract deep features of the image, has strong nonlinear fitting capability, can learn the mapping relation from the far-field light intensity image to the near-field wavefront on the basis of grating modulation, and avoids the iterative computation process of the traditional phase difference method. Based on the method, the focal plane wavefront restoration sensor with simple structure, good real-time performance and high restoration precision is designed based on grating modulation, the iterative optimization process of the traditional phase difference method is avoided by utilizing a deep learning algorithm, the time consumption of the algorithm is reduced, and the rapid phase inversion of the single-frame focal plane light intensity image is realized.
The technical scheme adopted by the invention for solving the problems is as follows: a single-frame focal plane light intensity image deep learning phase difference method based on grating modulation selects a data set fully sampled in a sample space for a Convolutional Neural Network (CNN) to fit a mapping relation between a far-field light spot and a near-field wave front phase, and inputs a far-field light spot image after network training convergence to obtain a corresponding wave front aberration thereof, wherein iterative operation is not needed in the mapping solving process, the time consumption of calculation is reduced, and the specific implementation steps are as follows:
step 1: designing a wavefront sensor based on defocused grating modulation;
step 2: verifying whether the sensor designed in the step 1 can simultaneously obtain positive and negative defocusing and far-field light intensity distribution of a focal plane on the focal plane of the short-focus lens;
and step 3: if the step 2 can be realized, collecting far-field light spots and near-field wavefront data modulated by the defocused grating according to the step 1, respectively using focal plane far-field images measured by the CCD and Zernike mode coefficients corresponding to the near-field wavefront as samples and labels in a data set, and if the step 2 cannot be realized, repeatedly executing the step 1 and designing the sensor again until the step 2 is realized;
and 4, step 4: configuring a deep learning environment and building a CNN;
and 5: randomly extracting 80% of samples in the data set as a training set for network learning of the mapping relation between the far-field light spots and the near-field wave fronts; and randomly extracting half of samples from the rest 20% of samples of the data set to be used as a verification set for adjusting network hyper-parameters and verifying the effectiveness of the algorithm, and finally taking the rest 10% of samples as a test set for unbiased estimation.
The defocusing grating in the step 1 can be regarded as an off-axis Fresnel zone plate, a symmetrically distributed +/-1-order diffraction optical axis exists, a pair of conjugate focal lengths exist on +/-1-order diffraction optical axis, the defocusing grating is in close contact with the short-focus lens for use, and the focal length of the short-focus lens is far smaller than that of the defocusing grating.
When the defocusing grating and the short-focus lens are used in close contact in the step 2, the defocusing grating finely adjusts the focusing capacity of the lens, the focal lengths of the plus or minus 1-order diffracted light are respectively slightly shorter and longer than the focal length of the lens, and the cross section of the focal plane of the short-focus lens on the plus or minus 1-order diffracted light is a front-back symmetrical out-of-focus plane, so that when the CCD is placed at the focal plane of the short-focus lens, the CCD can simultaneously measure the corresponding far-field light intensity distribution of the incident wavefront on the focal plane and the front-back symmetrical out-of-focus plane.
In step 3, the number of samples of the data set should be at least ten thousand, that is, the data set should be fully sampled in the sample space.
The deep learning network in step 4 may be CNN, or may be another deep learning network.
In step 5, the division manner of the training set, the verification set and the test set may be changed according to the size of the data set.
Compared with the prior art, the invention has the advantages that:
(1) the invention can overcome the multi-solution problem of the traditional single-frame light intensity phase inversion algorithm and the problem that the traditional phase difference method needs CCD to measure the light intensity distribution of a far field for multiple times on the focal plane and the defocusing plane;
(2) compared with the traditional phase inversion method, the CNN is directly used for fitting the mapping relation between the far-field light spot distribution and the near-field wavefront to recover the wavefront phase, so that the iterative process can be avoided, and the calculation efficiency is improved.
Drawings
FIG. 1 is a schematic diagram of the operating principle of the control algorithm of the present invention;
FIG. 2 is a flow chart of the operation of the present invention;
FIG. 3 is a schematic diagram of an out-of-focus grating;
FIG. 4 is a schematic diagram of a wavefront sensor based on defocused grating modulation;
FIG. 5 is a schematic diagram of the optical characteristics of an out-of-focus grating;
FIG. 6 is a diagram showing the distribution of far field light intensity in the focal plane of a short-focus lens modulated by an out-of-focus grating;
fig. 7 is a schematic diagram of a CNN architecture adopted in the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings in conjunction with specific embodiments.
FIG. 1 is a schematic diagram of a working principle of a single-frame focal plane light intensity image deep learning phase difference method based on grating modulation. Fig. 2 is a work flow diagram of the present invention, and the specific implementation process is as follows:
step 1: designing a wavefront sensor based on defocusing grating modulation, closely connecting an off-axis Fresnel zone plate with a short-focus lens for use, and placing a CCD on a focal plane of the short-focus lens. The relevant parameters of the defocusing grating are as follows: the length of the defocused grating side is 16mm, the focal length of the defocused grating is 7.5m, the displacement of a distorted grating slit of the defocused grating relative to a regular line grating slit is 33.75mm, the focal length of the short-focus lens is 200mm, the depths of phase steps of the defocused grating are respectively 0.639 pi and 2 pi, and the size of the CCD up-opening window is 100pixel multiplied by 100 pixel. FIG. 3 is a schematic diagram of an out-of-focus grating, and FIG. 4 is a schematic diagram of a wavefront sensor located away from the out-of-focus grating;
step 2: verifying whether the sensor designed in the step 1 can simultaneously obtain the positive and negative defocusing and far-field light intensity distribution of the focal plane on the focal plane of the short-focus lens, wherein FIG. 5 is an optical characteristic schematic diagram of a defocusing grating, and the positive and negative defocusing and far-field light intensity distribution of the focal plane can be simultaneously measured on the focal plane of the short-focus lens when the defocusing grating and the short-focus lens are closely connected for use. Fig. 6(a1) is a diagram illustrating a far field light intensity distribution of a focal plane of a short focal length lens, and fig. 6(a2), 6(a3) and 6(a4) are respectively a positive defocused light spot, a focal plane light spot and a negative defocused light spot extracted from fig. 6(a 1);
and step 3: if the step 2 can be realized, collecting far-field light spots and near-field wavefront data modulated by the defocused grating according to the step 1, respectively using focal plane far-field images measured by the CCD and Zernike mode coefficients corresponding to the near-field wavefront as samples and labels in a data set, and if the step 2 cannot be realized, repeatedly executing the step 1 and designing the sensor again until the step 2 is realized;
and 4, step 4: configuring a deep learning environment, and constructing a CNN, wherein FIG. 7 is a schematic diagram of a CNN architecture of the invention: the CNN architecture comprises 11 layers in total, of which there are 1 input layer, 4 convolutional layers, 4 pooling layers, and 2 full-link layers. The input layer inputs samples of 100 × 100 × 3 size, and 3 channels are respectively positive and negative defocus and far-field spot of focal plane. The convolution kernel sizes of the 4 convolutional layers are respectively 5 × 5, 4 × 4 and 4 × 4, and the channel numbers of the 4 convolutional layers are respectively 16, 32 and 32. The maximum pooling is selected for the pooling layers, and the step length of each pooling layer is 2. The number of nodes of the two fully connected layers is 300 and 20 respectively. The network outputs Zernike coefficients of order 4 to 23. CNN selects Adam function as gradient descending function, and initial learning rate is 10-3. To avoid network overfitting, Batch regularization was introduced for each convolutional layer. Epoch is set to 200 and Batch size is set to 100.
And 5: randomly extracting 80% of samples in the data set as a training set for network learning of the mapping relation between the far-field light spots and the near-field wave fronts; and randomly extracting half of samples from the rest 20% of samples of the data set to be used as a verification set for adjusting network hyper-parameters and verifying the effectiveness of the algorithm, and finally taking the rest 10% of samples as a test set for unbiased estimation.
After the network is trained to converge, the network can output the near-field wavefront information corresponding to the far-field light spot only by inputting a single-frame focal plane light intensity image to the network. The iterative operation is not involved in the process, the calculation speed is greatly improved, and the simulation result shows that the time required for completing the primary wavefront recovery can be less than 0.6 millisecond.
The above description is only an embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can understand that the modifications or substitutions within the technical scope of the present invention are included in the scope of the present invention.

Claims (6)

1. A single-frame focal plane light intensity image deep learning phase difference method based on grating modulation is characterized by comprising the following steps:
step 1: designing a wavefront sensor based on defocused grating modulation;
step 2: verifying whether the sensor designed in the step 1 can simultaneously obtain positive and negative defocusing and far-field light intensity distribution of a focal plane on the focal plane of the short-focus lens;
and step 3: if the step 2 can be realized, collecting far-field light spots and near-field wavefront data modulated by the defocused grating according to the step 1, respectively using focal plane far-field images measured by the CCD and Zernike mode coefficients corresponding to the near-field wavefront as samples and labels in a data set, and if the step 2 cannot be realized, repeatedly executing the step 1 and designing the sensor again until the step 2 is realized;
and 4, step 4: configuring a deep learning environment and building a Convolutional Neural Network (CNN);
and 5: randomly extracting 80% of samples in the data set as a training set for network learning of the mapping relation between the far-field light spots and the near-field wave fronts; and randomly extracting half of samples from the rest 20% of samples of the data set to be used as a verification set for adjusting network hyper-parameters and verifying the effectiveness of the algorithm, and finally taking the rest 10% of samples as a test set for unbiased estimation.
2. The method of claim 1, wherein the method comprises the following steps: the defocusing grating in the step 1 can be regarded as an off-axis Fresnel zone plate, a symmetrically distributed +/-1-order diffraction optical axis exists, a pair of conjugate focal lengths exist on +/-1-order diffraction optical axis, the defocusing grating is in close contact with the short-focus lens for use, and the focal length of the short-focus lens is far smaller than that of the defocusing grating.
3. The method of claim 1, wherein the method comprises the following steps: in the step 2, when the defocusing grating and the short-focus lens are used in close contact, the defocusing grating finely adjusts the focusing capacity of the lens, the focal lengths of the plus or minus 1-order diffracted lights are respectively slightly shorter and longer than the focal length of the lens, and the cross section of the focal plane of the short-focus lens on the plus or minus 1-order diffracted lights is a front-back symmetrical off-focus plane, so that when the CCD is placed at the focal plane of the short-focus lens, the CCD can simultaneously measure the corresponding far-field light intensity distribution of the incident wavefront on the focal plane and the front-back symmetrical off-focus plane.
4. The method of claim 1, wherein the method comprises the following steps: the number of samples of the data set in step 3 should be at least ten thousand, i.e. sufficient sampling in the sample space should be performed.
5. The method of claim 1, wherein the method comprises the following steps: the deep learning network in step 4 may be CNN, or may be another deep learning network.
6. The method of claim 1, wherein the method comprises the following steps: the division manner of the training set, the verification set and the test set in the step 5 can be changed appropriately according to the size of the data set.
CN202011031209.6A 2020-09-27 2020-09-27 Single-frame focal plane light intensity image depth learning phase difference method based on grating modulation Pending CN112179504A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011031209.6A CN112179504A (en) 2020-09-27 2020-09-27 Single-frame focal plane light intensity image depth learning phase difference method based on grating modulation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011031209.6A CN112179504A (en) 2020-09-27 2020-09-27 Single-frame focal plane light intensity image depth learning phase difference method based on grating modulation

Publications (1)

Publication Number Publication Date
CN112179504A true CN112179504A (en) 2021-01-05

Family

ID=73945131

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011031209.6A Pending CN112179504A (en) 2020-09-27 2020-09-27 Single-frame focal plane light intensity image depth learning phase difference method based on grating modulation

Country Status (1)

Country Link
CN (1) CN112179504A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113158487A (en) * 2021-05-08 2021-07-23 大连海事大学 Wavefront phase difference detection method based on long-short term memory depth network

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102095503A (en) * 2010-11-30 2011-06-15 中国科学院国家天文台南京天文光学技术研究所 Wavefront detection and reconstruction method based on differential sensor
CN102288305A (en) * 2011-07-18 2011-12-21 中国科学院光电技术研究所 Wave-front sensor of self-adaptive optical system and detecting method thereof
CN102331303A (en) * 2011-08-05 2012-01-25 中国科学院光电技术研究所 Grating-based phase difference wavefront sensor
CN102607719A (en) * 2011-06-24 2012-07-25 北京理工大学 Wave-front aberration detection device based on transverse shearing interference for beam expanding collimation system
CN107462150A (en) * 2017-07-19 2017-12-12 哈尔滨工程大学 Double-view field digital holographic detection device and method based on One Dimension Periodic grating and point diffraction
CN108197693A (en) * 2017-09-21 2018-06-22 中国科学院长春光学精密机械与物理研究所 Compression evolution Hybrid Particle Swarm Optimization based on phase difference wavefront reconstruction
CN109031654A (en) * 2018-09-11 2018-12-18 安徽农业大学 A kind of adaptive optics bearing calibration and system based on convolutional neural networks
CN111626997A (en) * 2020-05-21 2020-09-04 浙江大学 Method for directly detecting optical distortion phase by high-speed single image based on deep learning

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102095503A (en) * 2010-11-30 2011-06-15 中国科学院国家天文台南京天文光学技术研究所 Wavefront detection and reconstruction method based on differential sensor
CN102607719A (en) * 2011-06-24 2012-07-25 北京理工大学 Wave-front aberration detection device based on transverse shearing interference for beam expanding collimation system
CN102288305A (en) * 2011-07-18 2011-12-21 中国科学院光电技术研究所 Wave-front sensor of self-adaptive optical system and detecting method thereof
CN102331303A (en) * 2011-08-05 2012-01-25 中国科学院光电技术研究所 Grating-based phase difference wavefront sensor
CN107462150A (en) * 2017-07-19 2017-12-12 哈尔滨工程大学 Double-view field digital holographic detection device and method based on One Dimension Periodic grating and point diffraction
CN108197693A (en) * 2017-09-21 2018-06-22 中国科学院长春光学精密机械与物理研究所 Compression evolution Hybrid Particle Swarm Optimization based on phase difference wavefront reconstruction
CN109031654A (en) * 2018-09-11 2018-12-18 安徽农业大学 A kind of adaptive optics bearing calibration and system based on convolutional neural networks
CN111626997A (en) * 2020-05-21 2020-09-04 浙江大学 Method for directly detecting optical distortion phase by high-speed single image based on deep learning

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113158487A (en) * 2021-05-08 2021-07-23 大连海事大学 Wavefront phase difference detection method based on long-short term memory depth network

Similar Documents

Publication Publication Date Title
CN110956256B (en) Method and device for realizing Bayes neural network by using memristor intrinsic noise
CN109459923B (en) Holographic reconstruction algorithm based on deep learning
CN113158487B (en) Wavefront phase difference detection method based on long-short term memory depth network
CN107562992B (en) Photovoltaic array maximum power tracking method based on SVM and particle swarm algorithm
CN111579097B (en) High-precision optical scattering compensation method based on neural network
JP2022539659A (en) Method and apparatus for learning and testing an object detection network that detects objects on images using attention maps
Sun et al. Identification and adaptive control of a high-contrast focal plane wavefront correction system
CN111582435A (en) Diffraction depth neural network system based on residual error network
CN113033796A (en) Image identification method of all-optical nonlinear diffraction neural network
CN107329233A (en) A kind of droplet type PCR instrument Atomatic focusing method based on neutral net
CN112179504A (en) Single-frame focal plane light intensity image depth learning phase difference method based on grating modulation
Cho et al. One-shot neural architecture search via compressive sensing
CN106324854B (en) A kind of Phase-retrieval method based on the rectangular diffraction element of binary
CN113222250B (en) High-power laser device output waveform prediction method based on convolutional neural network
Markley et al. Physics-based learned diffuser for single-shot 3d imaging
CN112099229B (en) High-speed self-adaptive optical closed-loop control method based on far field
CN114022730B (en) Point target phase retrieval method based on self-supervision learning neural network
CN112179503A (en) Deep learning wavefront restoration method based on sparse subaperture shack-Hartmann wavefront sensor
CN111273533A (en) Coaxial digital holographic automatic focusing method and system
Petković et al. RETRACTED: Adaptive neuro-fuzzy prediction of modulation transfer function of optical lens system
CN112668797B (en) Long-short-period traffic prediction method
Zhao et al. Deep learning in power systems
CN112197876A (en) Single far-field type depth learning wavefront restoration method based on four-quadrant discrete phase modulation
CN114880953A (en) Rapid wavefront recovery method for four-step phase type Fresnel zone plate
CN114964524A (en) Target imaging wavefront phase restoration method based on defocused grating and neural network expansion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210105

RJ01 Rejection of invention patent application after publication