CN115561825A - Complex wavefront detection technology based on phase difference method and deep neural network - Google Patents

Complex wavefront detection technology based on phase difference method and deep neural network Download PDF

Info

Publication number
CN115561825A
CN115561825A CN202211213030.1A CN202211213030A CN115561825A CN 115561825 A CN115561825 A CN 115561825A CN 202211213030 A CN202211213030 A CN 202211213030A CN 115561825 A CN115561825 A CN 115561825A
Authority
CN
China
Prior art keywords
phase difference
neural network
wavefront
model
training
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211213030.1A
Other languages
Chinese (zh)
Inventor
刘辉
金振宇
季凯帆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yunnan Astronomical Observatory of CAS
Original Assignee
Yunnan Astronomical Observatory of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yunnan Astronomical Observatory of CAS filed Critical Yunnan Astronomical Observatory of CAS
Priority to CN202211213030.1A priority Critical patent/CN115561825A/en
Publication of CN115561825A publication Critical patent/CN115561825A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V8/00Prospecting or detecting by optical means
    • G01V8/10Detecting, e.g. by using light barriers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Geophysics (AREA)
  • Data Mining & Analysis (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • General Life Sciences & Earth Sciences (AREA)
  • Testing Of Optical Devices Or Fibers (AREA)

Abstract

The invention discloses a complex wavefront detection technology based on a phase difference method and a deep neural network, which comprises the following steps: collecting image information, simulating wavefront phase difference, training a neural network calculation model and predicting the model; the invention has the advantages that the complex wavefront detection technology based on the phase difference method and the deep neural network replaces the nonlinear optimization calculation in the existing PD method by the forward parallel calculation of the neural network model, namely, a simple forward (one-way) calculation mode replaces the original complex iterative calculation mode, and a universal wavefront detection algorithm model is obtained; the solving time of nonlinear optimization iterative computation minute magnitude is reduced to millisecond magnitude by the unidirectional computation mode of the neural network model, the requirement of complex dynamic wavefront quasi-real-time detection can be well met, and the application field of the PD method is greatly expanded.

Description

Complex wavefront detection technology based on phase difference method and deep neural network
Technical Field
The invention relates to the technical field of wavefront phase accurate detection, in particular to a complex wavefront detection technology based on a phase difference method and a deep neural network.
Background
Accurate detection of wavefront phase is a key technique for adaptive optics and high resolution imaging. In 1979, phase Diversity (Phase Diversity) was proposed by Gonsalves, which introduces an imaging channel with known aberrations (e.g., defocus) outside the focal plane image, and uses two (or more) images to remove the conjugate uncertainty of Phase recovery, thereby reconstructing wavefront Phase information. The PD method has the advantages of simple optical structure and low cost, is suitable for application occasions of point targets, extended targets, coherent light, incoherent light and the like, and is widely applied to the aspect of high-resolution optical imaging.
At present, the core algorithm adopted by the phase difference wavefront detection technology is an optimization solution for a nonlinear cost function, and the main problems faced by the method are as follows:
1. a regular term is constructed by mastering the prior information of the wavefront phase so as to avoid falling into a local optimal solution;
2. the iterative computation of the nonlinear optimization solution is time consuming.
Therefore, the PD method adopted at present cannot well meet the requirement of real-time detection of dynamically changing wavefront, and therefore, the present invention has been made in view of the above-mentioned problems.
Disclosure of Invention
The invention aims to solve the problems, designs a complex wavefront detection technology based on a phase difference method and a deep neural network, and solves the problems of the prior art.
The technical scheme of the invention for realizing the aim is as follows: the complex wavefront detection technology based on the phase difference method and the deep neural network comprises the following steps: collecting image information, simulating wavefront phase difference, training a neural network calculation model and predicting the model;
collecting image information: collecting focal plane and out-of-focus image information of the same imaging target;
wave front phase difference simulation: simulating a wave front phase difference sample based on a Kolmogorov atmospheric turbulence model, calculating a Point Spread Function (PSF) of a focal plane and an out-of-focus plane by using the wave front phase difference sample, generating a focal plane and an out-of-focus image of the corresponding wave front phase difference sample by using the PSF, and forming a training sample pair of a neural network by the focal plane and the out-of-focus image and the corresponding wave front phase difference;
training a neural network calculation model, firstly carrying out data transformation, and then inputting a data transformation result into a hybrid neural network for training;
data transformation: fourier transforming the focal image and the out-of-focus image to obtain frequency domain representation thereof: i is f And I d (ii) a By I f And I d Is calculated to obtain F 1 ,F 2 And F 3
And (3) hybrid network training: with F 1 Real part of Re [ F ] 1 ]And an imaginary part Im [ F [) 1 ]And F 2 、F 3 Taking the fitting result of the Zernike polynomial of the wave-front phase difference as input and output, and training a hybrid neural network;
model prediction: and acquiring focal plane and defocused images, and inputting the images into the mixed neural network through data transformation to obtain a Zernike polynomial fitting result of wavefront phase difference as a prediction result.
And calculating the Point Spread Function (PSF) of the focal plane and the out-of-focus plane of the wave front phase difference simulation through a phase difference method imaging optical path.
The training number of the samples of the wave front phase difference simulation is not less than 2 ten thousand samples.
When the data transformation is carried out during the training of the neural network calculation model, F 1 ,F 2 And F 3 Meter (2)
Figure BDA0003872098240000021
The calculation method is
Figure BDA0003872098240000022
Figure BDA0003872098240000023
In the calculation formula of the data transformation, ' indicates taking complex conjugate, and in the calculation formula, ' | ' indicates taking module operation.
The complex wave-front detection technology based on the phase difference method and the deep neural network manufactured by the technical scheme of the invention trains the neural network calculation model by using the simulated wave-front phase sample, so that the neural network calculation model can obtain a focal plane image and a defocused image and calculate the universal mapping relation of wave-front phase information. And the approximation of the dynamic complex wavefront phase is obtained in real time by the calculation model, and the main points are as follows.
1. The forward parallel computation of the neural network model replaces the nonlinear optimization computation in the existing PD method, namely a simple forward (one-way) computation mode replaces the original complex iterative computation mode, and a universal wave-front detection algorithm model is obtained.
2. The one-way calculation mode of the neural network model reduces the solving time of nonlinear optimization iterative computation in minute order to millisecond order, can better meet the requirement of complex dynamic wavefront quasi-real-time detection, and greatly expands the application field of the PD method.
Drawings
Fig. 1 is a phase difference method (PD) imaging optical path diagram of a complex wavefront sensing technology based on a phase difference method and a deep neural network according to the present invention.
Fig. 2 is a flowchart of the wavefront measurement of the PD-based deep neural network based on the complex wavefront measurement technique of the phase difference method and the deep neural network according to the present invention.
Detailed Description
The invention is described in detail below with reference to the following drawings:
the invention discloses a novel method for predicting complex wavefront phase difference (such as earth atmospheric turbulence phase) by using focal plane and defocusing image information of the same imaging target through a depth neural network.
The method trains a neural network calculation model by using a simulated wave-front phase sample, so that a focus image and a defocused image can be obtained, and the universality mapping relation of wave-front phase information is calculated. And from this computational model an approximation of the dynamic complex wavefront phase is obtained in real time. The method mainly comprises the following three steps:
1. wavefront phase difference simulation
1.1 generating a simulated wavefront phase difference sample (generally not less than 2 ten thousand samples) based on a Kolmogorov atmospheric turbulence model;
1.2 calculating Point Spread Function (PSF) of focal plane and out-of-focus plane using wavefront phase difference samples according to the principle of FIG. 1;
1.3 generating the focal plane and the out-of-focus image of the corresponding wavefront phase difference sample from the PSF
1.4 the focal and out-of-focus images differ from the corresponding wavefronts to form a training sample pair for the neural network.
2. Model training
2.1 data transformation:
fourier transforming the focal image and the out-of-focus image to obtain frequency domain representation thereof: i is f And I d (ii) a From I f
Figure BDA0003872098240000041
And I d To obtain F 1 ,F 2 And F 3
Figure BDA0003872098240000042
Wherein '. Alpha.' represents taking complex conjugate,
Figure BDA0003872098240000043
'| |' denotes modulo arithmetic;
2.2 with F 1 Real part of Re [ F ] 1 ]And an imaginary part Im [ F [) 1 ]And F 2 、F 3 The hybrid neural network in fig. 2 was trained with the results of the zernike polynomial fit of the wavefront phase differences as input and as output.
3. Model prediction
The optical system shown in fig. 1 is used for collecting the focal plane and the defocused image, and the focal plane and the defocused image are input into the mixed neural network through 2.1 data transformation, so that the zernike polynomial fitting result of the wave front phase difference is obtained to represent estimation.
For improved data comparison see table one:
TABLE 1 comparison of outstanding results before and after improvement
Figure BDA0003872098240000044
According to the complex wavefront detection technology based on the phase difference method and the deep neural network, the neural network calculation model is trained by using simulated wavefront phase samples, so that a focus image and a defocused image can be obtained, and the universal mapping relation of wavefront phase information is calculated. And from this computational model an approximation of the dynamic complex wavefront phase is obtained in real time. The forward parallel computation of a neural network model replaces the nonlinear optimization computation in the existing PD method, namely a simple forward (one-way) computation mode replaces the original complex iterative computation mode, and a universal wave-front detection algorithm model is obtained; the one-way calculation mode of the neural network model reduces the solving time of nonlinear optimization iterative computation in minute order to millisecond order, can better meet the requirement of complex dynamic wavefront quasi-real-time detection, and greatly expands the application field of the PD method.
The technical solutions described above only represent the preferred technical solutions of the present invention, and some possible modifications to some parts of the technical solutions by those skilled in the art all represent the principles of the present invention, and fall within the protection scope of the present invention.

Claims (5)

1. The complex wavefront detection technology based on the phase difference method and the deep neural network is characterized by comprising the following steps of: collecting image information, simulating wavefront phase difference, training a neural network calculation model and predicting the model;
collecting image information: collecting focal plane and out-of-focus image information of the same imaging target;
wave front phase difference simulation: simulating a wave front phase difference sample based on a Kolmogorov atmospheric turbulence model, calculating a Point Spread Function (PSF) of a focal plane and an out-of-focus plane by using the wave front phase difference sample, generating a focal plane and an out-of-focus image of the corresponding wave front phase difference sample by using the PSF, and forming a training sample pair of a neural network by the focal plane and the out-of-focus image and the corresponding wave front phase difference;
training a neural network calculation model, firstly carrying out data transformation, and then inputting a data transformation result into a hybrid neural network for training;
data transformation: focal and out-of-focus imagesThe line fourier transform yields their frequency domain representation: i is f And I d (ii) a Through I f And I d Is calculated to obtain F 1 ,F 2 And F 3
And (3) hybrid network training: with F 1 Real part of Re [ F 1 ]And an imaginary part Im [ F [) 1 ]And F 2 、F 3 Taking the fitting result of the Zernike polynomial of the wave-front phase difference as input and output, and training a hybrid neural network;
model prediction: and acquiring focal plane and defocused images, and inputting the images into the mixed neural network through data transformation to obtain a Zernike polynomial fitting result of wavefront phase difference as a prediction result.
2. The complex wavefront sensor technology based on phase difference method and deep neural network of claim 1, characterized in that the wavefront phase difference simulates Point Spread Function (PSF) of focal plane and off-focal plane, and is calculated by phase difference method imaging optical path.
3. The complex wavefront sensing technique based on phase difference method and deep neural network of claim 2, characterized in that the training number of the wavefront phase difference simulation samples is not less than 2 ten thousand samples.
4. The complex wavefront sensor technique of claim 1 based on phase difference method and deep neural network, wherein F is the time of data transformation when training the neural network computation model 1 ,F 2 And F 3 Is calculated in a manner that
Figure FDA0003872098230000011
5. The complex wavefront sensor technology based on the phase difference method and the deep neural network of claim 4, wherein '. DELTA.' in the calculation formula of the data transformation represents complex conjugate, and '. DELTA.' in the calculation formula represents modulo operation.
CN202211213030.1A 2022-09-29 2022-09-29 Complex wavefront detection technology based on phase difference method and deep neural network Pending CN115561825A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211213030.1A CN115561825A (en) 2022-09-29 2022-09-29 Complex wavefront detection technology based on phase difference method and deep neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211213030.1A CN115561825A (en) 2022-09-29 2022-09-29 Complex wavefront detection technology based on phase difference method and deep neural network

Publications (1)

Publication Number Publication Date
CN115561825A true CN115561825A (en) 2023-01-03

Family

ID=84745699

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211213030.1A Pending CN115561825A (en) 2022-09-29 2022-09-29 Complex wavefront detection technology based on phase difference method and deep neural network

Country Status (1)

Country Link
CN (1) CN115561825A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107843982A (en) * 2017-12-01 2018-03-27 长春理工大学 Based on real-time phase difference technology without Wavefront detecting adaptive optics system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107843982A (en) * 2017-12-01 2018-03-27 长春理工大学 Based on real-time phase difference technology without Wavefront detecting adaptive optics system
CN107843982B (en) * 2017-12-01 2024-03-08 长春理工大学 Wave front-free detection self-adaptive optical system based on real-time phase difference technology

Similar Documents

Publication Publication Date Title
CN100583144C (en) Multi-frame self-adaption optical image high resolution restoration method using wave front data
WO2022000857A1 (en) Dataset establishment method, vehicle, and storage medium
CN109031654A (en) A kind of adaptive optics bearing calibration and system based on convolutional neural networks
CN115561825A (en) Complex wavefront detection technology based on phase difference method and deep neural network
CN113158487B (en) Wavefront phase difference detection method based on long-short term memory depth network
CN111221123A (en) Wavefront-sensor-free self-adaptive optical correction method based on model
CN111968047A (en) Adaptive optical image blind restoration method based on generating type countermeasure network
Zhang et al. Imaging through the atmosphere using turbulence mitigation transformer
CN106534614A (en) Rapid movement compensation method of moving target detection under mobile camera
CN116343334A (en) Motion recognition method of three-stream self-adaptive graph convolution model fused with joint capture
CN114120263A (en) Image processing apparatus, recording medium, and image processing method
CN114202473A (en) Image restoration method and device based on multi-scale features and attention mechanism
CN106991659A (en) A kind of multi-frame self-adaption optical image restoration methods for adapting to atmospheric turbulance change
CN116188265A (en) Space variable kernel perception blind super-division reconstruction method based on real degradation
CN115524018A (en) Solving method and system for phase difference wavefront detection
Estrada et al. Multi-frame image fusion using a machine learning-based weight mask predictor for turbulence-induced image degradation
CN115079175A (en) Synthetic aperture radar undersampling imaging method based on SE-Unet
CN114821228A (en) Depth image output model training method, depth image obtaining method and device
Wenjie et al. Research on super-resolution reconstruction algorithm of remote sensing image based on generative adversarial networks
CN113112522A (en) Twin network target tracking method based on deformable convolution and template updating
CN115019125A (en) Self-supervision extended target aberration detection method based on convolutional neural network
Senthilkumar et al. Radial Basis Function Networks for image restoration with stochastic normalizations as Bayesian learning in deep conventional neural network
CN113192146B (en) Method for eliminating defocusing error in grid line projection phase shift technology based on deep learning
CN115311185B (en) High-resolution refocusing method for ISAR defocused image of maneuvering target
CN114332187B (en) Monocular target ranging method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination