CN115222657A - Lens aberration prediction method and device, electronic device and storage medium - Google Patents
Lens aberration prediction method and device, electronic device and storage medium Download PDFInfo
- Publication number
- CN115222657A CN115222657A CN202210605185.3A CN202210605185A CN115222657A CN 115222657 A CN115222657 A CN 115222657A CN 202210605185 A CN202210605185 A CN 202210605185A CN 115222657 A CN115222657 A CN 115222657A
- Authority
- CN
- China
- Prior art keywords
- pictures
- aberration
- visual angles
- neural network
- lens
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000004075 alteration Effects 0.000 title claims abstract description 86
- 238000000034 method Methods 0.000 title claims abstract description 66
- 230000000007 visual effect Effects 0.000 claims abstract description 40
- 238000013528 artificial neural network Methods 0.000 claims abstract description 31
- 230000003287 optical effect Effects 0.000 claims abstract description 22
- 238000003384 imaging method Methods 0.000 claims abstract description 17
- 238000012545 processing Methods 0.000 claims abstract description 13
- 230000006870 function Effects 0.000 claims description 25
- 238000004590 computer program Methods 0.000 claims description 6
- 230000008569 process Effects 0.000 abstract description 23
- 238000005516 engineering process Methods 0.000 abstract description 10
- 238000004088 simulation Methods 0.000 abstract description 6
- 238000004364 calculation method Methods 0.000 abstract description 5
- 238000004891 communication Methods 0.000 description 8
- 230000000694 effects Effects 0.000 description 7
- 230000007613 environmental effect Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 238000012937 correction Methods 0.000 description 3
- 230000003044 adaptive effect Effects 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 238000005094 computer simulation Methods 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000008034 disappearance Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000004880 explosion Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000012804 iterative process Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000000386 microscopy Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/80—Geometric correction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10052—Images from lightfield camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Studio Devices (AREA)
Abstract
The present application relates to the field of image data processing technologies, and in particular, to a method and an apparatus for predicting an aberration of a lens, an electronic device, and a storage medium, where the method includes: acquiring pictures of a plurality of visual angles, wherein the pictures of the plurality of visual angles are acquired by the same imaging system and comprise different angle domain information; the method comprises the steps of inputting pictures of a plurality of visual angles into a preset neural network, extracting at least one visual angle characteristic of each visual angle, and predicting aberration brought by an optical lens or a shooting environment based on the at least one visual angle characteristic. Therefore, the technical problems of complex simulation propagation process, high calculation complexity and poor robustness caused by the physical optics in the related technology are solved.
Description
Technical Field
The present disclosure relates to the field of image data processing technologies, and in particular, to a method and an apparatus for predicting an aberration of a lens, an electronic device, and a storage medium.
Background
Two-dimensional imaging sensors have revolutionized almost all areas including industrial inspection, mobile devices, automotive driving, surveillance, medical diagnostics, biology and astronomy, benefiting from the rapid growth of the semiconductor industry, the pixel size of digital sensors has grown rapidly over the past decade, however, the actual performance of most imaging systems has reached the bottleneck of optics rather than electronics, and imperfect lenses or environmental disturbances can cause optical aberrations, affecting the accuracy of the imaging.
AO (Adaptive optics) achieves active aberration correction through deformable mirror arrays or spatial light modulators, but current AO systems are often complex, bulky and expensive and cannot be adapted to lightweight systems or portable devices.
The proposed scanning light field imaging provides a solution for aberration estimation on lightweight systems or portable devices, and the related art realizes robust, general and high-performance 3D imaging with large SBP (space-bandwidth product) at low cost based on a Digital Adaptive Optical (DAO) algorithm. Specifically, the related art can simulate the forward and reverse processes of light propagation in a physical mode, set an initial aberration value, calculate a lens point spread function, calculate a high-resolution picture through a deconvolution process, calculate a picture to be shot through the picture and the point spread function, compare the picture with the existing sensor signal, correct the aberration, and iterate continuously until convergence.
However, the related technology is based on physical optics, the forward and backward propagation process is complex, the computer simulation time is long, whether convergence is uncertain or not is determined, and the iterative process is different and the robustness is poor for different process iteration modes; the deconvolution process is time-consuming and energy-consuming, and iteration causes that the process cannot be avoided for multiple times and is difficult to be paralleled; based on a simple neural network, such as 3D CNN, the network burden is too heavy in the whole process, the performance of the network predicting point spread function or aberration is unstable, or the accuracy is low, so that the reconstruction effect is not good, and improvement is urgently needed.
Disclosure of Invention
The present application is based on the recognition and discovery by the inventors of the following problems:
for a gigapixel sensor, the number of pixels available for a normal imaging system is typically limited to the megapixel level, mainly due to optical aberrations caused by imperfect lenses or environmental disturbances, which can result in light emanating from a point being spread over a large area on the two-dimensional sensor. Meanwhile, projecting a 3D scene onto a 2D plane also results in loss of various degrees of freedom of LF (Light Field), such as depth and local coherence. Therefore, obtaining high density depth maps using integrated sensors has been a challenge.
To solve the above problems, optical engineering experts use a plurality of precision engineering lenses to perform sequential aberration correction to realize a perfect imaging system. However, while the difficulty of optical design and manufacture grows exponentially with SBP, the number of available pixels is limited due to the diffraction limit, resulting in high performance incoherent imaging systems with an effective SBP that are typically very expensive and bulky, such as large aperture telescopes and half mirrors. To alleviate this problem, an optimized lens surface can be manufactured by a superlens and a free-form optical device while giving sufficient processing accuracy in a large scale range.
In addition, an image deblurring algorithm designed for a two-dimensional sensor can improve image contrast through accurate estimation of a PSF (point spread function), and PSF engineering with coded aperture retains more information by reducing zeros in the frequency domain. However, these methods are difficult to recover the high frequency information lost by the low MTF (Modulation Transfer Function) and typically require a certain data prior and accurate PSF estimation, especially for spatially non-uniform aberrations. At the same time, these methods are still very sensitive to dynamic environmental aberrations with a small DOF (depth of field).
The AO implements active aberration correction through an array of deformable mirrors or a spatial light modulator by directing light rays emitted from a point at different angles to the same location on the 2D sensor, and the wavefront affected by the aberration can be measured by the directors and the wavefront sensor, or can be measured by iterative updating according to a specific evaluation index. AO has enjoyed great success in astronomy and microscopy, and has enjoyed great scientific findings. However, the spatially non-uniform aberrations introduced by the 3D heterogeneity of the refractive indices cause the effective FOV (Field of View) of the current AO method to be very small, especially for ground telescopes, aberrations caused by atmospheric turbulence limit the Field diameter of the AO to around 40 arc seconds, which is not suitable for large-scale observation telescopes, while current AO systems are generally complex, bulky and expensive and cannot be applied to lightweight systems or portable devices.
The application provides a method and a device for predicting aberration caused by an optical lens or a shooting environment, electronic equipment and a storage medium, and aims to solve the technical problems that in the related technology, based on physical optics, the propagation process is complex, time and energy are consumed, the robustness is poor, and the reconstruction effect cannot meet expectations.
An embodiment of a first aspect of the present application provides a method for predicting an aberration of a lens, including the following steps: acquiring pictures of a plurality of visual angles, wherein the pictures of the plurality of visual angles are acquired by the same imaging system and comprise different angle domain information; and inputting the pictures of the plurality of visual angles into a preset neural network, extracting at least one visual angle characteristic of each visual angle, and predicting aberration caused by the optical lens or the shooting environment based on the at least one visual angle characteristic.
Optionally, in an embodiment of the present application, the method further includes: and reconstructing a high-resolution picture without aberration influence according to the pictures of the plurality of visual angles and the actual point spread function.
Optionally, in an embodiment of the present application, the predicting aberrations introduced by the optical lens or the shooting environment based on the at least one view angle characteristic includes: and fitting the Zernike polynomials by using the aberration and the preset Zernike polynomial coefficients to generate the aberrations with different resolutions.
Optionally, in an embodiment of the present application, the acquiring the pictures of the multiple view angles includes: scanning data of the light field multi-view camera to obtain a plurality of data pictures; and rearranging the data pictures to obtain the pictures of the visual angles.
Optionally, in an embodiment of the present application, the preset neural network is a ResNet3D neural network.
In a second aspect, an embodiment of the present application provides an aberration prediction apparatus for a lens, including: the system comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring pictures of multiple visual angles, and the pictures of the multiple visual angles are acquired by the same imaging system and comprise different angle domain information; and the prediction module is used for inputting the pictures of the plurality of visual angles into a preset neural network, extracting at least one visual angle characteristic of each visual angle and predicting aberration brought by the optical lens or the shooting environment based on the at least one visual angle characteristic.
Optionally, in an embodiment of the present application, the method further includes: and the reconstruction module is used for reconstructing a high-resolution picture without aberration influence according to the pictures of the plurality of visual angles and the actual point spread function.
Optionally, in an embodiment of the present application, the prediction module includes: and the fitting unit is used for fitting the Zernike polynomials by utilizing the aberration and the preset Zernike polynomial coefficients to generate the aberrations with different resolutions.
Optionally, in an embodiment of the present application, the obtaining module includes: the scanning unit is used for scanning data of the light field multi-view camera to obtain a plurality of data pictures; and the processing unit is used for rearranging the data pictures to obtain the pictures of the visual angles.
Optionally, in an embodiment of the present application, the preset residual neural network is a ResNet3D neural network.
An embodiment of a third aspect of the present application provides an electronic device, including: a memory, a processor and a computer program stored on the memory and executable on the processor, the processor executing the program to implement the aberration prediction method of the lens as described in the above embodiments.
A fourth aspect of the present application provides a computer-readable storage medium storing computer instructions for causing a computer to execute the method for predicting the aberration of a lens according to the foregoing embodiment.
According to the method and the device, the predicted lens aberration of the multi-view-angle picture can be obtained by using the residual neural network, the aberration with higher resolution is obtained through aberration fitting, high-speed non-iteration and low memory burden picture reconstruction are realized while the parallax estimation precision is ensured, and the system robustness is realized. Therefore, the technical problems that the simulation propagation process is complex, the calculation complexity is high, the robustness is poor and the reconstruction effect cannot meet the expectation due to the fact that the simulation propagation process is based on physical optics in the related technology are solved.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a flowchart illustrating an aberration prediction method for a lens according to an embodiment of the present disclosure;
FIG. 2 is a flowchart illustrating a method for aberration prediction of a lens according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of an aberration prediction apparatus for a lens according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary and intended to be used for explaining the present application and should not be construed as limiting the present application.
A method, an apparatus, an electronic device, and a storage medium for predicting an aberration of a lens according to embodiments of the present application are described below with reference to the drawings. In order to solve the technical problems that the simulation propagation process is complex, the calculation complexity is high, the robustness is poor and the reconstruction effect cannot meet the expectation due to the fact that the related technology mentioned in the background technology center is based on physical optics, the method for predicting the aberration of the lens is provided. Therefore, the technical problems that in the related technology, due to the fact that physical optics is used, the propagation process is complex, time and energy are consumed, robustness is poor, and the reconstruction effect cannot meet the expectation are solved.
Specifically, fig. 1 is a flowchart illustrating a method for predicting aberration of a lens according to an embodiment of the present disclosure.
As shown in fig. 1, the aberration prediction method of the lens includes the following steps:
in step S101, pictures of multiple viewing angles are acquired, wherein the pictures of multiple viewing angles are acquired by the same imaging system and include different angle domain information.
It can be understood that optical aberration caused by lens or environmental interference is a main factor that causes effective pixels of an imaging device to be low, for example, when a 3D scene is projected to a 2D plane under the same light source, a loss of degrees of freedom of LF, such as depth and local coherence, may be caused.
Optionally, in an embodiment of the present application, acquiring pictures from multiple viewing angles includes: scanning data of the light field multi-view camera to obtain a plurality of data pictures; and rearranging the plurality of data pictures to obtain pictures of a plurality of visual angles.
In an actual execution process, the method and the device for obtaining the multi-view images can scan data of the light field multi-view camera to obtain the images of multiple views, further, the method and the device for obtaining the multi-view images can rearrange the obtained multiple data images to obtain the images of multiple views, and then the multi-view images are spliced into a two-dimensional image or stacked into a three-dimensional stack image.
In step S102, the pictures of multiple view angles are input into a preset neural network, at least one view angle feature of each view angle is extracted, and an aberration caused by the optical lens or the shooting environment is predicted based on the at least one view angle feature.
It can be understood that, based on the existing neural network, there are problems of gradient explosion, gradient disappearance and the like along with the increase of the depth of the network, the training becomes more and more difficult, the network is overloaded in the whole process, the performance of the network for predicting point spread function or aberration is unstable, and the reconstruction quality of the picture is not guaranteed.
Therefore, in the embodiment of the application, multi-view images obtained by rearranging the pictures of multiple views can be spliced into a two-dimensional image or stacked into a three-dimensional stack image and then input into a preset residual error neural network, so that at least one view characteristic of each view is extracted, and the corresponding aberration of the camera is predicted.
Optionally, in an embodiment of the present application, predicting an aberration caused by the optical lens or the shooting environment based on the at least one view angle characteristic includes: and fitting the Zernike polynomials by utilizing the aberration and the preset Zernike polynomial coefficients to generate the aberration with different resolutions.
Optionally, in an embodiment of the present application, the preset residual neural network is a ResNet3D neural network.
In an actual implementation process, the preset residual neural network in the embodiment of the present application may be a ResNet3D neural network, such as ResNet34, and the ResNet3D neural network is used to predict a planar gradient of an aberration, in general, the residual neural network is more suitable for processing a two-dimensional image, because a 3D model is usually represented by grid data and lacks of a regular structure and hierarchical representation, a three-dimensional image processing may be implemented by the ResNet3D neural network, so that correlation between scanned light field multi-view pictures is effectively processed, and reconstruction of a subsequent high-quality image is facilitated.
Further, according to the embodiment of the application, the image reconstruction can be realized according to the predicted aberration.
Optionally, in an embodiment of the present application, the method further includes: and reconstructing an original picture of the camera according to the pictures of the multiple visual angles and the actual point spread function.
In an actual execution process, the embodiment of the application can calculate an actual point spread function based on an aberration picture predicted by an ideal point spread function, and deconvolution is performed by using the actual point spread function to obtain a picture.
The working principle of the aberration prediction method of the lens according to the embodiment of the present application is described in detail with reference to fig. 2.
As shown in fig. 2, the embodiment of the present application may include the following steps:
step S201: a scanning light field camera sensor collects data.
Step S202: and rearranging the multi-view pictures, and splicing the multi-view pictures into a two-dimensional image or stacking the multi-view pictures into a three-dimensional stack image. It can be understood that optical aberration caused by lens or environmental interference is a main factor causing a low effective pixel of an imaging device, and projecting a 3D scene to a 2D plane under the same light source may cause a loss of degree of freedom of LF, such as depth and local coherence.
Step S203: aberration plane gradients were predicted by ResNet 3D. According to the method and the device, the image of the plurality of visual angles can be input to a residual error neural network such as ResNet3D to extract the characteristics of each visual angle, and the aberration image can be predicted.
Step S204: the actual point spread function is calculated. According to the method and the device, the aberration picture can be obtained based on the ideal point spread function prediction, the actual point spread function is calculated, and the picture is obtained by performing deconvolution by using the actual point spread function.
Step S205: and calculating an actual picture. According to the embodiment of the application, the actual point spread function and the deconvolution-based multi-view picture can be used for calculating to generate the actual picture, and high-precision picture reconstruction is achieved.
According to the lens aberration prediction method provided by the embodiment of the application, the lens aberration of the multi-view image can be predicted by using the residual error neural network, the image reconstruction with high speed, no iteration and small memory burden is realized while the parallax estimation precision is ensured, and the system robustness is realized. Therefore, the technical problems that the quasi-propagation process is complex, the calculation complexity is high, the robustness is poor and the reconstruction effect cannot meet the expectation due to the fact that the quasi-propagation process is based on physical optics in the related technology are solved.
Next, an aberration prediction apparatus of a lens proposed according to an embodiment of the present application is described with reference to the drawings.
Fig. 3 is a block diagram illustrating an aberration prediction apparatus of a lens according to an embodiment of the present application.
As shown in fig. 3, the aberration predicting apparatus 10 of the lens includes: an acquisition module 100 and a prediction module 200.
Specifically, the acquiring module 100 is configured to acquire pictures of multiple viewing angles, where the pictures of the multiple viewing angles are acquired by the same imaging system and include different angle domain information.
The prediction module 200 is configured to input pictures of multiple view angles into a preset neural network, extract at least one view angle feature of each view angle, and predict an aberration caused by an optical lens or a shooting environment based on the at least one view angle feature.
Optionally, in an embodiment of the present application, the aberration predicting apparatus 10 for a lens further includes: and a reconstruction module.
The reconstruction module is used for reconstructing a high-resolution picture without aberration influence according to pictures of a plurality of visual angles and an actual point spread function.
Optionally, in an embodiment of the present application, the prediction module 200 includes: and a fitting unit.
The fitting unit is used for fitting the Zernike polynomials by using the aberration and preset Zernike polynomial coefficients to generate the aberrations with different resolutions.
Optionally, in an embodiment of the present application, the obtaining module 100 includes: a scanning unit and a processing unit.
The scanning unit is used for scanning data of the light field multi-view camera to obtain a plurality of data pictures.
And the processing unit is used for rearranging the plurality of data pictures to obtain pictures of a plurality of visual angles.
Optionally, in an embodiment of the present application, the preset residual neural network is a ResNet3D neural network.
It should be noted that the above explanation of the embodiment of the aberration prediction method for a lens is also applicable to the aberration prediction apparatus for a lens in this embodiment, and is not repeated herein.
According to the aberration prediction device of the lens, the lens aberration prediction device of the multi-view image can be obtained by using the residual error neural network, the image reconstruction with high speed, no iteration and small memory burden is realized while the parallax estimation precision is ensured, and the system robustness is achieved. Therefore, the technical problems that the simulation propagation process is complex, the calculation complexity is high, the robustness is poor and the reconstruction effect cannot meet the expectation due to the fact that the simulation propagation process is based on physical optics in the related technology are solved.
Fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present application. The electronic device may include:
The processor 402 implements the aberration prediction method of the lens provided in the above-described embodiment when executing the program.
Further, the electronic device further includes:
a communication interface 403 for communication between the memory 401 and the processor 402.
A memory 401 for storing computer programs operable on the processor 402.
If the memory 401, the processor 402 and the communication interface 403 are implemented independently, the communication interface 403, the memory 401 and the processor 402 may be connected to each other through a bus and perform communication with each other. The bus may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended ISA (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in FIG. 4, but this does not indicate only one bus or one type of bus.
Alternatively, in practical implementation, if the memory 401, the processor 402 and the communication interface 403 are integrated on a chip, the memory 401, the processor 402 and the communication interface 403 may complete communication with each other through an internal interface.
The present embodiment also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the aberration prediction method of a lens as above.
In the description herein, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or N embodiments or examples. Moreover, various embodiments or examples and features of various embodiments or examples described in this specification can be combined and combined by one skilled in the art without being mutually inconsistent.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or to implicitly indicate the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present application, "N" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or N executable instructions for implementing steps of a custom logic function or process, and alternate implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the embodiments of the present application.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or N wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the N steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. If implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried out in the method of implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and the program, when executed, includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. While embodiments of the present application have been shown and described above, it will be understood that the above embodiments are exemplary and should not be construed as limiting the present application and that changes, modifications, substitutions and alterations in the above embodiments may be made by those of ordinary skill in the art within the scope of the present application.
Claims (11)
1. A method for predicting aberration of a lens, comprising the steps of:
acquiring pictures of a plurality of visual angles, wherein the pictures of the plurality of visual angles are acquired by the same imaging system and comprise different angle domain information;
and inputting the pictures of the multiple view angles into a preset neural network, extracting at least one view angle characteristic of each view angle, and predicting aberration caused by the optical lens or the shooting environment based on the at least one view angle characteristic.
2. The method of claim 1, further comprising:
and reconstructing a high-resolution picture without aberration influence according to the pictures of the plurality of visual angles and the actual point spread function.
3. The method of claim 1, wherein predicting aberrations introduced by an optical lens or a shooting environment based on the at least one view angle feature comprises:
and fitting the Zernike polynomials by using the aberration and the preset Zernike polynomial coefficients to generate the aberrations with different resolutions.
4. The method of claim 1, wherein the obtaining the pictures of the plurality of views comprises:
scanning data of the light field multi-view camera to obtain a plurality of data pictures;
and rearranging the data pictures to obtain the pictures of the visual angles.
5. The method of any one of claims 1-4, wherein the pre-defined neural network is a ResNet3D neural network.
6. An aberration prediction apparatus for a lens, comprising:
the system comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring pictures of multiple visual angles, the pictures of the multiple visual angles are acquired by the same imaging system and comprise different angle domain information;
and the prediction module is used for inputting the pictures of the plurality of visual angles into a preset neural network, extracting at least one visual angle characteristic of each visual angle and predicting aberration brought by the optical lens or the shooting environment based on the at least one visual angle characteristic.
7. The apparatus of claim 6, further comprising:
and the reconstruction module is used for reconstructing a high-resolution picture without aberration influence according to the pictures of the plurality of visual angles and the actual point spread function.
8. The apparatus of claim 6, wherein the obtaining module comprises:
the scanning unit is used for scanning data of the light field multi-view camera to obtain a plurality of data pictures;
and the processing unit is used for rearranging the plurality of data pictures to obtain the pictures of the plurality of visual angles.
9. The apparatus according to any one of claims 6-8, wherein the pre-set neural network is a ResNet3D neural network.
10. An electronic device, comprising: a memory, a processor and a computer program stored on the memory and executable on the processor, the processor executing the program to implement the aberration prediction method of the lens according to any one of claims 1 to 5.
11. A computer-readable storage medium, on which a computer program is stored, the program being executable by a processor for implementing aberrations introduced by an optical lens or a shooting environment according to any one of claims 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210605185.3A CN115222657A (en) | 2022-05-30 | 2022-05-30 | Lens aberration prediction method and device, electronic device and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210605185.3A CN115222657A (en) | 2022-05-30 | 2022-05-30 | Lens aberration prediction method and device, electronic device and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115222657A true CN115222657A (en) | 2022-10-21 |
Family
ID=83608606
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210605185.3A Pending CN115222657A (en) | 2022-05-30 | 2022-05-30 | Lens aberration prediction method and device, electronic device and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115222657A (en) |
-
2022
- 2022-05-30 CN CN202210605185.3A patent/CN115222657A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11861813B2 (en) | Image distortion correction method and apparatus | |
JP4782899B2 (en) | Parallax detection device, distance measuring device, and parallax detection method | |
EP3522522A1 (en) | Methods and apparatus for superpixel modulation | |
KR100914211B1 (en) | Distorted image correction apparatus and method | |
JPWO2017195801A1 (en) | Calibration apparatus, calibration method, optical apparatus, photographing apparatus, projection apparatus, measurement system, and measurement method | |
CN113256741B (en) | Lens calibration method and device for scanning light field imaging system | |
JP2021503133A (en) | Systems and methods for exogenous calibration of cameras and diffractive optics | |
JP2004328506A (en) | Imaging apparatus and image recovery method | |
KR20090004428A (en) | A method and a system for optical design and an imaging device using an optical element with optical aberrations | |
CN114494258B (en) | Lens aberration prediction and image reconstruction method and device | |
WO2011137140A1 (en) | Range measurement using a coded aperture | |
JP2017208641A (en) | Imaging device using compression sensing, imaging method, and imaging program | |
CN115086550B (en) | Meta imaging system | |
CN109859313B (en) | 3D point cloud data acquisition method and device, and 3D data generation method and system | |
JP2006094468A (en) | Imaging device and imaging method | |
CN115222657A (en) | Lens aberration prediction method and device, electronic device and storage medium | |
WO2024053046A1 (en) | Information processing system, endoscope system, trained model, information storage medium, and information processing method | |
EP3766046A1 (en) | Camera calibration and/or use of a calibrated camera | |
JP2018133064A (en) | Image processing apparatus, imaging apparatus, image processing method, and image processing program | |
JP2020030569A (en) | Image processing method, image processing device, imaging device, lens device, program, and storage medium | |
CN115209000A (en) | Dynamic phase difference estimation method and system for remote sensing imaging | |
CN115185078A (en) | Incoherent aperture synthetic aberration correction method and device | |
KR102680384B1 (en) | Method and apparatus for correction of aberration | |
JP2018081378A (en) | Image processing apparatus, imaging device, image processing method, and image processing program | |
JP4091455B2 (en) | Three-dimensional shape measuring method, three-dimensional shape measuring apparatus, processing program therefor, and recording medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |