The present application claims priority from the chinese patent application with application number 201910876997.X filed by the chinese bureau of china on 17.9/2019, the entire contents of which are incorporated herein by reference.
Disclosure of Invention
The invention provides an image processing method and a method for adjusting a diffraction screen structure based on a point spread function, which aim to solve the problem that in the prior art, an image obtained by imaging of a diffraction imaging system has diffraction patterns and cannot truly reflect information of a target object.
In a first aspect, the present invention provides an image processing method, including:
the diffraction imaging system images a target object and acquires an original image, wherein the original image is an image with a diffraction pattern;
obtaining a point spread function of the diffractive imaging system;
and restoring and generating a target image from the original image according to the point spread function, wherein the target image is an image of a target object.
Further, the method further comprises:
and generating the target image according to the intensity of each level of diffraction peak of the point spread function and the position of each level of corresponding diffraction peak.
Further, the obtaining a point spread function of the diffractive imaging system includes:
obtaining a light intensity distribution function behind the screen according to the light wave function and the diffraction screen function;
carrying out Fourier transform on the light intensity distribution function to obtain an angular spectrum;
determining a transfer function of a frequency domain according to the angular spectrum and the free propagation function;
and carrying out inverse Fourier transform on the transfer function of the frequency domain to generate a point spread function of the diffraction imaging system.
Further, the method may further comprise, after obtaining the behind-screen light intensity distribution function,
and if the diffraction imaging system comprises a lens arranged behind the diffraction screen, determining the transfer function of the frequency domain according to the light intensity distribution function behind the screen, the lens function and the free propagation function.
Further, the method may further comprise, after obtaining the behind-screen light intensity distribution function,
and if the diffraction imaging system comprises a zone plate arranged behind a diffraction screen, determining a transfer function of a frequency domain according to a light intensity distribution function behind the screen, a zone plate function and a free propagation function, wherein the zone plate function is a function for expressing the structure of the zone plate.
Further, the obtaining a point spread function of the diffractive imaging system includes:
acquiring a first image, wherein the first image is obtained by a diffraction imaging system under a standard light source;
determining diffraction peak intensities at each level of the first image;
determining a first diffusion function according to the diffraction peak intensities of all levels of the first image, wherein the first diffusion function is the convolution of the shape of the standard light source and a point diffusion function;
and carrying out inverse convolution on the first diffusion function according to the shape of the standard light source to obtain a point diffusion function of the diffraction imaging system.
Further, the obtaining a point spread function of the diffractive imaging system includes:
acquiring a preset number of original images;
acquiring the position and the intensity of each level of diffraction peak of each original image;
fitting the positions and intensities of all levels of diffraction peaks of all original images;
and determining a point spread function of the diffraction imaging system according to the fitting result.
Further, the method further comprises:
if the original image is an overexposed image, determining the intensity relation between all levels of diffraction peaks of the original image according to the point spread function;
determining the true secondary diffraction peak intensity of the overexposed image;
and determining the true main peak intensity of the overexposed region in the overexposed image according to the intensity relation and the true secondary diffraction peak intensity.
Further, the determining the true secondary diffraction peak intensity of the overexposed image comprises:
acquiring the average value of the image intensity around the secondary diffraction peak in the overexposed image;
and determining the true secondary diffraction peak intensity of the overexposed image according to the difference value between the secondary diffraction peak intensity in the overexposed image and the average value.
Further, the method further comprises:
if the original image is an overexposure image, acquiring a first imaging parameter during overexposure imaging;
adjusting a first imaging parameter of the diffractive imaging system to a second imaging parameter;
acquiring a second image according to the second imaging parameters, wherein the second image is an unexposed image;
determining a dominant peak intensity in the second image;
and determining the real main peak intensity of an overexposed area in the overexposed image according to the main peak intensity in the second image, the first imaging parameter and the second imaging parameter.
Further, when the original image is a color image, the method includes:
acquiring a single-color original image of each color channel;
obtaining a point spread function corresponding to each color channel;
restoring and generating a monochromatic target image from the corresponding monochromatic original image according to the point spread function;
and synthesizing a color target image according to the generated single-color target image corresponding to each color channel.
Further, the air conditioner is provided with a fan,
if the acquired single-color original image comprises both an overexposed image and an overexposed image, acquiring a third imaging parameter and a fourth imaging parameter, wherein the third imaging parameter is the imaging parameter of the overexposed image in the single-color original image, and the fourth imaging parameter is the imaging parameter of the overexposed image in the single-color original image;
determining a dominant peak intensity of an unexposed image in the monochromatic image;
and determining the real main peak intensity of an overexposed area in the overexposed image in the monochromatic original image according to the main peak intensity of the non-overexposed image, the third imaging parameter and the fourth imaging parameter.
In a second aspect, the present invention provides a method for adjusting a diffraction screen structure based on a point spread function, comprising:
acquiring an original image;
if the original image is an image with a diffraction pattern, adjusting a diffraction screen function;
obtaining a light intensity distribution function behind the screen according to the light wave function and the adjusted diffraction screen function;
carrying out Fourier transform on the light intensity distribution function to obtain an angular spectrum;
determining a transfer function of a frequency domain according to the angular spectrum and the free propagation function;
carrying out inverse Fourier transform on the transfer function of the frequency domain to generate an adjusted point spread function of the diffraction imaging system;
generating a first target image according to the adjusted point spread function of the diffraction imaging system and the original image;
and if the diffraction pattern in the first target image is weakened relative to the diffraction pattern in the original image, determining the diffraction screen structure according to the adjusted diffraction screen function.
Further, the adjusting the diffraction screen function includes:
adjusting the diffraction screen structure to be a non-periodic structure;
and obtaining the adjusted diffraction screen function according to the adjusted diffraction screen structure.
In a third aspect, the present invention is a method for adjusting a diffraction screen structure based on a point spread function, including:
acquiring an original image;
obtaining a light intensity distribution function behind the screen according to the light wave function and the diffraction screen function;
carrying out Fourier transform on the light intensity distribution function to obtain an angular spectrum;
determining a transfer function of a frequency domain according to the angular spectrum and the free propagation function;
if the value of the transfer function of the frequency domain at the low frequency does not meet the range of a first preset threshold value, adjusting the diffraction screen function;
generating an adjusted frequency domain transfer function according to the adjusted diffraction screen function;
and if the value of the transfer function of the adjusted frequency domain at the low frequency meets a first preset threshold range, determining the diffraction screen structure according to the adjusted diffraction screen function.
According to the image processing method and the method for adjusting the diffraction screen structure based on the point spread function, provided by the embodiment of the invention, the point spread function is utilized, so that the diffraction pattern in the original image can be weakened or even eliminated to obtain the target image, and the information of the target object is truly reflected.
Detailed Description
The technical solutions in the embodiments of the present invention will be described in detail below with reference to the accompanying drawings in the embodiments of the present invention.
Referring to fig. 1, fig. 1 is a schematic view illustrating a workflow of an image processing method according to an embodiment of the present invention. The image processing method comprises the following steps:
step 101, a diffraction imaging system images a target object and acquires an original image, wherein the original image is an image with a diffraction pattern.
As known from the foregoing background, a diffraction imaging system refers to an optical system that mainly relies on the wave property of light for imaging, and when imaging is performed in the diffraction imaging system, light emitted from a point on a target is blocked at a diffraction screen (arrangement of light-transmitting portions), so that a diffraction pattern, such as a diffraction stripe, is formed.
And 102, obtaining a point spread function of the diffraction imaging system.
The Point Spread Function (PSF) is a light field distribution of an output image of an optical system when an input object is a Point light source.
Implementations of obtaining the point spread function of the diffractive imaging system include a variety of. For example:
the first implementation mode comprises the following steps: obtaining a light intensity distribution function behind the screen according to the light wave function and the diffraction screen function; carrying out Fourier transform on the light intensity distribution function to obtain an angular spectrum; determining a transfer function of a frequency domain according to the angular spectrum and the free propagation function; and carrying out inverse Fourier transform on the transfer function of the frequency domain to generate a point spread function of the diffraction imaging system.
For example, in the diffraction imaging system, as described in the background art, a light-transmitting portion is provided in a display area of a display screen, and a camera is provided below the light-transmitting portion, so that light emitted by an object above the light-transmitting portion passes through the light-transmitting portion and is collected by the camera. The arrangement of the light-transmitting parts is equivalent to a diffraction screen, and the diffraction screen function is used for describing the structure of the diffraction screen, such as the number, the shape and the size of the light-transmitting parts on the diffraction screen. It should be noted that the diffraction imaging system is not limited to the above-mentioned exemplary structure, and includes all optical systems that mainly rely on the fluctuation imaging of light.
Multiplying a unit plane wave or spherical wave U (x, y) by a diffraction screen function T (x, y) to obtain a light intensity distribution function behind the screen, carrying out Fourier transform on the light intensity distribution function to obtain an angle spectrum, multiplying the light intensity distribution function by a free propagation function of the angle spectrum to obtain a transfer function H of a frequency domain, carrying out Fourier inverse transform on the transfer function, and generating a point spread function of the diffraction imaging system.
Fig. 2 to 5 are images of the obtained point spread function, wherein fig. 2 and 4 are two-dimensional images of the point spread function, and the PSF values corresponding to the bright portions in fig. 2 and 4 are high and the PSF values corresponding to the dark portions are low, for example, the PSF values corresponding to the white area a are high and the PSF values corresponding to the gray area B are low in fig. 2 and 4. Fig. 3 and 5 are one-dimensional images of a point spread function, in which the abscissa represents positional information, such as left and right positions, and the ordinate represents diffraction peak intensity values.
In a first implementation manner, if the diffraction imaging system includes a lens disposed behind a diffraction screen, after the light intensity distribution function behind the screen is obtained, a transfer function of a frequency domain is determined according to the light intensity distribution function behind the screen, the lens function, and a free propagation function.
For example, a diffractive imaging system is: the display screen further comprises a lens corresponding to the light-transmitting part, the lens is arranged between the diffraction screen and the camera, and light rays emitted by a target object above the light-transmitting part penetrate through the light-transmitting part and the lens and are collected by the camera. If the diffraction imaging system comprises a lens arranged behind a diffraction screen, when the point spread function of the diffraction imaging system is obtained, after the light intensity distribution function behind the screen is obtained, the transfer function of the frequency domain is determined according to the light intensity distribution function behind the screen, the lens function and the free propagation function.
In a first implementation manner, if the diffraction imaging system includes a zone plate disposed behind a diffraction screen, after obtaining the light intensity distribution function behind the screen, a transfer function of a frequency domain is determined according to the light intensity distribution function behind the screen, the zone plate function, and a free propagation function, where the zone plate function is a function for representing a structure of the zone plate.
For example, a diffractive imaging system is: the display screen comprises a display area and is characterized in that a light transmission part is arranged in the display area of the display screen, a camera is arranged below the light transmission part, the display screen further comprises a zone plate corresponding to the light transmission part, the zone plate is arranged between a diffraction screen and the camera, the zone plate comprises a shading band and a light transmission band, and light emitted by a target object above the light transmission part passes through the light transmission part and the light transmission band and is collected by the camera. If the diffraction imaging system comprises a zone plate arranged behind a diffraction screen, when the point spread function of the diffraction imaging system is obtained, after the light intensity distribution function behind the screen is obtained, the transfer function of a frequency domain is determined according to the light intensity distribution function behind the screen, the zone plate function and the free propagation function, wherein the zone plate function is a function for representing the structure of the zone plate. In the method, since the distance between the diffraction screen and the zone plate is short, the process of calculating the multiplication of the free propagation function between the diffraction screen and the zone plate and the diffraction screen function of the zone plate is omitted.
In the first implementation manner, the point spread function of the diffraction imaging system is obtained by using a theoretical calculation method, but each component of the diffraction imaging system has certain errors in the manufacturing and assembling processes, so that a second implementation manner and a third implementation manner are provided for the purpose.
In a second implementation manner, the method for obtaining the point spread function of the diffraction imaging system, as shown in fig. 6, includes the following steps:
step 201, acquiring a first image, where the first image is an image obtained by a diffraction imaging system under a standard light source.
And imaging the target object by using a standard light source with controllable brightness in a darkroom to obtain a first image. The standard light source means that the light emitting property of the light source is known, specifically, the shape of the light source is known, the light emitting intensity of the standard light source is constant for convenience of calculation, and the spectrum of the color light source is known for the color light source.
Step 202, determining the diffraction peak intensity of each level of the first image.
From the first image obtained, the intensity of the diffraction peak of each level of said first image can be determined by image analysis thereof.
Step 203, determining a first diffusion function according to the diffraction peak intensities of the first image at each level, wherein the first diffusion function is the convolution of the shape of the standard light source and the point diffusion function.
And 204, performing deconvolution on the first diffusion function according to the shape of the standard light source to obtain a point diffusion function of the diffraction imaging system.
In the second implementation, imaging is performed under a standard light source to obtain a diffusion function corresponding to the standard light source, which is called a first diffusion function for convenience of description, the first diffusion function is a convolution of a shape of the standard light source and the point diffusion function, and then the first diffusion function is subjected to inverse convolution according to the shape of the standard light source to eliminate the influence of the shape of the light source and obtain the point diffusion function corresponding to the point light source.
In a third implementation manner, the method for obtaining the point spread function of the diffraction imaging system includes the following steps:
acquiring a preset number of original images;
acquiring the position and the intensity of each level of diffraction peak of each original image;
fitting the positions and intensities of all levels of diffraction peaks of all original images;
and determining a point spread function of the diffraction imaging system according to the fitting result.
The third implementation is a statistical fitting to determine the point spread function of the diffractive imaging system. The positions and the intensities of all levels of diffraction peaks of each original image are obtained by analyzing a sufficient number of original images, and the positions and the intensities of all levels of diffraction peaks are subjected to fitting processing, so that the point spread function of the diffraction imaging system is calculated.
When the point spread function of the diffraction imaging system is obtained by using the second implementation manner and the third implementation manner, in order to ensure that an accurate PSF is obtained, first, factors influencing the accuracy of the PSF are determined to mainly include: exposure level during imaging and the influence of noise.
Therefore, based on the above influencing factors, when the point spread function of the diffraction imaging system is obtained by using the second implementation and the third implementation, in order to obtain an accurate point spread function, the diffraction imaging system needs to avoid overexposure when obtaining the first image.
Further, in order to obtain an accurate point spread function, a plurality of first images may be continuously collected, the steps 201 to 204 are performed on the plurality of collected first images, each first image is corresponding to one point spread function, and finally, an average value of the point spread functions is taken, so that the influence of random noise of the sensor is eliminated, and the obtained point spread function is more accurate.
Further, in order to obtain an accurate point spread function, the determination of the point spread function by the valley portion of the non-main peak is also meaningful, and specifically, by adjusting imaging parameters, such as increasing the gain value and the exposure time, the point where the original image sensor reading is zero has a non-zero reading, so as to determine the accurate diffraction peak intensity of the valley portion. Since the dynamic range of a typical image sensor is only 255, and the ratio of the intensity of the main peak to the intensity of the secondary diffraction peak is usually 50, when the reading of the intensity of the main peak on the image sensor is 200, the corresponding reading of the intensity of the secondary diffraction peak is only 4, and the reading on the image sensor within 3.5 to 4.5 is considered to be 4, so the error is relatively large. In addition, no diffraction peak intensity reading below 1 on the image sensor can be detected, so the gain value and exposure time should be increased so that the reading value of the diffraction peak intensity at the valley portion is larger. For example, the gain value is increased by 10 times, the exposure time is increased by 5 times, the peak value is increased by 5 × 10 to 50 times, and the reading of the secondary diffraction peak intensity reaches 4 × 50 to 200, so that the diffraction peak intensity of the valley portion can be accurately determined. At this time, if the corresponding main peak is overexposed, the true main peak intensity can be determined through the change of the gain and the change of the exposure time, for example, the main peak intensity of the non-overexposed image without the gain value and the overexposed time being increased is obtained, and then the true main peak intensity of the overexposed image is determined according to the main peak intensity of the non-overexposed image, the gain multiple and the exposure time.
And 103, restoring and generating a target image from the original image according to the point spread function, wherein the target image is an image of a target object.
The point spread function is utilized to restore the original image into the target image which can truly reflect the information of the target object, and the method comprises a plurality of implementation modes. For example:
in a first implementation manner, the target image is generated according to the intensities of diffraction peaks of each level of the point spread function and the positions of the diffraction peaks of each level.
When the point spread function is known, the intensity of the diffraction peak at each level and the position of the diffraction peak corresponding to each level are known, and therefore, the target image can be generated from the intensity of the diffraction peak at each level and the position of the diffraction peak corresponding to each level.
In a second implementation manner, in a case that the point spread function is known, a target image is generated by restoring from the original image by using a frequency domain method such as inverse filtering or wiener filtering.
First, the generation of a target image by restoring the original image using an inverse filtering method will be described.
The process of imaging the target object by the diffractive imaging system can be simply expressed as expression (1):
f(x,y)=o(x,y)*h(x,y) (1)
wherein f (x, y) represents an original image; o (x, y) represents an object plane light intensity distribution function; h (x, y) represents the point spread function, which is a convolution.
The fourier transform form of expression (1) is expression (2):
F(u,v)=O(u,v)H(u,v) (2)
transforming expression (2) to obtain expression (3)
Then, the expression (3) is inversely transformed to obtain an expression (4)
From this, it can be seen that, theoretically, if H (x, y) or H (u, v) is known, and the imaging process is free from the influence of overexposure and noise, o (x, y) can be perfectly reproduced. This method of restoring the generated target image from the original image is called inverse filtering.
However, realistic imaging procedures typically result in F (u, v) containing noise. As H (u, v) approaches zero, the inverse filtering will greatly amplify the noise of F (u, v), at which point the inverse filtering is highly sensitive to noise, effectively rendering the inverse filtering unusable. Whereas H (u, v) in reality inevitably approaches zero at high frequencies, in this case, a simple inverse filter may be replaced by a wiener filter, as shown in expression (5):
wherein H*(u, v) represents the conjugate function of H (u, v); k (u, v) represents a correction function based on a noise-to-signal ratio.
The above method for generating the target image from the original image by restoration is called wiener inverse filtering based on expression (5).
The wiener filtering method is suitable for the case that the original image has no overexposure but has noise influence. However, in practice, due to imaging overexposure or overexposure, overexposed points are easily generated on the original image when intense light is contained within the imaging area. The over-exposure is caused by that the energy of the corresponding position of the image space is too high and exceeds the upper limit of the representation range of the image sensor, and the output value is truncated, so that the real light intensity of the over-exposure point cannot be known, the real image cannot be restored, and the diffraction pattern cannot be accurately eliminated or weakened.
For example, the following steps are carried out: assuming that the captured sunset image is overexposed and the light spot beside the image is just overlapped with the building A beside the sunset, so the building A can be a darker building, for example, the original image intensity reading of the building A is 10, the diffraction peak intensity reading of the building A is 30 because the secondary diffraction peak of the overexposed point is overlapped with the building A, the intensity proportional relation between the main peak and the secondary diffraction peak can be known according to the PSF, for example, the intensity ratio between the main peak and the secondary diffraction peak is 100:2, therefore, the secondary diffraction peak intensity of the overexposed point can be determined according to the intensity proportional relation between the main peak and the secondary diffraction peak and the main peak intensity, for example, the main peak intensity is 1000, the corresponding secondary diffraction peak intensity of the overexposed point is 20, further, the intensity of the secondary diffraction peak of the overexposed point is subtracted by the diffraction peak intensity 30 of the secondary diffraction peak of the overexposed point, to eliminate the influence of the secondary diffraction peak of the overexposed point at the building a, and thereby obtain the original image intensity of the building a as 10.
The method is a method for restoring and generating a target image when an original image is overexposed. In the method, under the condition that the real main peak intensity of an overexposed area in the overexposed image is known to be 1000, the influence of a secondary diffraction peak of an overexposed point is eliminated, and the original image is further restored to generate the target image. However, in the actual situation, when the original image is overexposed, the overexposed point exceeds the upper limit of the representation range of the image sensor, the output value is truncated, and the true light intensity of the overexposed point cannot be known, that is, the true main peak intensity of the overexposed region in the overexposed image cannot be known, so that the true secondary diffraction peak intensity of the overexposed point cannot be determined, the influence of the secondary diffraction peak of the overexposed point cannot be eliminated, the diffraction pattern cannot be eliminated or weakened, and the original image cannot be restored. Therefore, when the original image is an overexposed image, in order to restore the overexposed image to the target image, it is critical to determine the true main peak intensity of the overexposed region in the overexposed image.
Two implementations for determining the true main peak intensity of an overexposed region in an overexposed image are provided below:
the first implementation, as shown in fig. 7, includes the following steps:
step 301, if the original image is an overexposed image, determining an intensity relationship between diffraction peaks of each level of the original image according to the point spread function.
The intensity relationship between each level of diffraction peak of the original image includes the relationship of the true secondary diffraction peak intensity to the true primary peak intensity. Wherein, the true secondary diffraction peak intensity refers to the true secondary diffraction peak intensity of the overexposed point in the overexposed image.
And step 302, determining the true secondary diffraction peak intensity of the overexposed image.
And step 303, determining the true main peak intensity of the overexposed region in the overexposed image according to the intensity relation and the true secondary diffraction peak intensity.
In the above steps, after the true secondary diffraction peak intensity of the overexposed image is determined, the true main peak intensity can be calculated by using the relationship between the true secondary diffraction peak intensity and the true main peak intensity.
Further, in the first implementation manner, the method for determining the true secondary diffraction peak intensity of the overexposed image may be implemented by obtaining an average value of image intensities around the secondary diffraction peak in the overexposed image; and determining the true secondary diffraction peak intensity of the overexposed image according to the difference value between the secondary diffraction peak intensity in the overexposed image and the average value.
The intensity of the secondary diffraction peak in the overexposed image is the intensity indicated by the image sensor, that is, the intensity of the diffraction peak obtained by superimposing the secondary diffraction peak of the overexposed point and the original image intensity at the secondary diffraction peak in the overexposed image, as in the above example, the reading of the intensity of the diffraction peak of the building a becomes 30 because the secondary diffraction peak of the overexposed point coincides with the building a, and the reading of the intensity of the diffraction peak of the building a becomes 30, that is, the intensity of the secondary diffraction peak in the overexposed image. The average value of the image intensities around the secondary diffraction peak in the overexposed image is obtained to estimate the original image intensity at the secondary diffraction peak, which is equivalent to determining that the original image intensity of the building a is 10. And finally, the difference value of the secondary diffraction peak intensity in the overexposed image and the average value is the true secondary diffraction peak intensity of the overexposed image.
The overexposed image comprises a main peak, a larger secondary diffraction peak (secondary diffraction peak in the overexposed image) and other smaller main peaks, the image intensity around the secondary diffraction peak in the overexposed image refers to those smaller main peaks (corresponding to regions with little brightness change), the specific range of the image intensity can define a preset intensity threshold, and the main peak meeting the preset intensity threshold is taken as the image intensity around the secondary diffraction peak in the overexposed image.
For example, the overexposed image is a star in a shot night sky, the overexposed image includes a black sky and a star in an overexposed region, and a higher secondary diffraction peak, that is, a secondary diffraction peak in the overexposed image, appears near the overexposed point under the influence of the overexposed point, but since only the night sky and the star exist in the image, it can be determined that the original image intensity at the higher secondary diffraction peak should be the same as the image intensity of the black sky around the overexposed image, so the original image intensity at the higher secondary diffraction peak is determined by obtaining an average value of the images around the higher secondary diffraction peak, the actual secondary diffraction peak intensity of the overexposed image is obtained by subtracting the original image intensity at the higher secondary diffraction peak from the secondary diffraction peak intensity in the overexposed image, and finally the actual main peak intensity can be calculated through an intensity relationship.
In another example, the sun on the blue sky is photographed to obtain an overexposed image, the brightness of the blue sky in the overexposed image does not change much, for example, the image intensity reading of the blue sky is 10, but the diffraction peak intensity reading of the blue sky near the overexposed point in the photographed image is changed to 30, since most of the brightness on the blue sky is the same, by obtaining the average value of the image intensity around the higher secondary diffraction peak (the diffraction peak intensity reading is 30), the original image intensity at the higher secondary diffraction peak can be determined to be 10 as the average value, so that the true secondary diffraction peak intensity of the overexposed image can be determined to be 20(30-10), and finally, the true main peak intensity can be calculated through the intensity relationship.
The second implementation, as shown in fig. 8, includes the following steps:
step 401, if the original image is an overexposed image, acquiring a first imaging parameter during overexposure imaging.
Step 402, adjusting a first imaging parameter of the diffraction imaging system to be a second imaging parameter;
step 403, acquiring a second image according to the second imaging parameter, wherein the second image is an overexposure-free image;
step 404, determining the main peak intensity in the second image;
step 405, determining the true main peak intensity of the overexposed region in the overexposed image according to the main peak intensity in the second image, the first imaging parameter and the second imaging parameter.
In the above steps 401 to 405, two frames of images are acquired, including an overexposed image and an overexposed image, the overexposed image is acquired using the first imaging parameter, and the overexposed image (the second image) is acquired using the second imaging parameter, where the imaging parameters include a gain value and an exposure time, and the gain value and the exposure time in the second imaging parameter for acquiring the overexposed image are smaller than those in the first imaging parameter, for example, the gain value in the second imaging parameter is reduced by 10 times compared with the gain value in the first imaging parameter, and the exposure time in the second imaging parameter is reduced by 5 times compared with the exposure time in the first imaging parameter. Since the second image is not overexposed, the true main peak intensity can be read, if the main peak intensity in the second image is 50, and the second imaging parameter is increased by 50 times (the product of the gain value multiple and the exposure time multiple) compared with the first imaging parameter, and then the true main peak intensity of the overexposed area in the overexposed image is determined to be 2500 (the product of the main peak intensity in the second image and the imaging parameter increase multiple) according to the main peak intensity in the second image being 50 and the second imaging parameter being increased by 50 times compared with the first imaging parameter.
When the original image is a color image, the image processing method comprises the following steps:
a monochromatic raw image for each color channel is acquired.
And obtaining the point spread function corresponding to each color channel. When the point spread function of any color channel is obtained, any method for obtaining the point spread function in the above embodiments may be adopted, which is not described herein again.
And restoring and generating a monochromatic target image from the corresponding monochromatic original image according to the point spread function. When a monochrome target image is restored and generated from a corresponding monochrome original image, any method for restoring the original image to generate the target image in the foregoing embodiments may be adopted, and details are not repeated here.
And synthesizing a color target image according to the generated single-color target image corresponding to each color channel.
The color image comprises red, green and blue channels, and after monochromatic target images corresponding to the red, green and blue channels are respectively generated, the color target images are synthesized through image processing.
Further, when the original image is a color image and the color image is overexposed, in addition to determining the true main peak intensity of the overexposed region in the overexposed image by using the two implementations provided in the above embodiments, the true main peak intensity of the overexposed region in the overexposed image may also be determined by using the third implementation.
The third implementation mode comprises the following steps:
and if the acquired monochromatic original image comprises both an overexposed image and an overexposed image, acquiring a third imaging parameter and a fourth imaging parameter, wherein the third imaging parameter is the imaging parameter of the overexposed image in the monochromatic original image, and the fourth imaging parameter is the imaging parameter of the overexposed image in the monochromatic original image.
Determining a dominant peak intensity of an unexposed image in the monochromatic image.
And determining the real main peak intensity of an overexposed area in the overexposed image in the monochromatic original image according to the main peak intensity of the non-overexposed image, the third imaging parameter and the fourth imaging parameter.
If the three monochromatic original images corresponding to the three color channels of the color image include an overexposed image and an overexposed image, the overexposed image and the overexposed image are equivalent to different imaging parameters, for example, the exposure time is different, and based on this, the second implementation manner for determining the true main peak intensity of the overexposed region in the overexposed image can be applied to determine the true main peak intensity of the overexposed region in the monochromatic original image of the color image, which is not described herein again.
The above embodiment section describes an image processing method for an original image with a diffraction pattern generated by a diffraction imaging system, and by the above image processing method, the diffraction pattern in the original image can be weakened or even eliminated to obtain a target image, so as to truly reflect the information of a target object.
Based on the image processing method in the above embodiment, the diffraction screen function can be reversely deduced by using the obtained point spread function, so as to determine the structure of the diffraction screen, that is, the number, shape and size of the holes on the diffraction screen. Therefore, the present application further provides a method for adjusting a diffraction screen structure based on a point spread function, as shown in fig. 9, including the following steps:
step 501, obtaining an original image.
Step 502, if the original image is an image with a diffraction pattern, adjusting a diffraction screen function.
And 503, obtaining a light intensity distribution function behind the screen according to the light wave function and the adjusted diffraction screen function.
And step 504, carrying out Fourier transform on the light intensity distribution function to obtain an angular spectrum.
And 505, determining a transfer function of a frequency domain according to the angular spectrum and the free propagation function.
And 506, performing inverse Fourier transform on the transfer function of the frequency domain to generate an adjusted point spread function of the diffraction imaging system.
And 507, generating a first target image according to the adjusted point spread function of the diffraction imaging system and the original image.
And step 508, if the diffraction pattern in the first target image is weakened relative to the diffraction pattern in the original image, determining the diffraction screen structure according to the adjusted diffraction screen function.
The point spread function of the diffraction imaging system is calculated according to the adjusted diffraction screen function, and the method specifically comprises the following steps: obtaining a light intensity distribution function behind the screen according to the light wave function and the adjusted diffraction screen function; carrying out Fourier transform on the light intensity distribution function to obtain an angular spectrum; determining a transfer function of a frequency domain according to the angular spectrum and the free propagation function; performing inverse fourier transform on the transfer function to generate an adjusted point spread function of the diffraction imaging system, which may be referred to in the description of the first implementation manner for obtaining the point spread function of the diffraction imaging system in the above steps and is not described herein again.
In the above steps 501 to 508, a point spread function of the adjusted diffraction imaging system is generated according to the adjusted diffraction screen function by adjusting the diffraction screen function, and a first target image is generated, and then the generated first target image is judged, and if the diffraction pattern in the first target image is weakened, the diffraction screen structure is determined according to the adjusted diffraction screen function. If the diffraction pattern in the first target image is not attenuated, the diffraction screen function is readjusted until the diffraction pattern in the first target image is attenuated. The diffraction screen structure includes the number, shape and size of the light-transmitting holes on the diffraction screen, such as the arrangement of the light-transmitting parts of the diffraction imaging system in the background art.
Wherein the weakening of the diffraction pattern in the first target image means that the diffraction pattern in the first target image is reduced relative to the diffraction pattern in the original image. Specifically, the judgment criterion for the attenuation of the diffraction pattern in the first target image may be any one or more of the following: the secondary diffraction peak shape of the first target image is gentle compared with that of the original image, the secondary diffraction peak value of the first target image is small compared with that of the original image, the distance between the secondary diffraction peak and the main peak of the first target image is large compared with that of the original image, and the secondary diffraction peak and the main peak of the first target image are not overlapped.
Further, the weakening degree of the diffraction pattern in the first target image can be set according to actual needs, and when the weakening degree of the diffraction pattern in the first target image reaches a preset weakening degree, the diffraction screen structure is determined according to the adjusted diffraction screen function. The preset weakening degree can be set according to a judgment standard for weakening the diffraction pattern, for example, the preset weakening degree is that the secondary diffraction peak of the first target image does not overlap with the main peak; for another example, the preset attenuation degree is that the height of the secondary diffraction peak of the first target image satisfies a preset height threshold; for another example, the preset attenuation degree is that the secondary diffraction peak of the first target image meets a preset intensity threshold; also for example, the degree of attenuation is preset such that the distance between the secondary diffraction peak and the main peak of the first target image satisfies a preset size threshold.
In the research process, the applicant finds that when the adjusted diffraction screen structure is a non-periodic structure, the diffraction pattern of the imaged image is less, and based on the fact, the diffraction screen structure can be adjusted to be the non-periodic structure; and then obtaining an adjusted diffraction screen function according to the adjusted diffraction screen structure, and then executing steps 503-508.
The aperiodic structure comprises a diffraction screen, a plurality of light transmission holes and a non-periodic structure, wherein the quantity of the light transmission holes on the diffraction screen is aperiodic, the shape of each light transmission hole on the diffraction screen is aperiodic, and the size of each light transmission hole on the diffraction screen is aperiodic. As shown in fig. 10-12, fig. 10 is a schematic diagram of an implementation mode of non-periodic arrangement of a plurality of light transmission portions with the same size, and in fig. 10, on a display region 1, a spacing distance between a first light transmission portion 2 and a second light transmission portion 2 is s, a spacing distance between the second light transmission portion 2 and a third light transmission portion 2 is s/2, and a spacing distance between the third light transmission portion 2 and a fourth light transmission portion 2 is s/3. Fig. 11 is a schematic view of an implementation manner of non-periodic arrangement of a plurality of light-transmitting portions with different sizes, in fig. 11, the spacing distances s between the plurality of light-transmitting portions 2 on the display area 1 are the same, but the diameters d1, d2, d3 and d4 of the light-transmitting portions 2 are different and irregularly change. Fig. 12 is a schematic view of an implementation mode in which a plurality of light-transmitting portions with different shapes are non-periodically arranged, and in fig. 12, the shapes of the plurality of light-transmitting portions 2 on the display region 1 are different and irregularly changed.
Based on the method for obtaining the point spread function in the foregoing embodiment, the present application also provides another method for adjusting the diffraction screen structure based on the point spread function, as shown in fig. 13, including the following steps:
step 601, obtaining an original image.
And step 602, obtaining a light intensity distribution function after the screen according to the light wave function and the diffraction screen function.
And 603, carrying out Fourier transform on the light intensity distribution function to obtain an angular spectrum.
And step 604, determining a transfer function of a frequency domain according to the angular spectrum and the free propagation function.
The above-mentioned step 601-step 604, obtain the original image; obtaining a light intensity distribution function behind the screen according to the light wave function and the diffraction screen function; carrying out Fourier transform on the light intensity distribution function to obtain an angular spectrum; determining a transfer function of a frequency domain according to the angular spectrum and the free propagation function; for the above steps, reference may be made to the description of the first implementation manner of obtaining the point spread function of the diffraction imaging system, and details are not described here.
And 605, if the value of the transfer function of the frequency domain at the low frequency does not meet the first preset threshold range, adjusting the diffraction screen function.
In the first implementation of the point spread function of the diffraction imaging system, the transfer function in the frequency domain is determined and then the transfer function is subjected to inverse fourier transform, so that the transfer function needs to be reversible. As can be seen from the foregoing description of the inverse filtering and wiener filtering methods, in order to satisfy the condition that the transfer function is reversible, the transfer function should be as far away from zero as possible in the low frequency portion, and when the transfer function has a value close to zero in the low frequency portion, the obtained point spread function is not ideal, so that when the transfer function has a value close to zero in the low frequency portion, the diffraction screen function is adjusted. For example, setting the low frequency to a frequency lower than 20 line pairs per centimeter at an object distance of 40 centimeters (which is a common object distance for self-timer shooting), which is equivalent to a resolution of 0.5 millimeters, is sufficiently clear for self-timer shooting, wherein the range close to zero is a range that does not satisfy a first preset threshold range, the first preset threshold range in step 605 is a range where the frequency is greater than 0.3, i.e., if the value of the transfer function at the low frequency is less than or equal to 0.3, the diffraction screen function is adjusted.
As shown in fig. 14, fig. 14 shows a graph of frequency versus transfer function value; at low frequencies (frequencies between 0 and 100), the transfer function exhibits a value close to zero (in fig. 14 the value of the transfer function at low frequencies is less than 0.2), and the value of the transfer function at low frequencies does not satisfy the first predetermined threshold range, where the diffraction screen function needs to be adjusted.
And 606, generating a transfer function of the adjusted frequency domain according to the adjusted diffraction screen function.
Step 607, if the value of the adjusted frequency domain transfer function at the low frequency meets the first preset threshold range, determining the diffraction screen structure according to the adjusted diffraction screen function.
The above steps 601 to 607 are to make the transfer function far from the zero value at the low frequency part, where the value of the transfer function at the low frequency part does not satisfy the first preset threshold range, to adjust the diffraction screen function, and generate the transfer function of the adjusted frequency domain according to the adjusted diffraction screen function. And when the value of the transfer function of the adjusted frequency domain at the low frequency meets a first preset threshold range, determining the diffraction screen structure according to the adjusted diffraction screen function. Fewer diffraction patterns are imaged in the image using the finally determined diffraction screen structure.
Corresponding to the image processing method, the embodiment of the invention also discloses an image processing system, which comprises:
the imaging module is used for imaging a target object by the diffraction imaging system and acquiring an original image, wherein the original image is an image with a diffraction pattern;
a first obtaining module for obtaining a point spread function of the diffractive imaging system;
and the first restoring module is used for restoring and generating a target image from the original image according to the point spread function, wherein the target image is an image of a target object.
Further, the system further comprises:
and the first generation module is used for generating the target image according to the intensities of all levels of diffraction peaks of the point spread function and the positions of the corresponding diffraction peaks.
Further, the first obtaining module includes:
the obtaining submodule is used for obtaining a light intensity distribution function after the screen according to the light wave function and the diffraction screen function;
the first Fourier transform module is used for carrying out Fourier transform on the light intensity distribution function to obtain an angular spectrum;
the first determining module is used for determining a transfer function of a frequency domain according to the angular spectrum and the free propagation function;
and the inverse Fourier transform module is used for performing inverse Fourier transform on the transfer function of the frequency domain to generate a point spread function of the diffraction imaging system.
Further, the system also includes,
and the second determining module is used for determining the transfer function of the frequency domain according to the light intensity distribution function behind the screen, the lens function and the free propagation function after the light intensity distribution function behind the screen is obtained and when the diffraction imaging system comprises a lens arranged behind the diffraction screen.
Further, the system also includes,
and the third determining module is used for determining a transfer function of a frequency domain according to the light intensity distribution function after the screen, the zone plate function and the free propagation function after the light intensity distribution function after the screen is obtained and when the diffraction imaging system comprises a zone plate arranged behind the diffraction screen, wherein the zone plate function is a function for representing the structure of the zone plate.
Further, the first obtaining module includes:
the first acquisition module is used for acquiring a first image, wherein the first image is obtained by a diffraction imaging system under a standard light source;
a fourth determining module, configured to determine diffraction peak intensities of respective orders of the first image;
a fifth determining module, configured to determine a first diffusion function according to the diffraction peak intensities of each level of the first image, where the first diffusion function is a convolution of a point diffusion function and a shape of the standard light source;
and the deconvolution module is used for performing deconvolution on the first diffusion function according to the shape of the standard light source to obtain a point diffusion function of the diffraction imaging system.
Further, the first obtaining module includes:
the second acquisition module is used for acquiring a preset number of original images;
the third acquisition module is used for acquiring the position and the intensity of each level of diffraction peak of each original image;
the fitting module is used for fitting the positions and the intensities of all levels of diffraction peaks of all the original images;
and the sixth determining module is used for determining a point spread function of the diffraction imaging system according to the fitting result.
Further, the system further comprises:
a seventh determining module, configured to determine, according to the point spread function, an intensity relationship between diffraction peaks of each level of the original image when the original image is an overexposed image;
an eighth determining module, configured to determine a true secondary diffraction peak intensity of the overexposed image;
and the ninth determining module is used for determining the real main peak intensity of the overexposed area in the overexposed image according to the intensity relation and the real secondary diffraction peak intensity.
Further, the eighth determining module includes:
the first acquisition submodule is used for acquiring the average value of the image intensity around the secondary diffraction peak in the overexposed image;
and the first determining submodule is used for determining the true secondary diffraction peak intensity of the overexposed image according to the difference value between the secondary diffraction peak intensity in the overexposed image and the average value.
Further, the system further comprises:
the fourth acquisition module is used for acquiring a first imaging parameter during overexposure imaging when the original image is an overexposure image;
the first adjusting module is used for adjusting the first imaging parameters of the diffraction imaging system into second imaging parameters;
a second obtaining module, configured to obtain a second image according to the second imaging parameter, where the second image is an overexposed image;
a tenth determining module for determining the intensity of the main peak in the second image;
and the eleventh determining module is used for determining the real main peak intensity of the overexposed area in the overexposed image according to the main peak intensity in the second image, the first imaging parameter and the second imaging parameter.
Further, when the original image is a color image, the system includes:
the fifth acquisition module is used for acquiring a single-color original image of each color channel;
the second obtaining module is used for obtaining a point spread function corresponding to each color channel;
the second restoring module is used for restoring and generating a monochromatic target image from the corresponding monochromatic original image according to the point spread function;
and the synthesis module is used for synthesizing the color target image according to the generated single-color target image corresponding to each color channel.
Further, comprises
A sixth obtaining module, configured to obtain a third imaging parameter and a fourth imaging parameter when the obtained monochrome original image includes both an overexposed image and an overexposed image, where the third imaging parameter is an imaging parameter of the overexposed image in the monochrome original image, and the fourth imaging parameter is an imaging parameter of the overexposed image in the monochrome original image;
a twelfth determining module for determining a dominant peak intensity of an unexposed image in the monochrome image;
and the thirteenth determining module is used for determining the real main peak intensity of the overexposed area in the overexposed image in the monochromatic original image according to the main peak intensity of the unexploded image, the third imaging parameter and the fourth imaging parameter.
Corresponding to the method for adjusting the diffraction screen structure based on the point spread function, the embodiment of the invention also discloses a system for adjusting the diffraction screen structure based on the point spread function, which comprises the following steps:
a seventh obtaining module, configured to obtain an original image;
the second adjusting module is used for adjusting the diffraction screen function when the original image is an image with a diffraction pattern;
the third obtaining module is used for obtaining a light intensity distribution function after the screen according to the light wave function and the adjusted diffraction screen function;
the second Fourier transform module is used for carrying out Fourier transform on the light intensity distribution function to obtain an angular spectrum;
a fourteenth determining module, configured to determine a transfer function of a frequency domain according to the angular spectrum and a free propagation function;
a second inverse fourier transform for performing an inverse fourier transform on the transfer function of the frequency domain to generate an adjusted point spread function of the diffractive imaging system;
the second generation module is used for generating a first target image according to the adjusted point spread function of the diffraction imaging system and the original image;
and the third adjusting module is used for determining the diffraction screen structure according to the adjusted diffraction screen function when the diffraction pattern in the first target image is weakened relative to the diffraction pattern in the original image.
Further, the second adjusting module comprises:
the adjusting submodule is used for adjusting the diffraction screen structure to be a non-periodic structure;
and the second determining submodule is used for obtaining the adjusted diffraction screen function according to the adjusted diffraction screen structure.
Corresponding to the method for adjusting the diffraction screen structure based on the point spread function, the embodiment of the invention also discloses another system for adjusting the diffraction screen structure based on the point spread function, which comprises the following steps:
the eighth acquisition module is used for acquiring an original image;
the fourth obtaining module is used for obtaining a light intensity distribution function after the screen according to the light wave function and the diffraction screen function;
the third Fourier transform module is used for carrying out Fourier transform on the light intensity distribution function to obtain an angular spectrum;
a fifteenth determining module, configured to determine a transfer function of a frequency domain according to the angular spectrum and a free propagation function;
the fourth adjusting module is used for adjusting the diffraction screen function when the value of the transfer function of the frequency domain at the low frequency does not meet the range of a first preset threshold value;
the third generation module is used for generating the transfer function of the adjusted frequency domain according to the adjusted diffraction screen function;
and the fifth adjusting module is used for determining the diffraction screen structure according to the adjusted diffraction screen function if the value of the transfer function of the adjusted frequency domain at the low frequency meets the first preset threshold range.
In a specific implementation, the present invention further provides a computer storage medium, where the computer storage medium may store a program, and the program may include some or all of the steps in the embodiments of the image processing method provided by the present invention when executed. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM) or a Random Access Memory (RAM).
Those skilled in the art will readily appreciate that the techniques of the embodiments of the present invention may be implemented as software plus a required general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be essentially or partially implemented in the form of a software product, which may be stored in a storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the embodiments or some parts of the embodiments.
The same and similar parts in the various embodiments in this specification may be referred to each other. In particular, as for the embodiment of the supplementary device for activated carbon in a flue gas purification device, since it is basically similar to the embodiment of the method, the description is simple, and the relevant points can be referred to the description in the embodiment of the method.
The above-described embodiments of the present invention should not be construed as limiting the scope of the present invention.