CN109410143B - Image enhancement method and device, electronic equipment and computer readable medium - Google Patents

Image enhancement method and device, electronic equipment and computer readable medium Download PDF

Info

Publication number
CN109410143B
CN109410143B CN201811289608.5A CN201811289608A CN109410143B CN 109410143 B CN109410143 B CN 109410143B CN 201811289608 A CN201811289608 A CN 201811289608A CN 109410143 B CN109410143 B CN 109410143B
Authority
CN
China
Prior art keywords
image
generate
function
standard
corrected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811289608.5A
Other languages
Chinese (zh)
Other versions
CN109410143A (en
Inventor
朱兴杰
刘岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Taikang Insurance Group Co Ltd
Original Assignee
Taikang Insurance Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Taikang Insurance Group Co Ltd filed Critical Taikang Insurance Group Co Ltd
Priority to CN201811289608.5A priority Critical patent/CN109410143B/en
Publication of CN109410143A publication Critical patent/CN109410143A/en
Application granted granted Critical
Publication of CN109410143B publication Critical patent/CN109410143B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • G06T5/73
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/10Image enhancement or restoration by non-spatial domain filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration by the use of local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20056Discrete and fast Fourier transform, [DFT, FFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Abstract

The disclosure relates to an image enhancement method, an image enhancement device, an electronic device and a computer readable medium. Relates to the field of image processing, and the method comprises the following steps: preprocessing an original image to generate a standard image; performing color correction on the standard image to generate a corrected image; and deblurring the corrected image by wiener filtering to obtain an enhanced image. The image enhancement method, the image enhancement device, the electronic equipment and the computer readable medium can improve the image enhancement effect, efficiently solve the problem of image data structuring processing, improve the authentication efficiency in face recognition and save a large amount of labor cost.

Description

Image enhancement method and device, electronic equipment and computer readable medium
Technical Field
The present disclosure relates to the field of computer information processing, and in particular, to an image enhancement method and apparatus, an electronic device, and a computer-readable medium.
Background
With the popularization of various digital instruments and digital products, images and videos become the most common information carriers in human activities, and the images and videos contain a large amount of information of objects, so that the images and videos become the main ways for people to obtain external original information. However, due to the influence of the shooting operation or shooting environment factors, the obtained image often has the phenomena of blurring, distortion and the like. In addition, in the processes of generating, transmitting, recording and storing images, the images are inevitably affected by various adverse factors due to the incompleteness of an imaging system, a transmission medium and a recording device, so that the image information is lost and the quality of the images is reduced. Image enhancement is the process of restoring sharp images by processing blurred images. Image enhancement techniques have been a focus of image processing and computer vision research.
However, since the image usually has noise, in the image enhancement process, both the blurring factor and the sharp image are unknown, and the known quantity usually only has an observed blurred image, the unknown quantity is more than the known quantity when the problem is solved. There is a lot of unreliability in image enhancement.
Therefore, a new image enhancement method, apparatus, electronic device and computer readable medium are needed.
The above information disclosed in this background section is only for enhancement of understanding of the background of the disclosure and therefore it may contain information that does not constitute prior art that is already known to a person of ordinary skill in the art.
Disclosure of Invention
In view of this, the present disclosure provides an image enhancement method, an image enhancement device, an electronic device, and a computer-readable medium, which can improve an image enhancement effect, efficiently solve the problem of image data structuring processing, improve authentication efficiency during face recognition, and save a large amount of labor cost.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to an aspect of the present disclosure, an image enhancement method is provided, which includes: preprocessing an original image to generate a standard image; performing color correction on the standard image to generate a corrected image; and deblurring the corrected image by wiener filtering to obtain an enhanced image.
In an exemplary embodiment of the present disclosure, preprocessing the original image, and generating the standard image includes at least one of: normalizing the original image to generate a standard image; carrying out standardization processing on the original image to generate a standard image; and adjusting the resolution of the original image to generate a standard image.
In an exemplary embodiment of the present disclosure, color correcting the standard image, generating a corrected image includes: spatially converting the standard image from three primary colors to YCbCrA space; and pair YCbCrAnd carrying out color correction on the spatial standard image to generate the corrected image.
In an exemplary embodiment of the present disclosure, the pair YCbCrColor correcting a standard image of a space, generating the corrected image comprising: will YCbCrDividing the standard image of the space into a plurality of sub-images according to a preset rule; determining a variance of the plurality of subgraphs; determining a reference point matrix by the variance of the plurality of subgraphs; pair of YC by said reference point matrixbCrAnd carrying out color correction on the spatial standard image to generate the corrected image.
In an exemplary embodiment of the present disclosure, deblurring the corrected image by wiener filtering to obtain an enhanced image includes: carrying out Fourier transformation on the corrected image to generate a frequency domain image; determining a point spread function; determining a wiener filter function through the point spread function; and deblurring the corrected image through a wiener filter function to obtain an enhanced image.
In an exemplary embodiment of the present disclosure, determining the wiener filter function by the point spread function includes: determining a frequency domain function and a complex conjugate function of the point spread function through Fourier change; constructing the wiener filter through a frequency domain function and a complex conjugate function:
Figure BDA0001849790120000021
where H (u, v) is the frequency domain function of the point spread function, H*(u, v) is the complex conjugate function of the point spread function, and k is the tuning parameter.
In an exemplary embodiment of the present disclosure, deblurring the corrected image by the wiener filter function to obtain an enhanced image includes: deblurring the corrected image through a wiener filter function to obtain a filtered image; and inverse fourier transforming the filtered image to obtain the enhanced image.
According to an aspect of the present disclosure, an image enhancement apparatus is provided, the apparatus including: the preprocessing module is used for preprocessing the original image to generate a standard image; the color correction module is used for performing color correction on the standard image to generate a corrected image; and the image enhancement module is used for carrying out deblurring processing on the corrected image through wiener filtering so as to obtain an enhanced image.
According to an aspect of the present disclosure, an electronic device is provided, the electronic device including: one or more processors; storage means for storing one or more programs; when executed by one or more processors, cause the one or more processors to implement a method as above.
According to an aspect of the disclosure, a computer-readable medium is proposed, on which a computer program is stored, which program, when being executed by a processor, carries out the method as above.
According to the image enhancement method, the image enhancement device, the electronic equipment and the computer readable medium, the image enhancement effect can be improved, the problem of image data structuring processing can be solved efficiently, the authentication efficiency in face recognition can be improved, and meanwhile, a large amount of labor cost can be saved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The above and other objects, features and advantages of the present disclosure will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings. The drawings described below are merely some embodiments of the present disclosure, and other drawings may be derived from those drawings by those of ordinary skill in the art without inventive effort.
FIG. 1 is a flow chart illustrating a method of image enhancement according to an exemplary embodiment.
Fig. 2 is a diagram illustrating an application scenario of an image enhancement method according to an exemplary embodiment.
Fig. 3 is a diagram illustrating an application scenario of an image enhancement method according to an exemplary embodiment.
Fig. 4 is a flowchart illustrating an image enhancement method according to another exemplary embodiment.
Fig. 5 is a flowchart illustrating an image enhancement method according to another exemplary embodiment.
Fig. 6 is a flowchart illustrating an image enhancement method according to another exemplary embodiment.
Fig. 7 is a flowchart illustrating an image enhancement method according to another exemplary embodiment.
Fig. 8 is a block diagram illustrating an image enhancement apparatus according to an exemplary embodiment.
FIG. 9 is a block diagram illustrating an electronic device in accordance with an example embodiment.
FIG. 10 is a schematic diagram illustrating a computer-readable storage medium according to an example embodiment.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The same reference numerals denote the same or similar parts in the drawings, and thus, a repetitive description thereof will be omitted.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and so forth. In other instances, well-known methods, devices, implementations, or operations have not been shown or described in detail to avoid obscuring aspects of the disclosure.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. I.e. these functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor means and/or microcontroller means.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all of the contents and operations/steps, nor do they necessarily have to be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
It will be understood that, although the terms first, second, third, etc. may be used herein to describe various components, these components should not be limited by these terms. These terms are used to distinguish one element from another. Thus, a first component discussed below may be termed a second component without departing from the teachings of the disclosed concept. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
It is to be understood by those skilled in the art that the drawings are merely schematic representations of exemplary embodiments, and that the blocks or processes shown in the drawings are not necessarily required to practice the present disclosure and are, therefore, not intended to limit the scope of the present disclosure.
The inventors of the present application have discovered that digital image enhancement techniques combine multiple interdisciplines of pattern recognition, machine learning, artificial intelligence, and visual neurophysiology. Most of the existing image enhancement methods are based on a probabilistic prior model of the image, i.e. for a given blurred image and blur kernel, a maximum a posteriori probability estimate of the potential sharp image is found. Image enhancement, which is an inverse problem, estimates the original sharp image from the observed blurred image. Because the blurring factor and the sharp image are unknown in the image enhancement process, and the known quantity is only the observed blurred image, the unknown quantity is more than the known quantity when the problem is solved. The image enhancement has unreliability, and at present, fuzzy parameters are calculated on an image frequency spectrum based on high-frequency energy analysis, hough transformation and Radon transformation by mainly using a parameter estimation method of a space domain and a frequency domain; and obtaining a clear image according to methods such as constrained least square filtering, wiener filtering, inverse filtering and the like. However, since images are usually noisy and blur parameter estimates are generally not accurate enough, these deconvolution algorithms typically yield sharp images with various artifacts.
Based on the above problems, a new image enhancement method is proposed herein, which solves some technical difficulties encountered in the prior art solutions to some extent.
The following is a detailed description of the image enhancement method in the present application:
FIG. 1 is a flow chart illustrating a method of image enhancement according to an exemplary embodiment. The image enhancement method 10 includes at least steps S102 to S106.
As shown in fig. 1, in S102, the original image is preprocessed to generate a standard image. The image resolution can vary greatly depending on the source of the input image. For consistency in subsequent processing, the original image needs to be preprocessed.
In one embodiment, the pre-processing of the original image, and the generation of the standard image comprises at least one of: normalizing the original image to generate a standard image; carrying out standardization processing on the original image to generate a standard image; and adjusting the resolution of the original image to generate a standard image.
In S104, the standard image is color-corrected to generate a corrected image. For example, the image may be color-corrected by a white balance algorithm, and the step may effectively handle the situation of unclear image quality caused by image color cast, and specifically includes: spatially converting the standard image from three primary colors to YCbCrA space; and pair YCbCrAnd carrying out color correction on the spatial standard image to generate the corrected image.
Among them, the three primary color space RGB (red, green, blue) is a color standard in the industry, and various colors are obtained by changing three color channels of red (R), green (G), and blue (B) and superimposing them with each other, where RGB is a color representing three channels of red, green, and blue, and this standard almost includes all colors that can be perceived by human vision, and is one of the most widely used color systems at present.
YCbCrThe color space has brightness as its main component, YCbCrIt is commonly used in video processing, or digital photography systems. YCbCrWherein Y is a luminance component, CbRefers to the blue chrominance component, andCrrefers to the red chrominance component.
Since the user's naked eye is more sensitive to the Y component of the video, the user's naked eye does not perceive a change in image quality after the chrominance component is reduced by sub-sampling the chrominance component. Thus converting the original image in the three primary color space into YCbCrAfter space, adopt YCbCrThe color space is used for storing the original image, and the storage space can be saved on the basis of keeping the definition of the image.
In S106, the corrected image is deblurred by wiener filtering to obtain an enhanced image. Can be for example: firstly, Fourier transformation is carried out on the corrected image to generate a frequency domain image; then, determining a point spread function; determining a wiener filter function through the point spread function; and then deblurring the corrected image through a wiener filter function to obtain an enhanced image.
Among them, a Point Spread Function (PSF) is used for short. Is a function describing the ability of the optical system to resolve a point source. Since a point source forms an enlarged image spot by diffraction after passing through any optical system. By measuring the point spread function of the system, image information can be extracted more accurately. In this embodiment, the wiener filter function can be determined by different point spread functions, which may be, for example: geometric (no diffraction effect) point diagrams, Fast Fourier Transform (FFT) PSFs and Huygens (Huygens) PSFs based on diffraction effects, but the application is not limited thereto.
In one embodiment, determining the wiener filter function from the point spread function comprises: determining a frequency domain function and a complex conjugate function of the point spread function through Fourier change; constructing the wiener filter through a frequency domain function and a complex conjugate function:
Figure BDA0001849790120000071
the spectrum F (u, v) of the deblurred filtered image can be as follows:
Figure BDA0001849790120000072
wherein | H (u, v) & gtu2=H(u,v)H*(u, v), G (u, v) are frequency domain images.
Where H (u, v) is the frequency domain function of the point spread function, H*(u, v) is the complex conjugate function of the point spread function, and k is the tuning parameter. In practical application, a proper K value is selected according to the treatment effect. K may be presented, for example, as an adjustable coefficient at the user display for determining K values in real time based on the presentation effect of the enhanced image. And calculating the effect of enhancing the image in real time according to the k value determined by the user in real time so as to be selected by the user.
According to the image enhancement method disclosed by the disclosure, the original image is subjected to preprocessing and white balance processing, and then the image is deblurred in a wiener filtering mode to obtain an enhanced image, so that the image enhancement effect can be improved.
Fig. 2 is a diagram illustrating an application scenario of an image enhancement method according to an exemplary embodiment.
As shown in fig. 2, in the field of insurance claim payment, a client needs to upload image-related information such as an identity card during a claim settlement process, and the information often has some noise during shooting and transmission, and when performing structured recognition on a digital image, the content of the image cannot be recognized in time due to poor image quality. As the amount of information increases, the check-by-check of manual work becomes more and more difficult to control.
According to the image enhancement method disclosed by the invention, the image enhancement processing is firstly carried out on the certificate image to be recognized, and the subsequent structured recognition is carried out according to the enhanced image, so that the efficiency and the real-time performance of character recognition processing can be improved, the problem of data structured processing can be solved very efficiently, and a large amount of labor cost is saved.
Fig. 3 is a diagram illustrating an application scenario of an image enhancement method according to an exemplary embodiment.
As shown in fig. 3, as the degree of informatization increases, the customer's will to purchase insurance through the mobile terminal increases. After a customer purchases the image online, the customer often needs to perform face recognition to check customer information, and the quality of a face image often has some noisy information due to the influence of various factors in the process of shooting or uploading a face picture of the customer. For such situations, verification is usually performed manually or a customer visit is needed to be made offline.
In order to improve the one-time authentication passing rate of a client and reduce the manual intervention of the client in insurance purchase, according to the image enhancement method disclosed by the invention, the image enhancement processing is carried out on the face photo image to be recognized, and the face recognition is continued through enhancing the image, so that the authentication passing rate can be greatly improved, the client satisfaction can be improved to the greatest extent, and the labor cost is greatly saved.
It should be clearly understood that this disclosure describes how to make and use particular examples, but the principles of this disclosure are not limited to any details of these examples. Rather, these principles can be applied to many other embodiments based on the teachings of the present disclosure.
Fig. 4 is a flowchart illustrating an image enhancement method according to another exemplary embodiment. The flowchart 40 of the image enhancement method shown in fig. 4 is a detailed description of "preprocessing the original image to generate the standard image" in the flowchart 10 of the image enhancement method shown in fig. 1.
In S402, the original image is normalized to generate a standard image. The image normalization is to convert the original image to be processed into a corresponding unique standard form (the standard form has invariant characteristics to affine transformations such as translation, rotation, scaling and the like) through a series of transformations (namely, a set of parameters is found by using invariant moment of the image so that the influence of other transformation functions on image transformation can be eliminated).
In one embodiment, the normalization process of the image may be, for example: the moments of the image that are invariant to affine transformations are used to determine the parameters of the transformation function, and the transformation function determined by the parameters is then used to transform the original image into a standard form of image. Image normalization makes the image resistant to attacks by geometric transformations, which can find those invariants in the image to know that the images are originally the same or a series.
In S404, the raw image is normalized to generate a standard image. The normalization process may be, for example, a whitening process. The purpose of the whitening process is to remove redundant information of the input data. Image whitening (whitening) can be used to process either overexposed or underexposed pictures by changing the average pixel value of the image to 0 and the variance of the image to a unit variance of 1. The mean and variance of the original image may be calculated, for example, and then each pixel value of the original image may be transformed.
In S406, the resolution of the original image is adjusted to generate a standard image. The image resolution refers to the amount of information stored in an image, and is the number of pixels Per inch of the image, and the resolution is in units of ppi (pixels Per inc), which is generally called pixels Per inch. Image resolution is typically used to change the sharpness of an image.
In one embodiment, the image resolution adjustment may be achieved, for example, by image scaling (image scaling), which refers to the process of resizing the digital image. Image scaling is a non-trivial process requiring a trade-off in processing efficiency and smoothness and sharpness of the result. When the size of an image increases, the visibility of pixels constituting the image becomes higher, so that the image appears blurred. Conversely, reducing an image will enhance its smoothness and sharpness.
According to the image enhancement method disclosed by the invention, the definition and the accuracy of the later-stage image processing can be improved in a mode of carrying out other image operations after the original image is subjected to preprocessing.
Fig. 5 is a flowchart illustrating an image enhancement method according to another exemplary embodiment. The flowchart 50 of the image enhancement method shown in fig. 5 is a detailed description of "color-correcting the standard image to generate a corrected image" in the flowchart 10 of the image enhancement method shown in fig. 1.
As shown in fig. 5, in S502, the standard image is spatially converted from the three primary colors into YCbCrA space. Converting an image from RGB color space to YCbCrSpatially, the conversion formula can be as follows:
Figure BDA0001849790120000091
wherein R, G, B, respectively represent the values of the unavailable color channels in the RGB color space.
In S504, a subgraph variance is calculated. Based on YCbCrAnd the space diagram divides the image into sub-images according to a certain rule and calculates the mean value and the variance of each sub-image. Specifically, the image is divided into sub-images sub _ P (x, y) with the size h x wiAnd separately calculate Cr,CbAverage value M ofr,Mb. According to Mr,MbC is calculated by the following formular,CbVariance D ofr,Db
Figure BDA0001849790120000092
In S506, a white reference point is determined. Based on the variance values of the different channels, a "white reference point" in the image is calculated and obtained. The concrete is shown in the formula.
Figure BDA0001849790120000093
Let a luminance matrix R of "white reference pointLAnd the size is w x h. If the value of the luminance (Y component) of the point (i, j) is assigned to R, the point is used as a white reference pointL(i, j); if not, R of the pointLThe value of (i, j) is 0. And selects the maximum 10% luminance (Y component) value among the reference "white reference points" and selects the minimum value Lu _ min thereof.
In S508, color correction is performed on the standard image based on the reference point, and a corrected image is generated. Luminance matrix R based on said reference pointLAdjusting RL
If R isL(i,j))<Lu_min,RL(i,j)=0;
Otherwise, RL(i,j)=1;
R, G, B and RL are multiplied to obtain R2, G2 and B2 respectively. And calculating the average value of R2, G2 and B2 respectively, Rav,Gav,Bav(ii) a The calculation method of the gain adjustment is shown in the following formula.
Figure BDA0001849790120000101
Wherein Y ismaxIs the maximum value in the luminance image. The image is color corrected by the following formula. The specific formula is shown below.
Figure BDA0001849790120000102
According to the image enhancement method disclosed by the invention, the definition and the accuracy of the post-image processing can be improved in a mode of carrying out other image operations after the color correction is carried out on the original image.
Fig. 6 is a flowchart illustrating an image enhancement method according to another exemplary embodiment. The flowchart 60 of the image enhancement method shown in fig. 6 is a detailed description of "deblurring the corrected image by wiener filtering to obtain an enhanced image" in the flowchart 10 of the image enhancement method shown in fig. 1.
In S602, the corrected image is fourier-transformed to generate a frequency domain image. Let the color corrected image be Po(x, y). For the extracted blurred image Po(x, y), which generally means that a normal image is obtained by a certain degradation system, the introduced random noise can be defined as n (x, y) during the degradation process. Then the mouldBlurred image Po(x, y) can be represented by the following formula:
Po(x,y)=h(x,y)*f(x,y)+n(x,y)
where h (x, y) is a spatial description of the degradation function, i.e. the point spread function of the imaging system.
For the extracted blurred image Po(x, y) performing two-dimensional Fourier transform to obtain a frequency domain image G (u, v);
in S604, a point spread function is determined. Estimating a system point spread function H (x, y), and performing two-dimensional Fourier transform on the system point spread function H (x, y) to obtain H (u, v) and complex conjugate H of a frequency domain*(u,v)。
In S606, a frequency domain function and a complex conjugate function of the point spread function are determined by fourier transform.
In S608, the wiener filter is constructed by a frequency domain function and a complex conjugate function. Constructing the wiener filter through a frequency domain function and a complex conjugate function:
Figure BDA0001849790120000111
where H (u, v) is the frequency domain function of the point spread function, H*(u, v) is the complex conjugate function of the point spread function, and k is the tuning parameter.
In S610, a filtered image is obtained by deblurring the corrected image by a wiener filter function. Constructing a wiener filter to process the image according to the frequency domain image obtained in the step 3.1; the frequency domain G (u, v) image is subjected to wiener filtering to obtain a deblurred image spectrum F (u, v).
Figure BDA0001849790120000112
Wherein | H (u, v) & gtu2=H(u,v)H*(u, v), in practical application, selecting a proper K value according to the treatment effect.
In S612, the filtered image is subjected to an inverse fourier transform to obtain the enhanced image. To pairCarrying out inverse Fourier transform on the deblurred image frequency spectrum F (u, v) to obtain a deblurred clear image Ps(x,y)。
According to the image enhancement method disclosed by the disclosure, the image obtained by performing color correction and deblurring on the original image is input into the image enhancement model, and a final enhanced image is obtained, so that the image enhancement effect can be improved.
According to the image enhancement method disclosed by the invention, the image polluted by noise can be processed to a certain extent by using wiener filtering (minimum mean square error filtering) to carry out deblurring operation on the image.
Fig. 7 is a flowchart illustrating an image enhancement method according to another exemplary embodiment. Flowchart 70 of the image enhancement method shown in fig. 7 the process of "image enhancement based on deep convolutional neural network" is added to the flowchart 10 of the image enhancement method shown in fig. 1.
In S702, an original image is input.
In S704, the image is preprocessed.
In S706, color correction based on a white balance algorithm.
In S708, an enhanced image is obtained based on the deblurring of the wiener filtered image.
In S710, a second enhanced image is obtained based on image enhancement by the deep convolutional neural network.
In S712, a second enhanced image is output.
In one embodiment, in addition to the steps described above, the enhanced image may be input into a deep convolutional neural network model, for example, to obtain a second enhanced image after further enhancement. Training a deep convolutional neural network with image data to obtain the image enhancement model comprises: determining the number of layers of the deep convolutional neural network; determining an activation function of a deep convolutional neural network; determining a loss function of the deep convolutional neural network; and inputting the image training data and the image comparison data into the deep convolutional neural network, and acquiring the image enhancement model through training.
Comparing image training data with image comparison dataInputting the image enhancement model into the deep convolutional neural network, and obtaining the image enhancement model through training. Learning blurred images and their corresponding sharp images may be obtained in advance from a large amount of training data, given a set of blurred images { P }iAnd their corresponding sharp images { P }i FAnd (4) taking the blurred image as image training data, taking the clear image as image contrast data, inputting the image contrast data into the depth convolution neural network with the set parameters, and acquiring the image enhancement model through training.
In one embodiment, the image enhancement model includes three types of layer structures:
for the first layer, W3 × 3 filters may be set and a ReLU activation function is used, which may be defined as:
Figure BDA0001849790120000121
similarly, for second to penultimate layers, normalization methods may be added on a per first layer basis. For the last layer, the convolution can be performed using W3 x 3 filters, resulting in the final enhanced image.
In one embodiment, inputting the enhanced image into an image enhancement model to obtain a second enhanced image further comprises: after each convolution calculation, the calculation result is subjected to boundary processing. The method specifically comprises the following steps: after each layer convolution, in order to match the resolution of the input image with that of the output image, the convolution result for each layer is subjected to boundary processing. The boundaries may be filled with, for example, a 0-setting throughout so that no additional noise information is added on the basis of maintaining the original size of the image.
In one embodiment, the enhanced image is input into a convolution layer with an activation function of an image enhancement model to obtain first data; inputting the first data into a plurality of convolutions with activation functions to obtain second data; inputting second data into the convolutional layer to obtain the second enhanced image.
According to the image enhancement method disclosed by the invention, the noise pollution phenomenon in the original image can be reduced by carrying out the deblurring operation on the image by using the wiener filtering (minimum mean square error filtering).
According to the image enhancement method disclosed by the invention, through a multi-step-based image enhancement method, the method enhances the effectiveness of the algorithm by enhancing blurred images in different directions;
according to the image enhancement method disclosed by the invention, the image enhancement method based on the deep learning method is provided, and the image enhancement effect is greatly optimized on the basis of not influencing the performance of the algorithm.
Those skilled in the art will appreciate that all or part of the steps implementing the above embodiments are implemented as computer programs executed by a CPU. When executed by the CPU, performs the functions defined by the above-described methods provided by the present disclosure. The program may be stored in a computer readable storage medium, which may be a read-only memory, a magnetic or optical disk, or the like.
Furthermore, it should be noted that the above-mentioned figures are only schematic illustrations of the processes involved in the methods according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
The following are embodiments of the disclosed apparatus that may be used to perform embodiments of the disclosed methods. For details not disclosed in the embodiments of the apparatus of the present disclosure, refer to the embodiments of the method of the present disclosure.
Fig. 8 is a block diagram illustrating an image enhancement apparatus according to an exemplary embodiment. The image enhancement apparatus 80 shown in fig. 8 includes: a pre-processing module 802, a color correction module 804, and an image enhancement module 806.
The preprocessing module 802 is configured to preprocess the original image to generate a standard image; for example, the original image is normalized to generate a standard image; carrying out standardization processing on the original image to generate a standard image; and adjusting the resolution of the original image to generate a standard image.
The color correction module 804 is configured to perform color correction on the standard image to generate a corrected image; for example, the image may be color-corrected by a white balance algorithm, and the step may effectively handle the situation of unclear image quality caused by image color cast, and specifically includes: spatially converting the standard image from three primary colors to YCbCrA space; and pair YCbCrAnd carrying out color correction on the spatial standard image to generate the corrected image.
The image enhancement module 806 is configured to deblur the corrected image through wiener filtering to obtain an enhanced image; the corrected image may be fourier transformed, for example, to generate a frequency domain image; determining a point spread function; determining a wiener filter function through the point spread function; and deblurring the frequency domain image through a wiener filter function to obtain an enhanced image.
The wiener filter constructed by the frequency domain function and the complex conjugate function is as follows:
Figure BDA0001849790120000141
where H (u, v) is the frequency domain function of the point spread function, H*(u, v) is the complex conjugate function of the point spread function, and k is the tuning parameter.
The image enhancing apparatus 80 may further include: the training module 806 (not shown) is used to train the deep convolutional neural network with the image data to obtain the image enhancement model. For example, a mapping model between a blurred image and its corresponding sharp image is learned in advance from a large amount of training data, and a set of blurred images { P is giveniAnd their corresponding sharp images { P }i F}. And training the data through a deep convolutional neural network model to obtain an image enhancement model.
According to the image enhancement method disclosed by the disclosure, the original image is subjected to preprocessing and white balance processing, and then the image is deblurred in a wiener filtering mode to obtain an enhanced image, so that the image enhancement effect can be improved.
FIG. 9 is a block diagram illustrating an electronic device in accordance with an example embodiment.
An electronic device 200 according to this embodiment of the present disclosure is described below with reference to fig. 9. The electronic device 200 shown in fig. 9 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 9, the electronic device 200 is embodied in the form of a general purpose computing device. The components of the electronic device 200 may include, but are not limited to: at least one processing unit 210, at least one memory unit 220, a bus 230 connecting different system components (including the memory unit 220 and the processing unit 210), a display unit 240, and the like.
Wherein the storage unit stores program code executable by the processing unit 210 to cause the processing unit 210 to perform the steps according to various exemplary embodiments of the present disclosure described in the above-mentioned electronic prescription flow processing method section of the present specification. For example, the processing unit 210 may perform the steps as shown in fig. 1, 4, 5, 6 and 7.
The memory unit 220 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM)2201 and/or a cache memory unit 2202, and may further include a read only memory unit (ROM) 2203.
The storage unit 220 may also include a program/utility 2204 having a set (at least one) of program modules 2205, such program modules 2205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 230 may be one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 200 may also communicate with one or more external devices 300 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 200, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 200 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 250. Also, the electronic device 200 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via the network adapter 260. The network adapter 260 may communicate with other modules of the electronic device 200 via the bus 230. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 200, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, or a network device, etc.) to execute the above method according to the embodiments of the present disclosure.
FIG. 10 schematically illustrates a computer-readable storage medium in an exemplary embodiment of the disclosure.
Referring to fig. 10, a program product 400 for implementing the above method according to an embodiment of the present disclosure is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
Program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
The computer readable medium carries one or more programs which, when executed by a device, cause the computer readable medium to perform the functions of: preprocessing an original image to generate a standard image; performing color correction on the standard image to generate a corrected image; and deblurring the corrected image by wiener filtering to obtain an enhanced image.
Those skilled in the art will appreciate that the modules described above may be distributed in the apparatus according to the description of the embodiments, or may be modified accordingly in one or more apparatuses unique from the embodiments. The modules of the above embodiments may be combined into one module, or further split into multiple sub-modules.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a mobile terminal, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
Exemplary embodiments of the present disclosure are specifically illustrated and described above. It is to be understood that the present disclosure is not limited to the precise arrangements, instrumentalities, or instrumentalities described herein; on the contrary, the disclosure is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.

Claims (9)

1. An image enhancement method, comprising:
preprocessing an original image to generate a standard image;
performing color correction on the standard image to generate a corrected image; and
deblurring the corrected image through a wiener filter function to obtain a filtered image;
performing an inverse Fourier transform on the filtered image to obtain a first enhanced image;
inputting the first enhanced image into an image enhancement model to obtain a second enhanced image;
wherein the image enhancement model comprises a deep convolutional neural network model, and the inputting the first enhanced image into the image enhancement model to obtain the second enhanced image comprises:
and carrying out boundary filling processing on the convolution result of each convolution layer of the deep convolution neural network model.
2. The method of claim 1, wherein pre-processing the original image to generate the standard image comprises at least one of:
normalizing the original image to generate a standard image;
carrying out standardization processing on the original image to generate a standard image; and
and adjusting the resolution of the original image to generate a standard image.
3. The method of claim 1, wherein color correcting the standard image, generating a corrected image comprises:
spatially converting the standard image from three primary colors to YCbCrA space; and
to YCbCrAnd carrying out color correction on the spatial standard image to generate the corrected image.
4. The method of claim 3 wherein YC is processedbCrColor correcting a standard image of a space, generating the corrected image comprising:
will YCbCrDividing the standard image of the space into a plurality of sub-images according to a preset rule;
determining a variance of the plurality of subgraphs;
determining a reference point matrix by the variance of the plurality of subgraphs;
pair of YC by said reference point matrixbCrAnd carrying out color correction on the spatial standard image to generate the corrected image.
5. The method of claim 3, wherein deblurring the corrected image by wiener filtering to obtain a filtered image comprises:
performing Fourier transform on the corrected image to generate a frequency domain image;
determining a point spread function;
determining a wiener filter function through the point spread function; and
and deblurring the frequency domain image through a wiener filter function to obtain a filter image.
6. The method of claim 5, wherein determining a wiener filter function from the point spread function comprises:
determining a frequency domain function and a complex conjugate function of the point spread function through Fourier transform;
constructing the wiener filter through a frequency domain function and a complex conjugate function:
Figure FDA0002884183810000021
where H (u, v) is the frequency domain function of the point spread function, H*(u, v) is the complex conjugate function of the point spread function, and k is the tuning parameter.
7. An image enhancement apparatus, comprising:
the preprocessing module is used for preprocessing the original image to generate a standard image;
the color correction module is used for performing color correction on the standard image to generate a corrected image; and
the image enhancement module is used for carrying out deblurring processing on the corrected image through a wiener filter function to obtain a filtered image, carrying out inverse Fourier transform on the filtered image to obtain a first enhanced image, and inputting the first enhanced image into an image enhancement model to obtain a second enhanced image;
wherein the image enhancement model comprises a deep convolutional neural network model, and the inputting the first enhanced image into the image enhancement model to obtain the second enhanced image comprises:
and carrying out boundary filling processing on the convolution result of each convolution layer of the deep convolution neural network model.
8. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-6.
9. A computer-readable medium, on which a computer program is stored, which, when being executed by a processor, carries out the method according to any one of claims 1-6.
CN201811289608.5A 2018-10-31 2018-10-31 Image enhancement method and device, electronic equipment and computer readable medium Active CN109410143B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811289608.5A CN109410143B (en) 2018-10-31 2018-10-31 Image enhancement method and device, electronic equipment and computer readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811289608.5A CN109410143B (en) 2018-10-31 2018-10-31 Image enhancement method and device, electronic equipment and computer readable medium

Publications (2)

Publication Number Publication Date
CN109410143A CN109410143A (en) 2019-03-01
CN109410143B true CN109410143B (en) 2021-03-09

Family

ID=65470999

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811289608.5A Active CN109410143B (en) 2018-10-31 2018-10-31 Image enhancement method and device, electronic equipment and computer readable medium

Country Status (1)

Country Link
CN (1) CN109410143B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112069870A (en) * 2020-07-14 2020-12-11 广州杰赛科技股份有限公司 Image processing method and device suitable for vehicle identification
CN113807246A (en) * 2021-09-16 2021-12-17 平安普惠企业管理有限公司 Face recognition method, device, equipment and storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6721459B1 (en) * 1999-12-03 2004-04-13 Eastman Kodak Company Storing sharpness data using embedded carriers
CN102629371A (en) * 2012-02-22 2012-08-08 中国科学院光电技术研究所 Video image quality improvement system based on real-time blind image restoration technology
CN107301381A (en) * 2017-06-01 2017-10-27 西安电子科技大学昆山创新研究院 Recognition Method of Radar Emitters based on deep learning and multi-task learning strategy
CN107832684B (en) * 2017-10-26 2021-08-03 通华科技(大连)有限公司 Intelligent vein authentication method and system with autonomous learning capability
CN107945125B (en) * 2017-11-17 2021-06-22 福州大学 Fuzzy image processing method integrating frequency spectrum estimation method and convolutional neural network
CN108335339B (en) * 2018-04-08 2021-10-22 朱高杰 Magnetic resonance reconstruction method based on deep learning and convex set projection

Also Published As

Publication number Publication date
CN109410143A (en) 2019-03-01

Similar Documents

Publication Publication Date Title
Wang et al. Real-esrgan: Training real-world blind super-resolution with pure synthetic data
Galdran Image dehazing by artificial multiple-exposure image fusion
Chen et al. Hdrunet: Single image hdr reconstruction with denoising and dequantization
CN111275626B (en) Video deblurring method, device and equipment based on ambiguity
JP6961139B2 (en) An image processing system for reducing an image using a perceptual reduction method
Xiong et al. Unsupervised low-light image enhancement with decoupled networks
JP4443064B2 (en) Method and apparatus for performing local color correction
WO2011092696A1 (en) Method and system for generating an output image of increased pixel resolution from an input image
CN110136055B (en) Super resolution method and device for image, storage medium and electronic device
WO2014169579A1 (en) Color enhancement method and device
WO2017100971A1 (en) Deblurring method and device for out-of-focus blurred image
JP5007363B2 (en) Automatic image enhancement
Trentacoste et al. Blur‐aware image downsampling
Liang et al. Improved non-local iterative back-projection method for image super-resolution
CN109410143B (en) Image enhancement method and device, electronic equipment and computer readable medium
CN111353955A (en) Image processing method, device, equipment and storage medium
Kousha et al. Modeling srgb camera noise with normalizing flows
JP2007506321A (en) Method and system for modifying a digital image differentially quasi-regularly from pixel to pixel
US20210368088A1 (en) Systems and methods of image enhancement
JP2012023455A (en) Image processing device, image processing method, and program
CN111083359B (en) Image processing method and apparatus, electronic device, and computer-readable storage medium
CN110689486A (en) Image processing method, device, equipment and computer storage medium
Bengtsson et al. Regularized optimization for joint super-resolution and high dynamic range image reconstruction in a perceptually uniform domain
JP2017033182A (en) Image processing apparatus, imaging apparatus, and image processing program
WO2016051716A1 (en) Image processing method, image processing device, and recording medium for storing image processing program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant