CN109325920B - Haze image sharpening method and system and storable medium - Google Patents

Haze image sharpening method and system and storable medium Download PDF

Info

Publication number
CN109325920B
CN109325920B CN201810886690.3A CN201810886690A CN109325920B CN 109325920 B CN109325920 B CN 109325920B CN 201810886690 A CN201810886690 A CN 201810886690A CN 109325920 B CN109325920 B CN 109325920B
Authority
CN
China
Prior art keywords
haze
network parameter
image
quality
image quality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810886690.3A
Other languages
Chinese (zh)
Other versions
CN109325920A (en
Inventor
储颖
游为麟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen University
Original Assignee
Shenzhen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen University filed Critical Shenzhen University
Priority to CN201810886690.3A priority Critical patent/CN109325920B/en
Publication of CN109325920A publication Critical patent/CN109325920A/en
Application granted granted Critical
Publication of CN109325920B publication Critical patent/CN109325920B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to a haze image clarification method, a haze image clarification system and a storable medium, wherein the haze image clarification method comprises the following steps: s1, randomly selecting a first depth network parameter to remove haze of the haze image so as to obtain a first haze removal image and obtain a first image quality score; s2, selecting a first high-quality depth network parameter according to the first image quality score; s3, carrying out rapid bacterial group swimming optimization operation on the first high-quality deep network parameters to obtain a plurality of second deep network parameters; s4, acquiring a second haze-removing image and a second image quality score through the second depth network parameters; s5, selecting a second high-quality deep network parameter and judging whether the second high-quality deep network parameter is optimal or not, if so, outputting a second haze removal image corresponding to the second high-quality deep network parameter and ending the process; if not, the second premium deep network parameter is defined as a new first premium deep network parameter and S3 is executed. The invention has the advantages of fast processing process and accurate and objective result.

Description

Haze image sharpening method and system and storable medium
Technical Field
The invention relates to the technical field of image processing, in particular to a haze image sharpening method and system and a storage medium.
Background
In recent years, haze is severe. In the image shot in haze weather, various pollutants such as inhalable particles and fine particles are mixed in the air, so that the absorption, refraction and scattering of light are seriously influenced, the image is blurred, and the image is inconvenient for human eyes to watch. The image visual effect is not good, which not only affects the image definition, but also brings trouble to the judgment of the target.
The quality of the image shot in the haze environment is poor, and the subsequent processing difficulty of the image is greatly increased. In addition, the existence of the haze image can bring great influence to road traffic monitoring and satellite remote sensing monitoring. Therefore, the influence of adverse weather such as haze on the imaging effect is reduced through the image processing technology, the image quality is effectively improved, and the method has a wide application prospect. The haze problem is not only related to the health of everyone, but also closely related to the safety of cities. From the perspective of image processing, how to perform the sharpening processing on the haze image is researched, and the problem which needs to be solved urgently in city development is formed.
The image haze removing technology aims to remove the interference of adverse factors such as fog and haze in an image, so that the image recovers effective information and characteristics, and an image with a good visual effect is obtained. At present, haze image clarification processing technologies mainly fall into two categories: the image enhancement method based on image processing and the image restoration method based on physical model. Wherein:
the image enhancement method based on image processing is to enhance the contrast of an image, highlight or weaken certain effective information and image characteristics to reduce the interference of haze on the image, thereby realizing the sharpness of the image. The method is simple in principle and easy to implement, but has the problems that the processed image is easy to have color distortion, image information is lost and the like. In other words, this type of method only reduces the interference of haze on the image, but does not remove the haze intrinsically. The image enhancement method based on image processing mainly comprises the methods based on histogram equalization, curvelet transformation, homomorphic filtering and Retinex theory.
The image restoration method based on the physical model is to model the atmospheric scattering effect from the angle of the haze formation reason and then realize haze removal through analysis and processing. Although the image information and the feature are completely stored in the processing process, the method has less loss of the image information. However, the process is complex, and the haze morphology is greatly influenced, and common methods include methods based on atmospheric physical models, image differences and dark channel priority and methods based on fusion.
Disclosure of Invention
The invention aims to solve the technical problems that in the prior art, an image is prone to color distortion, image information is lost and the like in the haze image sharpening process, or the haze processing process is complex and the result is unstable, and provides a haze image sharpening method, a haze image sharpening system and a storage medium.
The technical scheme adopted by the invention for solving the technical problems is as follows: a haze image clearing method is constructed, and the haze image clearing method comprises the following steps:
s1, randomly selecting a plurality of first depth network parameters, performing deep learning haze removal on a haze image to obtain a plurality of first haze removal images, and performing blind image quality evaluation on the first haze removal images to obtain a plurality of first image quality scores;
s2, selecting a first high-quality deep network parameter from the plurality of first deep network parameters according to the first image quality score;
s3, performing rapid bacterial group swimming optimization operation on the first high-quality deep network parameters to obtain a plurality of second deep network parameters;
s4, performing deep learning haze removal on the haze images through the second depth network parameters to obtain second haze removal images, and performing blind image quality evaluation on the second haze removal images to obtain second image quality scores;
s5, selecting a second high-quality deep network parameter from the plurality of second deep network parameters according to the second image quality score, and judging whether the second high-quality deep network parameter is optimal according to the second image quality score corresponding to the second high-quality deep network parameter, if so, executing a step S6, and if not, executing a step S6-1;
s6, outputting a second haze removal image corresponding to the second high-quality deep network parameter and ending the process;
s6-1, defining the second premium deep network parameter as a new first premium deep network parameter and executing the step S3.
Preferably, in step S5, the determining, according to the second image quality score corresponding to the second high-quality deep network parameter, whether the second high-quality deep network parameter is optimal includes:
and calculating a difference value between a second image quality score corresponding to the second high-quality deep network parameter and a first image quality score corresponding to the first high-quality deep network parameter, judging whether the difference value meets a first preset condition, and if so, judging that the second high-quality deep network parameter is the optimal deep network parameter.
Preferably, in step S5, the determining, according to the second image quality score corresponding to the second high-quality deep network parameter, whether the second high-quality deep network parameter is optimal includes:
and calculating whether a second image quality score corresponding to the second high-quality deep network parameter meets a second preset condition, if so, judging that the second high-quality deep network parameter is an optimal deep network parameter.
Preferably, in the method, the first depth network parameter and the second depth network parameter are column vector parameters, and each column vector parameter includes m parameter values, where m is greater than or equal to 1.
Preferably, in the method, the first depth network parameter and the second depth network parameter are row vector parameters, and each of the row vector parameters includes n parameter values, where n is greater than or equal to 1.
Preferably, in the step S1, the blind image quality evaluation of the first haze-removed images to obtain first image quality scores includes:
blind image quality evaluation is carried out on the first haze removal images by adopting a DIIVENE algorithm so as to obtain first image quality scores;
in the step S4, the blind image quality evaluation of the second haze-removed images to obtain second image quality scores includes:
and blind image quality evaluation is carried out on the second haze removal images by adopting the DIIVENE algorithm so as to obtain second image quality scores.
Preferably, in the step S3, the fast bacterial swarm optimization operation is performed on the first high-quality deep network parameter to obtain a plurality of second deep network parameters; the method comprises the following steps: the constraint conditions of the rapid bacterial group swimming optimization operation meet:
when J isi(j+1,k,l)>Jmin(j,k,l),
Figure BDA0001755830460000041
Wherein:
Jmin(j, k, l) is a first image quality score corresponding to the first good network parameter,
Figure BDA0001755830460000042
is a change in the second deep network parameter relative to the first good network parameter,
θi(j +1, k, l) is the second deep-network parameter,
Cccis an attraction factor and is used for representing a change factor when the second deep network parameter changes relative to the first quality network parameter,
θb(j, k, l) is the first good network parameter.
Preferably, said CccIncluding a dynamic step size C (k, l), wherein the constraint conditions of the dynamic step size are as follows:
C(k,l)=Lred/nk+l-1
wherein:
Lredfor the initial length of the chemotaxis step,
n is the step-down gradient.
The invention also constructs a haze image clearing system, which comprises the following components: a processor, a memory,
the memory, for storing program instructions,
the processor is configured to perform the steps of any of the above methods in accordance with program instructions stored in the memory.
The invention also constitutes a computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of the above.
The implementation of the haze image clearing method, the haze image clearing system and the storable medium has the following beneficial effects: the self-adaption to the haze weather can be realized, and the haze image can be objectively clearly processed according to the current haze weather. The processing process is fast, and the result is accurate and objective.
Drawings
The invention will be further described with reference to the accompanying drawings and examples, in which:
FIG. 1 is a flowchart illustrating a haze image sharpening method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram showing effect comparison of the haze image sharpening method according to the present invention.
Detailed Description
For a more clear understanding of the technical features, objects and effects of the present invention, embodiments of the present invention will now be described in detail with reference to the accompanying drawings.
As shown in fig. 1, in a first embodiment of the haze image sharpening method according to the present invention, the method includes the steps of:
s1, randomly selecting a plurality of first depth network parameters, performing deep learning haze removal on a haze image to obtain a plurality of first haze removal images, and performing blind image quality evaluation on the first haze removal images to obtain a plurality of first image quality scores; specifically, the haze image is input into the haze deep learning network, the haze image is subjected to initial haze removal through the deep learning of the haze deep learning network, here, in the initial haze removal process, the initial depth network parameters of the haze removal deep learning network can also be understood as that the first depth network parameters are randomly selected, the set of first depth network parameters can be generated in a random generation mode, wherein the set of first depth network parameters can comprise a plurality of first depth network parameters, the haze deep learning network respectively carries out deep learning on the haze image based on each first depth network parameter so as to obtain a first haze image corresponding to the first depth network parameters, and then, carrying out image quality evaluation on each first haze removal image through a blind image quality evaluation method, and obtaining the image quality score of each first haze removal image. It can be understood that, a number of the first deep network parameters may be irrelevant or less relevant deep network parameters, so that the coverage of the deep network parameters may be expanded, and the processing procedure and processing resources of the whole process may be reduced. The deep learning network includes a Convolutional neural network (Convolutional neural networks), an automatic encoder (Autoencoder), a Recurrent neural network (Recurrent neural networks), and the like. The convolutional neural network can be adopted, the local correlation of input data is utilized, the number of parameters of the fully-connected network is greatly reduced, and the performance is better in the image recognition process
S2, selecting a first high-quality deep network parameter from the first deep network parameters according to the first image quality score; specifically, the image quality scores of the plurality of first haze removal images obtained above are compared, and the first depth network parameter corresponding to the first haze removal image with the best image quality score is selected and used as the first high-quality depth network parameter.
S3, carrying out rapid bacterial group swimming optimization operation on the first high-quality deep network parameters to obtain a plurality of second deep network parameters; specifically, the obtained first high-quality deep network parameter is subjected to rapid bacterial swarm optimization operation, the first high-quality deep network parameter is closed to a historical optimal point in a swarm, and other peripheral deep network parameters which are better than the first deep network parameter are searched, wherein the other peripheral deep network parameters are defined as second deep network parameters.
S4, carrying out deep learning haze removal on the haze images through a plurality of second depth network parameters to obtain a plurality of second haze removal images, and carrying out blind image quality evaluation on the second haze removal images to obtain a plurality of second image quality scores; specifically, the haze deep learning network respectively carries out deep learning on haze images based on the selected second depth network parameters to obtain a plurality of corresponding second haze images, then carries out image quality evaluation on each second haze image through a blind image quality evaluation method, and obtains the image quality score of each second haze image.
S5, selecting a second high-quality deep network parameter from the plurality of second deep network parameters according to the second image quality score, and judging whether the second high-quality deep network parameter is optimal according to the second image quality score corresponding to the second high-quality deep network parameter, if so, executing a step S6, and if not, executing a step S6-1; specifically, the image quality scores of the plurality of second haze removal images obtained above are compared, the second haze removal image with the best image quality score is selected, and the corresponding second depth network parameter is obtained and used as the second high-quality depth network parameter. Here, whether the second high-quality depth network parameter is optimal or not can be judged according to the quality score of the second haze-removed image with the best second score, that is, the second image quality score corresponding to the second high-quality depth network parameter, for example, when it is judged that the second image quality score meets the condition of the optimal score, the second high-quality depth network parameter can be used as the optimal depth network parameter in the haze-removing process of the haze image, and then the operation of step S6 can be performed. And when the second image quality score does not meet the condition of the optimal score, the second high-quality depth network parameter is not the optimal depth network parameter in the haze image haze removal process. The operation of step S6-1 is to be performed.
S6, outputting a second haze-removing image corresponding to the second high-quality deep network parameter and ending the process; specifically, according to the above judgment result, when it can be judged that the second high-quality depth network parameter is the optimal depth network parameter in the haze removing process of the haze image, it can be judged that the second haze image corresponding to the second high-quality depth network parameter is the optimal haze removing image of the haze image. Therefore, the second haze-removing image corresponding to the second high-quality depth network parameter is used as the final clear image output in the clearing process of the haze image.
S6-1, defining the second premium deep network parameter as the new first premium deep network parameter and executing the step S3. Specifically, according to the above judgment result, when it is judged that the second high-quality depth network parameter is not the optimal depth network parameter in the haze image haze removal process, the second high-quality depth network parameter is used as a new first high-quality depth network parameter, the step S3 and the following steps are repeatedly executed, the new second high-quality depth network parameter is searched and judged, so that the final second high-quality depth network parameter capable of optimally removing the haze image is obtained, and the corresponding final clarified image output in the haze image clarification process is output.
Compared with the method that when the network parameters are determined in the prior art, the haze degree of the image is observed manually, and then the network parameters are selected manually, real-time haze image clarification processing can be carried out. In addition, the quality of the haze-removing image is evaluated by adopting an objective blind image quality evaluation algorithm, and the optimal haze-removing image is obtained, so that the method has higher accuracy compared with the existing method which is subjectively evaluated. In addition, the network parameters are intelligently selected by adopting a rapid bacterial swarm optimization algorithm, so that the problem of low efficiency caused by the traditional exhaustion method can be avoided, and the optimal deep network parameters can be rapidly determined.
Further, in step S5, determining whether the second high-quality deep network parameter is optimal according to the second image quality score corresponding to the second high-quality deep network parameter includes: and calculating a difference value between a second image quality score corresponding to the second high-quality deep network parameter and a first image quality score corresponding to the first high-quality deep network parameter, judging whether the difference value meets a first preset condition, and if so, judging that the second high-quality deep network parameter is the optimal deep network parameter. Specifically, whether the second high-quality depth network parameter is the optimal depth network parameter or not is judged, the second image quality score corresponding to the second high-quality depth network parameter and the first image quality score corresponding to the first high-quality depth network parameter can be compared, when the difference value between the second image quality score and the first image quality score is very small, or when the second image quality score and the first image quality score are very close to each other, the fact that the optimal effect of the haze image in haze removal through deep learning tends to be stable is shown, the fact that the optimal depth network parameter is found again has no influence on the haze removal effect of the haze image is shown, the second high-quality depth network parameter can be used as the optimal depth network parameter in the haze removal process of the haze image at this time, and the second haze removal image corresponding to the haze image is the final clear image output.
Further, in step S5, determining whether the second high-quality deep network parameter is optimal according to the second image quality score corresponding to the second high-quality deep network parameter includes: and calculating whether a second image quality score corresponding to the second high-quality deep network parameter meets a second preset condition, and if so, judging that the second high-quality deep network parameter is the optimal deep network parameter. Specifically, in the haze removing process of some haze images, the optimal output result of the haze image can be set, that is, the optimal clear image which can be output and the optimal quality score corresponding to the optimal clear image can be defined, when the second image quality score corresponding to the second high-quality depth network parameter meets the optimal quality score, the second high-quality depth network parameter can be determined to be the optimal depth network parameter if the second haze image meets the optimal output result of the target defined by the second haze image, and the second haze image corresponding to the second haze image is the final clear image output. It can be understood that, different blind image quality evaluation methods are different, the value ranges of the image quality scores are different, the image quality score values of different algorithms meet the monotonicity principle, and the minimum value or the maximum value of the image quality score values is necessarily a situation corresponding to the best image quality. Therefore, in the process of image sharpening, the target image quality score in the process of image sharpening is determined to be the minimum value or the maximum value only according to the type of the blind image quality evaluation algorithm.
Further, in some embodiments, in the haze image sharpening method of the present invention, the first depth network parameter and the second depth network parameter are column vector parameters, and each of the column vector parameters includes m parameter values, where m is greater than or equal to 1. In other embodiments, in the haze image sharpening method of the present invention, the first depth network parameter and the second depth network parameter are row vector parameters, and each of the row vector parameters includes n parameter values, where n is greater than or equal to 1. Specifically, different haze-removing deep learning networks can have different quantities and parameters of physical significance due to different structures, for example, column vector parameters, row vector parameters, and matrix parameters. Such as the following row vector parameters,
Figure BDA0001755830460000081
wherein:
bi-the ith deep network parameter vector;
Figure BDA0001755830460000082
-the nth parameter value of the ith deep network parameter vector.
Specifically, in step S1, blind image quality evaluation is performed on the first haze removal images to obtain first image quality scores, including: blind image quality evaluation is carried out on the first haze removal images by adopting a DIIVENE algorithm to obtain first image quality scores; in step S4, blind image quality evaluation is performed on the second haze-removed images to obtain second image quality scores, including: and blind image quality evaluation is carried out on the second haze removal images by adopting a DIIVENE algorithm so as to obtain second image quality scores. Specifically, in some embodiments, a DIIVINE algorithm may be adopted for blind image quality evaluation of the haze image, and in other embodiments, other methods may be adopted to perform blind image quality evaluation on the first haze image and the second haze image respectively. And the blind image quality evaluation scheme is not suitable to be replaced in the process to carry out blind image quality evaluation on the haze-removed images at different stages.
Further, in step S3, performing fast bacterial swarm optimization operation on the first high-quality deep network parameter to obtain a plurality of second deep network parameters; the method comprises the following steps: the constraint conditions of the rapid bacterial group swimming optimization operation are satisfied as follows:
when J isi(j+1,k,l)>Jmin(j,k,l),
Figure BDA0001755830460000091
Wherein:
Jmin(j, k, l) is a first image quality score corresponding to the first good network parameter,
Figure BDA0001755830460000092
for the second deep-network parameter to be,
θi(j +1, k, l) is a first deep network parameter,
Cccis an attraction factor and is used for representing a change factor when the second deep network parameter changes relative to the first quality network parameter,
θb(j, k, l) is the first good network parameter.
Specifically, the new population induction mechanism allows the first deep network parameter to guide the change of the first deep network parameter by using the experience of the surrounding first high-quality network parameters, so that the search time of the algorithm in the solution space can be greatly shortened. Meanwhile, the new mechanism is beneficial to the first deep network parameter to jump out of the local optimal solution, and the possibility that the first high-quality network parameter escapes from the global optimal point can be effectively reduced.
Further, CccIncluding dynamic step size C (k, l), the constraint conditions of the dynamic step size are:
C(k,l)=Lred/nk+l-1
wherein:
Lredfor the initial length of the chemotaxis step,
n is the step-down gradient.
Specifically, C (k, l) is on a downward trend as the number of replication and elimination-dispersal events increases. When k + l is small, C (k, l) is large, and excessive search time in a local area can be avoided; when k + l is gradually increased and C (k, l) is shortened, the local searching capability of the first high-quality network parameter near the global optimum point can be enhanced, and the algorithm is ensured to finally approach the global optimum point.
In addition, the haze image sharpening system of the present invention includes: a processor, a memory for storing program instructions, and a processor for performing the steps of any of the above methods according to the program instructions stored in the memory. Specifically, the haze image sharpening method can be performed by the haze image sharpening system, and a final sharpened image is output.
Furthermore, a computer-readable storage medium of the invention, on which a computer program is stored, which computer program, when being executed by a processor, carries out the steps of the method as set forth in any of the above. Specifically, the above-described method may be stored as a program and a duplicate copy may be made. A computer readable storage medium herein may be, but is not limited to, a tangible device that can hold and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. And a combination of devices including the above memory device.
As shown in fig. 2, a comparison graph of the processing result of the haze image clarification method of the present invention and the result of the existing processing method is shown, where a is an original haze image, i.e., a haze image to be clarified, b, c, and d are the clarification results of the existing haze image, and e is the processing result of the haze image clarification method of the present invention. But the process is far superior to the existing haze image clarification method.
It is to be understood that the foregoing examples, while indicating the preferred embodiments of the invention, are given by way of illustration and description, and are not to be construed as limiting the scope of the invention; it should be noted that, for those skilled in the art, the above technical features can be freely combined, and several changes and modifications can be made without departing from the concept of the present invention, which all belong to the protection scope of the present invention; therefore, all equivalent changes and modifications made within the scope of the claims of the present invention should be covered by the claims of the present invention.

Claims (7)

1. A haze image sharpening method is characterized by comprising the following steps:
s1, randomly selecting a plurality of first depth network parameters, performing deep learning haze removal on a haze image to obtain a plurality of first haze removal images, and performing blind image quality evaluation on the first haze removal images to obtain a plurality of first image quality scores;
s2, selecting a first high-quality deep network parameter from the plurality of first deep network parameters according to the first image quality score;
s3, performing rapid bacterial group swimming optimization operation on the first high-quality deep network parameters to obtain a plurality of second deep network parameters;
s4, performing deep learning haze removal on the haze images through the second depth network parameters to obtain second haze removal images, and performing blind image quality evaluation on the second haze removal images to obtain second image quality scores;
s5, selecting a second high-quality deep network parameter from the plurality of second deep network parameters according to the second image quality score, and judging whether the second high-quality deep network parameter is optimal according to the second image quality score corresponding to the second high-quality deep network parameter, if so, executing a step S6, and if not, executing a step S6-1;
s6, outputting a second haze removal image corresponding to the second high-quality deep network parameter and ending the process;
s6-1, defining the second premium deep network parameter as a new first premium deep network parameter and executing the step S3;
the determining whether the second high-quality deep network parameter is optimal according to the second image quality score corresponding to the second high-quality deep network parameter includes:
and calculating a difference value between a second image quality score corresponding to the second high-quality deep network parameter and a first image quality score corresponding to the first high-quality deep network parameter, judging whether the difference value meets a first preset condition, and if so, judging that the second high-quality deep network parameter is the optimal deep network parameter.
2. The method for clearing haze images according to claim 1, wherein in the step S5, the determining whether the second high-quality depth network parameter is optimal according to the second image quality score corresponding to the second high-quality depth network parameter includes:
and calculating whether a second image quality score corresponding to the second high-quality deep network parameter meets a second preset condition, if so, judging that the second high-quality deep network parameter is an optimal deep network parameter.
3. The method for clearing the haze images according to claim 1, wherein the first depth network parameter and the second depth network parameter are column vector parameters respectively, and the column vector parameters respectively comprise m parameter values, wherein m is greater than or equal to 1.
4. The method for clearing haze images according to claim 1, wherein the first depth network parameter and the second depth network parameter are row vector parameters, and each row vector parameter comprises n parameter values, wherein n is greater than or equal to 1.
5. The haze image sharpening method according to claim 1, wherein in the step S1, the blind image quality evaluation of the first haze-removed images to obtain first image quality scores comprises:
blind image quality evaluation is carried out on the first haze removal images by adopting a DIIVENE algorithm so as to obtain first image quality scores;
in the step S4, the blind image quality evaluation of the second haze-removed images to obtain second image quality scores includes:
and blind image quality evaluation is carried out on the second haze removal images by adopting the DIIVENE algorithm so as to obtain second image quality scores.
6. The haze image sharpening system is characterized by comprising: a processor, a memory,
the memory, for storing program instructions,
the processor configured to perform the steps of the method of any one of claims 1-5 in accordance with program instructions stored in the memory.
7. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 5.
CN201810886690.3A 2018-08-06 2018-08-06 Haze image sharpening method and system and storable medium Active CN109325920B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810886690.3A CN109325920B (en) 2018-08-06 2018-08-06 Haze image sharpening method and system and storable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810886690.3A CN109325920B (en) 2018-08-06 2018-08-06 Haze image sharpening method and system and storable medium

Publications (2)

Publication Number Publication Date
CN109325920A CN109325920A (en) 2019-02-12
CN109325920B true CN109325920B (en) 2022-02-22

Family

ID=65263584

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810886690.3A Active CN109325920B (en) 2018-08-06 2018-08-06 Haze image sharpening method and system and storable medium

Country Status (1)

Country Link
CN (1) CN109325920B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110428371A (en) * 2019-07-03 2019-11-08 深圳大学 Image defogging method, system, storage medium and electronic equipment based on super-pixel segmentation
CN112241670B (en) * 2019-07-18 2024-03-01 杭州海康威视数字技术股份有限公司 Image processing method and device

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104915933A (en) * 2015-06-01 2015-09-16 长安大学 Foggy day image enhancing method based on APSO-BP coupling algorithm

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9361670B2 (en) * 2014-09-04 2016-06-07 National Taipei University Of Technology Method and system for image haze removal based on hybrid dark channel prior
US10269098B2 (en) * 2016-11-01 2019-04-23 Chun Ming Tsang Systems and methods for removing haze in digital photos
US10127659B2 (en) * 2016-11-23 2018-11-13 General Electric Company Deep learning medical systems and methods for image acquisition
CN106910201B (en) * 2017-02-28 2020-03-27 江南大学 Method for processing image based on improved firework algorithm
CN107958465A (en) * 2017-10-23 2018-04-24 华南农业大学 A kind of single image to the fog method based on depth convolutional neural networks

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104915933A (en) * 2015-06-01 2015-09-16 长安大学 Foggy day image enhancing method based on APSO-BP coupling algorithm

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
PSO-based Parameters Selection for the Bilateral Filter in Image Denoising;chenyan wang et al;《GECCO ’17》;20170719;第442-448页 *
基于粒子群优化的快速细菌群游算法;储颖等;《数据采集与处理》;20100731;第25卷(第4期);第51-58页 *

Also Published As

Publication number Publication date
CN109325920A (en) 2019-02-12

Similar Documents

Publication Publication Date Title
CN108986050B (en) Image and video enhancement method based on multi-branch convolutional neural network
CN109977774B (en) Rapid target detection method based on adaptive convolution
CN107886169B (en) Multi-scale convolution kernel method for generating confrontation network model based on text-image
CN104835145B (en) Foreground detection method based on adaptive Codebook background models
CN111861925A (en) Image rain removing method based on attention mechanism and gate control circulation unit
CN104796582B (en) Video image denoising and Enhancement Method and device based on random injection retinex
JP2008542911A (en) Image comparison by metric embedding
CN111950389B (en) Depth binary feature facial expression recognition method based on lightweight network
JP2020038666A (en) Method for generating data set for learning for detection of obstacle in autonomous driving circumstances and computing device, learning method, and learning device using the same
CN109325920B (en) Haze image sharpening method and system and storable medium
KR20180109658A (en) Apparatus and method for image processing
CN112287906B (en) Template matching tracking method and system based on depth feature fusion
CN110866872A (en) Pavement crack image preprocessing intelligent selection method and device and electronic equipment
CN112419191A (en) Image motion blur removing method based on convolution neural network
CN113159236A (en) Multi-focus image fusion method and device based on multi-scale transformation
CN111242176B (en) Method and device for processing computer vision task and electronic system
CN115660998A (en) Image defogging method based on deep learning and traditional priori knowledge fusion
CN111914938A (en) Image attribute classification and identification method based on full convolution two-branch network
CN111753671A (en) Crowd counting method for real scene
CN113177956B (en) Semantic segmentation method for unmanned aerial vehicle remote sensing image
CN117409083A (en) Cable terminal identification method and device based on infrared image and improved YOLOV5
CN114821174B (en) Content perception-based transmission line aerial image data cleaning method
CN116631190A (en) Intelligent traffic monitoring system and method thereof
WO2011086594A1 (en) Image processing apparatus and method therefor
CN115631108A (en) RGBD-based image defogging method and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant