CN110211085B - Image fusion quality evaluation method and system - Google Patents

Image fusion quality evaluation method and system Download PDF

Info

Publication number
CN110211085B
CN110211085B CN201810168825.2A CN201810168825A CN110211085B CN 110211085 B CN110211085 B CN 110211085B CN 201810168825 A CN201810168825 A CN 201810168825A CN 110211085 B CN110211085 B CN 110211085B
Authority
CN
China
Prior art keywords
energy
image
pixel point
fused image
difference
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810168825.2A
Other languages
Chinese (zh)
Other versions
CN110211085A (en
Inventor
胡事民
朱哲
卢嘉铭
王敏轩
张松海
拉尔夫·马丁
刘涵涛
王巨宏
郑宇飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Shenzhen Tencent Computer Systems Co Ltd
Original Assignee
Tsinghua University
Shenzhen Tencent Computer Systems Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University, Shenzhen Tencent Computer Systems Co Ltd filed Critical Tsinghua University
Priority to CN201810168825.2A priority Critical patent/CN110211085B/en
Publication of CN110211085A publication Critical patent/CN110211085A/en
Application granted granted Critical
Publication of CN110211085B publication Critical patent/CN110211085B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

The invention provides an image fusion quality evaluation method and system. The method comprises the following steps: acquiring the difference of gray values of pixel points at each corresponding position of the fused image and the fused image; acquiring the defect degree of each pixel point in the fused image according to the gray value difference of the pixel point at each corresponding position; and acquiring a fusion quality evaluation result of the fused image according to the flaw degrees of all pixel points in the fused image. The system comprises: the gray difference module is used for acquiring the difference of the gray values of each corresponding pixel point before and after fusion; the defect acquisition module is used for acquiring the defect degree of each pixel point according to the difference of the gray values; and the quality evaluation module is used for obtaining a fusion quality evaluation result according to the flaw degree. The image fusion quality evaluation method and the image fusion quality evaluation system provided by the invention can objectively and quantitatively reflect the difference between the fused image and the expected image fusion result, and obtain the image fusion quality evaluation result close to the human eye impression.

Description

Image fusion quality evaluation method and system
Technical Field
The invention relates to the technical field of image processing, in particular to an image fusion quality evaluation method and system.
Background
In the internet era, images have become the primary carrier of information. Therefore, image editing becomes an increasingly important technology. Among image editing technologies, image fusion is a widely used technology, and has important applications in object insertion and image stitching.
In recent years, image fusion techniques have been extensively studied, and many different image fusion algorithms have emerged. These algorithms have particular application in different scenarios. In the virtual reality technology which has recently emerged, the splicing and fusion of images is a key technology. The quality of the fusion quality directly affects the viewing experience of the user. Since different image fusion algorithms are suitable for different application scenes, automatically evaluating the image fusion effect becomes a problem to be solved urgently.
In image processing technology, image quality evaluation has been intensively studied. The traditional image quality evaluation methods are mainly divided into two categories: a method of evaluating image quality with reference and a method of evaluating image quality without reference. The reference image quality evaluation method refers to an image evaluation method in which a reference image is provided and an evaluation result is obtained by comparing the generated image with the reference image. In image fusion, since a fused reference image cannot be obtained, the image quality fusion method with reference is not suitable for image fusion quality evaluation. The reference-free image quality evaluation method refers to a method that does not require provision of a reference image. However, the conventional non-reference image quality evaluation methods mainly consider details such as noise of the image itself, resolution of the details, and a degree of structure retention, and these methods do not analyze specifically for the image fusion quality, and therefore are not suitable for evaluation of image fusion. Therefore, a method specially for evaluating the image fusion quality is needed at present.
Disclosure of Invention
In order to overcome the defect that the image fusion quality can not be evaluated in the prior art, the invention provides an image fusion quality evaluation method and system.
According to a first aspect of the present invention, there is provided an image fusion quality evaluation method, including:
s1, acquiring the difference between the gray values of the pixels at each corresponding position of the fused image and the fused image;
s2, acquiring the defect degree of each pixel point in the fused image according to the difference of the gray values of the pixel points at each corresponding position;
s3, acquiring a fusion quality evaluation result of the fused image according to the defect degrees of all pixel points in the fused image;
and the defect degree is the difference degree between the pixel point in the fused image and the expected image fusion result of the pixel point.
Preferably, the step S1 is preceded by:
and converting the fused image and the image before fusion into a gray image.
Preferably, the step S2 further includes:
s21, taking the absolute value of the difference between the gray values of the pixels at each corresponding position as the energy value of the pixel at each corresponding position in the first energy map to obtain the first energy map;
s22, obtaining the average energy of the first energy map according to the energy value of each pixel point in the first energy map;
s23, determining the defect degree of each pixel point in the fused image according to the energy value of each pixel point in the first energy map and the average energy.
Preferably, the step S22 specifically includes:
carrying out binarization on the first energy map to obtain a second energy map, and counting the number of non-zero pixel points in the second energy map;
acquiring the sum of energy values of each pixel point in the first energy diagram, and taking the sum of the energy values as the energy sum of the first energy diagram;
and acquiring the average energy according to the number of non-zero pixel points in the second energy diagram and the energy sum.
Preferably, the step S23 specifically includes:
for each pixel point in the first energy map, acquiring a first difference value between the energy value of the pixel point and the average energy;
when the first difference is larger than zero, taking the first difference as the defect degree of a pixel point in the fused image at the corresponding position of the pixel point; and when the first difference is not more than zero, taking zero as the defect degree of the pixel point in the fused image at the corresponding position of the pixel point.
Preferably, the step S3 specifically includes:
and acquiring the square sum of the flaw degrees of all the pixel points in the fused image, and taking the square sum as the fusion quality evaluation result of the fused image.
According to a second aspect of the present invention, there is provided an image fusion quality evaluation system including:
the gray difference module is used for acquiring the difference between the gray values of the pixels at each corresponding position of the fused image and the fused image;
the defect acquisition module is used for acquiring the defect degree of each pixel point in the fused image according to the gray value difference of the pixel point at each corresponding position;
the quality evaluation module is used for obtaining a fusion quality evaluation result of the fused image according to the defect degrees of all pixel points in the fused image;
and the defect degree is the difference degree between the pixel point in the fused image and the expected image fusion result of the pixel point.
Preferably, the flaw acquisition module includes:
the energy map submodule is used for taking the absolute value of the difference between the gray values of the pixels at each corresponding position as the energy value of the pixels at each corresponding position in the first energy map to obtain the first energy map;
the average energy submodule is used for acquiring the average energy of the first energy map according to the energy value of each pixel point in the first energy map;
and the flaw calculation submodule is used for determining the flaw degree of each pixel point in the fused image according to the energy value of each pixel point in the first energy diagram and the average energy.
According to a third aspect of the present invention, there is provided a computer program product comprising a computer program stored on a non-transitory computer readable storage medium, the computer program comprising program instructions which, when executed by a computer, cause the computer to perform the above method.
According to a fourth aspect of the invention, there is provided a non-transitory computer readable storage medium storing computer instructions which cause the computer to perform the above method.
According to the image fusion quality evaluation method and system provided by the invention, the integral difference condition between the fused image and the expected image fusion result is obtained by obtaining the difference degree between each pixel point of the fused image and the expected image fusion result, the image fusion quality evaluation result is evaluated according to the integral difference condition between the fused image and the expected image fusion result, the difference between the fused image and the expected image fusion result can be objectively and quantitatively reflected, and the image fusion quality evaluation result close to the human eye impression is obtained.
Drawings
FIG. 1 is a flowchart of an image fusion quality evaluation method according to an embodiment of the present invention;
fig. 2 is a functional block diagram of an image fusion quality evaluation system according to an embodiment of the present invention.
Detailed Description
The following detailed description of embodiments of the present invention is provided in connection with the accompanying drawings and examples. The following examples are intended to illustrate the invention but are not intended to limit the scope of the invention.
Fig. 1 is a flowchart of an image fusion quality evaluation method according to an embodiment of the present invention. As shown in fig. 1, an image fusion quality evaluation method includes: step S1, acquiring the difference between the gray values of the pixels at each corresponding position of the fused image and the fused image; step S2, acquiring the defect degree of each pixel point in the fused image according to the difference of the gray values of the pixel points at each corresponding position; s3, acquiring a fusion quality evaluation result of the fused image according to the defect degrees of all pixel points in the fused image; and the defect degree is the difference degree between the pixel point in the fused image and the expected image fusion result of the pixel point.
The image fusion quality evaluation method provided by the embodiment of the invention is applicable to, but not limited to, the following conditions: fusing the image A into the image B to obtain a fused image B'; wherein, the image B and the image B' are respectively an image before fusion and an image after fusion.
When the images are fused, the expected image fusion result is the image before fusion. The closer the fused image is to the image before fusion, the better the image fusion is. Therefore, the smaller the difference between the fused image and the image before fusion is, the higher the image fusion quality is; the greater the difference between the fused image and the image before fusion, the lower the image fusion quality.
In the actual image fusion process, due to reasons such as the boundary condition is not ideal enough, the fused image has unexpected fusion effects such as blurring and color overflowing, and the pixels of the blurring and color overflowing in the fused image are called as flaws. The pixels with unexpected fusion effects such as blur, color overflow and the like in the fused image have very obvious difference with the image before fusion.
The basic idea of the image fusion quality evaluation method provided by the embodiment of the invention is as follows: the image fusion quality is evaluated by acquiring the flaws in the actually obtained fused image B' and based on the flaws.
Specifically, in step S1, the gray values of the pixels at each corresponding position of the fused image and the fused image are subtracted respectively to obtain the difference between the gray values of the pixels at each corresponding position.
The image B is fused with the image A to obtain an image B ', and the difference between the gray values of the pixel points at each corresponding position of the image B' and the image B reflects the difference between each position of the fused image and the image before fusion.
And the difference between the fused image and the position in the image before fusion is very small, namely the difference between the gray values of the pixels at the position is close to zero.
Step S2, according to the difference between the gray values of the pixel points at each corresponding position of the fused image and the fused image, the difference between each pixel point and the expected image fusion result of the pixel point can be obtained.
The difference degree between the pixel point and the expected image fusion result of the pixel point is the defect degree of the pixel point.
The larger the defect degree of a certain pixel point is, the larger the difference between the expected image fusion result of the pixel point and the pixel point is, and the lower the image fusion quality of the pixel point is.
Step S3, after the defect degree of each pixel point of the fused image is obtained through the step S2, the overall situation of the defects in the fused image can be obtained according to the defect degree of each pixel point of the fused image; according to the overall situation of the flaws in the fused image, the overall difference situation between the fused image and the expected image fusion result can be reflected, and therefore the fusion quality evaluation result of the fused image is obtained according to the overall difference situation between the fused image and the expected image fusion result.
According to the embodiment of the invention, the integral difference condition between the fused image and the expected image fusion result is obtained by obtaining the difference degree between each pixel point of the fused image and the expected image fusion result, the image fusion quality evaluation result is evaluated according to the integral difference condition between the fused image and the expected image fusion result, the difference between the fused image and the expected image fusion result can be objectively and quantitatively reflected, and the image fusion quality evaluation result close to the human eye impression is obtained.
Based on the foregoing embodiment, as an alternative embodiment, step S1 is preceded by: and converting the fused image and the image before fusion into a gray image.
The image fusion includes fusion of color images and fusion of grayscale images.
When evaluating the fusion quality of the gray level image, the difference between the gray level values of the pixels at each corresponding position of the fused image and the fused image can be directly obtained through step S1.
As an optional embodiment, when evaluating the fusion quality of the color images, before obtaining the difference between the gray values of the pixel points at each corresponding position of the fused image and the fused image, the fused image and the fused image are converted into gray images by using the same graying method; and after the fused image and the image before fusion are converted into gray level images, subtracting the gray level values of the pixel points at each corresponding position of the fused image and the image before fusion after the fused image and the image before fusion are converted into the gray level images respectively to obtain the difference of the gray level values of the pixel points at each corresponding position of the fused image and the image before fusion.
According to the embodiment of the invention, the fused image and the image before fusion are converted into the gray level image, so that the quality evaluation result of the color fused image can be obtained, the difference between the fused image and the expected image fusion result is objectively and quantitatively reflected, and the image fusion quality evaluation result close to the human eye impression is obtained.
As the color image includes several components, most commonly RGB components, as another optional embodiment, when evaluating the fusion quality of the color image, the component values of the pixel points at each corresponding position of the fused image and the fused image before fusion are subtracted to obtain a difference image of the fused image and the fused image before fusion, and the difference image is grayed to obtain the difference between the gray values of the pixel points at each corresponding position of the fused image and the fused image before fusion.
Based on the above embodiment, step S2 further includes: step S21, taking the absolute value of the difference of the gray values of the pixels at each corresponding position as the energy value of the pixels at each corresponding position in the first energy map to obtain the first energy map; step S22, acquiring the average energy of the first energy map according to the energy value of each pixel point in the first energy map; and step S23, determining the defect degree of each pixel point in the fused image according to the energy value and the average energy of each pixel point in the first energy map.
Specifically, the step of obtaining the defect degree of each pixel point in the fused image through the step S2 further includes the following steps.
Step S21, after the difference between the gray values of the pixels at each corresponding position of the fused image and the fused image is obtained in step S1, the absolute value of the difference between the gray values of the pixels at each corresponding position is used as the value of the pixel at each corresponding position in the first energy map, that is, the energy value of the pixel at each corresponding position in the first energy map, so as to obtain the first energy map.
The energy value of each pixel point in the first energy map reflects the difference degree between the fused image and the image before fusion at the corresponding position of the pixel point. The larger the difference between the fused image at the position corresponding to the pixel point and the image before fusion is, the larger the energy value of the pixel point in the first energy map is. Ideally, the energy value of each pixel point in the first energy map is the same.
Because the more serious the degree of the unexpected fusion effects such as blurring, color overflowing and the like of pixels with unexpected fusion effects in the fused image is, the larger the energy value of the pixel point in the first energy map at the corresponding position of the pixel point is, and therefore, the energy value of the pixel point in the first energy map corresponding to the flaw in the fused image is obviously higher than the energy values of other pixel points in the first energy map. Therefore, a suitable threshold value of the energy value can be selected for distinguishing the pixel points with the energy value obviously larger in the first energy map.
In step S22, a threshold of an appropriate energy value may be determined from the energy values of all the pixels in the first energy map, and the threshold of the energy value is referred to as an average energy of the first energy map.
Step S23, after obtaining the average energy of the first energy map through step S22, for each pixel point in the first energy map, determining the defect degree of the pixel point at the corresponding position of the pixel point in the fused image according to the energy value and the average energy of the pixel point.
Based on the foregoing embodiment, as a preferred embodiment, step S22 specifically includes: binarizing the first energy map to obtain a second energy map, and counting the number of non-zero pixel points in the second energy map; acquiring the sum of energy values of each pixel point in the first energy diagram, and taking the sum of the energy values as the energy sum of the first energy diagram; and obtaining the average energy according to the number and the energy sum of the non-zero pixel points in the second energy diagram.
As a preferred embodiment, the average energy is determined by the following steps.
And binarizing the first energy map to obtain a second energy map. Preferably, the first energy map is binarized by using a maximum inter-class variance method, that is, the Otsu method, to obtain a second energy map. By binarizing the first energy map, defects and non-defects in the fused image can be well distinguished.
The maximum inter-class variance method is a self-adaptive threshold determination method, which is called OTSU for short. The method divides an image into a background part and an object part according to the gray characteristic of the image. The larger the inter-class variance between the background and the object, the larger the difference between the two parts constituting the image, and the smaller the difference between the two parts when part of the object is mistaken for the background or part of the background is mistaken for the object. Therefore, using the partition with the largest inter-class variance means that the probability of false scores is minimal.
And after the second energy graph is obtained, counting the number of non-zero pixel points in the second energy graph. The number of non-zero pixel points in the second energy map is the number of defects in the fused image.
And adding the energy values of all the pixel points in the first energy diagram to obtain the energy sum of the first energy diagram.
And obtaining the average energy according to the energy sum of the first energy diagram and the number of non-zero pixel points in the second energy diagram.
Preferably, the average energy is
Figure BDA0001585232500000091
Wherein A ishThe number of non-zero pixel points in the second energy map; ehIs the energy sum of the first energy map; alpha is a constant and is generally 2; delta is a very small constant, typically 10-8To avoid the denominator being 0 in the above formula in special cases.
Based on the foregoing embodiment, as a preferred embodiment, step S23 specifically includes: for each pixel point in the first energy map, acquiring a first difference value between the energy value of the pixel point and the average energy; when the first difference is larger than zero, the first difference is used as the defect degree of the pixel point in the fused image at the corresponding position of the pixel point; and when the first difference is not more than zero, taking the zero as the defect degree of the pixel point in the fused image at the corresponding position of the pixel point.
As a preferred embodiment, the defect degree of the pixel point at the corresponding position of the pixel point in the fused image is determined according to the difference between the energy value of the pixel point and the average energy.
The defect degree B (x) of the pixel points in the fused image is
Figure BDA0001585232500000092
Wherein, x represents the energy value of the pixel point at the corresponding position of the pixel point in the first energy diagram.
The image fusion quality evaluation method and the image fusion quality evaluation device quantize the flaws in the fused image according to the average energy of the first energy map, can objectively and quantitatively reflect the difference between the fused image and an expected image fusion result, and obtain an image fusion quality evaluation result close to the human eye impression.
Based on the foregoing embodiment, as a preferred embodiment, step S3 specifically includes: and acquiring the square sum of the flaw degrees of all the pixel points in the fused image, and taking the square sum as the fusion quality evaluation result of the fused image.
As a preferred embodiment, after the defect degree of each pixel point in the fused image is obtained, the sum of squares of the defect degrees of all pixel points in the fused image is calculated, and the sum of squares is used as the fusion quality evaluation result of the fused image.
V=∑B(x)2
Where V represents the fusion quality evaluation result of the fused image.
The sum of squares of the degrees of defects of all pixels in the fused image is the sum of squares of the difference between the energy value of each pixel in the first energy map and the average energy, belongs to the sum of squared deviations, reflects the difference between the gray values of the pixels at each corresponding position of the fused image and the fused image, and the deviation degree of the average value of the difference between the gray values of the pixels at all corresponding positions of the fused image and the fused image, and reflects the difference degree between the fused image and the expected image fusion result, so that the sum of squares of the degrees of defects of all pixels in the fused image can be used as the fusion quality evaluation result of the fused image.
The embodiment of the invention takes the sum of the squares of the flaw degrees of all the pixel points in the fused image as the fusion quality evaluation result of the fused image, accurately reflects the difference degree between the fused image and the expected image fusion result, and obtains the image fusion quality evaluation result close to the human eye impression.
Fig. 2 is a functional block diagram of an image fusion quality evaluation system according to an embodiment of the present invention. As shown in fig. 2, based on the above-described embodiment, an image fusion quality evaluation system includes: a gray difference module 201, configured to obtain a difference between gray values of pixels at each corresponding position of the fused image and the fused image; a defect obtaining module 202, configured to obtain a defect degree of each pixel in the fused image according to a difference between gray values of pixels at each corresponding position; the quality evaluation module 203 is used for obtaining a fusion quality evaluation result of the fused image according to the defect degrees of all pixel points in the fused image; and the defect degree is the difference degree between the pixel point in the fused image and the expected image fusion result of the pixel point.
Specifically, the gray difference module 201 is electrically connected to the defect acquisition module 202, and transmits an electrical signal; the flaw acquisition module 202 is electrically connected with the quality evaluation module 203 and transmits an electrical signal.
The gray difference module 201 obtains the gray value difference between the pixel points at each corresponding position of the fused image and the fused image, and then transmits the obtained gray value difference between the pixel points at each corresponding position of the fused image and the fused image to the defect obtaining module 202.
The defect obtaining module 202 receives the gray value difference between the pixel points at each corresponding position of the fused image and the fused image, obtains the defect degree of each pixel point in the fused image according to the gray value difference between the pixel points at each corresponding position, and transmits the defect degree of each pixel point in the fused image to the quality evaluation module 203.
The quality evaluation module 203 obtains a fusion quality evaluation result of the fused image according to the defect degrees of all the pixel points in the fused image.
The image fusion quality evaluation system provided by the invention is used for executing the image fusion quality evaluation system provided by the invention. The specific method and flow for each module included in the image fusion quality evaluation system to implement the corresponding function are detailed in the embodiment of the image fusion quality evaluation system, and are not described herein again.
According to the embodiment of the invention, the integral difference condition between the fused image and the expected image fusion result is obtained by obtaining the difference degree between each pixel point of the fused image and the expected image fusion result, the image fusion quality evaluation result is evaluated according to the integral difference condition between the fused image and the expected image fusion result, the difference between the fused image and the expected image fusion result can be objectively and quantitatively reflected, and the image fusion quality evaluation result close to the human eye impression is obtained.
Based on the above embodiment, the defect acquisition module 202 includes: the energy map submodule is used for taking the absolute value of the difference between the gray values of the pixels at each corresponding position as the energy value of the pixels at each corresponding position in the first energy map to obtain the first energy map; the average energy submodule is used for acquiring the average energy of the first energy map according to the energy value of each pixel point in the first energy map; and the flaw calculation submodule is used for determining the flaw degree of each pixel point in the fused image according to the energy value and the average energy of each pixel point in the first energy map.
Specifically, the energy map sub-module is electrically connected with the average energy sub-module and transmits an electric signal; the average energy submodule is electrically connected with the flaw calculation submodule and transmits an electric signal.
After the energy map submodule obtains the first energy map, the first energy map is transmitted to the average energy submodule.
After the average energy sub-module obtains the average energy of the first energy map, the average energy is transmitted to the flaw calculation sub-module.
And the defect calculation submodule determines the defect degree of each pixel point in the fused image according to the energy value and the average energy of each pixel point in the first energy map.
The image fusion quality evaluation system provided by the invention is used for executing the image fusion quality evaluation system provided by the invention. The detailed method and flow for each sub-module included in the flaw acquisition module of the image fusion quality evaluation system to implement the corresponding function are described in the above embodiment of the image fusion quality evaluation system, and are not described herein again.
Based on the above embodiments, the present embodiment discloses a computer program product, the computer program product includes a computer program stored on a non-transitory computer readable storage medium, the computer program includes program instructions, when executed by a computer, cause the computer to execute the methods provided by the above method embodiments, for example, the method includes: an image fusion quality evaluation method and a method for acquiring flaw degree.
Based on the foregoing embodiments, the present embodiment discloses a non-transitory computer-readable storage medium, which stores computer instructions, where the computer instructions cause a computer to execute the methods provided by the foregoing method embodiments, for example, the method includes: an image fusion quality evaluation method and a method for acquiring flaw degree.
Those of ordinary skill in the art will understand that: all or part of the steps for implementing the method embodiments may be implemented by hardware related to program instructions, and the program may be stored in a computer readable storage medium, and when executed, the program performs the steps including the method embodiments; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
The above-described embodiments of the image fusion quality evaluation system and the like are merely illustrative, and units described as separate components may or may not be physically separate, and components described as units may or may not be physical units, that is, may be located in one place, or may be distributed over a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods of the various embodiments or some parts of the embodiments.
Finally, the above-mentioned embodiments of the present invention are merely preferred embodiments, and are not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (7)

1. An image fusion quality evaluation method is characterized by comprising the following steps:
s1, acquiring the difference between the gray values of the pixels at each corresponding position of the fused image and the fused image;
s2, acquiring the defect degree of each pixel point in the fused image according to the difference of the gray values of the pixel points at each corresponding position;
s3, acquiring a fusion quality evaluation result of the fused image according to the defect degrees of all pixel points in the fused image;
the flaw degree is the difference degree between a pixel point in the fused image and an expected image fusion result of the pixel point;
wherein the step S2 further includes:
s21, taking the absolute value of the difference between the gray values of the pixels at each corresponding position as the energy value of the pixel at each corresponding position in the first energy map to obtain the first energy map;
s22, obtaining the average energy of the first energy map according to the energy value of each pixel point in the first energy map;
s23, determining the defect degree of each pixel point in the fused image according to the energy value of each pixel point in the first energy diagram and the average energy;
the step S22 specifically includes:
carrying out binarization on the first energy map to obtain a second energy map, and counting the number of non-zero pixel points in the second energy map;
acquiring the sum of energy values of each pixel point in the first energy diagram, and taking the sum of the energy values as the energy sum of the first energy diagram;
and acquiring the average energy according to the number of non-zero pixel points in the second energy diagram and the energy sum.
2. The image fusion quality evaluation method according to claim 1, further comprising, before step S1:
and converting the fused image and the image before fusion into a gray image.
3. The image fusion quality evaluation method according to claim 1, wherein the step S23 specifically includes:
for each pixel point in the first energy map, acquiring a first difference value between the energy value of the pixel point and the average energy;
when the first difference is larger than zero, taking the first difference as the defect degree of a pixel point in the fused image at the corresponding position of the pixel point; and when the first difference is not more than zero, taking zero as the defect degree of the pixel point in the fused image at the corresponding position of the pixel point.
4. The image fusion quality evaluation method according to any one of claims 1 to 3, wherein the step S3 specifically includes:
and acquiring the square sum of the flaw degrees of all the pixel points in the fused image, and taking the square sum as the fusion quality evaluation result of the fused image.
5. An image fusion quality evaluation system characterized by comprising:
the gray difference module is used for acquiring the difference between the gray values of the pixels at each corresponding position of the fused image and the fused image;
the defect acquisition module is used for acquiring the defect degree of each pixel point in the fused image according to the gray value difference of the pixel point at each corresponding position;
wherein, the acquiring the defect degree of each pixel point in the fused image comprises:
taking the absolute value of the difference between the gray values of the pixels at each corresponding position as the energy value of the pixel at each corresponding position in the first energy map to obtain the first energy map;
acquiring the average energy of the first energy map according to the energy value of each pixel point in the first energy map;
determining the defect degree of each pixel point in the fused image according to the energy value of each pixel point in the first energy map and the average energy;
the quality evaluation module is used for obtaining a fusion quality evaluation result of the fused image according to the defect degrees of all pixel points in the fused image;
the flaw degree is the difference degree between a pixel point in the fused image and an expected image fusion result of the pixel point;
the obtaining the average energy of the first energy map according to the energy value of each pixel point in the first energy map specifically includes:
carrying out binarization on the first energy map to obtain a second energy map, and counting the number of non-zero pixel points in the second energy map;
acquiring the sum of energy values of each pixel point in the first energy diagram, and taking the sum of the energy values as the energy sum of the first energy diagram;
and acquiring the average energy according to the number of non-zero pixel points in the second energy diagram and the energy sum.
6. The image fusion quality evaluation system of claim 5, wherein the flaw acquisition module comprises:
the energy map submodule is used for taking the absolute value of the difference between the gray values of the pixels at each corresponding position as the energy value of the pixels at each corresponding position in the first energy map to obtain the first energy map;
the average energy submodule is used for acquiring the average energy of the first energy map according to the energy value of each pixel point in the first energy map;
and the flaw calculation submodule is used for determining the flaw degree of each pixel point in the fused image according to the energy value of each pixel point in the first energy diagram and the average energy.
7. A non-transitory computer-readable storage medium storing computer instructions that cause a computer to perform the method of any one of claims 1 to 4.
CN201810168825.2A 2018-02-28 2018-02-28 Image fusion quality evaluation method and system Active CN110211085B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810168825.2A CN110211085B (en) 2018-02-28 2018-02-28 Image fusion quality evaluation method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810168825.2A CN110211085B (en) 2018-02-28 2018-02-28 Image fusion quality evaluation method and system

Publications (2)

Publication Number Publication Date
CN110211085A CN110211085A (en) 2019-09-06
CN110211085B true CN110211085B (en) 2021-04-27

Family

ID=67778767

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810168825.2A Active CN110211085B (en) 2018-02-28 2018-02-28 Image fusion quality evaluation method and system

Country Status (1)

Country Link
CN (1) CN110211085B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113674232A (en) * 2021-08-12 2021-11-19 Oppo广东移动通信有限公司 Image noise estimation method and device, electronic equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1604139A (en) * 2004-10-28 2005-04-06 上海交通大学 Method for constructing image fusion estimation system
CN101115131A (en) * 2006-07-28 2008-01-30 南京理工大学 Pixel space relativity based image syncretizing effect real-time estimating method and apparatus
CN101334893A (en) * 2008-08-01 2008-12-31 天津大学 Fused image quality integrated evaluating method based on fuzzy neural network
CN102682439A (en) * 2012-01-15 2012-09-19 河南科技大学 Medical image fusion method based on multidirectional empirical mode decomposition
CN103854265A (en) * 2012-12-03 2014-06-11 西安元朔科技有限公司 Novel multi-focus image fusion technology
CN105447837A (en) * 2016-02-04 2016-03-30 重庆邮电大学 Multi-mode brain image fusion method based on adaptive cloud model

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1604139A (en) * 2004-10-28 2005-04-06 上海交通大学 Method for constructing image fusion estimation system
CN101115131A (en) * 2006-07-28 2008-01-30 南京理工大学 Pixel space relativity based image syncretizing effect real-time estimating method and apparatus
CN101334893A (en) * 2008-08-01 2008-12-31 天津大学 Fused image quality integrated evaluating method based on fuzzy neural network
CN102682439A (en) * 2012-01-15 2012-09-19 河南科技大学 Medical image fusion method based on multidirectional empirical mode decomposition
CN103854265A (en) * 2012-12-03 2014-06-11 西安元朔科技有限公司 Novel multi-focus image fusion technology
CN105447837A (en) * 2016-02-04 2016-03-30 重庆邮电大学 Multi-mode brain image fusion method based on adaptive cloud model

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Infrared and Low-Light-Level Image Fusion Quality Evaluation Based on the Grey Correlation Analysis;Weipeng Zhang;《Research Journal of Applied Sciences, Engineering and Technology》;20130525;第2902-2907页 *
objective Image Fusion Quality Evaluation Using Structural Similarity;ZHENG Youzhi 等;《TSINGHUA SCIENCE ANDTECHNOLOGY》;20091231;第703-709页 *

Also Published As

Publication number Publication date
CN110211085A (en) 2019-09-06

Similar Documents

Publication Publication Date Title
US10169655B2 (en) Detection of logos in a sequence of video frames
US9619897B2 (en) Correction of blotches in component images
US20200311981A1 (en) Image processing method, image processing apparatus, image processing system, and learnt model manufacturing method
KR20180109665A (en) A method and apparatus of image processing for object detection
KR100485594B1 (en) A method for removing noise in image and a system thereof
CN109284673B (en) Object tracking method and device, electronic equipment and storage medium
CN109829859B (en) Image processing method and terminal equipment
JP2006067585A (en) Method and apparatus for specifying position of caption in digital image and extracting thereof
CN111696064B (en) Image processing method, device, electronic equipment and computer readable medium
CN110708568B (en) Video content mutation detection method and device
CN111031359B (en) Video playing method and device, electronic equipment and computer readable storage medium
CN116189079A (en) Abnormality detection method and device for monitoring equipment
US8442348B2 (en) Image noise reduction for digital images using Gaussian blurring
CN113762220B (en) Object recognition method, electronic device, and computer-readable storage medium
CN110889817A (en) Image fusion quality evaluation method and device
CN110211085B (en) Image fusion quality evaluation method and system
US7646892B2 (en) Image inspecting apparatus, image inspecting method, control program and computer-readable storage medium
CN113628192B (en) Image blur detection method, apparatus, device, storage medium, and program product
CN116612355A (en) Training method and device for face fake recognition model, face recognition method and device
US9589331B2 (en) Method and apparatus for determining a detection of a defective object in an image sequence as a misdetection
CN114821596A (en) Text recognition method and device, electronic equipment and medium
CN111161211B (en) Image detection method and device
KR102556350B1 (en) Method and Apparatus for Calculating Ratio of Lesion Area
CN110942420B (en) Method and device for eliminating image captions
CN108447107B (en) Method and apparatus for generating video

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant