CN113793291A - Image fusion method and device, electronic equipment and storage medium - Google Patents

Image fusion method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113793291A
CN113793291A CN202111200087.3A CN202111200087A CN113793291A CN 113793291 A CN113793291 A CN 113793291A CN 202111200087 A CN202111200087 A CN 202111200087A CN 113793291 A CN113793291 A CN 113793291A
Authority
CN
China
Prior art keywords
image
pixel point
saturation
fusion weight
fusion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111200087.3A
Other languages
Chinese (zh)
Other versions
CN113793291B (en
Inventor
张鹤
丁玲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision Digital Technology Co Ltd
Original Assignee
Hangzhou Hikvision Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision Digital Technology Co Ltd filed Critical Hangzhou Hikvision Digital Technology Co Ltd
Priority to CN202111200087.3A priority Critical patent/CN113793291B/en
Publication of CN113793291A publication Critical patent/CN113793291A/en
Application granted granted Critical
Publication of CN113793291B publication Critical patent/CN113793291B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Processing Of Color Television Signals (AREA)
  • Color Television Image Signal Generators (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the invention provides an image fusion method, an image fusion device, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring an image to be processed; performing white balance correction on the image to be processed by adopting the first white balance parameter to obtain a first image, and performing white balance correction on the image to be processed by adopting the second white balance parameter to obtain a second image; extracting a saturation characteristic of the first image to serve as a first saturation characteristic; extracting the saturation characteristic of the second image to be used as a second saturation characteristic; determining a first fusion weight corresponding to the first image based on the first saturation characteristic and the relation between the white balance and the saturation of the image, and determining a second fusion weight corresponding to the second image based on the second saturation characteristic and the relation between the white balance and the saturation of the image; and performing image fusion on the first image and the second image based on the first fusion weight and the second fusion weight to obtain a fused image. The problem of color cast of images shot under the complex illumination environment is solved.

Description

Image fusion method and device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to an image fusion method and apparatus, an electronic device, and a storage medium.
Background
The shooting scenes of the images are complex and various, a plurality of light sources may exist in the scenes at the same time, the light sources may be sunlight, car lights, incandescent lamps, fluorescent lamps, moonlights and the like, the images shot in the scenes are influenced by different light sources, and the shot images have color cast. Partial areas in the image may yellow or blue, causing distortion in the visual effect of the image.
In the current image color cast problem processing mode, a fixed color temperature value is usually adopted to perform global white balance correction on an image. Images shot under complex light sources are subjected to global correction, only partial areas are free of color cast, and the color cast problem still exists in other partial areas.
For example, when the image is corrected according to the high color temperature, the low color temperature part in the image is yellow; when the image is corrected according to the low color temperature, the high color temperature part in the image is blue. If the image is subjected to global white balance correction by using a single color temperature, color deviation exists in both a low color temperature part and a high color temperature part in the image. It can be seen that the current image color cast problem processing method cannot well solve the color cast problem of the image shot in the complex concern environment.
Disclosure of Invention
The embodiment of the invention aims to provide an image fusion method, an image fusion device, electronic equipment and a storage medium, so as to solve the color cast problem of an image shot in a complex illumination environment. The specific technical scheme is as follows:
in a first aspect, an embodiment of the present invention provides an image fusion method, where the method includes:
acquiring an image to be processed;
performing white balance correction on the image to be processed by adopting a first white balance parameter to obtain a first image, and performing white balance correction on the image to be processed by adopting a second white balance parameter to obtain a second image; wherein the color temperature of the first image is higher than the color temperature of the second image;
extracting the saturation characteristic of the first image to serve as a first saturation characteristic, and extracting the saturation characteristic of the second image to serve as a second saturation characteristic;
determining a first fusion weight corresponding to the first image based on the first saturation characteristic and the relation between the white balance and the saturation of the image, and determining a second fusion weight corresponding to the second image based on the second saturation characteristic and the relation between the white balance and the saturation of the image; the fusion weight is used for identifying the degree that the corresponding pixel point in the image is close to the actual color;
and performing image fusion on the first image and the second image based on the first fusion weight and the second fusion weight to obtain a fused image.
Optionally, the step of extracting the saturation feature of the first image as a first saturation feature, and extracting the saturation feature of the second image as a second saturation feature includes:
calculating the difference value between the maximum value and the minimum value in R, G, B values of each pixel point in the first image as a first saturation characteristic corresponding to the pixel point;
and calculating the difference value between the maximum value and the minimum value in R, G, B values of each pixel point in the second image as a second saturation characteristic corresponding to the pixel point.
Optionally, the step of determining a first fusion weight corresponding to the first image based on the first saturation feature and the relationship between the white balance and the saturation of the image, and determining a second fusion weight corresponding to the second image based on the second saturation feature and the relationship between the white balance and the saturation of the image includes:
aiming at each pixel point in the first image, determining a first credibility corresponding to the pixel point according to a first saturation characteristic corresponding to the pixel point and a corresponding relation between a preset saturation characteristic and a credibility; the credibility is used for representing the credibility of R, G, B values of corresponding pixel points in the image;
aiming at each pixel point in the second image, determining a second credibility corresponding to the pixel point according to a second saturation characteristic of the pixel point and the corresponding relation;
determining a first fusion weight corresponding to each pixel point in the first image and a second fusion weight corresponding to each pixel point in the second image based on the first confidence level and the second confidence level.
Optionally, the step of determining a first fusion weight corresponding to each pixel point in the first image and a second fusion weight corresponding to each pixel point in the second image based on the first confidence level and the second confidence level includes:
calculating the ratio between the first credibility corresponding to the pixel point and the first target value as the first fusion weight corresponding to the pixel point aiming at each pixel point in the first image; the first target value is the sum of a first credibility corresponding to the pixel point and a second credibility corresponding to the pixel point with the same position as the pixel point in the second image;
calculating the ratio between the second reliability corresponding to the pixel point and the second target value as a second fusion weight corresponding to the pixel point aiming at each pixel point in the second image; the second target value is the sum of a second credibility corresponding to the pixel point and a first credibility corresponding to a pixel point in the first image, wherein the pixel point is located at the same position as the pixel point; or the like, or, alternatively,
and determining the difference value between 1 and the first fusion weight corresponding to the pixel point with the same position as the pixel point in the first image as the second fusion weight corresponding to the pixel point aiming at each pixel point in the second image.
Optionally, the corresponding relationship is a relationship curve between a preset saturation feature and a reliability, and the reliability and the saturation feature are in a negative correlation relationship;
the step of determining, for each pixel point in the first image, a first reliability corresponding to the pixel point according to a first saturation characteristic corresponding to the pixel point and a correspondence between a preset saturation characteristic and a reliability includes:
aiming at each pixel point in the first image, determining a reliability coordinate value corresponding to the first saturation characteristic in the relation curve according to the first saturation characteristic corresponding to the pixel point, and determining the reliability coordinate value as a first reliability corresponding to the pixel point;
the step of determining, for each pixel point in the second image, a second reliability corresponding to the pixel point according to the second saturation characteristic of the pixel point and the correspondence relationship includes:
and determining the credibility coordinate value corresponding to the second saturation characteristic in the relation curve according to the second saturation characteristic corresponding to the pixel point for each pixel point in the second image, and determining the credibility coordinate value as the second credibility corresponding to the pixel point.
Optionally, the step of performing image fusion on the first image and the second image based on the first fusion weight and the second fusion weight to obtain a fused image includes:
and carrying out weighted summation on R, G, B values of the pixel points with the same positions in the first image and the second image according to the corresponding first fusion weight and the second fusion weight, and taking the weighted summation as the R, G, B value of the pixel point at the position after fusion to obtain the fused image.
In a second aspect, an embodiment of the present invention provides an image fusion apparatus, where the apparatus includes:
the image acquisition module is used for acquiring an image to be processed;
the white balance correction module is used for carrying out white balance correction on the image to be processed by adopting a first white balance parameter to obtain a first image and carrying out white balance correction on the image to be processed by adopting a second white balance parameter to obtain a second image; wherein the color temperature of the first image is higher than the color temperature of the second image;
the saturation characteristic extraction module is used for extracting the saturation characteristic of the first image to serve as a first saturation characteristic and extracting the saturation characteristic of the second image to serve as a second saturation characteristic;
a fusion weight determining module, configured to determine a first fusion weight corresponding to the first image based on the first saturation feature and a relationship between image white balance and saturation, and determine a second fusion weight corresponding to the second image based on the second saturation feature and a relationship between image white balance and saturation; the fusion weight is used for identifying the degree that the corresponding pixel point in the image is close to the actual color;
and the image fusion module is used for carrying out image fusion on the first image and the second image based on the first fusion weight and the second fusion weight to obtain a fused image.
Optionally, the saturation feature extraction module includes:
a first saturation feature extraction unit, configured to calculate, for each pixel point in the first image, a difference between a maximum value and a minimum value in R, G, B values of the pixel point, where the difference is used as a first saturation feature corresponding to the pixel point;
and the second saturation feature extraction unit is used for calculating a difference value between the maximum value and the minimum value in R, G, B values of each pixel point in the second image as a second saturation feature corresponding to the pixel point.
Optionally, the fusion weight determining module includes:
the first reliability determining unit is used for determining the first reliability corresponding to each pixel point in the first image according to the first saturation characteristic corresponding to the pixel point and the corresponding relation between the preset saturation characteristic and the reliability; the credibility is used for representing the credibility of R, G, B values of corresponding pixel points in the image;
the second reliability determining unit is used for determining a second reliability corresponding to each pixel point in the second image according to a second saturation characteristic of the pixel point and the corresponding relation;
a fusion weight determination unit, configured to determine, based on the first confidence level and the second confidence level, a first fusion weight corresponding to each pixel point in the first image and a second fusion weight corresponding to each pixel point in the second image.
Optionally, the fusion weight determining unit includes:
the first fusion weight subunit is configured to calculate, for each pixel point in the first image, a ratio between a first reliability corresponding to the pixel point and a first target value, and use the ratio as a first fusion weight corresponding to the pixel point; the first target value is the sum of a first credibility corresponding to the pixel point and a second credibility corresponding to the pixel point with the same position as the pixel point in the second image;
a second fusion weight subunit, configured to calculate, for each pixel point in the second image, a ratio between a second reliability corresponding to the pixel point and a second target value, which is used as a second fusion weight corresponding to the pixel point; the second target value is the sum of a second credibility corresponding to the pixel point and a first credibility corresponding to a pixel point in the first image, wherein the pixel point is located at the same position as the pixel point; or the like, or, alternatively,
and the fusion weight determining unit is used for determining a difference value between 1 and a first fusion weight corresponding to a pixel point in the first image, which is at the same position as the pixel point, as a second fusion weight corresponding to the pixel point, aiming at each pixel point in the second image.
Optionally, the corresponding relationship is a relationship curve between a preset saturation feature and a reliability, the reliability and the saturation feature are in a negative correlation relationship, and the first reliability determining unit includes:
a first reliability determining subunit, configured to determine, for each pixel point in the first image, a reliability coordinate value corresponding to a first saturation feature in the relationship curve according to the first saturation feature corresponding to the pixel point, and determine the reliability coordinate value as a first reliability corresponding to the pixel point;
the second reliability determination unit includes:
and the second credibility determining subunit is used for determining a credibility coordinate value corresponding to the second saturation feature in the relation curve according to the second saturation feature corresponding to the pixel point for each pixel point in the second image, and determining the credibility coordinate value as the second credibility corresponding to the pixel point.
Optionally, the image fusion module includes:
and the image fusion unit is used for weighting and summing R, G, B values of the pixel points with the same positions in the first image and the second image according to the corresponding first fusion weight and the second fusion weight, and the weighted sum is used as the R, G, B value of the pixel point at the position after fusion, so that the fused image is obtained.
In a third aspect, an embodiment of the present invention provides an electronic device, including a processor, a communication interface, a memory, and a communication bus, where the processor and the communication interface complete communication between the memory and the processor through the communication bus;
a memory for storing a computer program;
a processor adapted to perform the method steps of any of the above first aspects when executing a program stored in the memory.
In a fourth aspect, the present invention provides a computer-readable storage medium, in which a computer program is stored, and the computer program, when executed by a processor, implements the method steps of any one of the above first aspects.
The embodiment of the invention has the following beneficial effects:
in the scheme provided by the embodiment of the invention, the electronic equipment can acquire the image to be processed; performing white balance correction on the image to be processed by adopting a first white balance parameter to obtain a first image, and performing white balance correction on the image to be processed by adopting a second white balance parameter to obtain a second image; wherein the color temperature of the first image is higher than the color temperature of the second image; the electronic equipment can also extract the saturation feature of the first image as a first saturation feature and extract the saturation feature of the second image as a second saturation feature; based on the first saturation feature and the relationship between the image white balance and the saturation, the electronic device may determine a first fusion weight corresponding to the first image, and based on the second saturation feature and the relationship between the image white balance and the saturation, determine a second fusion weight corresponding to the second image; the fusion weight is used for identifying the degree that the corresponding pixel point in the image is close to the actual color; based on the first fusion weight and the second fusion weight, the electronic device may perform image fusion on the first image and the second image to obtain a fused image. The image saturation degree expresses the vividness of the image color; the higher the saturation, the more vivid the image, the lower the saturation, the more achromatic the image, so it can be used to measure how close the image is to "white"; because the white balance of the image has a relation with the saturation, the white balance of the image can be measured by using the image saturation characteristic; based on the saturation characteristics of the image, the fusion weight corresponding to the image can be obtained. Therefore, in the process of image fusion, the fusion weight of the pixel points which are closer to the actual color in the two images is larger, and the fused color is closer to the actual color, so that the color of the fused image is more real, the color cast problem of the image shot in a complex illumination environment is solved, and the display effect of the image is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other embodiments can be obtained by referring to these drawings.
Fig. 1 is a flowchart of an image fusion method according to an embodiment of the present invention;
FIG. 2 is a flow chart of the solution in the embodiment shown in FIG. 1;
FIG. 3(a) is a schematic diagram of the first image in step S102 in the embodiment shown in FIG. 1;
FIG. 3(b) is a schematic diagram of the second image in step S102 in the embodiment shown in FIG. 1;
FIG. 4 is a specific flowchart based on step S104 in the embodiment shown in FIG. 1;
FIG. 5 is a schematic illustration of one implementation of the embodiment shown in FIG. 1;
FIG. 6 is a flow chart of a two-color temperature fusion algorithm in the embodiment shown in FIG. 2;
FIG. 7 is a schematic diagram of the fused image in step S105 in the embodiment shown in FIG. 1;
FIG. 8 is a schematic view of a labeling map of the image of FIG. 7;
fig. 9 is a schematic structural diagram of an image fusion apparatus according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments obtained by those of ordinary skill in the art based on the embodiments of the present invention are within the scope of the present invention.
In order to solve the color cast problem of an image shot in a complex illumination environment and improve the display effect of the image, embodiments of the present invention provide an image fusion method, an image fusion device, an electronic device, a computer-readable storage medium, and a computer program product.
The image fusion method provided by the embodiment of the present invention can be applied to any electronic device that needs to perform adjustment on the color cast problem of an image, for example, a computer, a camera, a processing device, and the like, and is not limited specifically herein, and can perform processing such as white balance parameter adjustment on an image. For clarity of description, the electronic device is referred to hereinafter.
As shown in fig. 1, an image fusion method includes:
s101, acquiring an image to be processed;
s102, performing white balance correction on the image to be processed by adopting a first white balance parameter to obtain a first image, and performing white balance correction on the image to be processed by adopting a second white balance parameter to obtain a second image;
wherein a color temperature of the first image is higher than a color temperature of the second image.
S103, extracting the saturation feature of the first image to serve as a first saturation feature, and extracting the saturation feature of the second image to serve as a second saturation feature;
s104, determining a first fusion weight corresponding to the first image based on the first saturation characteristic and the relation between the white balance and the saturation of the image, and determining a second fusion weight corresponding to the second image based on the second saturation characteristic and the relation between the white balance and the saturation of the image;
the fusion weight is used for identifying the degree that the corresponding pixel point in the image is close to the actual color.
And S105, performing image fusion on the first image and the second image based on the first fusion weight and the second fusion weight to obtain a fused image.
Therefore, in the scheme provided by the embodiment of the invention, the electronic equipment can acquire the image to be processed; performing white balance correction on the image to be processed by adopting a first white balance parameter to obtain a first image, and performing white balance correction on the image to be processed by adopting a second white balance parameter to obtain a second image; wherein the color temperature of the first image is higher than the color temperature of the second image; the electronic equipment can also extract the saturation feature of the first image as a first saturation feature and extract the saturation feature of the second image as a second saturation feature; based on the first saturation feature and the relationship between the image white balance and the saturation, the electronic device may determine a first fusion weight corresponding to the first image, and based on the second saturation feature and the relationship between the image white balance and the saturation, determine a second fusion weight corresponding to the second image; the fusion weight is used for identifying the degree that the corresponding pixel point in the image is close to the actual color; based on the first fusion weight and the second fusion weight, the electronic device may perform image fusion on the first image and the second image to obtain a fused image. The image saturation degree expresses the vividness of the image color; the higher the saturation, the more vivid the image, the lower the saturation, the more achromatic the image, so it can be used to measure how close the image is to "white"; because the white balance of the image has a relation with the saturation, the white balance of the image can be measured by using the image saturation characteristic; based on the saturation characteristics of the image, the fusion weight corresponding to the image can be obtained. Therefore, in the process of image fusion, the fusion weight of the pixel points which are closer to the actual color in the two images is larger, and the fused color is closer to the actual color, so that the color of the fused image is more real, the color cast problem of the image shot in a complex illumination environment is solved, and the display effect of the image is improved.
In order to record a picture of a certain scene at a certain moment, a camera can be used for shooting the scene, but due to the complexity and diversity of the shot scenes, a plurality of light sources may exist in the scene at the same time, so that the shot images in the scene are influenced by different light sources, the shot images have a color cast phenomenon, and particularly, when the shot images are displayed, certain areas are yellow or blue.
For example, in an image captured in a scene where an incandescent lamp and a fluorescent lamp are both present, a part of the captured image illuminated by the incandescent lamp looks yellow when displayed due to the low color temperature of light emitted by the incandescent lamp, and a part of the captured image illuminated by the fluorescent lamp looks blue when displayed due to the high color temperature of light emitted by the fluorescent lamp. The image shot under the complex illumination environment has the color cast problem, needs to be processed and can be used as an image to be processed.
In step S101, the electronic device may acquire an image to be processed, which may be an image captured by a camera in real time, an image stored in advance by the electronic device, or the like, and is not limited herein.
After the electronic device obtains the image to be processed, the electronic device may perform white balance correction on the image to be processed by using the first white balance parameter to obtain a first image, and perform white balance correction on the image to be processed by using the second white balance parameter to obtain a second image, that is, execute step S102.
Since the image to be processed is an image captured in a complex light source environment, there are generally two kinds of color cast regions in the image to be processed, namely a region with blue color irradiated by a light source with a high color temperature and a region with yellow color irradiated by a light source with a low color temperature. The electronic device can select two color temperature parameters to perform white balance correction on the image to be processed, wherein the higher color temperature parameter is used for correcting the blue region of the color, and the lower color temperature parameter is used for correcting the yellow region of the color.
And performing white balance correction on the image to be processed by using a higher color temperature parameter, so that the color of the blue area can be corrected to be closer to a real color, and a first image is obtained. And performing white balance correction on the image to be processed by using a lower color temperature parameter, so that the yellow region can be corrected to be closer to a real color, and a second image is obtained, namely, the color temperature of the first image is higher than that of the second image.
In an embodiment, as shown in fig. 2, a source code stream captured by a camera may pass through two image signal preprocessing systems, and the two image signal preprocessing systems are used to perform warm light white balance correction, i.e., low color temperature white balance correction, and cold light white balance correction, i.e., high color temperature white balance correction, on an image to be processed in the source code stream, respectively. The white balance parameters adopted by the two image signal processing systems are different, and other parameters can be the same.
For example, the left half area light source of the image to be processed is an incandescent lamp, the color display is yellowish, the right half area light source is a fluorescent lamp, the color display is bluish, the first image of the image to be processed after high color temperature white balance correction is shown in fig. 3(a), and the second image of the image to be processed after low color temperature white balance correction is shown in fig. 3 (b).
In fig. 3(a) and 3(b), the region 301 and the region 303 correspond to a yellow-to-left region in the image to be processed, and the region 302 and the region 304 correspond to a blue-to-right region in the image to be processed. After the above-mentioned white balance correction of the high color temperature, the region 302 is closer to the actual color than the region 304; through the above-described low color temperature white balance correction, the region 303 is closer to the actual color than the region 301.
As shown in fig. 2, after the first image and the second image are obtained, the electronic device may perform two-color temperature fusion on the first image and the second image and then perform optimal output to obtain a fused image, where the color of the fused image is closer to the actual color, so as to solve the color cast problem.
Specifically, the electronic device may extract the saturation feature of the first image as the first saturation feature and extract the saturation feature of the second image as the second saturation feature, that is, perform step S103 described above.
Image saturation represents the degree of vividness of an image color, the higher the image saturation, the more vividness the image color, the lower the image saturation, the more achromatic the image appears, that is, the image saturation can be used to measure how close the image is to "white". In the process of image shooting, the color cast problem is more likely to occur in a part of which the actual color is white, so that the white area in the image is closer to the real white, and the color cast problem is less likely to occur in other colors, which indicates that the color cast problem of the image is smaller and closer to the real color, so that the image saturation can be used for representing the degree of the image close to the real color. Thus, the electronic device may extract a first saturation feature of the first image and extract a second saturation feature of the second image.
The color of each pixel point in the image is determined by the values of three primary colors of R (Red), G (Green) and B (Blue), and the range of R, G, B values is 0-255, and different values of R, G, B form different colors. The saturation of the image is related to R, G, B, the tristimulus values, so as an embodiment, the electronic device may determine a first saturation characteristic of the first image according to R, G, B values of each pixel in the first image, and determine a second saturation characteristic of the second image according to R, G, B values of each pixel in the second image.
Further, in step S104, the electronic device may determine a first fusion weight corresponding to the first image based on the first saturation characteristic and the relationship between the white balance and the saturation of the image, and may determine a second fusion weight corresponding to the second image based on the second saturation characteristic and the relationship between the white balance and the saturation of the image.
The relationship between the white balance and the saturation of the image is as follows: the image saturation may be used to characterize how close the image is to true color, so the electronic device may determine a first fusion weight corresponding to the first image based on the first saturation feature and the relationship between the white balance and the saturation of the image, and determine a second fusion weight corresponding to the second image based on the second saturation feature and the relationship between the white balance and the saturation of the image. The fusion weight is used for identifying the degree that the corresponding pixel point in the image is close to the actual color.
After determining the first fusion weight and the second fusion weight, the electronic device may execute step S105, that is, the electronic device may perform image fusion on the first image and the second image based on the first fusion weight and the second fusion weight to obtain a fused image. As an implementation manner, since the higher the saturation of a pixel is, the lower the degree of the pixel close to the actual color is, the lower the corresponding fusion weight may be, and then the contribution of the R, G, B value of the pixel is smaller in the process of image fusion; conversely, the lower the saturation of a pixel point, the higher the degree of the pixel point approaching to the actual color, the larger the corresponding fusion weight may be, and the larger the contribution of the R, G, B value of the pixel point in the process of image fusion.
Therefore, by adopting the scheme provided by the embodiment of the invention to perform image fusion, the electronic equipment can perform white balance processing on the image to be processed to obtain two images with different white balance, the saturation feature extraction is performed on the images, the fusion weight of the images is calculated, the image fusion weight closer to the actual color is larger, finally, the two images are fused according to the corresponding fusion weight to obtain the fused image, and the color of the fused image is closer to the actual color. The image fusion scheme provided by the embodiment of the invention solves the color cast problem of the image shot under the complex illumination environment, and improves the display effect of the image. Meanwhile, the method does not depend on extra hardware, and has the advantages of wide application range, low threshold and popularization.
As an implementation manner of the embodiment of the present invention, the step of extracting the saturation feature of the first image as the first saturation feature and extracting the saturation feature of the second image as the second saturation feature may include:
calculating the difference value between the maximum value and the minimum value in R, G, B values of each pixel point in the first image as a first saturation characteristic corresponding to the pixel point; and calculating the difference value between the maximum value and the minimum value in R, G, B values of each pixel point in the second image as a second saturation characteristic corresponding to the pixel point.
The electronic device may calculate its saturation characteristics for each pixel point in the first image and the second image. Specifically, for each pixel of the image, the color of the pixel is determined by R, G, B values, the range of R, G, B values is 0-255, the saturation characteristic of the pixel is related to R, G, B and the tristimulus value, and in one embodiment, the saturation characteristic of each pixel of the image can be calculated by using the following formula:
Figure BDA0003304594090000121
wherein S (x) is the saturation characteristic of pixel point x, maxc∈{r,g,b}Ic(x) Is the maximum value, min, of the R, G, B values for pixel point xc∈{r,g,b}Ic(x) The smallest value among the R, G, B values for pixel point x. The above formula includes a division operation, and in the region with extremely low brightness value of the image, the calculation result is usually unstable, i.e. the value is not flatTherefore, in this embodiment, the following formula may be adopted to calculate the saturation characteristic of each pixel:
Figure BDA0003304594090000122
that is to say, for each pixel point in the image, the difference between the maximum value and the minimum value in the R, G, B values of the pixel point can be calculated, and the difference is used as the saturation feature corresponding to the pixel point, and the value range of the saturation feature is 0 to 255.
For example, the R, G, B values of a certain pixel point of the first image are 200, 255, and 50, respectively, where the maximum value is 255 and the minimum value is 50, and the saturation characteristic of the pixel point is 255-50 ═ 205 according to the above formula; for another example, the R, G, B values of a certain pixel point of the second image are 128, and 128, respectively, where the maximum value is 128 and the minimum value is 128, and the saturation characteristic of the pixel point is 128-.
As can be seen, in this embodiment, for each pixel point in the first image and the second image, the electronic device may calculate a difference between a maximum value and a minimum value in R, G, B values of the pixel point, as a saturation feature corresponding to the pixel point, so as to obtain a first saturation feature of the first image and a second saturation feature of the second image. Saturation characteristics capable of representing the degree of each pixel point close to the real color can be obtained, so that fused images obtained by subsequent image fusion can be close to the real color as far as possible, and the display effect of the images is further improved.
As an embodiment of the present invention, as shown in fig. 4, the step of determining a first fusion weight corresponding to the first image based on the first saturation characteristic and the relationship between the white balance and the saturation of the image, and determining a second fusion weight corresponding to the second image based on the second saturation characteristic and the relationship between the white balance and the saturation of the image may include:
s401, aiming at each pixel point in the first image, determining a first credibility corresponding to the pixel point according to a first saturation characteristic corresponding to the pixel point and a corresponding relation between a preset saturation characteristic and a credibility;
s402, aiming at each pixel point in the second image, determining a second credibility corresponding to the pixel point according to a second saturation characteristic of the pixel point and the corresponding relation;
the credibility is used to represent the credibility of the R, G, B value of the corresponding pixel in the image, that is, the difference between the color of the pixel and the actual color. Since the lower the saturation of a pixel point, the higher the degree of the pixel point approaching to the actual color, the higher the confidence level of the pixel point should be. Conversely, the higher the saturation of a pixel point, the lower the degree that the pixel point is close to the actual color, so the lower the confidence level of the pixel point should be.
And based on the rule, according to the corresponding relation between the saturation characteristic and the reliability which is established in advance. Specifically, the value of the saturation feature may be quantized to a value between 0 and 255, and may be enumerated, so that the correspondence between the saturation feature and the reliability may be preset. The correspondence may be recorded by using a table, for example, as shown in the following table:
serial number Characteristic of degree of saturation Degree of confidence
1 0 X0
2 1 X1
256 255 X255
Then, if the value of the saturation feature of a certain pixel point of the first image is 125, the electronic device may determine that the first reliability corresponding to the pixel point is X125 according to the correspondence between the saturation feature and the reliability recorded in the above table. If the value of the saturation feature of a certain pixel point of the second image is 255, it may be determined that the second reliability corresponding to the pixel point is X255.
S403, determining a first fusion weight corresponding to each pixel point in the first image and a second fusion weight corresponding to each pixel point in the second image based on the first confidence level and the second confidence level.
After determining a first credibility corresponding to each pixel point in the first image and a second credibility corresponding to each pixel point in the second image. The electronic device can determine a first fusion weight corresponding to each pixel point in the first image and a second fusion weight corresponding to each pixel point in the second image based on the first confidence level and the second confidence level.
Since the reliability is used for representing the credibility of the R, G, B value of the corresponding pixel point in the image, that is, the higher the reliability is, the more accurate the R, G, B value of the pixel point is, the first fusion weight and the first reliability are in a positive correlation relationship, and the second fusion weight and the second reliability are in a positive correlation relationship. That is, for each pixel point, if the first reliability of the pixel point is greater than the second reliability, the corresponding first fusion weight is greater than the second fusion weight; correspondingly, if the first credibility of the pixel point is smaller than the second credibility, the corresponding first fusion weight is smaller than the second fusion weight.
Therefore, based on the relationship, the electronic device may determine a first fusion weight corresponding to each pixel point in the first image and a second fusion weight corresponding to each pixel point in the second image based on the first confidence level and the second confidence level in any manner capable of representing the relationship. For clarity of the text, examples will be presented later.
For the above step S401 and step S402, there is no precedence restriction on the execution sequence, and it is reasonable to execute step S401 first, step S402 first, or both step S401 and step S402.
As can be seen, in this embodiment, for each pixel point in the first image and the second image, the electronic device determines the reliability corresponding to the pixel point based on the saturation feature corresponding to the pixel point and the corresponding relationship between the preset saturation feature and the reliability. Further, a first fusion weight corresponding to each pixel point in the first image and a second fusion weight corresponding to each pixel point in the second image are determined based on the first confidence level and the second confidence level. Therefore, the corresponding reliability of each pixel point can be accurately determined based on the corresponding relation between the preset saturation characteristic and the reliability, the fusion weight is further accurately determined, and the fused image obtained by subsequent image fusion can be close to the real color as far as possible.
As an implementation manner of the embodiment of the present invention, the step of determining a first fusion weight corresponding to each pixel point in the first image and a second fusion weight corresponding to each pixel point in the second image based on the first reliability and the second reliability may include:
calculating the ratio between the first credibility corresponding to the pixel point and the first target value as the first fusion weight corresponding to the pixel point aiming at each pixel point in the first image; calculating the ratio between the second reliability corresponding to the pixel point and the second target value as a second fusion weight corresponding to the pixel point aiming at each pixel point in the second image; or, for each pixel point in the second image, determining a difference value between 1 and a first fusion weight corresponding to a pixel point in the first image, which is at the same position as the pixel point, as a second fusion weight corresponding to the pixel point.
In an embodiment, after determining a first reliability corresponding to each pixel point of the first image and a second reliability corresponding to each pixel point of the second image, the electronic device may perform normalization processing on the first reliability and the second reliability to obtain a first fusion weight corresponding to each pixel point in the first image and a second fusion weight corresponding to each pixel point in the second image.
The following formula can be used to calculate a first fusion weight corresponding to each pixel point in the first image and a second fusion weight corresponding to each pixel point in the second image:
Figure BDA0003304594090000151
wherein, W0(x) A first fusion weight, W, corresponding to each pixel point in the first image1(x) A second blending weight corresponding to each pixel point in the second image,
Figure BDA0003304594090000152
a first confidence level for each pixel point in the first image,
Figure BDA0003304594090000153
a second confidence level for each pixel point in the second image.
That is to say, for each pixel point in the first image, the electronic device may calculate a sum of a first reliability corresponding to the pixel point and a second reliability corresponding to a pixel point in the second image having the same position as the pixel point, as a first target value, and further calculate a ratio between the first reliability corresponding to the pixel point and the first target value, as a first fusion weight corresponding to the pixel point.
For each pixel point in the second image, in an embodiment, the electronic device may calculate, according to the above formula, a first fusion weight corresponding to a pixel point in the first image, where the pixel point is located at the same position as the pixel point. Since the sum of the first fusion weight corresponding to each pixel point in the first image and the second fusion weight corresponding to the pixel point with the same pixel point position in the second image is 1, for each pixel point in the second image, the difference between 1 and the first fusion weight corresponding to the pixel point with the same pixel point position in the first image can be determined as the second fusion weight corresponding to the pixel point.
In another embodiment, for each pixel point in the second image, the electronic device may also calculate a sum of the second reliability corresponding to the pixel point and the first reliability corresponding to the pixel point in the first image at the same position as the pixel point, as a second target value, and further calculate a ratio between the second reliability corresponding to the pixel point and the second target value, as a second fusion weight corresponding to the pixel point, which is reasonable. That is, the second fusion weight can be calculated using the following formula:
Figure BDA0003304594090000161
wherein, W1(x) A second blending weight corresponding to each pixel point in the second image,
Figure BDA0003304594090000162
a second confidence level for each pixel point in the second image,
Figure BDA0003304594090000163
a first confidence level for each pixel point in the first image.
For example, in the first image, for pixel point a, the first confidence level corresponding to pixel point a is 128, and in the second image, the first confidence level corresponding to pixel point a is 128The second confidence corresponding to the pixel point B with the same position as the pixel point a is 72, then the first target value is 128+72 ═ 200, and the first fusion weight of the pixel point a is the ratio of the first confidence 128 to the first target value 200, that is, the ratio is
Figure BDA0003304594090000164
The second fusion weight of the pixel point B is a difference between 1 and the first fusion weight, i.e., 1-0.64 is 0.36. It can also be obtained by the following calculation: the second target value is 72+128 to 200, and the second fusion weight of the pixel point B is the ratio of the second reliability 72 to the second target value 200, that is, the ratio is
Figure BDA0003304594090000165
The first fusion weight corresponding to the pixel point a is 0.64. Correspondingly, the second fusion weight corresponding to the pixel point B with the same position as the pixel point a is 0.36.
As can be seen, in this embodiment, the electronic device may determine, based on the first confidence level and the second confidence level, a first fusion weight corresponding to each pixel point in the first image and a second fusion weight corresponding to each pixel point in the second image. In this way, the first and second fusion weights may be determined quickly and accurately.
As an implementation manner of the embodiment of the present invention, the correspondence may be a relationship curve between a preset saturation characteristic and a reliability, where the reliability and the saturation characteristic are in a negative correlation relationship.
For example, the relationship may be as shown in fig. 5, where the abscissa is the value of the saturation feature, ranging from 0 to 255, and the ordinate is the confidence, which may be set, for example, to 20 to 100. The reliability and the saturation feature are in a negative correlation relationship, the correspondence reliability is 100 when the saturation feature is 0, and the correspondence reliability is 20 when the saturation feature is 255.
Correspondingly, the step of determining the first reliability corresponding to each pixel point in the first image according to the first saturation characteristic corresponding to the pixel point and the corresponding relationship between the preset saturation characteristic and the reliability may include: and aiming at each pixel point in the first image, determining a reliability coordinate value corresponding to the first saturation characteristic in the relation curve according to the first saturation characteristic corresponding to the pixel point, and determining the reliability coordinate value as a first reliability corresponding to the pixel point.
The step of determining, for each pixel point in the second image, a second reliability corresponding to the pixel point according to the second saturation characteristic of the pixel point and the correspondence relationship may include: and determining the credibility coordinate value corresponding to the second saturation characteristic in the relation curve according to the second saturation characteristic corresponding to the pixel point for each pixel point in the second image, and determining the credibility coordinate value as the second credibility corresponding to the pixel point.
On the relationship curve between the preset saturation feature and the reliability, it can be seen that when the value of the saturation feature of the pixel point is small, the corresponding reliability is kept at a high level, and as the value of the saturation feature gradually increases, the corresponding reliability is reduced. Namely, the reliability of the pixel point with the low value of the saturation characteristic is higher, the corresponding fusion weight is also higher, and the contribution to the R, G, B value of the pixel point in the fused image is also higher in the subsequent image fusion process, so that the color of the fused image is closer to the real color.
Therefore, in this embodiment, through a preset relationship curve between the saturation feature and the confidence level, the electronic device can quickly and accurately determine the confidence level corresponding to the saturation feature according to the saturation feature corresponding to the pixel point, so as to improve the image fusion speed.
As an implementation manner of the embodiment of the present invention, the step of performing image fusion on the first image and the second image based on the first fusion weight and the second fusion weight to obtain a fused image may include:
and carrying out weighted summation on R, G, B values of the pixel points with the same positions in the first image and the second image according to the corresponding first fusion weight and the second fusion weight, and taking the weighted summation as the R, G, B value of the pixel point at the position after fusion to obtain the fused image.
For each pixel point with the same position in the first image and the second image, the electronic device may multiply R, G, B values of the pixel point in the first image by the corresponding first fusion weight, respectively, to obtain a first product. And multiplying the R, G, B values of the pixel points with the same position in the second image by the corresponding second fusion weights respectively to obtain a second product. And R, G, B values in the first product and the second product can be respectively added to be used as R, G, B values of the pixel point at the position after fusion, so that the fused image can be obtained.
That is, the electronic device may calculate R, G, B values for each pixel point in the fused image using the following formula:
Figure BDA0003304594090000181
wherein, Jc(x) In order to obtain a fused image, the image is processed,
Figure BDA0003304594090000182
is the R, G, B value, W, of pixel point x in the first image0(x) A first fusion weight for pixel point x in the first image,
Figure BDA0003304594090000183
is the R, G, B value, W, of the pixel in the second image that is co-located with pixel x0(x) And the second fusion weight of the pixel point in the second image.
For example, the R, G, B values of the pixel point a in the first image are 255, and 0, respectively, and the corresponding first fusion weight is 0.4; the R, G, B values of the pixel point b in the second image, which is located at the same position as the pixel point, are 245, 235 and 10 respectively, and the corresponding second fusion weight is 0.6. Then, in the above manner, the R value of the fused pixel is 255 × 0.4+245 × 0.6 ═ 249, the G value of the fused pixel is 255 × 0.4+235 × 0.6 ═ 243, and the B value of the fused pixel is 0 × 0.4+10 × 0.6 ═ 6. Through the above calculation, R, G, B values of the pixel points at the position after fusion are obtained, which are 249, 243 and 6 respectively.
It can be seen that, in this embodiment, for the pixel points with the same position in the first image and the second image, the electronic device performs weighted summation according to the R, G, B value thereof and the corresponding fusion weight to obtain the R, G, B value of the position after fusion, so as to obtain the fused image, and obtain the fused image closer to the true color, thereby well solving the color cast problem.
As an implementation manner of the embodiment of the present invention, the saturation feature, the confidence level, and the fusion weight may be represented in the form of an image, as shown in fig. 6, after white balance processing, the electronic device obtains two images with different white balances, and the two-color-temperature fusion algorithm may include a color feature extraction 601, a feature mapping 602, a weight fusion 603, and a fusion output 604. The color feature extraction 601 is to extract a first saturation feature of the first image and a second saturation feature of the second image, the feature mapping 602 is to obtain a first confidence level and a second confidence level according to the first saturation feature and the second saturation feature, the weight fusion 603 is to perform normalization processing on the first confidence level and the second confidence level to obtain a first fusion weight and a second fusion weight, and the fusion output 604 is to perform weighted fusion on the first image and the second image according to the first fusion weight and the second fusion weight to obtain a fused image. Through the steps, the two images with different white balance are fused into one image, and the color of the fused image is closer to the actual color.
As an implementation manner of the embodiment of the present invention, after obtaining the first fusion weight and the second fusion weight, the electronic device may further use a small-radius gaussian blur for the first fusion weight and the second fusion weight to locally smooth the first fusion weight and the second fusion weight, so that the color of the image obtained after the fusion is more natural.
The image obtained by fusing the first image and the second image through the above steps is shown in fig. 7, the labeling map is the image to be processed, and as shown in fig. 8, the region 701 in fig. 7 and the region 801 in fig. 8 are corresponding regions at the same position, and it can be seen that the color cast of the image is reduced and the color is closer to the actual color in the region 701 of the image obtained by fusing the images than in the region 801 of the labeling map. The image fusion method of the embodiment of the invention solves the color cast problem of the image shot under the complex illumination environment and improves the display effect of the image.
Corresponding to the image fusion method, an embodiment of the present invention further provides an image fusion device, and an image fusion device provided in an embodiment of the present invention is described below.
As shown in fig. 9, an image fusion apparatus, the apparatus comprising:
an image obtaining module 901, configured to obtain an image to be processed;
a white balance correction module 902, configured to perform white balance correction on the image to be processed by using a first white balance parameter to obtain a first image, and perform white balance correction on the image to be processed by using a second white balance parameter to obtain a second image;
wherein the color temperature of the first image is higher than the color temperature of the second image;
a saturation feature extraction module 903, configured to extract a saturation feature of the first image as a first saturation feature, and extract a saturation feature of the second image as a second saturation feature;
a fusion weight determining module 904, configured to determine a first fusion weight corresponding to the first image based on the first saturation feature and a relationship between image white balance and saturation, and determine a second fusion weight corresponding to the second image based on the second saturation feature and a relationship between image white balance and saturation;
the fusion weight is used for identifying the degree that the corresponding pixel point in the image is close to the actual color;
an image fusion module 905, configured to perform image fusion on the first image and the second image based on the first fusion weight and the second fusion weight, so as to obtain a fused image.
Therefore, in the scheme provided by the embodiment of the invention, the electronic equipment can acquire the image to be processed; performing white balance correction on the image to be processed by adopting a first white balance parameter to obtain a first image, and performing white balance correction on the image to be processed by adopting a second white balance parameter to obtain a second image; wherein the color temperature of the first image is higher than the color temperature of the second image; the electronic equipment can also extract the saturation feature of the first image as a first saturation feature and extract the saturation feature of the second image as a second saturation feature; based on the first saturation feature and the relationship between the image white balance and the saturation, the electronic device may determine a first fusion weight corresponding to the first image, and based on the second saturation feature and the relationship between the image white balance and the saturation, determine a second fusion weight corresponding to the second image; the fusion weight is used for identifying the degree that the corresponding pixel point in the image is close to the actual color; based on the first fusion weight and the second fusion weight, the electronic device may perform image fusion on the first image and the second image to obtain a fused image. The image saturation degree expresses the vividness of the image color; the higher the saturation, the more vivid the image, the lower the saturation, the more achromatic the image, so it can be used to measure how close the image is to "white"; because the white balance of the image has a relation with the saturation, the white balance of the image can be measured by using the image saturation characteristic; based on the saturation characteristics of the image, the fusion weight corresponding to the image can be obtained. Therefore, in the process of image fusion, the fusion weight of the pixel points which are closer to the actual color in the two images is larger, and the fused color is closer to the actual color, so that the color of the fused image is more real, the color cast problem of the image shot in a complex illumination environment is solved, and the display effect of the image is improved.
As an implementation manner of the embodiment of the present invention, the saturation feature extraction module 903 includes:
a first saturation feature extraction unit, configured to calculate, for each pixel point in the first image, a difference between a maximum value and a minimum value in R, G, B values of the pixel point, where the difference is used as a first saturation feature corresponding to the pixel point;
and the second saturation feature extraction unit is used for calculating a difference value between the maximum value and the minimum value in R, G, B values of each pixel point in the second image as a second saturation feature corresponding to the pixel point.
As an implementation manner of the embodiment of the present invention, the fusion weight determining module 904 includes:
the first reliability determining unit is used for determining the first reliability corresponding to each pixel point in the first image according to the first saturation characteristic corresponding to the pixel point and the corresponding relation between the preset saturation characteristic and the reliability; the credibility is used for representing the credibility of R, G, B values of corresponding pixel points in the image;
the second reliability determining unit is used for determining a second reliability corresponding to each pixel point in the second image according to a second saturation characteristic of the pixel point and the corresponding relation;
a fusion weight determination unit, configured to determine, based on the first confidence level and the second confidence level, a first fusion weight corresponding to each pixel point in the first image and a second fusion weight corresponding to each pixel point in the second image.
As an embodiment of the present invention, the fusion weight determining unit includes:
the first fusion weight subunit is configured to calculate, for each pixel point in the first image, a ratio between a first reliability corresponding to the pixel point and a first target value, and use the ratio as a first fusion weight corresponding to the pixel point; the first target value is the sum of a first credibility corresponding to the pixel point and a second credibility corresponding to the pixel point with the same position as the pixel point in the second image;
a second fusion weight subunit, configured to calculate, for each pixel point in the second image, a ratio between a second reliability corresponding to the pixel point and a second target value, which is used as a second fusion weight corresponding to the pixel point; the second target value is the sum of a second credibility corresponding to the pixel point and a first credibility corresponding to a pixel point in the first image, wherein the pixel point is located at the same position as the pixel point; or the like, or, alternatively,
and the fusion weight determining unit is used for determining a difference value between 1 and a first fusion weight corresponding to a pixel point in the first image, which is at the same position as the pixel point, as a second fusion weight corresponding to the pixel point, aiming at each pixel point in the second image.
As an embodiment of the present invention, the correspondence relationship is a relationship curve between a preset saturation characteristic and a reliability, and the reliability and the saturation characteristic have a negative correlation, and the first reliability determining unit includes:
a first reliability determining subunit, configured to determine, for each pixel point in the first image, a reliability coordinate value corresponding to a first saturation feature in the relationship curve according to the first saturation feature corresponding to the pixel point, and determine the reliability coordinate value as a first reliability corresponding to the pixel point;
the second reliability determination unit includes:
and the second credibility determining subunit is used for determining a credibility coordinate value corresponding to the second saturation feature in the relation curve according to the second saturation feature corresponding to the pixel point for each pixel point in the second image, and determining the credibility coordinate value as the second credibility corresponding to the pixel point.
As an implementation manner of the embodiment of the present invention, the image fusion module 905 includes:
and the image fusion unit is used for weighting and summing R, G, B values of the pixel points with the same positions in the first image and the second image according to the corresponding first fusion weight and the second fusion weight, and the weighted sum is used as the R, G, B value of the pixel point at the position after fusion, so that the fused image is obtained.
The embodiment of the present invention further provides an electronic device, as shown in fig. 10, which includes a processor 1001, a communication interface 1002, a memory 1003 and a communication bus 1004, wherein the processor 1001, the communication interface 1002 and the memory 1003 complete mutual communication through the communication bus 1004,
a memory 1003 for storing a computer program;
the processor 1001 is configured to implement the method steps according to any of the embodiments when executing the program stored in the memory 1003.
The communication bus mentioned in the electronic device may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
The communication interface is used for communication between the electronic equipment and other equipment.
The Memory may include a Random Access Memory (RAM) or a Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components.
In a further embodiment of the present invention, a computer-readable storage medium is further provided, in which a computer program is stored, which, when being executed by a processor, implements the steps of the method of any of the above embodiments.
In a further embodiment provided by the present invention, there is also provided a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method steps of any of the above embodiments.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the invention to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the apparatus, the electronic device, the computer-readable storage medium, and the computer program product embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and for the relevant points, reference may be made to the partial description of the method embodiments.
The above description is only for the preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention shall fall within the protection scope of the present invention.

Claims (14)

1. An image fusion method, characterized in that the method comprises:
acquiring an image to be processed;
performing white balance correction on the image to be processed by adopting a first white balance parameter to obtain a first image, and performing white balance correction on the image to be processed by adopting a second white balance parameter to obtain a second image; wherein the color temperature of the first image is higher than the color temperature of the second image;
extracting the saturation characteristic of the first image to serve as a first saturation characteristic, and extracting the saturation characteristic of the second image to serve as a second saturation characteristic;
determining a first fusion weight corresponding to the first image based on the first saturation characteristic and the relation between the white balance and the saturation of the image, and determining a second fusion weight corresponding to the second image based on the second saturation characteristic and the relation between the white balance and the saturation of the image; the fusion weight is used for identifying the degree that the corresponding pixel point in the image is close to the actual color;
and performing image fusion on the first image and the second image based on the first fusion weight and the second fusion weight to obtain a fused image.
2. The method of claim 1, wherein the step of extracting the saturation feature of the first image as a first saturation feature and extracting the saturation feature of the second image as a second saturation feature comprises:
calculating the difference value between the maximum value and the minimum value in R, G, B values of each pixel point in the first image as a first saturation characteristic corresponding to the pixel point;
and calculating the difference value between the maximum value and the minimum value in R, G, B values of each pixel point in the second image as a second saturation characteristic corresponding to the pixel point.
3. The method of claim 1, wherein the step of determining a first fusion weight corresponding to the first image based on the first saturation characteristic and the relationship between image white balance and saturation, and determining a second fusion weight corresponding to the second image based on the second saturation characteristic and the relationship between image white balance and saturation comprises:
aiming at each pixel point in the first image, determining a first credibility corresponding to the pixel point according to a first saturation characteristic corresponding to the pixel point and a corresponding relation between a preset saturation characteristic and a credibility; the credibility is used for representing the credibility of R, G, B values of corresponding pixel points in the image;
aiming at each pixel point in the second image, determining a second credibility corresponding to the pixel point according to a second saturation characteristic of the pixel point and the corresponding relation;
determining a first fusion weight corresponding to each pixel point in the first image and a second fusion weight corresponding to each pixel point in the second image based on the first confidence level and the second confidence level.
4. The method of claim 3, wherein the step of determining a first blending weight for each pixel point in the first image and a second blending weight for each pixel point in the second image based on the first confidence level and the second confidence level comprises:
calculating the ratio between the first credibility corresponding to the pixel point and the first target value as the first fusion weight corresponding to the pixel point aiming at each pixel point in the first image; the first target value is the sum of a first credibility corresponding to the pixel point and a second credibility corresponding to the pixel point with the same position as the pixel point in the second image;
calculating the ratio between the second reliability corresponding to the pixel point and the second target value as a second fusion weight corresponding to the pixel point aiming at each pixel point in the second image; the second target value is the sum of a second credibility corresponding to the pixel point and a first credibility corresponding to a pixel point in the first image, wherein the pixel point is located at the same position as the pixel point; or the like, or, alternatively,
and determining the difference value between 1 and the first fusion weight corresponding to the pixel point with the same position as the pixel point in the first image as the second fusion weight corresponding to the pixel point aiming at each pixel point in the second image.
5. The method according to claim 3, wherein the corresponding relation is a relation curve between a preset saturation characteristic and a reliability, and the reliability and the saturation characteristic are in a negative correlation relationship;
the step of determining, for each pixel point in the first image, a first reliability corresponding to the pixel point according to a first saturation characteristic corresponding to the pixel point and a correspondence between a preset saturation characteristic and a reliability includes:
aiming at each pixel point in the first image, determining a reliability coordinate value corresponding to the first saturation characteristic in the relation curve according to the first saturation characteristic corresponding to the pixel point, and determining the reliability coordinate value as a first reliability corresponding to the pixel point;
the step of determining, for each pixel point in the second image, a second reliability corresponding to the pixel point according to the second saturation characteristic of the pixel point and the correspondence relationship includes:
and determining the credibility coordinate value corresponding to the second saturation characteristic in the relation curve according to the second saturation characteristic corresponding to the pixel point for each pixel point in the second image, and determining the credibility coordinate value as the second credibility corresponding to the pixel point.
6. The method according to any one of claims 1 to 5, wherein the step of performing image fusion on the first image and the second image based on the first fusion weight and the second fusion weight to obtain a fused image comprises:
and carrying out weighted summation on R, G, B values of the pixel points with the same positions in the first image and the second image according to the corresponding first fusion weight and the second fusion weight, and taking the weighted summation as the R, G, B value of the pixel point at the position after fusion to obtain the fused image.
7. An image fusion apparatus, characterized in that the apparatus comprises:
the image acquisition module is used for acquiring an image to be processed;
the white balance correction module is used for carrying out white balance correction on the image to be processed by adopting a first white balance parameter to obtain a first image and carrying out white balance correction on the image to be processed by adopting a second white balance parameter to obtain a second image; wherein the color temperature of the first image is higher than the color temperature of the second image;
the saturation characteristic extraction module is used for extracting the saturation characteristic of the first image to serve as a first saturation characteristic and extracting the saturation characteristic of the second image to serve as a second saturation characteristic;
a fusion weight determining module, configured to determine a first fusion weight corresponding to the first image based on the first saturation feature and a relationship between image white balance and saturation, and determine a second fusion weight corresponding to the second image based on the second saturation feature and a relationship between image white balance and saturation; the fusion weight is used for identifying the degree that the corresponding pixel point in the image is close to the actual color;
and the image fusion module is used for carrying out image fusion on the first image and the second image based on the first fusion weight and the second fusion weight to obtain a fused image.
8. The apparatus of claim 7, wherein the saturation feature extraction module comprises:
a first saturation feature extraction unit, configured to calculate, for each pixel point in the first image, a difference between a maximum value and a minimum value in R, G, B values of the pixel point, where the difference is used as a first saturation feature corresponding to the pixel point;
and the second saturation feature extraction unit is used for calculating a difference value between the maximum value and the minimum value in R, G, B values of each pixel point in the second image as a second saturation feature corresponding to the pixel point.
9. The apparatus of claim 7, wherein the fusion weight determination module comprises:
the first reliability determining unit is used for determining the first reliability corresponding to each pixel point in the first image according to the first saturation characteristic corresponding to the pixel point and the corresponding relation between the preset saturation characteristic and the reliability; the credibility is used for representing the credibility of R, G, B values of corresponding pixel points in the image;
the second reliability determining unit is used for determining a second reliability corresponding to each pixel point in the second image according to a second saturation characteristic of the pixel point and the corresponding relation;
a fusion weight determination unit, configured to determine, based on the first confidence level and the second confidence level, a first fusion weight corresponding to each pixel point in the first image and a second fusion weight corresponding to each pixel point in the second image.
10. The apparatus according to claim 9, wherein the fusion weight determining unit comprises:
the first fusion weight subunit is configured to calculate, for each pixel point in the first image, a ratio between a first reliability corresponding to the pixel point and a first target value, and use the ratio as a first fusion weight corresponding to the pixel point; the first target value is the sum of a first credibility corresponding to the pixel point and a second credibility corresponding to the pixel point with the same position as the pixel point in the second image;
a second fusion weight subunit, configured to calculate, for each pixel point in the second image, a ratio between a second reliability corresponding to the pixel point and a second target value, which is used as a second fusion weight corresponding to the pixel point; the second target value is the sum of a second credibility corresponding to the pixel point and a first credibility corresponding to a pixel point in the first image, wherein the pixel point is located at the same position as the pixel point; or the like, or, alternatively,
and the fusion weight determining unit is used for determining a difference value between 1 and a first fusion weight corresponding to a pixel point in the first image, which is at the same position as the pixel point, as a second fusion weight corresponding to the pixel point, aiming at each pixel point in the second image.
11. The apparatus according to claim 9, wherein the correspondence is a relationship curve between a preset saturation characteristic and a reliability, and the reliability is in a negative correlation with the saturation characteristic, and the first reliability determining unit includes:
a first reliability determining subunit, configured to determine, for each pixel point in the first image, a reliability coordinate value corresponding to a first saturation feature in the relationship curve according to the first saturation feature corresponding to the pixel point, and determine the reliability coordinate value as a first reliability corresponding to the pixel point;
the second reliability determination unit includes:
and the second credibility determining subunit is used for determining a credibility coordinate value corresponding to the second saturation feature in the relation curve according to the second saturation feature corresponding to the pixel point for each pixel point in the second image, and determining the credibility coordinate value as the second credibility corresponding to the pixel point.
12. The apparatus according to any one of claims 7-11, wherein the image fusion module comprises:
and the image fusion unit is used for weighting and summing R, G, B values of the pixel points with the same positions in the first image and the second image according to the corresponding first fusion weight and the second fusion weight, and the weighted sum is used as the R, G, B value of the pixel point at the position after fusion, so that the fused image is obtained.
13. An electronic device is characterized by comprising a processor, a communication interface, a memory and a communication bus, wherein the processor and the communication interface are used for realizing mutual communication by the memory through the communication bus;
a memory for storing a computer program;
a processor for implementing the method steps of any of claims 1-6 when executing a program stored in the memory.
14. A computer-readable storage medium, characterized in that a computer program is stored in the computer-readable storage medium, which computer program, when being executed by a processor, carries out the method steps of any one of claims 1 to 6.
CN202111200087.3A 2021-10-14 2021-10-14 Image fusion method and device, electronic equipment and storage medium Active CN113793291B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111200087.3A CN113793291B (en) 2021-10-14 2021-10-14 Image fusion method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111200087.3A CN113793291B (en) 2021-10-14 2021-10-14 Image fusion method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113793291A true CN113793291A (en) 2021-12-14
CN113793291B CN113793291B (en) 2023-08-08

Family

ID=79185023

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111200087.3A Active CN113793291B (en) 2021-10-14 2021-10-14 Image fusion method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113793291B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114374830A (en) * 2022-01-06 2022-04-19 杭州海康威视数字技术股份有限公司 Image white balance method, electronic device and computer readable storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160080626A1 (en) * 2014-09-16 2016-03-17 Google Technology Holdings LLC Computational Camera Using Fusion of Image Sensors
CN107343188A (en) * 2017-06-16 2017-11-10 广东欧珀移动通信有限公司 image processing method, device and terminal
US20190297246A1 (en) * 2018-03-20 2019-09-26 Kabushiki Kaisha Toshiba Image processing device, drive supporting system, and image processing method
CN110830779A (en) * 2015-08-28 2020-02-21 杭州海康威视数字技术股份有限公司 Image signal processing method and system
CN111047530A (en) * 2019-11-29 2020-04-21 大连海事大学 Underwater image color correction and contrast enhancement method based on multi-feature fusion
CN111754440A (en) * 2020-06-29 2020-10-09 苏州科达科技股份有限公司 License plate image enhancement method, system, equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160080626A1 (en) * 2014-09-16 2016-03-17 Google Technology Holdings LLC Computational Camera Using Fusion of Image Sensors
CN110830779A (en) * 2015-08-28 2020-02-21 杭州海康威视数字技术股份有限公司 Image signal processing method and system
CN107343188A (en) * 2017-06-16 2017-11-10 广东欧珀移动通信有限公司 image processing method, device and terminal
US20190297246A1 (en) * 2018-03-20 2019-09-26 Kabushiki Kaisha Toshiba Image processing device, drive supporting system, and image processing method
CN111047530A (en) * 2019-11-29 2020-04-21 大连海事大学 Underwater image color correction and contrast enhancement method based on multi-feature fusion
CN111754440A (en) * 2020-06-29 2020-10-09 苏州科达科技股份有限公司 License plate image enhancement method, system, equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
JUN-YAN HUO等: "《Robust automatic white balance algorithm using gray color points in images》", 《IEEE TRANSACTIONS ON CONSUMER ELECTRONICS》, vol. 52, no. 2 *
胡玉娟 等: "《基于图像融合的水下彩色图像的增强方法》", 《合肥工业大学学报(自然科学版)》, vol. 36, no. 8 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114374830A (en) * 2022-01-06 2022-04-19 杭州海康威视数字技术股份有限公司 Image white balance method, electronic device and computer readable storage medium
CN114374830B (en) * 2022-01-06 2024-03-08 杭州海康威视数字技术股份有限公司 Image white balance method, electronic device and computer readable storage medium

Also Published As

Publication number Publication date
CN113793291B (en) 2023-08-08

Similar Documents

Publication Publication Date Title
EP3542347B1 (en) Fast fourier color constancy
JP4754227B2 (en) Auto white balance device and white balance adjustment method
CN108234971B (en) White balance parameter determines method, white balance adjustment method and device, storage medium, terminal
KR102294622B1 (en) White balance processing method, electronic device and computer readable storage medium
WO2018149253A1 (en) Image processing method and device
CN113393540B (en) Method and device for determining color edge pixel points in image and computer equipment
CN112218065B (en) Image white balance method, system, terminal device and storage medium
WO2022142570A1 (en) Image fusion method and apparatus, image processing device, and binocular system
CN113792827B (en) Target object recognition method, electronic device, and computer-readable storage medium
WO2020142871A1 (en) White balance processing method and device for image
CN113793291B (en) Image fusion method and device, electronic equipment and storage medium
CN113329217B (en) Automatic white balance parameter processing method and device, and computer readable storage medium
CN111587573B (en) Image processing method and device and computer storage medium
CN112070682A (en) Method and device for compensating image brightness
WO2019137396A1 (en) Image processing method and device
CN114286000B (en) Image color processing method and device and electronic equipment
CN112995633B (en) Image white balance processing method and device, electronic equipment and storage medium
CN111711809B (en) Image processing method and device, electronic device and storage medium
CN113473101A (en) Color correction method and device, electronic equipment and storage medium
CN113766206A (en) White balance adjusting method, device and storage medium
CN116419076B (en) Image processing method and device, electronic equipment and chip
CN112348905B (en) Color recognition method and device, terminal equipment and storage medium
KR20110079310A (en) Auto white balance image processing method
CN117676349A (en) Correction method, device, equipment and medium for white balance gain
CN117640906A (en) Projection light source correction method, device, equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant