CN110363731B - Image fusion method and device and electronic equipment - Google Patents

Image fusion method and device and electronic equipment Download PDF

Info

Publication number
CN110363731B
CN110363731B CN201810315298.3A CN201810315298A CN110363731B CN 110363731 B CN110363731 B CN 110363731B CN 201810315298 A CN201810315298 A CN 201810315298A CN 110363731 B CN110363731 B CN 110363731B
Authority
CN
China
Prior art keywords
fused
image
gray value
pixel point
visible light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810315298.3A
Other languages
Chinese (zh)
Other versions
CN110363731A (en
Inventor
杨伟
郭海训
吕金刚
颜昌杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikmicro Sensing Technology Co Ltd
Original Assignee
Hangzhou Hikmicro Sensing Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikmicro Sensing Technology Co Ltd filed Critical Hangzhou Hikmicro Sensing Technology Co Ltd
Priority to CN201810315298.3A priority Critical patent/CN110363731B/en
Publication of CN110363731A publication Critical patent/CN110363731A/en
Application granted granted Critical
Publication of CN110363731B publication Critical patent/CN110363731B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the invention provides an image fusion method, an image fusion device and electronic equipment, wherein the method comprises the following steps: acquiring an infrared image to be fused and a visible light image to be fused; determining a first fusion weight corresponding to the infrared image to be fused and a second fusion weight corresponding to the visible light image to be fused according to the brightness distribution conditions of the infrared image to be fused and the visible light image to be fused; and determining the gray value of the pixel point in the fused image according to the first fusion weight, the second fusion weight, the gray value of the pixel point in the visible light image to be fused and the gray value of the pixel point in the infrared image to be fused. Therefore, the scheme can be used for fusing the visible light image to be fused and the infrared image to be fused, so that a fused image which is slightly influenced by illumination and weather conditions and has richer details is obtained, and the fused image can be further processed conveniently in the follow-up process.

Description

Image fusion method and device and electronic equipment
Technical Field
The present invention relates to the field of image processing, and in particular, to an image fusion method and apparatus, and an electronic device.
Background
Video surveillance is visible everywhere in people's lives, for example, in shopping malls, supermarkets, roads, and communities. The image of the monitored scene can be acquired through the video monitoring equipment so as to know the state of the monitored scene, or the acquired image is further processed.
Most video monitoring equipment adopts visible light image acquisition equipment, and when the illumination conditions such as daytime and the like are good, the visible light image acquired by the visible light image acquisition equipment has rich details and clear pictures. However, the imaging quality of the visible light image acquisition equipment is obviously reduced when the illumination conditions such as night and the like are poor, and the visible light image acquisition equipment is sensitive to illumination change and has poor environment adaptability. For example, in an environment with strong light, halos are noticeable; and in the case of low visibility such as foggy days, the collected visible light image is blurred, and the like.
Compared with a visible light image, although the infrared image collected by the infrared image collecting device has less image details and lower resolution, the infrared image collecting device is less influenced by weather and illumination conditions, and can clearly image under the conditions of lower visibility such as night and foggy days.
If the visible light image and the infrared image can be fused, an image which is slightly affected by illumination and weather conditions and has richer details can be obtained, so that how to fuse the visible light image and the infrared image is a problem to be solved urgently.
Disclosure of Invention
The embodiment of the invention aims to provide an image fusion method, an image fusion device and electronic equipment, which are used for carrying out image fusion on a visible light image to be fused and an infrared image to be fused to obtain a fused image which is less influenced by illumination and weather conditions and has richer details. The specific technical scheme is as follows:
in a first aspect, an embodiment of the present invention provides an image fusion method, where the method includes:
acquiring an infrared image to be fused and a visible light image to be fused;
determining a first fusion weight corresponding to the infrared image to be fused and a second fusion weight corresponding to the visible light image to be fused according to the brightness distribution conditions of the infrared image to be fused and the visible light image to be fused;
and determining the gray value of the pixel point in the fused image according to the first fusion weight, the second fusion weight, the gray value of the pixel point in the visible light image to be fused and the gray value of the pixel point in the infrared image to be fused.
Optionally, the step of determining a first fusion weight corresponding to the visible light image to be fused and a second fusion weight corresponding to the infrared image to be fused according to the brightness distribution of the infrared image to be fused and the visible light image to be fused includes:
determining a first brightness distribution coefficient of the infrared image to be fused based on the gray value of the pixel point in the infrared image to be fused;
determining a second brightness distribution coefficient of the visible light image to be fused based on the gray value of the pixel point in the visible light image to be fused;
and calculating a first fusion weight corresponding to the infrared image to be fused and a second fusion weight corresponding to the visible light image to be fused according to the first brightness distribution coefficient and the second brightness distribution coefficient.
Optionally, the step of determining a first brightness distribution coefficient of the to-be-fused infrared image based on the gray value of the pixel point in the to-be-fused infrared image includes:
determining a first number of pixel points of which the gray values are smaller than a first preset gray value in the infrared image to be fused;
determining a second number of pixel points of which the gray values are not smaller than the first preset gray value in the infrared image to be fused;
determining a first brightness distribution coefficient of the infrared image to be fused according to the first quantity and the second quantity;
the step of determining a second brightness distribution coefficient of the to-be-fused visible light image based on the gray value of the pixel point in the to-be-fused visible light image includes:
determining a third number of pixel points with gray values smaller than a second preset gray value in the visible light image to be fused;
determining a fourth number of pixel points of which the gray values are not less than the second preset gray value in the visible light image to be fused;
and determining a second brightness distribution coefficient of the visible light image to be fused according to the third quantity and the fourth quantity.
Optionally, the step of determining a first brightness distribution coefficient of the to-be-fused infrared image according to the first number and the second number includes:
using formulas
Figure BDA0001623570910000031
Determining a first brightness distribution coefficient of the infrared image to be fused, wherein N1 is the first number, N2 is the second number, and N1 is the first brightness distribution coefficient;
the step of determining a second brightness distribution coefficient of the visible light image to be fused according to the third number and the fourth number includes:
using formulas
Figure BDA0001623570910000032
And determining a second brightness distribution coefficient of the visible light images to be fused, wherein N3 is the third number, N4 is the fourth number, and N2 is the second brightness distribution coefficient.
Optionally, the step of calculating a first fusion weight corresponding to the infrared image to be fused and a second fusion weight corresponding to the visible light image to be fused according to the first brightness distribution coefficient and the second brightness distribution coefficient includes:
using formulas
Figure BDA0001623570910000033
Calculating a first fusion weight corresponding to the infrared image to be fused;
using formulas
Figure BDA0001623570910000034
Calculating a second fusion weight corresponding to the visible light image to be fused, or calculating the second fusion weight corresponding to the visible light image to be fused by using a formula p 2-1-p 1;
wherein n1 is the first luminance distribution coefficient, n2 is the second luminance distribution coefficient, and p1 is the first fusion weight; p2 is the second fusion weight.
Optionally, the step of determining the gray value of the pixel point in the fused image according to the first fusion weight, the second fusion weight, the gray value of the pixel point in the visible light image to be fused, and the gray value of the pixel point in the infrared image to be fused includes:
calculating a first gray value contribution value of the pixel point in the infrared image to be fused corresponding to the pixel point in the fused image according to the first fusion weight and the gray value of the pixel point in the infrared image to be fused;
calculating a second gray value contribution value of the pixel point in the visible light image to be fused corresponding to the pixel point in the fused image according to the second fusion weight and the gray value of the pixel point in the visible light image to be fused;
and determining the gray value of the pixel point in the fusion image based on the sum of the first gray value contribution value and the second gray value contribution value.
Optionally, the step of calculating, according to the first fusion weight and the gray value of the pixel point in the infrared image to be fused, a first gray value contribution value of the pixel point in the infrared image to be fused, which corresponds to the pixel point in the fusion image, includes:
calculating a first gray value contribution value of a pixel point in the infrared image to be fused corresponding to a pixel point in the fused image by using a formula A which is p1 × IR (i, j), wherein p1 is the first fusion weight, IR (i, j) is a gray value of the pixel point (i, j) in the infrared image to be fused, and A is the first gray value contribution value;
the step of calculating a second gray value contribution value of the pixel point in the visible light image to be fused corresponding to the pixel point in the fused image according to the second fusion weight and the gray value of the pixel point in the visible light image to be fused comprises the following steps:
and calculating a second gray value contribution value of the pixel point in the visible light image to be fused corresponding to the pixel point in the fused image by using a formula B ═ p2 × IV (i, j), wherein p2 is the second fusion weight, IV (i, j) is the gray value of the pixel point (i, j) in the visible light image to be fused, and B is the second gray value contribution value.
Optionally, the step of determining the gray value of the pixel point in the fused image based on the sum of the first gray value contribution value and the second gray value contribution value includes:
using formulas
Figure BDA0001623570910000041
Determining the gray value of a pixel point in the fusion image, wherein I (I, j) is the gray value of the pixel point (I, j) in the fusion image;
or the like, or, alternatively,
using formulas
Figure BDA0001623570910000042
Determining the gray value of a pixel point in the fusion image, wherein I (I, j) is the gray value of the pixel point (I, j) in the fusion image;
or the like, or, alternatively,
rounding the values of p1 × IR (i, j) + p2 × IV (i, j) by rounding rules; and determining the value obtained by rounding as the gray value of the pixel point (i, j) in the fused image.
Optionally, the method further includes:
and carrying out target detection processing on the fused image to obtain a target detection result.
In a second aspect, an embodiment of the present invention further provides an image fusion apparatus, where the apparatus includes:
the image acquisition module is used for acquiring an infrared image to be fused and a visible light image to be fused;
the weight determining module is used for determining a first fusion weight corresponding to the infrared image to be fused and a second fusion weight corresponding to the visible light image to be fused according to the brightness distribution conditions of the infrared image to be fused and the visible light image to be fused;
and the gray value determining module is used for determining the gray value of the pixel point in the fused image according to the first fusion weight, the second fusion weight, the gray value of the pixel point in the visible light image to be fused and the gray value of the pixel point in the infrared image to be fused.
Optionally, the weight determining module includes:
the first brightness distribution coefficient determining submodule is used for determining a first brightness distribution coefficient of the infrared image to be fused based on the gray value of the pixel point in the infrared image to be fused;
the second brightness distribution coefficient determining submodule is used for determining a second brightness distribution coefficient of the visible light image to be fused based on the gray value of the pixel point in the visible light image to be fused;
and the weight determining submodule is used for calculating a first fusion weight corresponding to the infrared image to be fused and a second fusion weight corresponding to the visible light image to be fused according to the first brightness distribution coefficient and the second brightness distribution coefficient.
Optionally, the first luminance distribution coefficient determining sub-module includes:
the first quantity determining unit is used for determining the first quantity of pixel points with the gray values smaller than a first preset gray value in the infrared image to be fused;
the second quantity determining unit is used for determining a second quantity of pixel points of which the gray values are not smaller than the first preset gray value in the infrared image to be fused;
the first brightness distribution coefficient determining unit is used for determining a first brightness distribution coefficient of the infrared image to be fused according to the first quantity and the second quantity;
the second luminance distribution coefficient determination sub-module includes:
the third quantity determining unit is used for determining the third quantity of the pixel points with the gray values smaller than a second preset gray value in the visible light image to be fused;
the fourth quantity determining unit is used for determining the fourth quantity of the pixel points of which the gray values are not less than the second preset gray value in the visible light image to be fused;
and the second brightness distribution coefficient determining unit is used for determining a second brightness distribution coefficient of the visible light image to be fused according to the third quantity and the fourth quantity.
Optionally, the first luminance distribution coefficient determining unit includes:
a first brightness distribution coefficient determining subunit for determining the brightness distribution coefficient by using the formula
Figure BDA0001623570910000061
Determining a first brightness distribution coefficient of the infrared image to be fused, wherein N1 is the first number, N2 is the second number, and N1 is the first brightness distribution coefficient;
the second luminance distribution coefficient determining unit includes:
a second luminance distribution coefficient determining subunit for determining the second luminance distribution coefficient by using the formula
Figure BDA0001623570910000062
And determining a second brightness distribution coefficient of the visible light images to be fused, wherein N3 is the third number, N4 is the fourth number, and N2 is the second brightness distribution coefficient.
Optionally, the weight determining sub-module includes:
a first fusion weight calculation unit for using the formula
Figure BDA0001623570910000063
Calculating a first fusion weight corresponding to the infrared image to be fused;
a second fusion weight calculation unit for using the formula
Figure BDA0001623570910000064
Calculating a second fusion weight corresponding to the visible light image to be fused, or calculating the second fusion weight corresponding to the visible light image to be fused by using a formula p 2-1-p 1;
wherein n1 is the first luminance distribution coefficient, n2 is the second luminance distribution coefficient, and p1 is the first fusion weight; p2 is the second fusion weight.
Optionally, the gray value determining module includes:
a first gray value contribution value determining submodule, configured to calculate, according to the first fusion weight and a gray value of a pixel point in the to-be-fused infrared image, a first gray value contribution value of the pixel point in the to-be-fused infrared image, which corresponds to the pixel point in the to-be-fused image;
the second gray value contribution value determining submodule is used for calculating a second gray value contribution value of the pixel point in the visible light image to be fused corresponding to the pixel point in the fused image according to the second fusion weight and the gray value of the pixel point in the visible light image to be fused;
and the gray value determining submodule is used for determining the gray value of the pixel point in the fusion image based on the sum of the first gray value contribution value and the second gray value contribution value.
Optionally, the first gray value contribution value determining sub-module includes:
a first gray value contribution value determining unit, configured to calculate, by using a formula a ═ p1 × IR (i, j), a first gray value contribution value of a pixel point in the to-be-fused infrared image, where p1 is the first fusion weight, IR (i, j) is a gray value of a pixel point (i, j) in the to-be-fused infrared image, and a is the first gray value contribution value;
the second gray value contribution value determination submodule includes:
and a second gray value contribution value determining unit, configured to calculate a second gray value contribution value of the pixel point in the to-be-fused visible light image corresponding to the pixel point in the fused image by using a formula B ═ p2 × IV (i, j), where p2 is the second fusion weight, IV (i, j) is a gray value of the pixel point (i, j) in the to-be-fused visible light image, and B is the second gray value contribution value.
Optionally, the gray value determining sub-module includes:
a gray value determining unit for using a formula
Figure BDA0001623570910000071
Determining the gray value of a pixel point in the fusion image, wherein I (I, j) is the gray value of the pixel point (I, j) in the fusion image;
or the like, or, alternatively,
for using formulae
Figure BDA0001623570910000072
Determining the gray value of a pixel point in the fusion image, wherein I (I, j) is the gray value of the pixel point (I, j) in the fusion image;
or the like, or, alternatively,
for rounding the values of p1 × IR (i, j) + p2 × IV (i, j); and determining the value obtained by rounding as the gray value of the pixel point (i, j) in the fused image.
Optionally, the apparatus further comprises:
and the target detection module is used for carrying out target detection processing on the fused image to obtain a target detection result.
In a third aspect, an embodiment of the present invention further provides an electronic device, including a processor, a memory, and a communication bus, where the processor and the memory complete communication with each other through the communication bus;
a memory for storing a computer program;
and the processor is used for realizing any image fusion method step when executing the program stored in the memory.
In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, in which a computer program is stored, and the computer program, when executed by a processor, implements any of the above-mentioned image fusion method steps.
According to the scheme provided by the embodiment of the invention, the infrared image to be fused and the visible light image to be fused are firstly obtained, then the first fusion weight corresponding to the infrared image to be fused and the second fusion weight corresponding to the visible light image to be fused are determined according to the brightness distribution condition of the infrared image to be fused and the visible light image to be fused, and finally the gray value of the pixel point in the fusion image is determined according to the first fusion weight, the second fusion weight, the gray value of the pixel point in the visible light image to be fused and the gray value of the pixel point in the infrared image to be fused. By adopting the scheme, the visible light image to be fused and the infrared image to be fused can be fused, so that the fused image which is slightly influenced by illumination and weather conditions and has richer details is obtained, and the fused image can be further processed conveniently in the follow-up process.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a flowchart of an image fusion method according to an embodiment of the present invention;
FIG. 2 is a detailed flowchart of step S102 in the embodiment shown in FIG. 1;
FIG. 3 is a detailed flowchart of step S201 in the embodiment shown in FIG. 2;
FIG. 4 is a detailed flowchart of step S202 in the embodiment shown in FIG. 2;
FIG. 5 is a detailed flowchart of step S103 in the embodiment shown in FIG. 1;
fig. 6 is a schematic structural diagram of an image fusion apparatus according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In order to fuse an infrared image to be fused and a visible light image to be fused and obtain a fused image which is less affected by illumination and weather conditions and has richer details, the embodiment of the invention provides an image fusion method, an image fusion device, electronic equipment and a computer-readable storage medium.
First, an image fusion method provided by an embodiment of the present invention is described below.
The image fusion method provided by the embodiment of the present invention may be applied to any electronic device that needs to perform image fusion processing, which is hereinafter referred to as an electronic device for short, and for example, the image fusion method may be applied to various electronic devices such as an image processing device, an image capturing device, a computer, a tablet computer, and a mobile phone, and is not limited specifically herein.
As shown in fig. 1, an image fusion method includes:
s101, acquiring an infrared image to be fused and a visible light image to be fused;
s102, determining a first fusion weight corresponding to the infrared image to be fused and a second fusion weight corresponding to the visible light image to be fused according to the brightness distribution conditions of the infrared image to be fused and the visible light image to be fused;
s103, determining the gray value of the pixel point in the fusion image according to the first fusion weight, the second fusion weight, the gray value of the pixel point in the visible light image to be fused and the gray value of the pixel point in the infrared image to be fused.
Therefore, in the scheme provided by the embodiment of the invention, the electronic device may first obtain the infrared image to be fused and the visible light image to be fused, then determine a first fusion weight corresponding to the infrared image to be fused and a second fusion weight corresponding to the visible light image to be fused according to the brightness distribution conditions of the infrared image to be fused and the visible light image to be fused, and finally determine the gray value of the pixel point in the fusion image according to the first fusion weight, the second fusion weight, the gray value of the pixel point in the visible light image to be fused and the gray value of the pixel point in the infrared image to be fused. By adopting the scheme, the visible light image to be fused and the infrared image to be fused can be fused, so that the fused image which is slightly influenced by illumination and weather conditions and has richer details is obtained, and the fused image can be further processed conveniently in the follow-up process.
In the step S101, the electronic device may acquire an infrared image to be fused and a visible light image to be fused, where the infrared image to be fused is an infrared image acquired by the infrared image acquisition device, and the visible light image to be fused is a visible light image acquired by the visible light image acquisition device. It can be understood that the image scene areas corresponding to the infrared image to be fused and the visible light image to be fused are the same, and the shooting angle and the shooting time are also the same, so that the fused image obtained by fusion can accurately reflect the scene condition of the scene area at the time, and the accuracy of the fused image is ensured.
In one embodiment, the infrared image capturing device and the visible image capturing device may be mounted in the same position with the lens optical axes co-directional and parallel. The infrared image acquisition equipment and the visible light image acquisition equipment synchronously output acquired images. The field angle ranges of the infrared image acquisition equipment and the visible light image acquisition equipment can be registered according to the image resolution, so that the image scene areas acquired by the infrared image acquisition equipment and the visible light image acquisition equipment are the same. As for the registration manner of the field angle range, a common registration manner of the field angle range of the image capturing device may be adopted, which is not specifically limited and described herein.
In order to ensure that the infrared image acquisition equipment and the visible light image acquisition equipment are installed at the same position, the infrared image acquisition equipment and the visible light image acquisition equipment can adopt an integrally designed double-window structure or a separately designed independent structure.
After the infrared image to be fused and the visible light image to be fused are obtained, in step S102, the electronic device may determine a first fusion weight corresponding to the infrared image to be fused and a second fusion weight corresponding to the visible light image to be fused according to the brightness distribution of the infrared image to be fused and the visible light image to be fused.
Each pixel point in the infrared image to be fused corresponds to a gray value, and the range of the gray value is [0, 225 ]. Each pixel point in the visible light image to be fused corresponds to a gray value, and the range of the gray value is [0, 225 ]. After the electronic device acquires the infrared image to be fused and the visible light image to be fused, the brightness distribution condition of the infrared image to be fused and the visible light image to be fused can be determined, and then a first fusion weight corresponding to the infrared image to be fused and a second fusion weight corresponding to the visible light image to be fused are determined.
Next, in step S103, the electronic device may determine the gray values of the pixels in the fused image according to the first fusion weight, the second fusion weight, the gray values of the pixels in the visible light image to be fused, and the gray values of the pixels in the infrared image to be fused, and after the gray values of all the pixels in the fused image are determined, the fused image obtained by fusing the infrared image to be fused and the visible light image to be fused is obtained. For clarity of the scheme and clarity of the layout, a specific manner of determining the gray values of the pixels in the fused image will be described later.
The visible light image to be fused can be a color image or a black-and-white image, and the determining process of the gray value of the pixel point in the fused image is not influenced. And if the visible light image to be fused is a color image, the gray value of the pixel point in the visible light image to be fused is the gray value of the pixel point in the gray image corresponding to the visible light image to be fused.
As an implementation manner of the embodiment of the present invention, as shown in fig. 2, the step of determining the first fusion weight corresponding to the to-be-fused visible light image and the second fusion weight corresponding to the to-be-fused infrared image according to the luminance distribution conditions of the to-be-fused infrared image and the to-be-fused visible light image may include:
s201, determining a first brightness distribution coefficient of the infrared image to be fused based on the gray value of the pixel point in the infrared image to be fused;
after the electronic equipment acquires the infrared image to be fused, the gray value of each pixel point in the infrared image to be fused can be acquired, and then the first brightness distribution coefficient of the infrared image to be fused can be determined.
In one embodiment, the electronic device may determine a first luminance distribution coefficient of the infrared image to be fused according to a luminance distribution histogram of the infrared image to be fused. The luminance distribution histogram is a two-dimensional graph with the gray value as the abscissa and the number of the pixel points as the ordinate. And the meaning of the coordinate points (x, y) in the brightness distribution histogram of the infrared image to be fused is that the number of the pixel points with the gray value of x in the infrared image to be fused is y. Of course, the electronic device may also determine the gray values of the pixel points in the infrared image to be fused in other manners, for example, the gray values of each pixel point in the infrared image to be fused may be recorded one by one, and the like, which is not specifically limited herein.
Therefore, the electronic device can determine the gray value, namely the brightness distribution condition, of each pixel point in the infrared image to be fused according to the brightness distribution histogram of the infrared image to be fused, and further determine a first brightness distribution coefficient, wherein the first brightness distribution coefficient represents the brightness distribution condition of the infrared image to be fused.
S202, determining a second brightness distribution coefficient of the visible light image to be fused based on the gray value of the pixel point in the visible light image to be fused;
similarly, after the electronic device obtains the visible light image to be fused, the gray value of each pixel point in the visible light image to be fused can be obtained, and then, the second brightness distribution coefficient of the visible light image to be fused can be determined.
In one embodiment, the electronic device may determine the second luminance distribution coefficient of the visible light image to be fused according to a luminance distribution histogram of the visible light image to be fused. The luminance distribution histogram is a two-dimensional graph with the gray value as the abscissa and the number of the pixel points as the ordinate. The meaning of the coordinate point (x, y) in the luminance distribution histogram of the visible light image to be fused is that the number of the pixel points with the gray scale value of x in the visible light image to be fused is y. Of course, the electronic device may also determine the gray values of the pixels in the visible light image to be fused in other manners, for example, the gray values of each pixel in the visible light image to be fused may be recorded one by one, and the like, which is not specifically limited herein.
Therefore, the electronic device can determine the gray value, namely the brightness distribution condition, of each pixel point in the visible light image to be fused according to the brightness distribution histogram of the visible light image to be fused, and further can determine a second brightness distribution coefficient, wherein the second brightness distribution coefficient represents the brightness distribution condition of the visible light image to be fused.
It should be noted that, the execution sequence of the above steps S201 and S202 is not limited, and the steps S201 and S202 may be executed first, or the steps S202 and S201 may be executed first, or the steps S201 and S202 may be executed simultaneously, which is reasonable and has no influence on the subsequent calculation of the first fusion weight corresponding to the infrared image to be fused and the second fusion weight corresponding to the visible light image to be fused.
S203, calculating a first fusion weight corresponding to the infrared image to be fused and a second fusion weight corresponding to the visible light image to be fused according to the first brightness distribution coefficient and the second brightness distribution coefficient.
After determining the first brightness distribution coefficient and the second brightness distribution coefficient, the electronic device may calculate a first fusion weight corresponding to the infrared image to be fused and a second fusion weight corresponding to the visible light image to be fused.
The first fusion weight represents the proportion of the gray value of the pixel point in the infrared image to be fused in the gray value of the corresponding pixel point in the fusion image, and the second fusion weight represents the proportion of the gray value of the pixel point in the visible image to be fused in the gray value of the corresponding pixel point in the fusion image.
Therefore, in this embodiment, the electronic device represents the brightness distribution conditions of the infrared image to be fused and the visible light image to be fused by using the first brightness distribution coefficient of the infrared image to be fused and the second brightness distribution coefficient of the visible light image to be fused, and determines the first fusion weight corresponding to the visible light image to be fused and the second fusion weight corresponding to the infrared image to be fused according to the first brightness distribution coefficient and the second brightness distribution coefficient, so that the first fusion weight corresponding to the visible light image to be fused and the second fusion weight corresponding to the infrared image to be fused can be quickly determined, and the subsequent determination of the gray value of the pixel point of the fusion image is facilitated.
As an implementation manner of the embodiment of the present invention, as shown in fig. 3, the step of determining the first luminance distribution coefficient of the to-be-fused infrared image based on the gray-level value of the pixel point in the to-be-fused infrared image may include:
s301, determining a first number of pixel points of which the gray values are smaller than a first preset gray value in the infrared image to be fused;
the first preset gray value can be a middle value of the image gray value range, so that the first brightness distribution coefficient is ensured to be moderate. For example, the image gray-scale value range is [0, 255], then the first preset gray-scale value may be 128. Of course, the values may be 125, 127, 129, 130, etc., and are not particularly limited as long as they are close to the intermediate value of the image gradation value range.
For example, the first preset gray value is 128, and the number of pixel points in the to-be-fused infrared image with a gray value smaller than 128 is 525, then the first number is 525.
S302, determining a second number of pixel points of which the gray values are not less than the first preset gray value in the infrared image to be fused;
the electronic device may also determine a second number of the pixel points having the gray values not less than the first preset gray value in the to-be-fused infrared image, that is, the number of the pixel points equal to or greater than the first preset gray value. For example, the first preset gray value is 128, and the number of pixel points in the to-be-fused infrared image whose gray value is not less than 128 is 571, then the first number is 571.
It should be noted that, the execution sequence of the above steps S301 and S302 is not limited, and the steps S301 and S302 may be executed first, or the steps S302 and S301 may be executed first, or the steps S301 and S302 may be executed simultaneously, which is reasonable and has no influence on the subsequent calculation of the first luminance distribution coefficient of the infrared image to be fused.
S303, determining a first brightness distribution coefficient of the infrared image to be fused according to the first quantity and the second quantity.
After the electronic device determines the first quantity and the second quantity, a first brightness distribution coefficient of the infrared image to be fused can be determined according to the first quantity and the second quantity. For example, a ratio of the first number to the second number may be determined as a first luminance distribution coefficient of the infrared image to be fused.
Correspondingly, as shown in fig. 4, the step of determining the second luminance distribution coefficient of the to-be-fused visible light image based on the gray-scale value of the pixel point in the to-be-fused visible light image may include:
s401, determining a third number of pixel points with gray values smaller than a second preset gray value in the visible light image to be fused;
it is reasonable that the second preset gray scale value may be the same as the first preset gray scale value or different from the first preset gray scale value. Then, the second preset gray value may also be a middle value of the image gray value range, which may ensure that the second luminance distribution coefficient is relatively moderate. For example, the image gray scale value range is [0, 255], then the second preset gray scale value may be 128. Of course, the values may be 126, 127, 129, 131, etc., as long as they are close to the middle value of the image gradation value range, and are not particularly limited herein.
For example, the second preset gray value is 127, the number of the pixel points with the gray value smaller than 127 in the visible light image to be fused is 704, and then the third number is 704.
S402, determining a fourth number of pixel points of which the gray values are not less than the second preset gray value in the visible light image to be fused;
the electronic device may also determine a fourth number of the pixel points with the gray values not less than the second preset gray value in the visible light image to be fused, that is, the number of the pixel points equal to or greater than the second preset gray value. For example, the second preset gray value is 127, the number of the pixel points whose gray values are not less than 127 in the to-be-fused infrared image is 392, and then the fourth number is 392.
It should be noted that, the execution sequence of the above steps S401 and S402 is not limited, and the step S401 may be executed first, and then the step S402 is executed, or the step S402 may be executed first, and then the step S401 is executed, or the step S401 and the step S402 may be executed simultaneously, which is reasonable and has no influence on the subsequent calculation of the first luminance distribution coefficient of the infrared image to be fused.
And S403, determining a second brightness distribution coefficient of the visible light image to be fused according to the third quantity and the fourth quantity.
After the electronic device determines the third number and the fourth number, a second brightness distribution coefficient of the visible light image to be fused can be determined according to the third number and the fourth number. For example, a ratio of the third number to the fourth number may be determined as the second luminance distribution coefficient of the visible light image to be fused.
Therefore, in the embodiment, the electronic device determines the first quantity and the second quantity according to the relation between the gray value in the infrared image to be fused and the first preset gray value, and further determines the first brightness distribution coefficient; and determining a third quantity and a fourth quantity according to the relation between the gray value in the visible light image to be fused and a second preset gray value, and further determining a second brightness distribution coefficient. The first brightness distribution coefficient and the second brightness distribution coefficient determined by the electronic device are respectively based on the gray values of each pixel point in the infrared image to be fused and the visible light image to be fused, so that the gray values of the pixel points in the subsequently obtained fused image can contain more information of the infrared image to be fused and the visible light image to be fused, and the details are richer.
As an implementation manner of the embodiment of the present invention, the step of determining the first luminance distribution coefficient of the to-be-fused infrared image according to the first number and the second number may include:
using formulas
Figure BDA0001623570910000151
And determining a first brightness distribution coefficient of the infrared images to be fused, wherein N1 is the first number, N2 is the second number, and N1 is the first brightness distribution coefficient.
In one embodiment, the electronic device may utilize a formula
Figure BDA0001623570910000152
And determining a first brightness distribution coefficient of the infrared image to be fused. Where min (N1, N2) represents the smaller of the values of N1 and N2, and max (N1, N2) represents the larger of the values of N1 and N2.
For example, if the first number N1 is 525 and the second number N2 is 571, the first luminance distribution coefficient N1 is 525/571 ≈ 0.9194, where the value of N1 is only exemplarily reserved to four digits after the decimal point, and when the value of N1 is infinite decimal, the number of decimal points reserved for the value of N1 may be determined according to factors such as actual image processing accuracy, and is not specifically limited herein.
Similarly, the step of determining the second luminance distribution coefficient of the visible light image to be fused according to the third number and the fourth number may include:
using formulas
Figure BDA0001623570910000153
And determining a second brightness distribution coefficient of the visible light images to be fused, wherein N3 is the third number, N4 is the fourth number, and N2 is the second brightness distribution coefficient.
In one embodiment, the electronic device may utilize a formula
Figure BDA0001623570910000161
And determining a second brightness distribution coefficient of the visible light image to be fused. Where min (N3, N4) represents the smaller of the values of N3 and N4, and max (N3, N4) represents the larger of the values of N3 and N4.
Illustratively, if the third number N3 is 704, and the fourth number N2 is 392, then the second luminance distribution coefficient N2 is 392/704 ≈ 0.5568, where the value of N2 is only exemplarily reserved to four digits after the decimal point, and when the value of N2 is an infinite decimal, the number of decimal points reserved for the value of N2 may be determined according to factors such as the actual image processing accuracy, and is not specifically limited herein.
It can be seen that, in the present embodiment, the electronic device can be according to a formula
Figure BDA0001623570910000162
Determining a first luminance distribution coefficient according to the formula
Figure BDA0001623570910000163
A second luminance distribution coefficient is determined. The first luminance distribution coefficient and the second luminance distribution coefficient can be determined quickly.
As an implementation manner of the embodiment of the present invention, the step of calculating the first fusion weight corresponding to the infrared image to be fused and the second fusion weight corresponding to the visible light image to be fused according to the first brightness distribution coefficient and the second brightness distribution coefficient may include:
using formulas
Figure BDA0001623570910000164
Calculating a first fusion weight corresponding to the infrared image to be fused; using formulas
Figure BDA0001623570910000165
And calculating a second fusion weight corresponding to the visible light image to be fused, or calculating the second fusion weight corresponding to the visible light image to be fused by using a formula p 2-1-p 1.
Wherein n1 is the first luminance distribution coefficient, n2 is the second luminance distribution coefficient, p1 is the first fusion weight, and p2 is the second fusion weight.
After the first brightness distribution coefficient and the second brightness distribution coefficient are determined, the electronic equipment can use the formula
Figure BDA0001623570910000166
And calculating a first fusion weight p1 corresponding to the infrared image to be fused. For example, if n1 is 0.9194 and n2 is 0.5568, the first fusion weight p1 is
Figure BDA0001623570910000167
The value of p1 is only exemplarily reserved to four digits after the decimal point, and when the value of p1 is infinite, the number of decimal points reserved for the value of p1 may be determined according to factors such as actual image processing accuracy, and is not particularly limited herein.
In an embodiment, for the second fusion weight corresponding to the visible light image to be fused, the electronic device may also determine according to the first brightness distribution coefficient and the second brightness distribution coefficient, and specifically, the electronic device may determine according to a formula
Figure BDA0001623570910000171
And calculating to obtain a second fusion weight.
For example, if n1 is 0.9194 and n2 is 0.5568, the first fusion weight p2 is
Figure BDA0001623570910000172
Figure BDA0001623570910000173
The value of p2 is only exemplarily reserved to four digits after the decimal point, and when the value of p2 is infinite, the number of decimal points reserved for the value of p2 may be determined according to factors such as actual image processing accuracy, and is not particularly limited herein.
In another embodiment, after determining the first fusion weight p1, the electronic device may calculate a second fusion weight corresponding to the visible light image to be fused by using the formula p2 — 1-p 1. For example, if p1 is 0.47, then p 2-1-0.47-0.53.
It can be seen that in this embodiment, the electronic device can be provided with
Figure BDA0001623570910000174
Or the p 2-1-p 1 is calculated to obtain the first fusion weight corresponding to the infrared image to be fused and the visible image to be fusedAnd the gray value of the pixel point in the fused image obtained by utilizing the first fusion weight and the second fusion weight comprises the information in the infrared image to be fused and the visible light image to be fused, so that the fused image can more completely and accurately reflect the real situation of the image scene.
As an implementation manner of the embodiment of the present invention, as shown in fig. 5, the step of determining the gray value of the pixel point in the fused image according to the first fusion weight, the second fusion weight, the gray value of the pixel point in the visible light image to be fused, and the gray value of the pixel point in the infrared image to be fused may include:
s501, calculating a first gray value contribution value of the pixel point in the infrared image to be fused corresponding to the pixel point in the fused image according to the first fusion weight and the gray value of the pixel point in the infrared image to be fused;
since the first fusion weight represents the proportion of the gray value of the pixel point in the infrared image to be fused in the gray value of the corresponding pixel point in the fusion image, the electronic device can calculate the ratio of the gray value of the pixel point in the infrared image to be fused in the gray value of the pixel point in the fusion image according to the first fusion weight and the gray value of the pixel point in the infrared image to be fused, namely the first gray value contribution value.
For example, the product of the gray value of the pixel point in the infrared image to be fused and the first fusion weight may be used as the first gray value contribution value of the pixel point, or the product of the product and a preset scaling factor may be used as the first gray value contribution value. The preset proportionality coefficient can be determined according to factors such as the weather condition of the collected visible light image to be fused and the like.
S502, calculating a second gray value contribution value of the pixel point in the visible light image to be fused corresponding to the pixel point in the fused image according to the second fusion weight and the gray value of the pixel point in the visible light image to be fused;
since the second fusion weight represents the proportion of the gray value of the pixel point in the visible light image to be fused in the gray value of the corresponding pixel point in the fusion image, the electronic device can calculate the ratio of the gray value of the pixel point in the visible light image to be fused in the gray value of the pixel point in the fusion image to be fused, namely the contribution value of the second gray value, according to the second fusion weight and the gray value of the pixel point in the visible light image to be fused.
For example, the product of the gray value of the pixel point in the visible light image to be fused and the second fusion weight may be used as the second gray value contribution value of the pixel point, or the product of the product and a preset scaling factor may be used as the second gray value contribution value. The preset proportionality coefficient can be determined according to factors such as the weather condition of the collected visible light image to be fused and the like.
It should be noted that, the execution sequence of the steps S501 and S502 is not limited, the steps S501 and S502 may be executed first, the steps S502 and S501 may be executed first, and the steps S501 and S502 may be executed simultaneously.
S503, determining the gray value of the pixel point in the fusion image based on the sum of the first gray value contribution value and the second gray value contribution value.
After the electronic device determines the first gray value contribution value and the second gray value contribution value, the gray value of the pixel point in the fused image can be determined according to the sum of the first gray value contribution value and the second gray value contribution value. For clarity of the scheme and clarity of the layout, a specific manner of determining the gray values of the pixels in the fused image will be described later.
Therefore, in this embodiment, the electronic device may determine the gray value of the pixel point in the fusion image according to the first gray value contribution value and the second gray value contribution value, and the gray value of the pixel point in the fusion image obtained by fusion reflects information in the infrared image to be fused and the visible light image to be fused, so that the information included in the fusion image is complete and accurate.
As an implementation manner of the embodiment of the present invention, the step of calculating, according to the first fusion weight and the gray value of the pixel point in the to-be-fused infrared image, a first gray value contribution value of the pixel point in the to-be-fused infrared image, which corresponds to the pixel point in the to-be-fused infrared image, may include:
and calculating a first gray value contribution value of the pixel point in the infrared image to be fused corresponding to the pixel point in the fused image by using a formula A which is p1 × IR (i, j).
Wherein p1 is a first fusion weight, IR (i, j) is a gray value of a pixel point (i, j) in the infrared image to be fused, and a is a first gray value contribution value. According to the formula, the electronic equipment can calculate and obtain a first gray value contribution value of each pixel point in the infrared image to be fused, which corresponds to the pixel point in the fused image.
For example, if the first fusion weight p1 is 0.47, and the gray value of the pixel point (18, 45) in the to-be-fused infrared image is 206, the first gray value contribution value corresponding to the pixel point (18, 45) in the to-be-fused infrared image is 0.47 × 206 — 96.82.
Correspondingly, the step of calculating a second gray value contribution value of the pixel point in the visible light image to be fused corresponding to the pixel point in the fused image according to the second fusion weight and the gray value of the pixel point in the visible light image to be fused may include:
and calculating a second gray value contribution value of the pixel point in the visible light image to be fused corresponding to the pixel point in the fused image by using the formula B which is p2 × IV (i, j).
Wherein p2 is the second fusion weight, IV (i, j) is the gray value of the pixel point (i, j) in the visible light image to be fused, and B is the second gray value contribution value. According to the formula, the electronic equipment can calculate and obtain a second gray value contribution value of each pixel point in the visible light image to be fused, which corresponds to the pixel point in the fused image.
For example, if the second fusion weight p2 is 0.53, and the gray scale value of the pixel point (18, 45) in the visible light image to be fused is 87, the second gray scale value contribution value corresponding to the pixel point (18, 45) in the visible light image to be fused is 46.11 — 0.53 × 87.
It can be seen that, in this embodiment, the electronic device may respectively determine, according to a ═ p1 × IR (i, j) and B ═ p2 × IV (i, j), a first grayscale contribution value corresponding to a pixel point in the fused image for each pixel point in the infrared image to be fused, and a second grayscale contribution value corresponding to a pixel point in the fused image for each pixel point in the visible light image to be fused, so as to facilitate subsequent calculation of a grayscale value of the pixel point in the fused image according to the first grayscale contribution value and the second grayscale contribution value.
As an implementation manner of the embodiment of the present invention, the step of determining the gray value of the pixel point in the fused image based on the sum of the first gray value contribution value and the second gray value contribution value may include:
using formulas
Figure BDA0001623570910000201
Determining the gray value of a pixel point in the fusion image, wherein I (I, j) is the gray value of the pixel point (I, j) in the fusion image;
or the like, or, alternatively,
using formulas
Figure BDA0001623570910000202
Determining the gray value of a pixel point in the fusion image, wherein I (I, j) is the gray value of the pixel point (I, j) in the fusion image;
or the like, or, alternatively,
rounding the values of p1 × IR (i, j) + p2 × IV (i, j) by rounding rules; and determining the value obtained by rounding as the gray value of the pixel point (i, j) in the fused image.
In the first embodiment, the determining manner of the gray value of the pixel point (i, j) in the fused image may be: using formulas
Figure BDA0001623570910000203
And determining the gray value of the pixel point (i, j) in the fused image. Wherein the content of the first and second substances,
Figure BDA0001623570910000204
indicating rounding down the value of t. For exampleT is 3.25, then
Figure BDA0001623570910000205
The value of (d) is 3.
Since the gray value of the pixel point in the image is an integer between [0 and 255], the electronic device may sum a first gray value contribution value corresponding to the pixel point (i, j) in the infrared image to be fused and a second gray value contribution value corresponding to the pixel point (i, j) in the visible light image to be fused, and lower the sum to a lower integer, and use the lower integer as the gray value of the pixel point (i, j) in the fused image.
For example, the first gray value contribution value corresponding to the pixel point (18, 45) in the infrared image to be fused is 96.82, and the second gray value contribution value corresponding to the pixel point (18, 45) in the visible light image to be fused is 46.11, so that the gray value of the pixel point (18, 45) in the fused image is the gray value
Figure BDA0001623570910000206
Figure BDA0001623570910000211
According to the method, the electronic equipment can calculate the gray value of each pixel point in the fused image, and the fused image obtained after the image fusion of the infrared image to be fused and the visible light image to be fused is obtained.
In a second embodiment, the determination method of the gray value of the pixel point (i, j) in the fused image may be: using formulas
Figure BDA0001623570910000212
And determining the gray value of the pixel point (i, j) in the fused image. Wherein the content of the first and second substances,
Figure BDA0001623570910000213
indicating rounding up the value of t. For example, t is 3.25, then
Figure BDA0001623570910000214
Is the value of4。
Since the gray value of the pixel point in the image is an integer between [0 and 255], the electronic device may sum a first gray value contribution value corresponding to the pixel point (i, j) in the infrared image to be fused and a second gray value contribution value corresponding to the pixel point (i, j) in the visible light image to be fused, and rounding up the summed value, and using the rounded up value as the gray value of the pixel point (i, j) in the fused image.
For example, the first gray value contribution value corresponding to the pixel point (18, 45) in the infrared image to be fused is 96.82, and the second gray value contribution value corresponding to the pixel point (18, 45) in the visible light image to be fused is 46.11, so that the gray value of the pixel point (18, 45) in the fused image is the gray value
Figure BDA0001623570910000215
Figure BDA0001623570910000216
When the gray value of each pixel point in the fusion image is calculated by adopting the method, the value larger than 255 can be obtained, if a formula is utilized
Figure BDA0001623570910000217
And determining that the gray value of the pixel point (i, j) in the fused image is larger than 255, and then the electronic equipment can determine that the gray value of the pixel point (i, j) is 255 so as to ensure that the gray values of the pixel points in the fused image are all within the range of the gray value of the pixel point of the normal image.
According to the method, the electronic equipment can calculate the gray value of each pixel point in the fused image, and the fused image obtained after the image fusion of the infrared image to be fused and the visible light image to be fused is obtained.
In a third embodiment, the determination method of the gray value of the pixel point (i, j) in the fused image may be: rounding the values of p1 × IR (i, j) + p2 × IV (i, j) by rounding rules; and determining the value obtained by rounding as the gray value of the pixel point (i, j) in the fused image.
Since the gray value of the pixel point in the image is an integer between [0, 255], the electronic device may sum a first gray value contribution value corresponding to the pixel point (i, j) in the infrared image to be fused and a second gray value contribution value corresponding to the pixel point (i, j) in the visible light image to be fused, then perform rounding on the summed value according to a rounding rule, and determine the rounded value as the gray value of the pixel point (i, j) in the fused image.
For example, the first gray value contribution value corresponding to the pixel point (1127, 829) in the infrared image to be fused is 57.55, and the second gray value contribution value corresponding to the pixel point (1127, 829) in the visible light image to be fused is 125.63, so that the gray value of the pixel point (1127, 829) in the fused image is 57.55+125.63, which is 183.18 rounded, which is 183.
If the first gray value contribution value corresponding to the pixel point (1127, 829) in the infrared image to be fused is 57.95, and the second gray value contribution value corresponding to the pixel point (1127, 829) in the visible light image to be fused is 125.63, then the gray value of the pixel point (1127, 829) in the fused image is the value rounded from 183.58 to 57.95+125.63, which is 184.
When the gray value of each pixel point in the fusion image is calculated by adopting the method, a value larger than 255 can be obtained, and the gray value of the pixel point (i, j) in the fusion image is determined to be larger than 255, so that the electronic equipment can determine that the gray value of the pixel point (i, j) is 255, and the gray value of the pixel point in the fusion image is ensured to be within the range of the gray value of the pixel point in the normal image.
According to the method, the electronic equipment can calculate the gray value of each pixel point in the fused image, and the fused image obtained after the image fusion of the infrared image to be fused and the visible light image to be fused is obtained.
Therefore, in this embodiment, the electronic device may determine the gray value of each pixel point in the fused image according to any one of the three manners, and the obtained gray value of each pixel point in the fused image is determined according to the gray values of the corresponding pixel points in the infrared image to be fused and the visible light image to be fused, so that the fused image may include information in the infrared image to be fused and the visible light image to be fused, details are richer, and the fused image is favorable for further processing subsequently.
As an implementation manner of the embodiment of the present invention, the method may further include: and carrying out target detection processing on the fused image to obtain a target detection result.
After the fused image is obtained, the electronic device can perform target detection processing on the fused image according to actual needs, and then obtain a required target detection result. For example, the electronic device may input the fused image into a deep learning network model for performing object detection, and perform object detection processing on the fused image. The deep learning network model may be a deep learning network model such as RCNN, Fast-RCNN, etc., and is not specifically limited herein.
For example, if the infrared image to be fused and the visible light image to be fused are monitoring images in an intelligent traffic system, and the fused images are used for determining license plate numbers, license plate detection processing can be performed on the fused images, the license plates in the fused images are detection targets, and the obtained license plate numbers are target detection results.
For another example, if the infrared image to be fused and the visible light image to be fused are monitoring images in a monitoring system, and the fused images are used for determining whether suspicious people exist in a monitoring scene, face detection processing may be performed on the fused images, the face in the fused images is a detection target, and an obtained face recognition result is a target detection result.
For another example, if the infrared image to be fused and the visible light image to be fused are captured images in a robot arm capture system, and the fused images are used for determining the target object and the position of the object to be captured, the fused images may be subjected to target object detection processing, the target object in the fused images is a detection target, and the obtained position of the target object is a target detection result.
Therefore, in this embodiment, after the electronic device obtains the fusion image, the electronic device may further perform target detection processing on the fusion image to obtain a required target detection result.
Corresponding to the image fusion method, the embodiment of the invention also provides an image fusion device. The following describes an image fusion apparatus provided in an embodiment of the present invention.
As shown in fig. 6, an image fusion apparatus, the apparatus comprising:
the image acquisition module 610 is used for acquiring an infrared image to be fused and a visible light image to be fused;
the weight determining module 620 is configured to determine a first fusion weight corresponding to the infrared image to be fused and a second fusion weight corresponding to the visible light image to be fused according to brightness distribution conditions of the infrared image to be fused and the visible light image to be fused;
a gray value determining module 630, configured to determine a gray value of a pixel point in the fused image according to the first fusion weight, the second fusion weight, the gray value of the pixel point in the visible light image to be fused, and the gray value of the pixel point in the infrared image to be fused.
Therefore, in the scheme provided by the embodiment of the invention, the electronic device may first obtain the infrared image to be fused and the visible light image to be fused, then determine a first fusion weight corresponding to the infrared image to be fused and a second fusion weight corresponding to the visible light image to be fused according to the brightness distribution conditions of the infrared image to be fused and the visible light image to be fused, and finally determine the gray value of the pixel point in the fusion image according to the first fusion weight, the second fusion weight, the gray value of the pixel point in the visible light image to be fused and the gray value of the pixel point in the infrared image to be fused. By adopting the scheme, the visible light image to be fused and the infrared image to be fused can be fused, so that the fused image which is slightly influenced by illumination and weather conditions and has richer details is obtained, and the fused image can be further processed conveniently in the follow-up process.
As an implementation manner of the embodiment of the present invention, the weight determining module 620 may include:
a first brightness distribution coefficient determining submodule (not shown in fig. 6) configured to determine a first brightness distribution coefficient of the to-be-fused infrared image based on a gray value of a pixel point in the to-be-fused infrared image;
a second brightness distribution coefficient determining submodule (not shown in fig. 6) configured to determine a second brightness distribution coefficient of the to-be-fused visible light image based on a gray value of a pixel point in the to-be-fused visible light image;
and a weight determining submodule (not shown in fig. 6) configured to calculate a first fusion weight corresponding to the infrared image to be fused and a second fusion weight corresponding to the visible light image to be fused according to the first brightness distribution coefficient and the second brightness distribution coefficient.
As an implementation manner of the embodiment of the present invention, the first luminance distribution coefficient determining sub-module may include:
a first number determining unit (not shown in fig. 6) configured to determine a first number of pixel points in the to-be-fused infrared image, where a gray value of the to-be-fused infrared image is smaller than a first preset gray value;
a second number determining unit (not shown in fig. 6) configured to determine a second number of pixel points in the to-be-fused infrared image, where the gray value is not less than the first preset gray value;
a first brightness distribution coefficient determining unit (not shown in fig. 6) configured to determine a first brightness distribution coefficient of the to-be-fused infrared image according to the first number and the second number;
the second luminance distribution coefficient determination sub-module may include:
a third number determining unit (not shown in fig. 6) configured to determine a third number of pixel points in the to-be-fused visible light image, where the gray value of the pixel points is smaller than a second preset gray value;
a fourth number determining unit (not shown in fig. 6) configured to determine a fourth number of pixel points in the to-be-fused visible light image, where the gray value is not less than the second preset gray value;
and a second brightness distribution coefficient determining unit (not shown in fig. 6) configured to determine a second brightness distribution coefficient of the visible light image to be fused according to the third number and the fourth number.
As an implementation manner of the embodiment of the present invention, the first luminance distribution coefficient determining unit may include:
a first luminance distribution coefficient determining subunit (not shown in fig. 6) for determining the luminance distribution coefficient using the formula
Figure BDA0001623570910000251
Figure BDA0001623570910000252
Determining a first brightness distribution coefficient of the infrared image to be fused, wherein N1 is the first number, N2 is the second number, and N1 is the first brightness distribution coefficient;
the second luminance distribution coefficient determining unit may include:
a second luminance distribution coefficient determining subunit (not shown in fig. 6) for determining the luminance distribution coefficient using the formula
Figure BDA0001623570910000253
Figure BDA0001623570910000254
And determining a second brightness distribution coefficient of the visible light images to be fused, wherein N3 is the third number, N4 is the fourth number, and N2 is the second brightness distribution coefficient.
As an implementation manner of the embodiment of the present invention, the weight determining sub-module 620 may include:
a first fusion weight calculation unit (not shown in FIG. 6) for utilizing the formula
Figure BDA0001623570910000255
Calculating a first fusion weight corresponding to the infrared image to be fused;
a second fusion weight calculation unit (not shown in FIG. 6) for utilizing the formula
Figure BDA0001623570910000256
Calculating a second fusion weight corresponding to the visible light image to be fused, or calculating the second fusion weight corresponding to the visible light image to be fused by using a formula p 2-1-p 1;
wherein n1 is the first luminance distribution coefficient, n2 is the second luminance distribution coefficient, and p1 is the first fusion weight; p2 is the second fusion weight.
As an implementation manner of the embodiment of the present invention, the gray value determining module 630 may include:
a first gray value contribution value determining submodule (not shown in fig. 6) configured to calculate, according to the first fusion weight and a gray value of a pixel point in the infrared image to be fused, a first gray value contribution value of the pixel point in the infrared image to be fused, which corresponds to the pixel point in the fusion image;
a second gray value contribution value determining submodule (not shown in fig. 6) configured to calculate, according to the second fusion weight and the gray value of the pixel point in the to-be-fused visible light image, a second gray value contribution value of the pixel point in the to-be-fused visible light image, which corresponds to the pixel point in the to-be-fused image;
a gray value determination submodule (not shown in fig. 6) configured to determine a gray value of a pixel point in the fused image based on a sum of the first gray value contribution value and the second gray value contribution value.
As an implementation manner of the embodiment of the present invention, the determining sub-module of the first gray value contribution value may include:
a first gray value contribution value determining unit (not shown in fig. 6), configured to calculate a first gray value contribution value of a pixel point in the to-be-fused infrared image corresponding to a pixel point in a fused image by using a formula a ═ p1 × IR (i, j), where p1 is the first fusion weight, IR (i, j) is a gray value of the pixel point (i, j) in the to-be-fused infrared image, and a is the first gray value contribution value;
the second gray value contribution value determination sub-module may include:
a second gray value contribution value determining unit (not shown in fig. 6), configured to calculate a second gray value contribution value of the pixel point in the visible light image to be fused corresponding to the pixel point in the fused image by using a formula B ═ p2 × IV (i, j), where p2 is the second fusion weight, IV (i, j) is a gray value of the pixel point (i, j) in the visible light image to be fused, and B is the second gray value contribution value.
As an implementation manner of the embodiment of the present invention, the gray value determining sub-module may include:
a gray value determining unit (not shown in fig. 6) for utilizing the formula
Figure BDA0001623570910000261
Figure BDA0001623570910000262
Determining the gray value of a pixel point in the fused image, wherein Ii, j is the gray value of the pixel point (i, j) in the fused image;
or the like, or, alternatively,
for using formulae
Figure BDA0001623570910000263
Determining the gray value of a pixel point in the fusion image, wherein I (I, j) is the gray value of the pixel point (I, j) in the fusion image;
or the like, or, alternatively,
for rounding the values of p1 × IR (i, j) + p2 × IV (i, j); and determining the value obtained by rounding as the gray value of the pixel point (i, j) in the fused image.
As an implementation manner of the embodiment of the present invention, the apparatus may further include:
and a target detection module (not shown in fig. 6) configured to perform target detection processing on the fused image to obtain a target detection result.
An embodiment of the present invention further provides an electronic device, as shown in fig. 7, including a processor 701, a communication interface 702, a memory 703 and a communication bus 704, where the processor 701, the communication interface 702, and the memory 703 complete mutual communication through the communication bus 704,
a memory 703 for storing a computer program;
the processor 701 is configured to implement the following steps when executing the program stored in the memory 703:
acquiring an infrared image to be fused and a visible light image to be fused;
determining a first fusion weight corresponding to the infrared image to be fused and a second fusion weight corresponding to the visible light image to be fused according to the brightness distribution conditions of the infrared image to be fused and the visible light image to be fused;
and determining the gray value of the pixel point in the fused image according to the first fusion weight, the second fusion weight, the gray value of the pixel point in the visible light image to be fused and the gray value of the pixel point in the infrared image to be fused.
Therefore, in the scheme provided by the embodiment of the invention, the electronic device may first obtain the infrared image to be fused and the visible light image to be fused, then determine a first fusion weight corresponding to the infrared image to be fused and a second fusion weight corresponding to the visible light image to be fused according to the brightness distribution conditions of the infrared image to be fused and the visible light image to be fused, and finally determine the gray value of the pixel point in the fusion image according to the first fusion weight, the second fusion weight, the gray value of the pixel point in the visible light image to be fused and the gray value of the pixel point in the infrared image to be fused. By adopting the scheme, the visible light image to be fused and the infrared image to be fused can be fused, so that the fused image which is slightly influenced by illumination and weather conditions and has richer details is obtained, and the fused image can be further processed conveniently in the follow-up process.
The communication bus mentioned in the electronic device may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
The communication interface is used for communication between the electronic equipment and other equipment.
The Memory may include a Random Access Memory (RAM) or a Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components.
The step of determining the first fusion weight corresponding to the visible light image to be fused and the second fusion weight corresponding to the infrared image to be fused according to the brightness distribution of the infrared image to be fused and the visible light image to be fused may include:
determining a first brightness distribution coefficient of the infrared image to be fused based on the gray value of the pixel point in the infrared image to be fused;
determining a second brightness distribution coefficient of the visible light image to be fused based on the gray value of the pixel point in the visible light image to be fused;
and calculating a first fusion weight corresponding to the infrared image to be fused and a second fusion weight corresponding to the visible light image to be fused according to the first brightness distribution coefficient and the second brightness distribution coefficient.
The step of determining the first brightness distribution coefficient of the to-be-fused infrared image based on the gray value of the pixel point in the to-be-fused infrared image may include:
determining a first number of pixel points of which the gray values are smaller than a first preset gray value in the infrared image to be fused;
determining a second number of pixel points of which the gray values are not smaller than the first preset gray value in the infrared image to be fused;
determining a first brightness distribution coefficient of the infrared image to be fused according to the first quantity and the second quantity;
the step of determining the second brightness distribution coefficient of the to-be-fused visible light image based on the gray value of the pixel point in the to-be-fused visible light image may include:
determining a third number of pixel points with gray values smaller than a second preset gray value in the visible light image to be fused;
determining a fourth number of pixel points of which the gray values are not less than the second preset gray value in the visible light image to be fused;
and determining a second brightness distribution coefficient of the visible light image to be fused according to the third quantity and the fourth quantity.
The step of determining the first brightness distribution coefficient of the to-be-fused infrared image according to the first number and the second number may include:
using formulas
Figure BDA0001623570910000291
Determining a first brightness distribution coefficient of the infrared image to be fused, wherein N1 is the first number, N2 is the second number, and N1 is the first brightness distribution coefficient;
the step of determining a second brightness distribution coefficient of the visible light image to be fused according to the third number and the fourth number includes:
using formulas
Figure BDA0001623570910000292
Determining a second brightness distribution coefficient of the visible light image to be fused, wherein N3 is the third number, N4 is the fourth number, and N2 is the second numberA brightness distribution coefficient.
The step of calculating a first fusion weight corresponding to the infrared image to be fused and a second fusion weight corresponding to the visible light image to be fused according to the first brightness distribution coefficient and the second brightness distribution coefficient may include:
using formulas
Figure BDA0001623570910000301
Calculating a first fusion weight corresponding to the infrared image to be fused;
using formulas
Figure BDA0001623570910000302
Calculating a second fusion weight corresponding to the visible light image to be fused, or calculating the second fusion weight corresponding to the visible light image to be fused by using a formula p 2-1-p 1;
wherein n1 is the first luminance distribution coefficient, n2 is the second luminance distribution coefficient, and p1 is the first fusion weight; p2 is the second fusion weight.
The step of determining the gray value of the pixel point in the fused image according to the first fusion weight, the second fusion weight, the gray value of the pixel point in the visible light image to be fused, and the gray value of the pixel point in the infrared image to be fused may include:
calculating a first gray value contribution value of the pixel point in the infrared image to be fused corresponding to the pixel point in the fused image according to the first fusion weight and the gray value of the pixel point in the infrared image to be fused;
calculating a second gray value contribution value of the pixel point in the visible light image to be fused corresponding to the pixel point in the fused image according to the second fusion weight and the gray value of the pixel point in the visible light image to be fused;
and determining the gray value of the pixel point in the fusion image based on the sum of the first gray value contribution value and the second gray value contribution value.
The step of calculating a first gray value contribution value of the pixel point in the infrared image to be fused corresponding to the pixel point in the fused image according to the first fusion weight and the gray value of the pixel point in the infrared image to be fused may include:
calculating a first gray value contribution value of a pixel point in the infrared image to be fused corresponding to a pixel point in the fused image by using a formula A which is p1 × IR (i, j), wherein p1 is the first fusion weight, IR (i, j) is a gray value of the pixel point (i, j) in the infrared image to be fused, and A is the first gray value contribution value;
the step of calculating a second gray value contribution value of the pixel point in the visible light image to be fused corresponding to the pixel point in the fused image according to the second fusion weight and the gray value of the pixel point in the visible light image to be fused may include:
and calculating a second gray value contribution value of the pixel point in the visible light image to be fused corresponding to the pixel point in the fused image by using a formula B ═ p2 × IV (i, j), wherein p2 is the second fusion weight, IV (i, j) is the gray value of the pixel point (i, j) in the visible light image to be fused, and B is the second gray value contribution value.
The determining the gray value of the pixel point in the fused image based on the sum of the first gray value contribution value and the second gray value contribution value may include:
using formulas
Figure BDA0001623570910000311
Determining the gray value of a pixel point in the fusion image, wherein I (I, j) is the gray value of the pixel point (I, j) in the fusion image;
or the like, or, alternatively,
using formulas
Figure BDA0001623570910000312
Determining the gray value of a pixel point in the fusion image, wherein I (I, j) is the gray value of the pixel point (I, j) in the fusion image;
or the like, or, alternatively,
rounding the values of p1 × IR (i, j) + p2 × IV (i, j) by rounding rules; and determining the value obtained by rounding as the gray value of the pixel point (i, j) in the fused image.
Wherein, the method can also comprise:
and carrying out target detection processing on the fused image to obtain a target detection result.
An embodiment of the present invention further provides a computer-readable storage medium, in which a computer program is stored, and when executed by a processor, the computer program implements the following steps:
acquiring an infrared image to be fused and a visible light image to be fused;
determining a first fusion weight corresponding to the infrared image to be fused and a second fusion weight corresponding to the visible light image to be fused according to the brightness distribution conditions of the infrared image to be fused and the visible light image to be fused;
and determining the gray value of the pixel point in the fused image according to the first fusion weight, the second fusion weight, the gray value of the pixel point in the visible light image to be fused and the gray value of the pixel point in the infrared image to be fused.
Therefore, in the scheme provided by the embodiment of the invention, when the computer program is executed by the processor, the infrared image to be fused and the visible light image to be fused are firstly obtained, then the first fusion weight corresponding to the infrared image to be fused and the second fusion weight corresponding to the visible light image to be fused are determined according to the brightness distribution condition of the infrared image to be fused and the visible light image to be fused, and finally the gray value of the pixel point in the fusion image is determined according to the first fusion weight, the second fusion weight, the gray value of the pixel point in the visible light image to be fused and the gray value of the pixel point in the infrared image to be fused. By adopting the scheme, the visible light image to be fused and the infrared image to be fused can be fused, so that the fused image which is slightly influenced by illumination and weather conditions and has richer details is obtained, and the fused image can be further processed conveniently in the follow-up process.
The step of determining the first fusion weight corresponding to the visible light image to be fused and the second fusion weight corresponding to the infrared image to be fused according to the brightness distribution of the infrared image to be fused and the visible light image to be fused may include:
determining a first brightness distribution coefficient of the infrared image to be fused based on the gray value of the pixel point in the infrared image to be fused;
determining a second brightness distribution coefficient of the visible light image to be fused based on the gray value of the pixel point in the visible light image to be fused;
and calculating a first fusion weight corresponding to the infrared image to be fused and a second fusion weight corresponding to the visible light image to be fused according to the first brightness distribution coefficient and the second brightness distribution coefficient.
The step of determining the first brightness distribution coefficient of the to-be-fused infrared image based on the gray value of the pixel point in the to-be-fused infrared image may include:
determining a first number of pixel points of which the gray values are smaller than a first preset gray value in the infrared image to be fused;
determining a second number of pixel points of which the gray values are not smaller than the first preset gray value in the infrared image to be fused;
determining a first brightness distribution coefficient of the infrared image to be fused according to the first quantity and the second quantity;
the step of determining the second brightness distribution coefficient of the to-be-fused visible light image based on the gray value of the pixel point in the to-be-fused visible light image may include:
determining a third number of pixel points with gray values smaller than a second preset gray value in the visible light image to be fused;
determining a fourth number of pixel points of which the gray values are not less than the second preset gray value in the visible light image to be fused;
and determining a second brightness distribution coefficient of the visible light image to be fused according to the third quantity and the fourth quantity.
The step of determining the first brightness distribution coefficient of the to-be-fused infrared image according to the first number and the second number may include:
using formulas
Figure BDA0001623570910000331
Determining a first brightness distribution coefficient of the infrared image to be fused, wherein N1 is the first number, N2 is the second number, and N1 is the first brightness distribution coefficient;
the step of determining a second brightness distribution coefficient of the visible light image to be fused according to the third number and the fourth number includes:
using formulas
Figure BDA0001623570910000332
And determining a second brightness distribution coefficient of the visible light images to be fused, wherein N3 is the third number, N4 is the fourth number, and N2 is the second brightness distribution coefficient.
The step of calculating a first fusion weight corresponding to the infrared image to be fused and a second fusion weight corresponding to the visible light image to be fused according to the first brightness distribution coefficient and the second brightness distribution coefficient may include:
using formulas
Figure BDA0001623570910000333
Calculating a first fusion weight corresponding to the infrared image to be fused;
using formulas
Figure BDA0001623570910000334
Calculating a second fusion weight corresponding to the visible light image to be fused, or calculating the second fusion weight corresponding to the visible light image to be fused by using a formula p 2-1-p 1;
wherein n1 is the first luminance distribution coefficient, n2 is the second luminance distribution coefficient, and p1 is the first fusion weight; p2 is the second fusion weight.
The step of determining the gray value of the pixel point in the fused image according to the first fusion weight, the second fusion weight, the gray value of the pixel point in the visible light image to be fused, and the gray value of the pixel point in the infrared image to be fused may include:
calculating a first gray value contribution value of the pixel point in the infrared image to be fused corresponding to the pixel point in the fused image according to the first fusion weight and the gray value of the pixel point in the infrared image to be fused;
calculating a second gray value contribution value of the pixel point in the visible light image to be fused corresponding to the pixel point in the fused image according to the second fusion weight and the gray value of the pixel point in the visible light image to be fused;
and determining the gray value of the pixel point in the fusion image based on the sum of the first gray value contribution value and the second gray value contribution value.
The step of calculating a first gray value contribution value of the pixel point in the infrared image to be fused corresponding to the pixel point in the fused image according to the first fusion weight and the gray value of the pixel point in the infrared image to be fused may include:
calculating a first gray value contribution value of a pixel point in the infrared image to be fused corresponding to a pixel point in the fused image by using a formula A which is p1 × IR (i, j), wherein p1 is the first fusion weight, IR (i, j) is a gray value of the pixel point (i, j) in the infrared image to be fused, and A is the first gray value contribution value;
the step of calculating a second gray value contribution value of the pixel point in the visible light image to be fused corresponding to the pixel point in the fused image according to the second fusion weight and the gray value of the pixel point in the visible light image to be fused may include:
and calculating a second gray value contribution value of the pixel point in the visible light image to be fused corresponding to the pixel point in the fused image by using a formula B ═ p2 × IV (i, j), wherein p2 is the second fusion weight, IV (i, j) is the gray value of the pixel point (i, j) in the visible light image to be fused, and B is the second gray value contribution value.
The determining the gray value of the pixel point in the fused image based on the sum of the first gray value contribution value and the second gray value contribution value may include:
using formulas
Figure BDA0001623570910000341
Determining the gray value of a pixel point in the fusion image, wherein I (I, j) is the gray value of the pixel point (I, j) in the fusion image;
or the like, or, alternatively,
using formulas
Figure BDA0001623570910000351
Determining the gray value of a pixel point in the fusion image, wherein I (I, j) is the gray value of the pixel point (I, j) in the fusion image;
or the like, or, alternatively,
rounding the values of p1 × IR (i, j) + p2 × IV (i, j) by rounding rules; and determining the value obtained by rounding as the gray value of the pixel point (i, j) in the fused image.
Wherein, the method can also comprise:
and carrying out target detection processing on the fused image to obtain a target detection result.
It should be noted that, for the above-mentioned apparatus, electronic device and computer-readable storage medium embodiments, since they are basically similar to the method embodiments, the description is relatively simple, and for the relevant points, reference may be made to the partial description of the method embodiments.
It is further noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The above description is only for the preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention shall fall within the protection scope of the present invention.

Claims (16)

1. An image fusion method, characterized in that the method comprises:
acquiring an infrared image to be fused and a visible light image to be fused;
determining a first brightness distribution coefficient of the infrared image to be fused according to a first quantity and a second quantity, wherein the first quantity is the quantity of pixel points with the gray values smaller than a first preset gray value in the infrared image to be fused, and the second quantity is the quantity of pixel points with the gray values not smaller than the first preset gray value in the infrared image to be fused;
determining a second brightness distribution coefficient of the visible light image to be fused according to a third quantity and a fourth quantity, wherein the third quantity is the quantity of pixel points of which the gray values are smaller than a second preset gray value in the visible light image to be fused, and the fourth quantity is the quantity of pixel points of which the gray values are not smaller than the second preset gray value in the visible light image to be fused;
calculating a first fusion weight corresponding to the infrared image to be fused and a second fusion weight corresponding to the visible light image to be fused according to the first brightness distribution coefficient and the second brightness distribution coefficient, wherein the first fusion weight represents the proportion of the gray value of the pixel point in the infrared image to be fused in the gray value of the corresponding pixel point in the fusion image, and the second fusion weight represents the proportion of the gray value of the pixel point in the visible light image to be fused in the gray value of the corresponding pixel point in the fusion image;
and determining the gray value of the pixel point in the fused image according to the first fusion weight, the second fusion weight, the gray value of the pixel point in the visible light image to be fused and the gray value of the pixel point in the infrared image to be fused.
2. The method according to claim 1, wherein the step of determining the first luminance distribution coefficient of the infrared image to be fused according to the first number and the second number comprises:
using formulas
Figure FDA0003078380770000011
Determining a first brightness distribution coefficient of the infrared image to be fused, wherein N1 is the first number, N2 is the second number, and N1 is the first brightness distribution coefficient;
the step of determining the second brightness distribution coefficient of the visible light image to be fused according to the third number and the fourth number includes:
using formulas
Figure FDA0003078380770000021
Determining a second brightness distribution coefficient of the visible light image to be fused, wherein N3 is the third number, and N4 is the fourth numberAnd n2 is the second brightness distribution coefficient.
3. The method according to claim 1, wherein the step of calculating a first fusion weight corresponding to the infrared image to be fused and a second fusion weight corresponding to the visible light image to be fused according to the first brightness distribution coefficient and the second brightness distribution coefficient comprises:
using formulas
Figure FDA0003078380770000022
Calculating a first fusion weight corresponding to the infrared image to be fused;
using formulas
Figure FDA0003078380770000023
Calculating a second fusion weight corresponding to the visible light image to be fused, or calculating the second fusion weight corresponding to the visible light image to be fused by using a formula p 2-1-p 1;
wherein n1 is the first luminance distribution coefficient, n2 is the second luminance distribution coefficient, and p1 is the first fusion weight; p2 is the second fusion weight.
4. The method of claim 1, wherein the step of determining the gray-level values of the pixels in the fused image according to the first fusion weight, the second fusion weight, the gray-level values of the pixels in the visible-light image to be fused, and the gray-level values of the pixels in the infrared image to be fused comprises:
calculating a first gray value contribution value of the pixel point in the infrared image to be fused corresponding to the pixel point in the fused image according to the first fusion weight and the gray value of the pixel point in the infrared image to be fused;
calculating a second gray value contribution value of the pixel point in the visible light image to be fused corresponding to the pixel point in the fused image according to the second fusion weight and the gray value of the pixel point in the visible light image to be fused;
and determining the gray value of the pixel point in the fusion image based on the sum of the first gray value contribution value and the second gray value contribution value.
5. The method according to claim 4, wherein the step of calculating a first gray value contribution value of a pixel point in the infrared image to be fused corresponding to a pixel point in the fused image according to the first fusion weight and the gray value of the pixel point in the infrared image to be fused comprises:
calculating a first gray value contribution value of a pixel point in the infrared image to be fused corresponding to a pixel point in the fused image by using a formula A which is p1 × IR (i, j), wherein p1 is the first fusion weight, IR (i, j) is a gray value of the pixel point (i, j) in the infrared image to be fused, and A is the first gray value contribution value;
the step of calculating a second gray value contribution value of the pixel point in the visible light image to be fused corresponding to the pixel point in the fused image according to the second fusion weight and the gray value of the pixel point in the visible light image to be fused comprises the following steps:
and calculating a second gray value contribution value of the pixel point in the visible light image to be fused corresponding to the pixel point in the fused image by using a formula B ═ p2 × IV (i, j), wherein p2 is the second fusion weight, IV (i, j) is the gray value of the pixel point (i, j) in the visible light image to be fused, and B is the second gray value contribution value.
6. The method of claim 5, wherein the step of determining the gray value of the pixel point in the fused image based on the sum of the first gray value contribution value and the second gray value contribution value comprises:
using formulas
Figure FDA0003078380770000031
Determining gray values of pixel points in the fusion image, wherein I (I, j) is the pixel in the fusion imageThe gray value of point (i, j);
or the like, or, alternatively,
using formulas
Figure FDA0003078380770000032
Determining the gray value of a pixel point in the fusion image, wherein I (I, j) is the gray value of the pixel point (I, j) in the fusion image;
or the like, or, alternatively,
rounding the values of p1 × IR (i, j) + p2 × IV (i, j) by rounding rules; and determining the value obtained by rounding as the gray value of the pixel point (i, j) in the fused image.
7. The method of any one of claims 1-6, further comprising:
and carrying out target detection processing on the fused image to obtain a target detection result.
8. An image fusion apparatus, characterized in that the apparatus comprises:
the image acquisition module is used for acquiring an infrared image to be fused and a visible light image to be fused;
the first brightness distribution coefficient determining unit is used for determining a first brightness distribution coefficient of the infrared image to be fused according to a first number and a second number, wherein the first number is the number of pixel points of which the gray value is smaller than a first preset gray value in the infrared image to be fused, and the second number is the number of pixel points of which the gray value is not smaller than the first preset gray value in the infrared image to be fused;
a second brightness distribution coefficient determining unit, configured to determine a second brightness distribution coefficient of the to-be-fused visible light image according to a third number and a fourth number, where the third number is the number of pixels in the to-be-fused visible light image whose gray value is smaller than a second preset gray value, and the fourth number is the number of pixels in the to-be-fused visible light image whose gray value is not smaller than the second preset gray value;
the weight determination submodule is used for calculating a first fusion weight corresponding to the infrared image to be fused and a second fusion weight corresponding to the visible light image to be fused according to the first brightness distribution coefficient and the second brightness distribution coefficient, wherein the first fusion weight represents the proportion of the gray value of the pixel point in the infrared image to be fused in the gray value of the corresponding pixel point in the fusion image, and the second fusion weight represents the proportion of the gray value of the pixel point in the visible light image to be fused in the gray value of the corresponding pixel point in the fusion image;
and the gray value determining module is used for determining the gray value of the pixel point in the fused image according to the first fusion weight, the second fusion weight, the gray value of the pixel point in the visible light image to be fused and the gray value of the pixel point in the infrared image to be fused.
9. The apparatus of claim 8, wherein the first luminance distribution coefficient determining unit comprises:
a first brightness distribution coefficient determining subunit for determining the brightness distribution coefficient by using the formula
Figure FDA0003078380770000041
Determining a first brightness distribution coefficient of the infrared image to be fused, wherein N1 is the first number, N2 is the second number, and N1 is the first brightness distribution coefficient;
the second luminance distribution coefficient determining unit includes:
a second luminance distribution coefficient determining subunit for determining the second luminance distribution coefficient by using the formula
Figure FDA0003078380770000051
And determining a second brightness distribution coefficient of the visible light images to be fused, wherein N3 is the third number, N4 is the fourth number, and N2 is the second brightness distribution coefficient.
10. The apparatus of claim 8, wherein the weight determination submodule comprises:
a first fusion weight calculation unit for using the formula
Figure FDA0003078380770000052
Calculating a first fusion weight corresponding to the infrared image to be fused;
a second fusion weight calculation unit for using the formula
Figure FDA0003078380770000053
Calculating a second fusion weight corresponding to the visible light image to be fused, or calculating the second fusion weight corresponding to the visible light image to be fused by using a formula p 2-1-p 1;
wherein n1 is the first luminance distribution coefficient, n2 is the second luminance distribution coefficient, and p1 is the first fusion weight; p2 is the second fusion weight.
11. The apparatus of claim 8, wherein the gray value determination module comprises:
a first gray value contribution value determining submodule, configured to calculate, according to the first fusion weight and a gray value of a pixel point in the to-be-fused infrared image, a first gray value contribution value of the pixel point in the to-be-fused infrared image, which corresponds to the pixel point in the to-be-fused image;
the second gray value contribution value determining submodule is used for calculating a second gray value contribution value of the pixel point in the visible light image to be fused corresponding to the pixel point in the fused image according to the second fusion weight and the gray value of the pixel point in the visible light image to be fused;
and the gray value determining submodule is used for determining the gray value of the pixel point in the fusion image based on the sum of the first gray value contribution value and the second gray value contribution value.
12. The apparatus of claim 11, wherein the first gray value contribution value determination submodule comprises:
a first gray value contribution value determining unit, configured to calculate, by using a formula a ═ p1 × IR (i, j), a first gray value contribution value of a pixel point in the to-be-fused infrared image, where p1 is the first fusion weight, IR (i, j) is a gray value of a pixel point (i, j) in the to-be-fused infrared image, and a is the first gray value contribution value;
the second gray value contribution value determination submodule includes:
and a second gray value contribution value determining unit, configured to calculate a second gray value contribution value of the pixel point in the to-be-fused visible light image corresponding to the pixel point in the fused image by using a formula B ═ p2 × IV (i, j), where p2 is the second fusion weight, IV (i, j) is a gray value of the pixel point (i, j) in the to-be-fused visible light image, and B is the second gray value contribution value.
13. The apparatus of claim 12, wherein the gray value determination sub-module comprises:
a gray value determining unit for using a formula
Figure FDA0003078380770000061
Determining the gray value of a pixel point in the fusion image, wherein I (I, j) is the gray value of the pixel point (I, j) in the fusion image;
or the like, or, alternatively,
for using formulae
Figure FDA0003078380770000062
Determining the gray value of a pixel point in the fusion image, wherein I (I, j) is the gray value of the pixel point (I, j) in the fusion image;
or the like, or, alternatively,
for rounding the values of p1 × IR (i, j) + p2 × IV (i, j); and determining the value obtained by rounding as the gray value of the pixel point (i, j) in the fused image.
14. The apparatus of any one of claims 8-13, further comprising:
and the target detection module is used for carrying out target detection processing on the fused image to obtain a target detection result.
15. An electronic device is characterized by comprising a processor, a memory and a communication bus, wherein the processor and the memory are communicated with each other through the communication bus;
a memory for storing a computer program;
a processor for implementing the method steps of any of claims 1 to 7 when executing a program stored in the memory.
16. A computer-readable storage medium, characterized in that a computer program is stored in the computer-readable storage medium, which computer program, when being executed by a processor, carries out the method steps of any one of claims 1 to 7.
CN201810315298.3A 2018-04-10 2018-04-10 Image fusion method and device and electronic equipment Active CN110363731B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810315298.3A CN110363731B (en) 2018-04-10 2018-04-10 Image fusion method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810315298.3A CN110363731B (en) 2018-04-10 2018-04-10 Image fusion method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN110363731A CN110363731A (en) 2019-10-22
CN110363731B true CN110363731B (en) 2021-09-03

Family

ID=68212877

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810315298.3A Active CN110363731B (en) 2018-04-10 2018-04-10 Image fusion method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN110363731B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111104917A (en) * 2019-12-24 2020-05-05 杭州魔点科技有限公司 Face-based living body detection method and device, electronic equipment and medium
CN113516593B (en) * 2020-04-10 2024-02-27 杭州海康威视数字技术股份有限公司 Human eye image detection and restoration method
CN112053392A (en) * 2020-09-17 2020-12-08 南昌航空大学 Rapid registration and fusion method for infrared and visible light images
CN112767298B (en) * 2021-03-16 2023-06-13 杭州海康威视数字技术股份有限公司 Fusion method and device of visible light image and infrared image

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7333270B1 (en) * 2005-06-10 2008-02-19 Omnitech Partners Dual band night vision device
CN101547309A (en) * 2008-03-25 2009-09-30 索尼株式会社 Image processing apparatus, image processing method, and program
CN105069768A (en) * 2015-08-05 2015-11-18 武汉高德红外股份有限公司 Visible-light image and infrared image fusion processing system and fusion method
CN105719263A (en) * 2016-01-22 2016-06-29 昆明理工大学 Visible light and infrared image fusion algorithm based on NSCT domain bottom layer visual features
CN106204509A (en) * 2016-07-07 2016-12-07 西安电子科技大学 Based on region characteristic infrared and visible light image fusion method
WO2017059774A1 (en) * 2015-10-09 2017-04-13 Zhejiang Dahua Technology Co., Ltd. Methods and systems for fusion display of thermal infrared and visible image
CN106600572A (en) * 2016-12-12 2017-04-26 长春理工大学 Adaptive low-illumination visible image and infrared image fusion method
KR20170093490A (en) * 2016-02-05 2017-08-16 에스엘 주식회사 Monitoring system for vehicle
CN107292860A (en) * 2017-07-26 2017-10-24 武汉鸿瑞达信息技术有限公司 A kind of method and device of image procossing
CN107481214A (en) * 2017-08-29 2017-12-15 北京华易明新科技有限公司 A kind of twilight image and infrared image fusion method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3129954A4 (en) * 2014-04-07 2017-10-18 BAE SYSTEMS Information and Electronic Systems Integration Inc. Contrast based image fusion

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7333270B1 (en) * 2005-06-10 2008-02-19 Omnitech Partners Dual band night vision device
CN101547309A (en) * 2008-03-25 2009-09-30 索尼株式会社 Image processing apparatus, image processing method, and program
CN105069768A (en) * 2015-08-05 2015-11-18 武汉高德红外股份有限公司 Visible-light image and infrared image fusion processing system and fusion method
WO2017059774A1 (en) * 2015-10-09 2017-04-13 Zhejiang Dahua Technology Co., Ltd. Methods and systems for fusion display of thermal infrared and visible image
CN105719263A (en) * 2016-01-22 2016-06-29 昆明理工大学 Visible light and infrared image fusion algorithm based on NSCT domain bottom layer visual features
KR20170093490A (en) * 2016-02-05 2017-08-16 에스엘 주식회사 Monitoring system for vehicle
CN106204509A (en) * 2016-07-07 2016-12-07 西安电子科技大学 Based on region characteristic infrared and visible light image fusion method
CN106600572A (en) * 2016-12-12 2017-04-26 长春理工大学 Adaptive low-illumination visible image and infrared image fusion method
CN107292860A (en) * 2017-07-26 2017-10-24 武汉鸿瑞达信息技术有限公司 A kind of method and device of image procossing
CN107481214A (en) * 2017-08-29 2017-12-15 北京华易明新科技有限公司 A kind of twilight image and infrared image fusion method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Infrared and visible image fusion via gradient transfer and total variation minimization;Jiayi Ma等;《Preprint submitted to Information Fusion》;20151115;第1-23页 *
红外与可见光图像融合算法研究;周渝人;《中国博士学位论文全文数据库(信息科技辑)》;20140915(第09期);第I138-31页 *

Also Published As

Publication number Publication date
CN110363731A (en) 2019-10-22

Similar Documents

Publication Publication Date Title
CN110363731B (en) Image fusion method and device and electronic equipment
CN108765278B (en) Image processing method, mobile terminal and computer readable storage medium
CN112330601A (en) Parking detection method, device, equipment and medium based on fisheye camera
CN111308448B (en) External parameter determining method and device for image acquisition equipment and radar
EP3641298B1 (en) Method and device for capturing target object and video monitoring device
CN111368615B (en) Illegal building early warning method and device and electronic equipment
CN108965835B (en) Image processing method, image processing device and terminal equipment
CN109816745A (en) Human body thermodynamic chart methods of exhibiting and Related product
CN109286758B (en) High dynamic range image generation method, mobile terminal and storage medium
WO2022134957A1 (en) Camera occlusion detection method and system, electronic device, and storage medium
US20220182582A1 (en) Image processing method and apparatus, device and storage medium
CN108805838B (en) Image processing method, mobile terminal and computer readable storage medium
CN110650288B (en) Focusing control method and device, electronic equipment and computer readable storage medium
CN108769521B (en) Photographing method, mobile terminal and computer readable storage medium
EP4093015A1 (en) Photographing method and apparatus, storage medium, and electronic device
CN111340722B (en) Image processing method, processing device, terminal equipment and readable storage medium
CN113505643A (en) Violation target detection method and related device
KR20180093418A (en) Apparatus and method for detecting pedestrian
CN113011445A (en) Calibration method, identification method, device and equipment
CN108732178A (en) A kind of atmospheric visibility detection method and device
CN108776959B (en) Image processing method and device and terminal equipment
EP3683716A1 (en) Monitoring method, apparatus and system, electronic device, and computer readable storage medium
CN114708954A (en) Medical aseptic article validity period environmental supervision system
CN114119408A (en) Express delivery detection method and device based on cat eye camera and cat eye camera
US20220036107A1 (en) Calculation device, information processing method, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200706

Address after: 311501 building A1, No. 299, Qiushi Road, Tonglu Economic Development Zone, Tonglu County, Hangzhou City, Zhejiang Province

Applicant after: Hangzhou Haikang Micro Shadow Sensing Technology Co.,Ltd.

Address before: Hangzhou City, Zhejiang province 310051 Binjiang District Qianmo Road No. 555

Applicant before: Hangzhou Hikvision Digital Technology Co.,Ltd.

GR01 Patent grant
GR01 Patent grant