CN116797505A - Image fusion method, electronic device and storage medium - Google Patents

Image fusion method, electronic device and storage medium Download PDF

Info

Publication number
CN116797505A
CN116797505A CN202310651289.2A CN202310651289A CN116797505A CN 116797505 A CN116797505 A CN 116797505A CN 202310651289 A CN202310651289 A CN 202310651289A CN 116797505 A CN116797505 A CN 116797505A
Authority
CN
China
Prior art keywords
image
fused
fusion
target
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310651289.2A
Other languages
Chinese (zh)
Inventor
洪铁鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Ck Technology Co ltd
Original Assignee
Chengdu Ck Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Ck Technology Co ltd filed Critical Chengdu Ck Technology Co ltd
Priority to CN202310651289.2A priority Critical patent/CN116797505A/en
Publication of CN116797505A publication Critical patent/CN116797505A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the application discloses an image fusion method, electronic equipment and a storage medium, wherein the method comprises the following steps: obtaining a target image sequence, wherein the target image sequence comprises the following steps: a plurality of images to be fused with different exposure degrees under the same scene; selecting a well-exposed image to be fused from the target image sequence as a reference image, performing super-pixel blocking on the reference image, and performing super-pixel blocking on other images to be fused except the reference image in the target image sequence by taking a blocking result of the reference image as a blocking standard to obtain a plurality of image blocks of each image to be fused; determining a fusion weight value of the pixel point of each image block according to the index parameter of the pixel point of each image block; based on the fusion weight value of the pixel points of each image block at the same position in each image to be fused, fusing each image block at the same position to obtain a plurality of fused image blocks at different positions, and splicing the plurality of fused image blocks to obtain a target image.

Description

Image fusion method, electronic device and storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image fusion method, an electronic device, and a storage medium.
Background
Multi-Exposure Fusion (MEF) is a process of fusing images of three or more different Exposure degrees of the same scene into a high-definition image with rich color details by performing some image processing operations in an image transformation domain or a spatial domain. In the related art, the multi-exposure image fusion has the problems of damaged image detail information, discontinuous boundaries and the like to a certain extent, and the quality of the fused image is lower.
Disclosure of Invention
The embodiment of the application provides an image fusion method, electronic equipment and a storage medium, which are used for solving the technical problem of lower image quality of multi-exposure image fusion in the related technology.
According to a first aspect of the present application, there is disclosed an image fusion method comprising:
obtaining a target image sequence, wherein the target image sequence comprises the following steps: a plurality of images to be fused with different exposure degrees under the same scene;
selecting a well-exposed image to be fused from the target image sequence as a reference image, performing super-pixel blocking on the reference image, and performing super-pixel blocking on other images to be fused except the reference image in the target image sequence by taking a blocking result of the reference image as a blocking standard to obtain a plurality of image blocks of each image to be fused;
Determining a fusion weight value of the pixel points of each image block according to index parameters of the pixel points of each image block, wherein the index parameters comprise at least one of the following: contrast, saturation, and brightness;
based on the fusion weight value of the pixel points of each image block at the same position in each image to be fused, fusing each image block at the same position to obtain a plurality of fused image blocks at different positions, and splicing the plurality of fused image blocks to obtain a target image.
According to a second aspect of the present application, there is disclosed an image fusion apparatus comprising:
the acquisition module is used for acquiring a target image sequence, wherein the target image sequence comprises the following steps: a plurality of images to be fused with different exposure degrees under the same scene;
the dividing module is used for selecting a well-exposed image to be fused from the target image sequence as a reference image, performing super-pixel blocking on the reference image, and performing super-pixel blocking on other images to be fused except the reference image in the target image sequence by taking a blocking result of the reference image as a blocking standard to obtain a plurality of image blocks of each image to be fused;
The determining module is used for determining a fusion weight value of the pixel point of each image block according to the index parameter of the pixel point of each image block, wherein the index parameter comprises at least one of the following: contrast, saturation, and brightness;
the fusion module is used for fusing the image blocks at the same position based on the fusion weight values of the pixel points of the image blocks at the same position in each image to be fused to obtain a plurality of fused image blocks at different positions, and splicing the fused image blocks to obtain a target image.
According to a third aspect of the present application, an electronic device is disclosed comprising a memory, a processor and a computer program stored on the memory, the processor executing the computer program to implement the image fusion method as in the first aspect.
According to a fourth aspect of the present application, there is disclosed a computer readable storage medium having stored thereon a computer program/instruction which, when executed by a processor, implements the image fusion method as in the first aspect.
According to a fifth aspect of the present application, a computer program product is disclosed, comprising a computer program/instructions which, when executed by a processor, implement the image fusion method as in the first aspect.
In the embodiment of the application, a target image sequence is acquired, wherein the target image sequence comprises: a plurality of images to be fused with different exposure degrees under the same scene; selecting a well-exposed image to be fused from the target image sequence as a reference image, performing super-pixel blocking on the reference image, and performing super-pixel blocking on other images to be fused except the reference image in the target image sequence by taking a blocking result of the reference image as a blocking standard to obtain a plurality of image blocks of each image to be fused; determining a fusion weight value of the pixel point of each image block according to an index parameter of the pixel point of each image block, wherein the index parameter comprises at least one of the following: contrast, saturation, and brightness; based on the fusion weight value of the pixel points of each image block at the same position in each image to be fused, fusing each image block at the same position to obtain a plurality of fused image blocks at different positions, and splicing the plurality of fused image blocks to obtain a target image.
Compared with the prior art that the images to be fused are uniformly segmented, in the embodiment of the application, each image to be fused can be segmented based on super pixels to obtain a plurality of image blocks containing the same object of each image to be fused, and the image fusion is carried out by taking the image blocks as a unit.
Drawings
FIG. 1 is an exemplary diagram of images to be fused at different exposure levels in the same scene provided by an embodiment of the present application;
FIG. 2 is an exemplary diagram of weighted images of images to be fused at different exposure levels provided by an embodiment of the present application;
FIG. 3 is an exemplary diagram of image fusion based on pyramid policies provided by an embodiment of the present application;
FIG. 4 is an exemplary diagram of a fused image obtained by multi-exposure fusion in the related art;
FIG. 5 is a flowchart of an image fusion method according to an embodiment of the present application;
FIG. 6 is an exemplary diagram of superpixel blocking of an image to be fused provided by an embodiment of the present application;
FIG. 7 is a flowchart of another image fusion method according to an embodiment of the present application;
FIG. 8 is a diagram illustrating an example of a process for filling a blank weight image of an image block to obtain a target weight image according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of an image fusion apparatus according to an embodiment of the present application;
fig. 10 is a block diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order that the above-recited objects, features and advantages of the present application will become more readily apparent, a more particular description of the application will be rendered by reference to the appended drawings and appended detailed description.
It should be noted that, for simplicity of description, the method embodiments are shown as a series of acts, but it should be understood by those skilled in the art that the embodiments are not limited by the order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred embodiments, and that the acts are not necessarily required by the embodiments of the application.
In recent years, technology research such as computer vision, deep learning, machine learning, image processing, image recognition and the like based on artificial intelligence has been advanced significantly. Artificial intelligence (Artificial Intelligence, AI) is an emerging scientific technology for studying and developing theories, methods, techniques and application systems for simulating and extending human intelligence. The artificial intelligence discipline is a comprehensive discipline and relates to various technical categories such as chips, big data, cloud computing, internet of things, distributed storage, deep learning, machine learning, neural networks and the like. Computer vision is an important branch of artificial intelligence, and particularly, machine recognition is a world, and computer vision technologies generally include technologies such as face recognition, living body detection, fingerprint recognition and anti-counterfeit verification, biometric feature recognition, face detection, pedestrian detection, object detection, pedestrian recognition, image processing, image recognition, image semantic understanding, image retrieval, word recognition, video processing, video content recognition, behavior recognition, three-dimensional reconstruction, virtual reality, augmented reality, synchronous positioning and map building (SLAM), computational photography, robot navigation and positioning, and the like. With research and progress of artificial intelligence technology, the technology expands application in various fields, such as security protection, city management, traffic management, building management, park management, face passing, face attendance, logistics management, warehouse management, robots, intelligent marketing, computed photography, mobile phone images, cloud services, intelligent home, wearing equipment, unmanned driving, automatic driving, intelligent medical treatment, face payment, face unlocking, fingerprint unlocking, personnel verification, intelligent screen, intelligent television, camera, mobile internet, network living broadcast, beauty, make-up, medical beauty, intelligent temperature measurement and the like.
Taking multi-exposure fusion in the field of image processing as an example, the multi-exposure fusion utilizes available information of a plurality of images, the complementary information of each image is preprocessed, and a certain criterion is used for fusion, so that an output image with obviously improved effect is obtained. In the related art, an image to be fused is uniformly divided into a plurality of image blocks by adopting a uniform block dividing method, and multi-exposure fusion is performed by taking the image blocks as units, but edge effects of the blocks can cause some problems: for example, too small a patch may result in edge discontinuities during image fusion, and too large a patch may result in loss of detail during image fusion, resulting in poor image quality for multi-exposure fusion.
In order to solve the technical problems, the embodiment of the application provides an image fusion method, electronic equipment and a storage medium. For easy understanding, the application scenario of the embodiments of the present application and some of the concepts involved will be first described below.
Multi-Exposure Fusion (MEF): and (3) performing some image processing operations in an image transformation domain or a space domain by utilizing three or more images with different exposure degrees in the same scene, and fusing the images into a process of high-definition and rich-color-detail image. For example, fig. 1 includes four images of different exposure levels in the same scene, which are to be fused image P 1 Image P to be fused 2 Image P to be fused 3 And an image P to be fused 4
The exposure level of each object (person or object) in an image can be roughly divided into: overexposure, good exposure and underexposure. Taking fig. 1 as an example, overexposure: the object itself is not a region of brightness saturation in the real scene, but is saturated by photographing, e.g. the image P to be fused 1 The window in the region 11 is not saturated in brightness in the real scene, but the brightness of the window in the image after shooting is 255; the exposure is good: subjectively visually appropriate brightness, moderate color saturation, high contrast, e.g. the image P to be fused 2 In the region 12 and the image P to be fused 3 Region 13 in (a); underexposure: the subject itself is not a dead black region in brightness in the real scene, but is taken at zero brightness, e.g. the image P to be fused 4 The person in the region 14 is not at zero brightness in the real scene, but the person is at zero brightness in the image after shooting.
Pyramid fusion strategy: and decomposing the input image by using a Laplacian pyramid, decomposing the weight image of the input image by using a Gaussian pyramid, fusing the input image with the corresponding scale obtained by decomposing the Laplacian pyramid based on the multi-scale weight image obtained by decomposing the Gaussian pyramid, and finally reconstructing the Laplacian pyramid to obtain a final fusion result.
For example, with the image P to be fused shown in FIG. 1 1 、P 2 、P 3 、P 4 For example, first, as shown in fig. 2, an image P to be fused is calculated 1 Weight image W of (2) 1 Image P to be fused 2 Weight image W of (2) 2 Image P to be fused 3 Weight image W of (2) 3 Image P to be fused 4 Weight image W of (2) 4
Then, as shown in FIG. 3, the image P to be fused is decomposed by using Laplacian pyramid 1 、P 2 、P 3 、P 4 Obtaining a plurality of decomposed images of different scales for each image to be fused, e.g. image P to be fused 1 Laplacian of (A)Pyramid decomposition image to L 1 {P 1 }、L 2 {P 1 },…,L M {P 1 Decomposing the weight image W by using a Gaussian pyramid 1 、W 2 、W 3 、W 4 Obtaining a plurality of decomposed images of different scales for each weighted image, e.g. weighted image W 1 Is G 1 {W 1 }、G 2 {W 1 },…,G M {W 1 And M is the number of scales of the pyramid decomposition image, and M is an integer greater than 1.
Then, based on the decomposition images of the same scale of the Gaussian pyramid, fusing the decomposition images of the same scale of the Laplacian pyramid to obtain a fused pyramid image sequence R 1 、R 2 ,…,R M The method comprises the steps of carrying out a first treatment on the surface of the Wherein, the liquid crystal display device comprises a liquid crystal display device,
R 1 =L 1 {P 1 }*G 1 {W 1 }+L 1 {P 2 }*G 1 {W 2 }+L 1 {P 3 }*G 1 {W 3 }+L 1 {P 4 }*G 1 {W 4 },
R 2 =L 2 {P 1 }*G 2 {W 1 }+L 2 {P 2 }*G 2 {W 2 }+L 2 {P 3 }*G 2 {W 3 }+L 2 {P 4 }*G 2 {W 4 },
R M =L M {P 1 }*G M {W 1 }+L M {P 2 }*G M {W 2 }+L M {P 3 }*G M {W 3 }+L M {P 4 }*G M {
W 4 finally, for R 1 、R 2 ,…,R M And carrying out Laplacian pyramid reconstruction to obtain a final fusion result.
When multi-exposure image fusion is performed through the pyramid strategy, the transition of the picture of the fusion result is natural, but the diffusion from high weight to low weight can occur in the smooth process of sampling at the juncture of the high weight region and the low weight region, so that the problem of edge buddha 41 (halo) in the image 40 shown in fig. 4 occurs.
Super pixel (Superpixel): it will be appreciated that clustering over an image, superpixel, by grouping together pixels that are similar in appearance, referred to as a superpixel, provides a compact representation of the image data, and that the processing unit becomes a superpixel instead of a single pixel point in subsequent processing.
An image fusion method provided by the embodiment of the application is described next.
Fig. 5 is a flowchart of an image fusion method according to an embodiment of the present application, as shown in fig. 5, the method may include the following steps: step 501, step 502, step 503 and step 504;
in step 501, a target image sequence is acquired, where the target image sequence includes: and a plurality of images to be fused with different exposure degrees in the same scene.
In one example, the target image sequence may include 3 images to be fused with different exposure degrees, which are respectively an underexposed image to be fused, a well-exposed image to be fused, and an overexposed image to be fused.
In step 502, a well-exposed image to be fused is selected from the target image sequence as a reference image, super-pixel blocking is performed on the reference image, and the blocking result of the reference image is used as a blocking standard, and super-pixel blocking is performed on other images to be fused except the reference image in the target image sequence, so as to obtain a plurality of image blocks of each image to be fused.
In the embodiment of the application, in order to ensure the correctness of the fusion result when the subsequent image fusion is carried out by taking the image block as a unit, all the images to be fused adopt the same image block division mode. Meanwhile, the details and textures of the image content in the well-exposed image to be fused are clear, so that when super-pixel blocking is carried out, super-pixel blocking is carried out by taking the well-exposed image to be fused as a reference image, and the blocking result of the reference image is used as a blocking standard, super-pixel blocking is carried out on other images to be fused except the reference image in the target image sequence, and the blocking effect is better.
In some embodiments of the present application, considering that the image with overexposed and underexposed is less in picture details, the gradient sum is lower, and the image with good exposure is more in picture details, the gradient sum is greater, so the reference image may be screened from the target image sequence according to the gradient sum of the images, and accordingly, the step 502 may include the following steps: calculating the gradient sum of each image to be fused in the target image sequence, wherein the gradient sum is the sum of the contrast of all pixel points in the image; and determining the gradient and the largest image to be fused in the target image sequence as a reference image.
In some embodiments of the application, the reference image can be subjected to super-pixel blocking based on a simple linear iterative clustering algorithm, and the time consumption of the image blocking to be fused can be ensured on the premise of ensuring the accuracy of the blocking effect because the calculation amount of the simple linear iterative clustering algorithm is smaller and the compactness and the contour maintenance of the generated super-pixel are more ideal.
In one example, as shown in fig. 1, the target image sequence includes: image P to be fused 1 、P 2 、P 3 And P 4 Selecting an image P to be fused according to the gradient sum of the images 3 As a reference image. As shown in FIG. 6, for P 3 Performing super-pixel blocking to obtain P 3 Each image block is a small region composed of a series of adjacent pixel points with similar color, brightness, texture and other characteristics. Then according to P 3 For P in a partitioning manner of a plurality of image blocks of (a) 1 Partitioning to obtain P 1 Is included in the image block; according to P 3 For P in a partitioning manner of a plurality of image blocks of (a) 2 Partitioning to obtain P 2 Is included in the image block; according to P 3 For P in a partitioning manner of a plurality of image blocks of (a) 4 Partitioning to obtain P 4 Multiple image blocks of (a)
In step 503, a fusion weight value of the pixel point of each image block is determined according to an index parameter of the pixel point of each image block, where the index parameter includes at least one of the following: contrast, saturation, and brightness.
In some embodiments of the present application, the contrast C of a pixel in an image is calculated by: in general, edges of objects in underexposed or overexposed areas are difficult to detect, and laplace operator l= [0, -1,0; -1,4, -4;0, -1,0], the contrast C is calculated on the Y channel of the image, giving the pixels of the edge a larger weight: c= |l|y|.
In some embodiments of the present application, the saturation S of a pixel in an image is calculated by: one pixel point with good exposure can well capture color saturation, and in an RGB color space, the standard deviation in a R, G, B channel of each pixel point can be used as a measurement S; in the YUV color space, the calculation formula of S is as follows: s= |u+|v|+1.
In some embodiments of the present application, the brightness E of a pixel point in an image is calculated by: with the general idea, it is considered that the pixel brightness with good exposure tends to approach 0.5 with a large probability, and the e calculation formula is as follows:wherein Y is the gray value of the pixel point in the image, and mu and delta are artificially set values.
In some embodiments of the present application, in order to improve the signal-to-noise ratio while preserving detailed information of the dark area, the index parameters may further include: quality measure, wherein the quality measure is positively correlated with the gray value of the pixel point, the quality measure is used for balancing the brightness of the image in the image fusion process, and the quality measure
In the embodiment of the application, when the fusion weight value of each pixel point in the image block is calculated, each item in the index parameters of the pixel point is subjected to product operation to obtain the fusion weight value of the pixel point.
For example, if the index parameters include: contrast, saturation, and brightness, the fusion weight value of each pixel in the image block=the contrast of the pixel.
For example, if the index parameters include: the contrast, saturation, brightness and quality metrics, the fusion weight value of each pixel in the image block=the contrast of the pixel×the saturation of the pixel×the brightness of the pixel×the quality metrics of the pixel.
In step 504, based on the fusion weight value of the pixel points of each image block at the same position in each image to be fused, each image block at the same position is fused to obtain a plurality of fused image blocks at different positions, and the plurality of fused image blocks are spliced to obtain a target image.
In some embodiments of the present application, to increase the fusion speed, the fusion may be performed based on the original size image block.
In one example, first, a sequence of target images is acquired, e.g., the sequence of target images includes: image F to be fused 1 、F 2 And F 3
Thereafter, the image F to be fused 1 、F 2 And F 3 Performing super-pixel blocking to obtain F 1 Image block F of (2) 11 、F 12 And F 13 ,F 2 Image block F of (2) 21 、F 22 And F 23 ,F 3 Image block F of (2) 31 、F 32 And F 33 Wherein the image block F 11 、F 21 And F 31 The positions in the images to be fused are identical, and the positions are marked as A 1 The method comprises the steps of carrying out a first treatment on the surface of the Image block F 12 、F 22 And F 32 The positions in the images to be fused are identical, and the positions are marked as A 2 The method comprises the steps of carrying out a first treatment on the surface of the Image block F 13 、F 23 And F 33 The positions in the images to be fused are identical, and the positions are marked as A 3
Then, calculate image block F 11 Fusion weight value W of each pixel point in (B) 11 、F 12 Fusion weight value W of each pixel point in (B) 12 And F 13 Fusion weight value W of each pixel point in (B) 13 The method comprises the steps of carrying out a first treatment on the surface of the Calculate image block F 21 Fusion weight value W of each pixel point in (B) 21 、F 22 Fusion weight value W of each pixel point in (B) 22 And F 23 Fusion weight value W of each pixel point in (B) 23 The method comprises the steps of carrying out a first treatment on the surface of the Calculate image block F 31 Fusion weight value W of each pixel point in (B) 31 、F 32 Fusion weight value W of each pixel point in (B) 32 And F 33 Fusion weight value W of each pixel point in (B) 33
And then based on A 1 Fusion weight value W of position 11 、W 21 And W is 31 For A 1 Image block F of position 11 、F 21 And F 31 Fusion is carried out to obtain A 1 Position-fused image block B 1 The method comprises the steps of carrying out a first treatment on the surface of the Based on A 2 Fusion weight value W of position 12 、W 22 And W is 32 For A 2 Image block F of position 12 、F 22 And F 32 Fusion is carried out to obtain A 2 Position-fused image block B 2 The method comprises the steps of carrying out a first treatment on the surface of the Based on A 3 Fusion weight value W of position 13 、W 23 And W is 33 For A 3 Image block F of position 13 、F 23 And F 33 Fusion is carried out to obtain A 3 Position-fused image block B 3
Finally, according to A 1 Position, A 2 Position and A 3 The position of the position in the image to be fused is the position of the fused image block B 1 、B 2 And B 3 And splicing to obtain the target image.
In some embodiments of the present application, in order to ensure that the transition of the frames in the fusion result is natural, a pyramid strategy may be adopted to decompose the image blocks in multiple scales, and fusion is performed based on the image blocks in multiple scales.
As can be seen from the above embodiment, in this embodiment, a target image sequence is acquired, where the target image sequence includes: a plurality of images to be fused with different exposure degrees under the same scene; selecting a well-exposed image to be fused from the target image sequence as a reference image, performing super-pixel blocking on the reference image, and performing super-pixel blocking on other images to be fused except the reference image in the target image sequence by taking a blocking result of the reference image as a blocking standard to obtain a plurality of image blocks of each image to be fused; determining a fusion weight value of the pixel point of each image block according to an index parameter of the pixel point of each image block, wherein the index parameter comprises at least one of the following: contrast, saturation, and brightness; based on the fusion weight value of the pixel points of each image block at the same position in each image to be fused, fusing each image block at the same position to obtain a plurality of fused image blocks at different positions, and splicing the plurality of fused image blocks to obtain a target image.
Compared with the prior art that the images to be fused are uniformly segmented, in the embodiment of the application, each image to be fused can be segmented based on super pixels to obtain a plurality of image blocks containing the same object of each image to be fused, and the image fusion is carried out by taking the image blocks as a unit.
Fig. 7 is a flowchart of another image fusion method provided in the embodiment of the present application, fusion weight values may be filled in an edge adjacent area of a weight area where an image block is located, and image fusion is performed based on a filled weight image, so as to avoid the problem of multi-exposure fused image edge halo in the related art, as shown in fig. 7, the method may include the following steps: step 701, step 702, step 703, step 704, step 705, and step 706;
In step 701, a target image sequence is acquired, where the target image sequence includes: and a plurality of images to be fused with different exposure degrees in the same scene.
In step 702, a well-exposed image to be fused is selected from the target image sequence as a reference image, super-pixel blocking is performed on the reference image, and the blocking result of the reference image is used as a blocking standard, so that super-pixel blocking is performed on other images to be fused except the reference image in the target image sequence, and a plurality of image blocks of each image to be fused are obtained.
In step 703, a fusion weight value of the pixel point of each image block is determined according to an index parameter of the pixel point of each image block, where the index parameter includes at least one of the following: contrast, saturation, and brightness.
The contents of step 701, step 702 and step 703 in the embodiment of the present application are similar to those of step 501, step 502 and step 503 in the embodiment shown in fig. 5, and are not described herein again.
In step 704, for each image block, a blank weight image is created for the image block, wherein the size of the blank weight image is the same as the size of the image to be fused.
In the embodiment of the application, in order to avoid the problem of edge halo during multi-exposure fusion, the edge adjacent area of the weight area where each image block is located can be filled with the fusion weight value, when the edge adjacent area is filled, the edge adjacent area of the weight area where each image block is located needs to be determined first, in order to determine the edge adjacent area, a blank weight image can be created for each image block, the weight area of the image block is filled in the blank weight image first, and the area around the weight area of the image block is the edge adjacent area. The fusion weight value of each pixel point position in the blank weight image is zero.
In step 705, for each image block, according to the position of the image block in the image to be fused, filling the fusion weight value of the pixel point of the image block into the corresponding position in the blank weight image, and performing image interpolation processing on other areas except for the filled position in the blank weight image based on the fusion weight value of the pixel point of the image block until each pixel point in the other areas is filled with the fusion weight value, so as to obtain a target weight image corresponding to the image block; the difference value between the fusion weight value of the pixel point of the image block in the target weight image and the fusion weight value of the pixel point in other areas is smaller than a threshold value.
In the embodiment of the application, after a blank weight image is created for each image block, according to the position of the image block in the image to be fused, the fusion weight value of the pixel point of the image block is filled to the corresponding position in the blank weight image, namely, the weight area of the image block in the blank weight image is determined, and the fusion weight value in the weight area is determined. And then, taking the edge of the weight region where the image block is positioned as a starting point, and carrying out image interpolation processing on other regions except the weight region where the image block is positioned based on the fusion weight value of the image block.
In the embodiment of the application, as the difference between the fusion weight value of the other areas except the weight area of the image block in the target weight image and the fusion weight value of the image block is smaller, for example, as shown in fig. 8, after the fusion weight value of the other areas except the weight area of the image block is filled, the obtained pixel values of all pixel points in the target weight image are relatively close to each other, so that the problem of weight diffusion during pyramid reconstruction can be solved when fusion is performed, and edge halo in a picture of a fusion result is avoided.
In some embodiments of the present application, considering that the nearest neighbor interpolation method is simpler to operate, the calculated amount is smaller, and the interpolation effect is better when the fusion weight value is filled, the step 705 may include the following steps: and carrying out nearest neighbor interpolation processing on other areas except the filled positions in the blank weight image based on the fusion weight values of the pixel points of the image block until all the pixel points in the other areas are filled with the fusion weight values, so as to obtain a target weight image corresponding to the image block.
In step 706, according to the target weight image of each image block at the same position in each image to be fused, each image block at the same position is fused to obtain a plurality of fused image blocks at different positions, and the plurality of fused image blocks are spliced to obtain a target image.
In some embodiments of the present application, in order to ensure that the transition of the image frames of the fusion result is natural, the image blocks of the image to be fused may be fused by adopting a pyramid policy, and accordingly, the step 706 may include the following steps: step 7061 and step 7062;
in step 7061, laplacian pyramid decomposition is performed on each image block, and gaussian pyramid decomposition is performed on the target weight image of each image block.
In step 7062, according to the weighted images of each scale obtained by the decomposition of the gaussian pyramid, the image blocks of the corresponding scale obtained by the decomposition of the laplacian pyramid are fused, so as to obtain a plurality of fused image blocks of different positions.
In one example, first, a sequence of target images is acquired, e.g., the sequence of target images includes: image Q to be fused 1 、Q 2 And Q 3
Thereafter, the image Q to be fused 1 、Q 2 And Q 3 Performing super-pixel blocking to obtain Q 1 Image block Q of (2) 11 、Q 12 And Q 13 ,Q 2 Image block Q of (2) 21 、Q 22 And Q 23 ,Q 3 Image block Q of (2) 31 、Q 32 And Q 33 Wherein, image block Q 11 、Q 21 And Q 31 The positions in the images to be fused are identical, and the positions are marked as K 1 The method comprises the steps of carrying out a first treatment on the surface of the Image block Q 12 、Q 22 And Q 32 The positions in the images to be fused are identical, and the positions are marked as K 2 The method comprises the steps of carrying out a first treatment on the surface of the Image block Q 13 、Q 23 And Q 33 The positions in the images to be fused are identical, and the positions are marked as K 3
Then, calculate the image block Q 11 Fusion weight value S of each pixel point in (B) 11 、Q 12 Fusion weight value S of each pixel point in (B) 12 And Q 13 Fusion weight value S of each pixel point in (B) 13 The method comprises the steps of carrying out a first treatment on the surface of the Calculate image block Q 21 Fusion weight value S of each pixel point in (B) 21 、Q 22 Fusion weight value S of each pixel point in (B) 22 And Q 23 Fusion weight value S of each pixel point in (B) 23 The method comprises the steps of carrying out a first treatment on the surface of the Calculate image block Q 31 Fusion weight value S of each pixel point in (B) 31 、Q 32 Fusion weight value S of each pixel point in (B) 32 And Q 33 Fusion weight value S of each pixel point in (B) 33
And then according to image block Q 11 Fusion weight value S of each pixel point in (B) 11 Calculate image block Q 11 Target weight image Z of (2) 11 The method comprises the steps of carrying out a first treatment on the surface of the From image block Q 12 Fusion weight value S of each pixel point in (B) 12 Calculate image block Q 12 Target weight image Z of (2) 12 The method comprises the steps of carrying out a first treatment on the surface of the From image block Q 13 Fusion weight value S of each pixel point in (B) 13 Calculate image block Q 13 Target weight image Z of (2) 13 The method comprises the steps of carrying out a first treatment on the surface of the The image block Q is obtained by the same calculation 21 Target weight image Z of (2) 21 Image block Q 22 Target weight image Z of (2) 22 Image block Q 23 Target weight image Z of (2) 23 Image block Q 31 Target weight image Z of (2) 31 Image block Q 32 Target weight image Z of (2) 32 Image block Q 33 Target weight image Z of (2) 33
And then, to K 1 Target weight image Z of location 11 、Z 21 And Z 31 All are decomposed by Gaussian pyramid, and K is calculated 1 Image block Q of location 11 、Q 21 And Q 31 Carrying out Laplacian pyramid decomposition; according to K 1 Multi-scale image obtained by decomposing Gaussian pyramid of position, for K 1 Fusing image blocks with corresponding scales obtained by decomposing Laplacian pyramid of the position to obtain K 1 A fused image block of locations; similarly, get K 2 Fused image block and K of position 3 And (5) fusing the image blocks of the positions.
Finally, according to K 1 Position, K 2 Position and K 3 The position of the position in the image to be fused is equal to K 1 Fused image block of positions, K 2 Fused image block and K of position 3 Splicing the fusion image blocks at the positions to obtain a targetAn image.
Therefore, in the embodiment of the application, the image to be fused can be segmented based on the non-uniform segmentation mode of the super pixels, and the fusion weight value is filled in the adjacent area of the edge of the weight area where each image block is positioned, so that the problem of edge halo caused by weight diffusion during multi-exposure fusion is avoided, the blocking effect is avoided, and the image quality of the multi-exposure fusion is improved.
Fig. 9 is a schematic structural diagram of an image fusion apparatus according to an embodiment of the present application, and as shown in fig. 9, the image fusion apparatus 900 may include: an acquisition module 901, a division module 902, a determination module 903 and a fusion module 904;
An obtaining module 901, configured to obtain a target image sequence, where the target image sequence includes: a plurality of images to be fused with different exposure degrees under the same scene;
the dividing module 902 is configured to select an image to be fused with good exposure from the target image sequence as a reference image, perform super-pixel blocking on the reference image, and perform super-pixel blocking on other images to be fused except the reference image in the target image sequence by using a blocking result of the reference image as a blocking standard, so as to obtain a plurality of image blocks of each image to be fused;
the determining module 903 is configured to determine a fusion weight value of a pixel point of each image block according to an index parameter of the pixel point of each image block, where the index parameter includes at least one of the following: contrast, saturation, and brightness;
the fusion module 904 is configured to fuse each image block at the same position based on the fusion weight value of the pixel point of each image block at the same position in each image to be fused, obtain a plurality of fused image blocks at different positions, and splice the plurality of fused image blocks to obtain a target image.
As can be seen from the above embodiment, in this embodiment, a target image sequence is acquired, where the target image sequence includes: a plurality of images to be fused with different exposure degrees under the same scene; selecting a well-exposed image to be fused from the target image sequence as a reference image, performing super-pixel blocking on the reference image, and performing super-pixel blocking on other images to be fused except the reference image in the target image sequence by taking a blocking result of the reference image as a blocking standard to obtain a plurality of image blocks of each image to be fused; determining a fusion weight value of the pixel point of each image block according to an index parameter of the pixel point of each image block, wherein the index parameter comprises at least one of the following: contrast, saturation, and brightness; based on the fusion weight value of the pixel points of each image block at the same position in each image to be fused, fusing each image block at the same position to obtain a plurality of fused image blocks at different positions, and splicing the plurality of fused image blocks to obtain a target image.
Compared with the prior art that the images to be fused are uniformly segmented, in the embodiment of the application, each image to be fused can be segmented based on super pixels to obtain a plurality of image blocks containing the same object of each image to be fused, and the image fusion is carried out by taking the image blocks as a unit.
Optionally, as an embodiment, the fusing module 904 may include:
a creating sub-module, configured to create, for each of the image blocks, a blank weight image for the image block, where a size of the blank weight image is the same as a size of the image to be fused;
the filling sub-module is used for filling fusion weight values of pixel points of the image blocks to corresponding positions in the blank weight image according to the positions of the image blocks in the image to be fused, and performing image interpolation processing on other areas except the filled positions in the blank weight image based on the fusion weight values of the pixel points of the image blocks until all the pixel points in the other areas are filled with the fusion weight values to obtain a target weight image corresponding to the image blocks; the difference value between the fusion weight value of the pixel point of the image block in the target weight image and the fusion weight value of the pixel point in the other region is smaller than a threshold value;
and the fusion sub-module is used for fusing the image blocks at the same position according to the target weight images of the image blocks at the same position in each image to be fused to obtain a plurality of fused image blocks at different positions.
Optionally, as an embodiment, the filling sub-module may include:
and the image interpolation unit is used for carrying out nearest neighbor interpolation processing on other areas except the filled positions in the blank weight image based on the fusion weight values of the pixel points of the image block until all the pixel points in the other areas are filled with the fusion weight values, so as to obtain a target weight image corresponding to the image block.
Alternatively, as an embodiment, the fusion sub-module may include:
the image decomposition unit is used for carrying out Laplacian pyramid decomposition on each image block and carrying out Gaussian pyramid decomposition on the target weight image of each image block;
and the multi-scale fusion unit is used for fusing the image blocks with the corresponding scales obtained by decomposing the Laplacian pyramid according to the weight images with the scales obtained by decomposing the Gaussian pyramid to obtain a plurality of fused image blocks with different positions.
Optionally, as an embodiment, the dividing module 902 may include:
the computing sub-module is used for computing the gradient sum of each image to be fused in the target image sequence, wherein the gradient sum is the sum of the contrast of all pixel points in the image;
And the determining submodule is used for determining the gradient and the largest image to be fused in the target image sequence as a reference image.
Optionally, as an embodiment, the index parameter further includes: and a quality metric, wherein the quality metric is positively correlated with the gray value of the pixel point, and the quality metric is used for balancing the brightness of the image in the image fusion process.
Optionally, as an embodiment, the dividing module 902 may include:
and the blocking sub-module is used for carrying out super-pixel blocking on the reference image based on a simple linear iterative clustering algorithm.
Any one step and specific operation in any one step in the embodiment of the image fusion method provided by the application can be completed by corresponding modules in the image fusion device. The respective operational procedures performed by the respective modules in the image fusion apparatus refer to the respective operational procedures described in the embodiments of the image fusion method.
For the device embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference is made to the description of the method embodiments for relevant points.
Fig. 10 is a block diagram of an electronic device including a processing component 1022 that further includes one or more processors, and memory resources represented by memory 1032, for storing instructions, such as applications, executable by the processing component 1022, provided in an embodiment of the present application. The application programs stored in memory 1032 may include one or more modules each corresponding to a set of instructions. Further, the processing component 1022 is configured to execute instructions to perform the methods described above.
The electronic device may also include a power component 1026 configured to perform power management of the electronic device, a wired or wireless network interface 1050 configured to connect the electronic device to a network, and an input output (I/O) interface 1058. The electronic device may operate based on an operating system stored in memory 1032, such as Windows Server, macOS XTM, unixTM, linuxTM, freeBSDTM, or the like.
According to yet another embodiment of the present application, there is also provided a computer-readable storage medium having stored thereon a computer program/instruction which, when executed by a processor, implements the steps of the image fusion method according to any of the embodiments described above.
According to a further embodiment of the present application, there is also provided a computer program product comprising a computer program/instruction which, when executed by a processor, implements the steps of the image fusion method according to any of the embodiments described above.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described by differences from other embodiments, and identical and similar parts between the embodiments are all enough to be referred to each other.
It will be apparent to those skilled in the art that embodiments of the present application may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the application may take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal device to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal device, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiment and all such alterations and modifications as fall within the scope of the embodiments of the application.
Finally, it is further noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or terminal device comprising the element.
The above description of the image fusion method, the electronic device and the storage medium provided by the present application applies specific examples to illustrate the principles and the implementation of the present application, and the above examples are only used to help understand the method and the core idea of the present application; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in accordance with the ideas of the present application, the present description should not be construed as limiting the present application in view of the above.

Claims (10)

1. A method of image fusion, the method comprising:
obtaining a target image sequence, wherein the target image sequence comprises the following steps: a plurality of images to be fused with different exposure degrees under the same scene;
selecting a well-exposed image to be fused from the target image sequence as a reference image, performing super-pixel blocking on the reference image, and performing super-pixel blocking on other images to be fused except the reference image in the target image sequence by taking a blocking result of the reference image as a blocking standard to obtain a plurality of image blocks of each image to be fused;
Determining a fusion weight value of the pixel points of each image block according to index parameters of the pixel points of each image block, wherein the index parameters comprise at least one of the following: contrast, saturation, and brightness;
based on the fusion weight value of the pixel points of each image block at the same position in each image to be fused, fusing each image block at the same position to obtain a plurality of fused image blocks at different positions, and splicing the plurality of fused image blocks to obtain a target image.
2. The method according to claim 1, wherein the fusing the image blocks at the same position based on the fusion weight value of the pixel points of the image blocks at the same position in each image to be fused to obtain a plurality of fused image blocks at different positions includes:
for each image block, creating a blank weight image for the image block, wherein the size of the blank weight image is the same as the size of the image to be fused;
for each image block, filling fusion weight values of pixel points of the image block to corresponding positions in the blank weight image according to the positions of the image blocks in the image to be fused, and performing image interpolation processing on other areas except the filled positions in the blank weight image based on the fusion weight values of the pixel points of the image block until all the pixel points in the other areas are filled with the fusion weight values to obtain a target weight image corresponding to the image block; the difference value between the fusion weight value of the pixel point of the image block in the target weight image and the fusion weight value of the pixel point in the other region is smaller than a threshold value;
And fusing the image blocks at the same position according to the target weight image of the image blocks at the same position in each image to be fused to obtain a plurality of fused image blocks at different positions.
3. The method according to claim 2, wherein the performing image interpolation processing on other areas except for the filled positions in the blank weight image based on the fusion weight values of the pixels of the image block until each pixel in the other areas is filled with the fusion weight values, to obtain the target weight image corresponding to the image block, includes:
and carrying out nearest neighbor interpolation processing on other areas except for the filled positions in the blank weight image based on the fusion weight values of the pixel points of the image block until all the pixel points in the other areas are filled with the fusion weight values, so as to obtain a target weight image corresponding to the image block.
4. The method according to claim 2, wherein the fusing the image blocks at the same position according to the target weight image of the image blocks at the same position in each image to be fused to obtain a plurality of fused image blocks at different positions includes:
Carrying out Laplacian pyramid decomposition on each image block, and carrying out Gaussian pyramid decomposition on the target weight image of each image block;
and fusing the image blocks with the corresponding scales obtained by decomposing the Laplacian pyramid according to the weighted images with the scales obtained by decomposing the Gaussian pyramid to obtain a plurality of fused image blocks with different positions.
5. The method according to claim 1, wherein selecting one well-exposed image to be fused from the target image sequence as a reference image comprises:
calculating the gradient sum of each image to be fused in the target image sequence, wherein the gradient sum is the sum of the contrast of all pixel points in the image;
and determining the gradient and the largest image to be fused in the target image sequence as a reference image.
6. The method of claim 1, wherein the index parameter further comprises: and a quality metric, wherein the quality metric is positively correlated with the gray value of the pixel point, and the quality metric is used for balancing the brightness of the image in the image fusion process.
7. The method of claim 1, wherein said super-pixel tiling the reference image comprises:
And performing super-pixel blocking on the reference image based on a simple linear iterative clustering algorithm.
8. An electronic device comprising a memory, a processor and a computer program stored on the memory, characterized in that the processor executes the computer program to implement the method of any one of claims 1-7.
9. A computer readable storage medium having stored thereon a computer program/instruction, which when executed by a processor, implements the method of any of claims 1-7.
10. A computer program product comprising computer programs/instructions which, when executed by a processor, implement the method of any of claims 1-7.
CN202310651289.2A 2023-06-02 2023-06-02 Image fusion method, electronic device and storage medium Pending CN116797505A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310651289.2A CN116797505A (en) 2023-06-02 2023-06-02 Image fusion method, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310651289.2A CN116797505A (en) 2023-06-02 2023-06-02 Image fusion method, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN116797505A true CN116797505A (en) 2023-09-22

Family

ID=88044593

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310651289.2A Pending CN116797505A (en) 2023-06-02 2023-06-02 Image fusion method, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN116797505A (en)

Similar Documents

Publication Publication Date Title
CN111178183B (en) Face detection method and related device
Yang et al. Single image haze removal via region detection network
CN116797504A (en) Image fusion method, electronic device and storage medium
CN113592726A (en) High dynamic range imaging method, device, electronic equipment and storage medium
CN111723707A (en) Method and device for estimating fixation point based on visual saliency
CN113112439B (en) Image fusion method, training method, device and equipment of image fusion model
CN114708173A (en) Image fusion method, computer program product, storage medium, and electronic device
CN114708172A (en) Image fusion method, computer program product, storage medium, and electronic device
CN116977804A (en) Image fusion method, electronic device, storage medium and computer program product
CN116977209A (en) Image noise reduction method, electronic device and storage medium
CN115115552B (en) Image correction model training method, image correction device and computer equipment
CN113591838B (en) Target detection method, device, electronic equipment and storage medium
CN116664694A (en) Training method of image brightness acquisition model, image acquisition method and mobile terminal
CN114581316A (en) Image reconstruction method, electronic device, storage medium, and program product
CN116797505A (en) Image fusion method, electronic device and storage medium
CN116362981A (en) Tone mapping method, computer program product, electronic device, and storage medium
CN114648604A (en) Image rendering method, electronic device, storage medium and program product
CN114372931A (en) Target object blurring method and device, storage medium and electronic equipment
CN114549340A (en) Contrast enhancement method, computer program product, storage medium, and electronic device
CN114565544A (en) Image fusion method, device, product and medium
CN110689609B (en) Image processing method, image processing device, electronic equipment and storage medium
CN114708143A (en) HDR image generation method, equipment, product and medium
CN114119678A (en) Optical flow estimation method, computer program product, storage medium, and electronic device
CN116415019A (en) Virtual reality VR image recognition method and device, electronic equipment and storage medium
CN116957961A (en) Method for enhancing image contrast, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination