CN115439386A - Image fusion method and device, electronic equipment and storage medium - Google Patents

Image fusion method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN115439386A
CN115439386A CN202211084167.1A CN202211084167A CN115439386A CN 115439386 A CN115439386 A CN 115439386A CN 202211084167 A CN202211084167 A CN 202211084167A CN 115439386 A CN115439386 A CN 115439386A
Authority
CN
China
Prior art keywords
target
image
pixel group
target image
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211084167.1A
Other languages
Chinese (zh)
Inventor
赵志杰
王宇飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202211084167.1A priority Critical patent/CN115439386A/en
Publication of CN115439386A publication Critical patent/CN115439386A/en
Priority to PCT/CN2023/117064 priority patent/WO2024051697A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The application discloses an image fusion method and device, electronic equipment and a storage medium, and belongs to the technical field of shooting. The image fusion method comprises the following steps: acquiring N frames of first images, wherein each frame of first image comprises M pixels, and N and M are positive integers; dividing a first target image into P first pixel groups, wherein the first target image is any one of N frames of first images, and P is a positive integer; acquiring optical parameters of a target pixel group in the P first pixel groups; determining at least one frame of second target image in the N frames of first images according to the optical parameters of the target pixel group, wherein the second target image comprises a second pixel group matched with the optical parameters of the target pixel group, the second target image is an image in the N frames of first images, the target pixel group comprises Q pixels, and Q is less than M; and carrying out fusion processing on the target pixel group and the second pixel group.

Description

Image fusion method and device, electronic equipment and storage medium
Technical Field
The application belongs to the technical field of shooting, and particularly relates to an image fusion method and device, electronic equipment and a storage medium.
Background
In the related art, the brightness of a shot picture is positively correlated with the light incoming quantity of a sensor, the shot picture is taken in an environment with complex illumination, a bright area is overexposed due to overlong exposure time, a dark area is underexposed due to overlong exposure time, and the quality of an image shot in a one-time exposure imaging mode is difficult to ensure.
In order to ensure the image quality of the shot image, a final synthesized image is usually obtained by acquiring multiple frames of short-exposure images and performing image fusion on all the whole images in the multiple frames of short-exposure images, so that motion blur exists in the final synthesized image.
Disclosure of Invention
An object of the embodiments of the present application is to provide an image fusion method, an image fusion device, an electronic device, and a storage medium, which avoid the problem of blur in the fused image due to fusion of some pixels with other pixels that are not matched with optical parameters, and improve the definition of the finally synthesized image.
In a first aspect, an embodiment of the present application provides an image fusion method, including: acquiring N frames of first images, wherein each frame of first image comprises M pixels, and N and M are positive integers; dividing a first target image into P first pixel groups, wherein the first target image is any one of N frames of first images, and P is a positive integer; acquiring optical parameters of a target pixel group in the P first pixel groups, wherein the target pixel group is any one of the P first pixel groups; determining at least one frame of second target image in the N frames of first images according to the optical parameters of the target pixel group, wherein the second target image comprises a second pixel group matched with the optical parameters of the target pixel group, the second target image is an image in the N frames of first images, the target pixel group comprises Q pixels, and Q is less than M; and carrying out fusion processing on the target pixel group and the second pixel group, wherein the position of the target pixel group in the first target image corresponds to the position of the second pixel group in the second target image.
In a second aspect, an embodiment of the present application provides an image fusion apparatus, including: the acquisition module is used for acquiring N frames of first images, each frame of first image comprises M pixels, and N and M are positive integers; the grouping module is used for dividing the first target image into P first pixel groups, the first target image is any one of N frames of first images, and P is a positive integer; the device comprises an acquisition module, a processing module and a control module, wherein the acquisition module is used for acquiring optical parameters of a target pixel group in P first pixel groups, and the target pixel group is any one of the P first pixel groups; the determining module is used for determining at least one frame of second target image in the N frames of first images according to the optical parameters of the target pixel group, the second target image comprises a second pixel group matched with the optical parameters of the target pixel group, the second target image is the image in the N frames of first images, and the target pixel group comprises Q pixels; and the fusion module is used for fusing the target pixel group and the second pixel group, and the position of the target pixel group in the first target image corresponds to the position of the second pixel group in the second target image.
In a third aspect, embodiments of the present application provide an electronic device, including a processor and a memory, where the memory stores a program or instructions executable on the processor, and the program or instructions, when executed by the processor, implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, embodiments of the present application provide a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the steps of the method according to the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product, stored on a storage medium, for execution by at least one processor to implement a method as in the first aspect.
In the embodiment of the application, when the electronic device shoots a shooting object in a moving state, a plurality of frames of first images including the shooting object are collected, a first target image in N frames of the first images is divided into a plurality of first pixel groups, target pixel groups in the plurality of first pixel groups are obtained, a second pixel group in a second target image which can be fused with the target pixel groups can be found through optical parameters of the target pixel groups, and the second pixel group is the pixel group found according to the optical parameters of the target pixel groups, so that the optical parameters between the second pixel group which is fused and the target pixel group can be ensured to be matched. And after each target pixel group is fused with the corresponding second pixel, splicing is carried out, so that an image with motion blur eliminated can be obtained. In the embodiment of the application, the pixel groups are used as the unit of image fusion, each target pixel group is fused with the matched second pixel group, the problem of blurring caused by fusion of part of pixels through other pixels which are not matched with optical parameters in the fused image can be avoided, and the definition of the finally synthesized image is improved.
According to the embodiment of the application, the pixel group is used as a unit for image fusion, only the optical parameters of the whole pixel group are calculated, and each pixel in the pixel group does not need to be fused independently. Because the relevance of the pixels in the target pixel group is strong, the pixel group is used as the unit of image fusion, the calculation amount of the electronic equipment can be reduced, and meanwhile, the precision of image fusion processing can be ensured. The method and the device realize that the data processing amount required by image fusion is reduced while the definition of the fused image is ensured.
Drawings
Fig. 1 shows a schematic flowchart of an image fusion method provided in an embodiment of the present application;
fig. 2 shows a variation curve of luminous flux values provided by an embodiment of the present application;
fig. 3 is a schematic diagram illustrating a pixel arrangement of a target pixel group according to an embodiment of the present disclosure;
fig. 4 is a second schematic diagram illustrating a pixel arrangement of a target pixel set according to an embodiment of the present application;
fig. 5 is a third schematic diagram illustrating a pixel arrangement of a target pixel group according to an embodiment of the present application;
FIG. 6 is a diagram illustrating one of the schematic diagrams of a first pixel group in a first target image according to an embodiment of the present application;
FIG. 7 is a second schematic diagram of a first pixel set in a first target image according to an embodiment of the present disclosure;
FIG. 8 is a schematic block diagram of an image fusion apparatus provided in an embodiment of the present application;
FIG. 9 shows a block diagram of an electronic device according to an embodiment of the application;
fig. 10 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application are capable of operation in sequences other than those illustrated or described herein, and that the terms "first," "second," etc. are generally used in a generic sense and do not limit the number of terms, e.g., a first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/", and generally means that the former and latter related objects are in an "or" relationship.
The image fusion method, the image fusion apparatus, the electronic device, and the storage medium provided in the embodiments of the present application are described in detail below with reference to fig. 1 to fig. 10 through specific embodiments and application scenarios thereof.
In some embodiments of the present application, an image fusion method is provided. Fig. 1 shows a schematic flowchart of an image fusion method provided in an embodiment of the present application. As shown in fig. 1, the image fusion method includes:
102, acquiring N frames of first images, wherein each frame of first image comprises M pixels, and N and M are positive integers;
in the process that the electronic equipment carries out multi-section exposure shooting at the preset frame rate, the electronic equipment can acquire N frames of first images, and pixels in the N frames of first images are acquired through the same sensor, so that the number of the pixels in each first image is the same, and the pixels all comprise M pixels.
Step 104, dividing a first target image into P first pixel groups, wherein the first target image is any one of N frames of first images, and P is a positive integer;
the first target image is an image used for fusing the images in the N frames of first images, any one of the N frames of first images can be selected as the first target image, and in the subsequent processing process, the first target image is used as an initial image to be fused with other images.
All pixels of the first image of each frame are divided into P first pixel groups. The number of pixels of each first pixel group needs to be smaller than the number of pixels in the first target image, and the first pixel group includes at least one pixel.
It should be noted that the number of pixels in each of the P first pixel groups may be the same or different. And the arrangement form of the pixels in the plurality of first pixel groups may be the same or different.
Illustratively, when the number of pixels in each first pixel group is determined to be 12, the arrangement form of the first pixel groups may be 2 × 6, 3 × 4, 1 × 12.
Step 106, acquiring optical parameters of a target pixel group in the P first pixel groups, wherein the target pixel group is any one of the P first pixel groups;
the target pixel group is any one of the P first pixel groups, and in the process of fusing the first target image, each of the P first pixel groups may be used as the target pixel group, and a part of the P first pixel groups may also be used as the target pixel group.
Step 108, determining at least one frame of second target image in the N frames of first images according to the optical parameters of the target pixel group, wherein the second target image comprises a second pixel group matched with the optical parameters of the target pixel group, the second target image is the image in the N frames of first images, the target pixel group comprises Q pixels, and Q is less than M;
the second target image is filtered from the N frames of the first image according to the optical parameters of the target pixel set. It should be noted that the second target image is a different image from the first target image, and the number of the second target images is at most N-1 frames, that is, the second target image may be all the images except the first target image in the N frames of the first image.
The second pixel set in the second target image matches the optical parameters of the target pixel set. And determining that the optical parameters of the second pixel group and the target pixel group are matched under the condition that the parameter difference value between the optical parameter of the second pixel group and the optical parameter of the target pixel group is smaller than a preset difference value.
The position of the second pixel group in the second target image corresponds to the position of the target pixel group in the first target image. The pixel matrixes corresponding to the first target image and the second target image are the same, and the position of the target pixel group in the pixel matrix is the same as that of the first pixel group in the pixel matrix.
And 110, carrying out fusion processing on the target pixel group and the second pixel group, wherein the position of the target pixel group in the first target image corresponds to the position of the second pixel group in the second target image.
In the embodiment of the application, the electronic device collects multiple frames of first images, divides a first target image in the multiple frames of first images into P first pixel groups, and searches for a second target image in the multiple frames of first images according to a target pixel group in the P first pixel groups. And fusing the target pixel group according to the corresponding second pixel group in the second target image. And fusing all the target pixel groups in the P first pixel groups to obtain a fused image. The image obtained through the process fusion can avoid the problem of motion blur caused by the motion of the shooting object in the shooting process.
Specifically, in the process of searching for a second target image in multiple frames of first images according to the target pixel group, a third pixel group in other images except for the first target image in the multiple frames of first images needs to be determined according to the target pixel group. And acquiring the optical parameters of the target pixel group and the optical parameters of the plurality of third pixel groups, and determining an image corresponding to a second pixel group matched with the optical parameters of the target pixel group in the plurality of third pixel groups as a second target image.
Illustratively, the determining whether the optical parameter of the third pixel group matches the optical parameter of the target pixel group comprises: and calculating the difference value of the optical parameters of the third pixel group and the target pixel group, and determining the three pixel groups with the difference value smaller than a preset difference value as a second pixel group matched with the target pixel group.
The optical parameter includes any one of: light flux value, number of photoelectrons.
In some possible embodiments, the optical parameter includes only one of a luminous flux value or a number of photoelectrons. In a case where it is detected that a difference between the light flux values of the third pixel group and the target pixel group is smaller than a preset difference, or a difference between the numbers of photoelectrons of the third pixel group and the target pixel group is smaller than a preset difference, the third pixel group is determined as a second pixel group matching the target pixel group.
In some possible embodiments, the optical parameters include a luminous flux value and a number of photoelectrons. And in the case where it is detected that the difference between the luminous flux values of the third pixel group and the target pixel group is smaller than a preset difference, and in the case where the difference between the numbers of photoelectrons of the third pixel group and the target pixel group is smaller than the preset difference, determining that the third pixel group is a second pixel group matching the target pixel group. It is also possible to determine the third pixel group as the second pixel group matching the target pixel group in a case where it is detected that a difference between the light flux values of the third pixel group and the target pixel group is smaller than a preset difference, or a difference between the numbers of photoelectrons of the third pixel group and the target pixel group is smaller than a preset difference.
The luminous flux value can be obtained by calculation. The number of photoelectrons is the number of pixels in the pixel group in a lighting state, and the collection statistics can be carried out through a sensor.
Fig. 2 shows a variation curve of the luminous flux value provided by the embodiment of the application, as shown in fig. 2, the electronic device acquires 1000 frames of the third image, and the luminous flux value of the target pixel group has two significant changes in the 1000 frames of the third image, where a variation of the first change is 0.6, and a variation of the second change is 1. In the case where the preset variation amount is 0.8, the timing corresponding to the second change is determined as the light flux value abrupt change timing, i.e., the first timing.
In the embodiment of the application, when the electronic equipment shoots a shooting object in a motion state, multiple frames of first images including the shooting object are collected, a first target image in N frames of the first images is divided into multiple first pixel groups, target pixel groups in the multiple first pixel groups are obtained, a second pixel group in a second target image capable of being fused with the target pixel groups can be found through optical parameters of the target pixel groups, and the second pixel group is a pixel group found according to the optical parameters of the target pixel groups, so that optical parameters between the fused second pixel group and the target pixel groups can be guaranteed to be matched. And after each target pixel group is fused with the corresponding second pixel, splicing is carried out, so that an image with motion blur eliminated can be obtained. In the embodiment of the application, the pixel groups are used as the unit of image fusion, each target pixel group is fused with the matched second pixel group, the problem of blurring caused by fusion of part of pixels through other pixels which are not matched with optical parameters in the fused image can be avoided, and the definition of the finally synthesized image is improved.
According to the embodiment of the application, the pixel group is used as a unit for image fusion, only the optical parameters of the whole pixel group are calculated, and each pixel in the pixel group does not need to be fused independently. Because the relevance of the pixels in the target pixel group is strong, the pixel group is used as the unit of image fusion, the calculation amount of the electronic equipment can be reduced, and meanwhile, the precision of the image fusion processing can be ensured. The method and the device realize that the data processing amount required by image fusion is reduced while the definition of the fused image is ensured.
In some embodiments of the present application, obtaining optical parameters of a target pixel group of the P first pixel groups comprises: determining X target pixels in the target pixel group, wherein X is a positive integer and is not more than Q; and determining the optical parameters of the target pixel group according to the optical parameters of the X target pixels.
In the embodiment of the present application, X target pixels are partial pixels in the target pixel group. The number of target pixels ranges from greater than 1 to equal to or less than Q. After the X target pixels are determined, the optical parameters of the target pixel group are determined according to the optical parameters corresponding to the X target pixels.
It should be noted that the selection rule of the X target pixels may be set in advance before the electronic device leaves the factory.
If the number of target pixels is 1, the optical parameters of the target pixels are directly used as the optical parameters of the target pixel group. Illustratively, when the number of target pixels is 1 and the light flux value of the target pixel is 1.2, the light flux value of the target pixel group where the target pixel is located is determined to be 1.2.
And when the number of the target pixels is multiple, calculating the optical parameters of the multiple target pixels to determine the optical parameters of the target pixel group where the multiple target pixels are located. Illustratively, when the number of target pixels is 2, one of the target pixels has a light flux value of 0.4, and the other target pixel has a light flux value of 0.6, the light flux value of the target pixel group is determined to be 0.5 by calculating the average value of the light flux values of the two target pixels. Other weighted calculation methods may also be optionally performed on the optical parameters of the plurality of target pixels to determine the optical parameters of the target pixel group.
In the case where the optical parameter is the number of photoelectrons, the number of lighting target pixels in the target pixel group is counted.
Fig. 3 shows one of the pixel arrangement diagrams of the target pixel group provided in the embodiment of the present application, as shown in fig. 3, the target pixel group 300 is a 2 × 2 pixel group, and a single target pixel 302 at the upper left corner in the 2 × 2 pixel group is selected as a light flux value calculation basis of the target pixel group 300, so that there is no need to calculate light flux values of other pixels, and thus three-quarters of the calculation amount is reduced.
Fig. 4 shows a second pixel arrangement diagram of the target pixel group provided in the embodiment of the present application, as shown in fig. 4, the target pixel group 400 is a 2 × 2 pixel group, and two target pixels 402 on the diagonal line in the 2 × 2 pixel group are selected as the basis for calculating the light flux value of the target pixel group 400. The average value of the light flux values of the above two target pixels 402 is calculated and is taken as the light flux value of the target pixel group 400, thereby reducing the amount of calculation.
Fig. 5 shows a third schematic diagram of pixel arrangement of a target pixel group according to the embodiment of the present application, as shown in fig. 5, the target pixel group 500 is a 2 × 2 pixel group, and two target pixels 502 in a first row of the 2 × 2 pixel group are selected as a basis for calculating a luminous flux value of the target pixel group 500. The average value of the light flux values of the above two target pixels 502 is calculated and is taken as the light flux value of the target pixel group 500, thereby reducing the amount of calculation.
In the embodiment of the application, after the first image is divided into the plurality of target pixel groups, part of pixels in the target pixel groups are used as the target pixels, and the optical parameters of the corresponding target pixel groups are determined according to the target pixels, so that the optical parameters of all pixels in the target pixel groups do not need to be calculated, and the calculation amount for determining the optical parameters corresponding to the target pixel groups is reduced.
In some embodiments of the present application, determining X target pixels in a target pixel group comprises: and determining X target pixels in the target pixel group according to the corresponding image content of the target pixel group in the first target image.
In this embodiment of the application, in the process of selecting X target pixels in the first pixel group, the number of the target pixels in the first pixel group and the positions of the target pixels may be selected according to the image content corresponding to the pixels included in the first pixel group. Under the condition that the image content corresponding to the first pixel group is rich, in order to ensure the accuracy of determining the optical parameters in the first pixel group, a larger number of target pixels are selected, and the target pixels of the image content at the corresponding positions in the first pixel group are selected.
Specifically, after the electronic device finishes dividing the first pixel group, the electronic device determines image content corresponding to the target pixel group through an image recognition algorithm, and selects different target pixels for the target pixel group based on different image content.
Illustratively, image content corresponding to the target pixel group is acquired, and the number of the target pixels in the target pixel group is determined according to the content amount of the image content corresponding to the target pixel group.
The image content corresponding to the target pixel group is the target image feature of the target pixel group in the image area of the first target image, and the target image feature in the image area can be determined by performing image recognition on the image area. The content amount of the image content is a ratio of the identified target image features in the image area, and is illustratively N pixels in total in the image area, and N/N in total in the image area.
Specifically, when the content amount of the image content corresponding to the target pixel group is large, the target pixels with a large number are selected from the target pixel group, and when the content amount of the image content corresponding to the target pixel group is small, the target pixels with a small number are selected from the target pixels.
Illustratively, the image content corresponding to the target pixel group is obtained, and the target pixel in the target pixel group is selected according to the position information of the image content corresponding to the target pixel group.
The position information of the image content is the position of the identified target image feature in the image area, illustratively, the image area is an M × N pixel matrix, and the position of the target image feature is an M × N pixel matrix, then the pixel in the M × N pixel matrix is taken as the target pixel.
In the embodiment of the application, the target pixel in the first pixel group is selected in a self-adaptive manner according to the image content through the electronic device, so that the accuracy of the optical parameters of the first pixel group obtained through determination is improved while the calculation amount of the optical parameters is reduced.
In some embodiments of the present application, dividing the first target image into P first pixel groups includes: acquiring a first resolution of a first target image; determining the number of preset pixels in each first pixel group according to the first resolution; the first target image is divided into P first pixel groups according to the preset pixel number.
In the embodiment of the application, before dividing a first target image into P first pixel groups, a first resolution of the first target image needs to be determined, a preset number of pixels of the first pixel group is determined according to the first resolution, the first target image is divided into P first pixel groups according to the preset number of pixels, and the number of pixels in the first pixel groups obtained by the division in the above manner is equal.
In the embodiment of the application, the electronic device can automatically determine the number of pixels in the first pixel group according to the resolution of the first target image, divide the first target image according to the number of pixels in the first pixel group, can finish the division of the first target image without manual operation of a user, and has equal number of pixels in each first pixel group obtained through division.
In some embodiments of the present application, determining the preset number of pixels in each of the first pixel groups according to the first resolution includes: determining the number of the preset pixels as a first value under the condition that the first resolution is greater than the preset resolution; determining the number of the preset pixels as a second value under the condition that the first resolution is less than or equal to the preset resolution; wherein the first value is greater than the second value.
In the embodiment of the present application, the number of pixels in the first pixel group is set to be higher when the resolution of the first target image is higher, and the number of pixels in the first pixel group is set to be lower when the resolution of the first target image is lower.
The larger the resolution of the first target image is, the larger the correlation between adjacent pixels in the first target image is, the more the number of pixels in the first pixel group is set, and the calculation amount of the electronic device can be reduced while the accuracy of determining the optical parameters of the first pixel group is not affected. The lower the resolution of the first target image is, the smaller the correlation between adjacent pixels in the first target image is, and in order to ensure the accuracy of the optical parameters of the first pixel group, the smaller the number of pixels in the first pixel group is set.
Fig. 6 shows one of schematic diagrams of the first pixel group in the first target image provided in the embodiment of the present application, and as shown in fig. 6, in a case that the resolution of the first target image 600 is higher, the number of the preset pixels of the first pixel group 602 in the first target image 600 may be optionally set to 9, and the arrangement is 3 × 3.
In the embodiment of the application, the electronic device can select different preset pixel numbers for the first pixel group according to the comparison result between the preset resolution and the first resolution of the first target image, so that the number of pixels in the first pixel group is large under the condition that the first resolution is high, the number of pixels in the first pixel group is small under the condition that the first resolution is low, the electronic device can automatically set the pixel number of the first pixel group, and the first target image is divided according to the set pixel number.
In some embodiments of the present application, dividing the first target image into P first pixel groups includes: identifying image content of a first target image; dividing a first target image into P first pixel groups according to the image content; the P first pixel groups comprise pixel groups with different pixel numbers.
In the embodiment of the application, in the process of dividing the first target image into the first pixel groups, the electronic device can adaptively divide the first target image according to the identified image content.
Specifically, after acquiring the first target image, the electronic device determines the image content in the first target image in an image recognition mode, and divides the first target image according to the image content obtained by recognition, so that the number of first pixel groups in a region with more image content in the first target image is smaller, and the number of first pixel groups in a region with less image content in the first target image is larger.
The image content is a target image feature in the first target image, and the target image feature in the image area can be determined by performing image recognition on the first target image. The region with more image content is a region where the target image feature is located in the first target image, the region with more image content may be a main region in the first target image, the region other than the region where the target image feature is located in the first target image is a region with less image content, and the region with less image content may be a background region in the first target image. Fig. 7 illustrates a second schematic diagram of the first pixel group in the first target image according to the embodiment of the present application, as shown in fig. 7, a first area 702 of the first target image 700 is a background area of the first target image 700, and a photographic object 706 is displayed in a second area 704 of the first target image 700. The electronics set the first pixel group 708 located in the first region 702 as a 5 x 5 pixel group and set the first pixel group 708 located in the second region 704 as a 2 x 2 pixel group.
In the embodiment of the application, the electronic device can identify a main body area with more image content and a background area with less image content in the first target image through an image identification mode, and set a smaller number of pixels for the first pixel group in the main body area, so that the image content is not lost after the first pixel group in the shooting main body area is fused, set a larger number of pixels for the first pixel group in the background area, and further reduce the calculation amount of the electronic device.
In some embodiments of the present application, the optical parameter comprises any one of: light flux value, number of photoelectrons.
In the embodiment of the present application, the optical parameters can be selected from the group consisting of light flux value and photoelectron quantity. In the case where a sudden change in the light flux value or a sudden change in the number of photoelectrons is detected, the moment of the sudden change is taken as the first moment of the change in the optical parameter.
The photoelectron quantity is the quantity of pixels in a lighting state in the first pixel group, the photoelectron quantity can be directly collected through a sensor, and the luminous flux value can be obtained through calculation.
The light flux value is calculated as shown in equation (1):
Figure BDA0003834794280000121
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003834794280000122
as luminous flux value, N pf Number of first images, T pf Acquisition time for a single exposure, t i Q is the photon detection efficiency of the sensor for the moment a pixel receives a photon.
In the embodiment of the application, the optical parameter is set to be any one of the light flux value and the number of the photoelectrons, so that the accuracy of judgment of the first moment of sudden change of the optical parameter is ensured, and the step of obtaining the optical parameter is simplified.
The optical parameters are not limited to the above-mentioned light flux values and the number of photoelectrons, and may be other optical parameters, and are not specifically limited herein.
In some embodiments of the present application, determining at least one second target image of the N frames of first images from optical parameters of the target pixel set comprises: acquiring the optical parameter variation of a third pixel group in a third target image adjacent to the acquisition time, wherein the third target image is the other image except the first target image in the N frames of first images, and the position of the third pixel group in the third target image corresponds to the position of the first pixel group in the first target image; determining the acquisition time when the variation of the optical parameter is larger than the preset variation as a target time; and determining at least one second target image in the N frames of first images according to the target time, wherein the acquisition time of the second target image is between the acquisition time of the first target image and the target time.
In the embodiment of the application, after N frames of first images are obtained through shooting, the acquisition time of each frame of first image is recorded. And taking the images except the first target image in the N frames of first images as third target images. And sequencing the third target images according to the acquisition time, and according to the positions of the target pixel groups in the first target image, each frame of the third pixel groups in the third target image. And performing difference calculation according to the optical parameters of the third pixel groups in every two adjacent third target images to obtain the optical parameter variation corresponding to the acquisition time. And taking the optical parameter variation as a target moment when the optical parameter variation is determined to be larger than a preset variation. The target moment is the moment of sudden change of the optical parameters of the third pixel group in the process of acquiring the multi-frame first image.
And when the acquisition time of the first target image is before the target time, determining a third target image with the acquisition time after the acquisition time of the first target image and before the target time as a second target image.
For example, in the process of acquiring N frames of first images, the first image acquired at the earliest time may be used as the first target image, and the first images acquired subsequently may be used as the third target images. Therefore, in the process of acquiring N frames of first images, the target time can be determined, namely, when a frame of third target image is acquired, the third pixel group in the third target image and the third pixel group in the third target image at the previous acquisition time are subjected to optical parameter variation calculation and are compared with the preset variation, so that the target time can be determined in the shooting process, the second target image is determined based on the target time, the efficiency of determining the second pixel group capable of being fused with the target pixel group is further improved, and the overall efficiency of image fusion is improved.
And determining a third target image with the acquisition time after the target image and before the acquisition time of the first target image as a second target image when the acquisition time of the first target image is after the target time.
It should be noted that the target time is a time when the optical parameter of the third pixel group changes abruptly in the process of acquiring the N frames of the first image. And determining the image before the sudden change of the optical parameters as a second target image in order to ensure that the target pixel group in the first target image can be subjected to image fusion with a second pixel group matched with the optical parameters when the acquisition time of the first target image is before the target time. And determining the image after the optical parameter mutation as the second target image in order to ensure that the target pixel group in the first target image can be subjected to image fusion with the second pixel group matched with the optical parameter after the acquisition time of the first target image is the target time.
In the embodiment of the application, the difference value of the optical parameters of the adjacent third target pixel groups at the acquisition time is calculated, the optical parameter variation corresponding to the acquisition time can be determined, and the optical parameter variation corresponding to each acquisition time can be obtained by obtaining the optical parameter variation of the third pixel groups in each two adjacent third target images. Under the condition that the variation of the optical parameter exceeds the preset variation, the acquisition time is determined to have a sudden change of the optical variation, and a third pixel group in a third target image acquired between the target time and the acquisition time of the first target image is matched with the target pixel group, so that all first images between the target time and the acquisition time of the first target image are used as second target images, and the problem that the image area of the target pixel group in the image fused according to the second pixel group in the second target image and the target pixel group cannot be blurred is solved.
In some embodiments of the present application, after the fusing the target pixel group and the second pixel group, the fusing includes: and generating a fourth target image according to a plurality of third pixel groups, wherein the third pixel groups are pixel groups obtained by fusing the target pixel groups and the second pixel groups, and the number of the third pixel groups is the same as that of the first pixel groups. In the embodiment of the application, each first pixel group is determined as a target pixel group, and then a second target image corresponding to each target pixel group is obtained. And fusing the corresponding target pixel group with the corresponding second pixel group in the second target image to obtain a third pixel group. And repeating the step of fusing each target pixel group to obtain a plurality of third pixel groups, and splicing the plurality of third pixel groups according to the positions of the corresponding first pixel groups in the first target image to generate a fourth target image.
In the embodiment of the application, the electronic device can ensure that each part in the first target image is subjected to targeted fusion processing by taking the pixel group as a unit of fusion processing, so that an area with part unclear exists in a fourth target image obtained after the fusion processing, and each pixel does not need to be processed independently, thereby reducing the calculation amount of the electronic device and improving the plotting efficiency.
According to the image fusion method provided by the embodiment of the application, the execution main body can be an image fusion device. The embodiment of the present application takes an image fusion device as an example to execute an image fusion method, and describes an image fusion device provided in the embodiment of the present application.
In some embodiments of the present application, an image fusion apparatus is provided. Fig. 8 shows a schematic block diagram of an image fusion apparatus 800 provided in an embodiment of the present application, and as shown in fig. 8, the image fusion apparatus 800 includes:
an acquisition module 802, configured to acquire N frames of first images, where each frame of first image includes M pixels, where N and M are positive integers;
a grouping module 804, configured to divide the first target image into P first pixel groups, where the first target image is any one of N frames of first images, and P is a positive integer;
an obtaining module 806, configured to obtain optical parameters of a target pixel group in the P first pixel groups, where the target pixel group is any one of the P first pixel groups;
a determining module 808, configured to determine at least one second target image in the N frames of first images according to the optical parameters of the target pixel group, where the second target image includes a second pixel group matched with the optical parameters of the target pixel group, the second target image is an image in the N frames of first images, and the target pixel group includes Q pixels;
and a fusion module 810, configured to perform fusion processing on the target pixel group and the second pixel group, where a position of the target pixel group in the first target image corresponds to a position of the second pixel group in the second target image.
In the embodiment of the application, when the electronic equipment shoots a shooting object in a motion state, multiple frames of first images including the shooting object are collected, a first target image in N frames of the first images is divided into multiple first pixel groups, target pixel groups in the multiple first pixel groups are obtained, a second pixel group in a second target image capable of being fused with the target pixel groups can be found through optical parameters of the target pixel groups, and the second pixel group is a pixel group found according to the optical parameters of the target pixel groups, so that optical parameters between the fused second pixel group and the target pixel groups can be guaranteed to be matched. And after each target pixel group is fused with the corresponding second pixel, splicing is carried out, so that an image with the motion blur eliminated can be obtained. In the embodiment of the application, the pixel groups are used as the unit of image fusion, each target pixel group is fused with the matched second pixel group, the problem of blurring caused by fusion of part of pixels through other pixels which are not matched with optical parameters in the fused image can be avoided, and the definition of the finally synthesized image is improved.
According to the embodiment of the application, the pixel group is used as a unit for image fusion, only the optical parameters of the whole pixel group are calculated, and each pixel in the pixel group does not need to be fused independently. Because the relevance of the pixels in the target pixel group is strong, the pixel group is used as the unit of image fusion, the calculation amount of the electronic equipment can be reduced, and meanwhile, the precision of the image fusion processing can be ensured. The method and the device realize that the data processing amount required by image fusion is reduced while the definition of the fused image is ensured.
In some embodiments of the present application, the determining module 808 is further configured to determine X target pixels in the target pixel group, where X is a positive integer and X ≦ Q;
the determining module 808 is further configured to determine the optical parameters of the target pixel group according to the optical parameters of the X target pixels.
In the embodiment of the application, after the first image is divided into the plurality of target pixel groups, part of pixels in the target pixel groups are used as the target pixels, and the optical parameters of the corresponding target pixel groups are determined according to the target pixels, so that the optical parameters of all pixels in the target pixel groups do not need to be calculated, and the calculation amount for determining the optical parameters corresponding to the target pixel groups is reduced.
In some embodiments of the present application, the determining module 808 is further configured to determine X target pixels in the target pixel group according to the corresponding image content of the target pixel group in the first target image.
In the embodiment of the application, the target pixel in the first pixel group is selected in a self-adaptive manner according to the image content through the electronic device, so that the accuracy of the optical parameters of the first pixel group obtained through determination is improved while the calculation amount of the optical parameters is reduced.
In some embodiments of the present application, the obtaining module 806 is further configured to obtain a first resolution of the first target image;
a determining module 808, further configured to determine a preset number of pixels in each first pixel group according to the first resolution;
the grouping module 804 is further configured to divide the first target image into P first pixel groups according to a preset number of pixels.
In the embodiment of the application, the electronic device can automatically determine the number of pixels in the first pixel group according to the resolution of the first target image, divide the first target image according to the number of pixels in the first pixel group, can finish the division of the first target image without manual operation of a user, and has equal number of pixels in each first pixel group obtained through division.
In some embodiments of the present application, the determining module 808 is further configured to determine the preset number of pixels to be a first value if the first resolution is greater than the preset resolution;
the determining module 808 is further configured to determine the number of the preset pixels as a second value when the first resolution is less than or equal to the preset resolution;
wherein the first value is greater than the second value.
In the embodiment of the application, the electronic device can select different preset pixel numbers for the first pixel group according to the comparison result between the preset resolution and the first resolution of the first target image, so that the number of pixels in the first pixel group is large under the condition that the first resolution is high, the number of pixels in the first pixel group is small under the condition that the first resolution is low, the electronic device can automatically set the pixel number of the first pixel group, and the first target image is divided according to the set pixel number.
In some embodiments of the present application, the image fusion apparatus 800 further includes:
the identification module is used for identifying the image content of the first target image;
a grouping module 804, further configured to divide the first target image into P first pixel groups according to image content;
the P first pixel groups comprise pixel groups with different pixel numbers.
In the embodiment of the application, the electronic device can identify a main body area with more image content and a background area with less image content in the first target image in an image identification mode, and set a smaller number of pixels for the first pixel group in the main body area, so that the image content is not lost after the first pixel group in the shooting main body area is fused, set a larger number of pixels for the first pixel group in the background area, and further reduce the calculation amount of the electronic device.
In some embodiments of the present application, the optical parameter comprises any one of: light flux value, number of photoelectrons.
In the embodiment of the application, the optical parameter is set to be any one of the light flux value and the number of the photoelectrons, so that the accuracy of judgment of the first moment of sudden change of the optical parameter is ensured, and the step of obtaining the optical parameter is simplified.
In some embodiments of the present application, the obtaining module 806 is configured to obtain variation of an optical parameter of a third pixel group in a third target image adjacent to the acquisition time, where the third target image is an image other than the first target image in the N frames of the first image, and a position of the third pixel group in the third target image corresponds to a position of the first pixel group in the first target image; a determining module 808, configured to determine, as a target time, an acquisition time at which the optical parameter variation is greater than a preset variation; the determining module 808 is configured to determine, according to the target time, at least one second target image in the N frames of the first images, where an acquisition time of the second target image is between an acquisition time of the first target image and the target time.
In the embodiment of the application, the difference value of the optical parameters of the adjacent third target pixel groups at the acquisition time is calculated, the optical parameter variation corresponding to the acquisition time can be determined, and the optical parameter variation corresponding to each acquisition time can be obtained by obtaining the optical parameter variation of the third pixel groups in each two adjacent third target images. Under the condition that the variation of the optical parameter exceeds the preset variation, the acquisition time is determined to have a sudden change of the optical variation, and a third pixel group in a third target image acquired between the target time and the acquisition time of the first target image is matched with the target pixel group, so that all first images between the target time and the acquisition time of the first target image are used as second target images, and the problem that the image area of the target pixel group in the image fused according to the second pixel group in the second target image and the target pixel group cannot be blurred is solved.
In some embodiments of the present application, the image fusion apparatus 800 further includes:
and the generation module is used for generating a fourth target image according to a plurality of third pixel groups, the third pixel groups are pixel groups obtained by fusing the target pixel groups and the second pixel groups, and the number of the third pixel groups is the same as that of the first pixel groups.
In the embodiment of the application, the electronic device can ensure that each part in the first target image is subjected to targeted fusion processing by taking the pixel group as a unit of fusion processing, so that a partially unclear region in a fourth target image obtained after the fusion processing is avoided, and each pixel does not need to be processed independently, thereby reducing the calculation amount of the electronic device and improving the plotting efficiency.
The image fusion device in the embodiment of the present application may be an electronic device, or may be a component in an electronic device, such as an integrated circuit or a chip. The electronic device may be a terminal, or may be a device other than a terminal. The electronic Device may be, for example, a Mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic Device, a Mobile Internet Device (MID), an Augmented Reality (AR)/Virtual Reality (VR) Device, a robot, a wearable Device, an ultra-Mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and may also be a server, a Network Attached Storage (Network Attached Storage, NAS), a personal computer (NAS), a Television (TV), a teller machine, a self-service machine, and the like, and the embodiments of the present application are not limited in particular.
The image fusion apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android operating system, an iOS operating system, or other possible operating systems, which is not specifically limited in the embodiment of the present application.
The image fusion device provided in the embodiment of the present application can implement each process implemented by the above method embodiment, and is not described here again to avoid repetition.
Optionally, an embodiment of the present application further provides an electronic device, which includes the image fusion device in any of the embodiments described above, so that all the beneficial effects of the image fusion device in any of the embodiments are achieved, and redundant description is not repeated here.
Optionally, an electronic device is further provided in an embodiment of the present application, fig. 9 shows a block diagram of a structure of the electronic device according to the embodiment of the present application, and as shown in fig. 9, the electronic device 900 includes a processor 902, a memory 904, and a program or an instruction stored in the memory 904 and executable on the processor 902, and when the program or the instruction is executed by the processor 902, the process of the embodiment of the image fusion method is implemented, and the same technical effect can be achieved, and details are not repeated here to avoid repetition.
It should be noted that the electronic devices in the embodiments of the present application include the mobile electronic device and the non-mobile electronic device described above.
Fig. 10 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 1000 includes, but is not limited to: radio frequency unit 1001, network module 1002, audio output unit 1003, input unit 1004, sensor 1005, display unit 1006, user input unit 1007, interface unit 1008, memory 1009, processor 1010, and the like.
Those skilled in the art will appreciate that the electronic device 1000 may further comprise a power source (e.g., a battery) for supplying power to various components, and the power source may be logically connected to the processor 1010 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 10 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is not repeated here.
The processor 1010 is configured to acquire N frames of first images, where each frame of first image includes M pixels, and N and M are positive integers;
a processor 1010, configured to divide a first target image into P first pixel groups, where the first target image is any one of N frames of first images, and P is a positive integer;
a processor 1010, configured to obtain optical parameters of a target pixel group in the P first pixel groups, where the target pixel group is any one of the P first pixel groups;
a processor 1010, configured to determine at least one second target image in the N frames of first images according to the optical parameters of the target pixel group, where the second target image includes a second pixel group that matches the optical parameters of the target pixel group, the second target image is an image in the N frames of first images, and the target pixel group includes Q pixels;
and a processor 1010, configured to perform fusion processing on a target pixel group and a second pixel group, where a position of the target pixel group in the first target image corresponds to a position of the second pixel group in the second target image.
In the embodiment of the application, when the electronic equipment shoots a shooting object in a motion state, multiple frames of first images including the shooting object are collected, a first target image in N frames of the first images is divided into multiple first pixel groups, target pixel groups in the multiple first pixel groups are obtained, a second pixel group in a second target image capable of being fused with the target pixel groups can be found through optical parameters of the target pixel groups, and the second pixel group is a pixel group found according to the optical parameters of the target pixel groups, so that optical parameters between the fused second pixel group and the target pixel groups can be guaranteed to be matched. And after each target pixel group is fused with the corresponding second pixel, splicing is carried out, so that an image with the motion blur eliminated can be obtained. In the embodiment of the application, the pixel groups are used as the unit of image fusion, each target pixel group is fused with the matched second pixel group, the problem of blurring caused by fusion of part of pixels through other pixels which are not matched with optical parameters in the fused image can be avoided, motion blurring in the image is reduced, and the image definition of shooting a moving object is improved.
According to the embodiment of the application, the pixel group is used as a unit for image fusion, only the optical parameters of the whole pixel group are calculated, and each pixel in the pixel group does not need to be fused independently. Because the relevance of the pixels in the target pixel group is strong, the pixel group is used as the unit of image fusion, the calculation amount of the electronic equipment can be reduced, and meanwhile, the precision of the image fusion processing can be ensured. The method and the device realize that the data processing amount required by image fusion is reduced while the definition of the fused image is ensured.
Further, the processor 1010 is configured to determine X target pixels in the target pixel group, where X is a positive integer and is not greater than Q;
a processor 1010 for determining optical parameters of the target pixel group from optical parameters of the X target pixels.
In the embodiment of the application, after the first image is divided into the plurality of target pixel groups, part of pixels in the target pixel groups are used as the target pixels, and the optical parameters of the corresponding target pixel groups are determined according to the target pixels, so that the optical parameters of all pixels in the target pixel groups do not need to be calculated, and the calculation amount for determining the optical parameters corresponding to the target pixel groups is reduced.
Further, the processor 1010 is configured to determine X target pixels in the target pixel group according to the image content of the target pixel group corresponding to the first target image.
In the embodiment of the application, the target pixel in the first pixel group is selected in a self-adaptive manner according to the image content through the electronic device, so that the accuracy of the optical parameters of the first pixel group obtained through determination is improved while the calculation amount of the optical parameters is reduced.
Further, a processor 1010 configured to obtain a first resolution of the first target image;
a processor 1010 configured to determine a preset number of pixels in each first pixel group according to the first resolution;
the processor 1010 is configured to divide the first target image into P first pixel groups according to a preset number of pixels.
In the embodiment of the application, the electronic device can automatically determine the number of pixels in the first pixel group according to the resolution of the first target image, divide the first target image according to the number of pixels in the first pixel group, divide the first target image without manual operation of a user, and obtain the same number of pixels in each first pixel group.
Further, the processor 1010 is configured to determine the number of the preset pixels to be a first value if the first resolution is greater than the preset resolution;
the processor 1010 is configured to determine that the number of the preset pixels is a second value when the first resolution is less than or equal to the preset resolution;
wherein the first value is greater than the second value.
In the embodiment of the application, the electronic device can select different preset pixel numbers for the first pixel group according to the comparison result between the preset resolution and the first resolution of the first target image, so that the number of pixels in the first pixel group is large under the condition that the first resolution is high, the number of pixels in the first pixel group is small under the condition that the first resolution is low, the electronic device can automatically set the pixel number of the first pixel group, and the first target image is divided according to the set pixel number.
Further, a processor 1010 for identifying image content of the first target image;
a processor 1010 configured to divide a first target image into P first pixel groups according to image content;
the P first pixel groups comprise pixel groups with different pixel numbers.
In the embodiment of the application, the electronic device can identify a main body area with more image content and a background area with less image content in the first target image in an image identification mode, and set a smaller number of pixels for the first pixel group in the main body area, so that the image content is not lost after the first pixel group in the shooting main body area is fused, set a larger number of pixels for the first pixel group in the background area, and further reduce the calculation amount of the electronic device.
Further, the optical parameter comprises any one of: light flux value, number of photoelectrons.
In the embodiment of the application, the optical parameter is set to be any one of the light flux value and the number of the photoelectrons, so that the accuracy of judging the first moment when the optical parameter changes suddenly is ensured, and the step of acquiring the optical parameter is simplified.
Further, the processor 1010 is configured to acquire an optical parameter variation of a third pixel group in a third target image adjacent to the acquisition time, where the third target image is an image of the N frames of the first image except the first target image, and a position of the third pixel group in the third target image corresponds to a position of the first pixel group in the first target image;
a processor 1010, configured to determine an acquisition time at which the optical parameter variation is greater than a preset variation as a target time;
the processor 1010 is configured to determine at least one second target image of the N frames of first images according to a target time, where an acquisition time of the second target image is between an acquisition time of the first target image and the target time.
In the embodiment of the application, the difference value of the optical parameters of the adjacent third target pixel groups at the acquisition time is calculated, the optical parameter variation corresponding to the acquisition time can be determined, and the optical parameter variation corresponding to each acquisition time can be obtained by obtaining the optical parameter variation of the third pixel groups in each two adjacent third target images. Under the condition that the variation of the optical parameter exceeds the preset variation, the acquisition time is determined to have a sudden change of the optical variation, and a third pixel group in a third target image acquired between the target time and the acquisition time of the first target image is matched with the target pixel group, so that all first images between the target time and the acquisition time of the first target image are used as second target images, and the problem that the image area of the target pixel group in the image fused according to the second pixel group in the second target image and the target pixel group cannot be blurred is solved.
Further, the processor 1010 is configured to generate a fourth target image according to a plurality of third pixel groups, where the third pixel groups are pixel groups obtained by fusing the target pixel group and the second pixel group, and the number of the third pixel groups is the same as that of the first pixel groups.
In the embodiment of the application, the electronic device can ensure that each part in the first target image is subjected to targeted fusion processing by taking the pixel group as a unit of fusion processing, so that a partially unclear region in a fourth target image obtained after the fusion processing is avoided, and each pixel does not need to be processed independently, thereby reducing the calculation amount of the electronic device and improving the plotting efficiency.
It should be understood that in the embodiment of the present application, the input Unit 1004 may include a Graphics Processing Unit (GPU) 10041 and a microphone 10042, and the Graphics Processing Unit 10041 processes image data of still pictures or videos obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 1006 may include a display panel 10061, and the display panel 10061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1007 includes at least one of a touch panel 10071 and other input devices 10072. The touch panel 10071 is also referred to as a touch screen. The touch panel 10071 may include two parts, a touch detection device and a touch controller. Other input devices 10072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
The memory 1009 may be used to store software programs as well as various data. The memory 1009 may mainly include a first storage area for storing a program or an instruction and a second storage area for storing data, wherein the first storage area may store an operating system, an application program or an instruction (such as a sound playing function, an image playing function, and the like) required for at least one function, and the like. Further, the memory 1009 may include volatile memory or non-volatile memory, or the memory 1009 may include both volatile and non-volatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. The volatile Memory may be a Random Access Memory (RAM), a Static Random Access Memory (Static RAM, SRAM), a Dynamic Random Access Memory (Dynamic RAM, DRAM), a Synchronous Dynamic Random Access Memory (Synchronous DRAM, SDRAM), a Double Data Rate Synchronous Dynamic Random Access Memory (Double Data Rate SDRAM, ddr SDRAM), an Enhanced Synchronous SDRAM (ESDRAM), a Synchronous Link DRAM (SLDRAM), and a Direct Memory bus RAM (DRRAM). The memory 1009 in the embodiments of the present application includes, but is not limited to, these and any other suitable types of memory.
Processor 1010 may include one or more processing units; optionally, the processor 1010 integrates an application processor, which primarily handles operations involving the operating system, user interface, and applications, and a modem processor, which primarily handles wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into processor 1010.
The embodiments of the present application further provide a readable storage medium, where a program or an instruction is stored, and when the program or the instruction is executed by a processor, the program or the instruction implements the processes of the foregoing method embodiments, and can achieve the same technical effects, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device in the above embodiment. Readable storage media include computer readable storage media such as computer read only memory ROM, random access memory RAM, magnetic or optical disks, and the like.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the embodiment of the image fusion method, and the same technical effect can be achieved.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
Embodiments of the present application provide a computer program product, where the program product is stored in a storage medium, and the program product is executed by at least one processor to implement the processes of the foregoing image fusion method embodiments, and can achieve the same technical effects, and in order to avoid repetition, details are not repeated here.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising a component of' 8230; \8230;" does not exclude the presence of another like element in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the method of the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (12)

1. An image fusion method, comprising:
acquiring N frames of first images, wherein each frame of first image comprises M pixels, and N and M are positive integers;
dividing a first target image into P first pixel groups, wherein the first target image is any one of the N first images, and P is a positive integer;
acquiring optical parameters of a target pixel group in the P first pixel groups, wherein the target pixel group is any one of the P first pixel groups;
determining at least one frame of second target image in the N frames of first images according to the optical parameters of the target pixel group, wherein the second target image comprises a second pixel group matched with the optical parameters of the target pixel group, the second target image is an image in the N frames of first images, the target pixel group comprises Q pixels, and Q is less than M;
and performing fusion processing on the target pixel group and the second pixel group, wherein the position of the target pixel group in the first target image corresponds to the position of the second pixel group in the second target image.
2. The image fusion method of claim 1, wherein the obtaining optical parameters of a target pixel set of the P first pixel sets comprises:
determining X target pixels in the target pixel group, wherein X is a positive integer and is not more than Q;
and determining the optical parameters of the target pixel group according to the optical parameters of the X target pixels.
3. The image fusion method of claim 2, wherein the determining X target pixels in the target pixel group comprises:
and determining the X target pixels in the target pixel group according to the corresponding image content of the target pixel group in the first target image.
4. The image fusion method according to any one of claims 1 to 3, wherein the dividing the first target image into P first pixel groups comprises:
identifying image content of the first target image;
dividing the first target image into P first pixel groups according to the image content;
the P first pixel groups comprise pixel groups with different pixel numbers.
5. The image fusion method according to any one of claims 1 to 3, wherein said determining at least one second target image of the N first images according to the optical parameters of the target pixel group comprises:
acquiring optical parameter variation of a third pixel group in a third target image adjacent to the acquisition time, wherein the third target image is other images except the first target image in the N frames of first images, and the position of the third pixel group in the third target image corresponds to the position of the first pixel group in the first target image;
determining the acquisition time when the optical parameter variation is larger than a preset variation as a target time;
and determining at least one frame of the second target image in the N frames of the first images according to the target time, wherein the acquisition time of the second target image is between the acquisition time of the first target image and the target time.
6. An image fusion apparatus, comprising:
the device comprises an acquisition module, a display module and a processing module, wherein the acquisition module is used for acquiring N frames of first images, each frame of the first images comprises M pixels, and both N and M are positive integers;
a grouping module, configured to divide a first target image into P first pixel groups, where the first target image is any one of the N first images, and P is a positive integer;
an obtaining module, configured to obtain optical parameters of a target pixel group in the P first pixel groups, where the target pixel group is any one of the P first pixel groups;
a determining module, configured to determine at least one second target image in the N frames of first images according to the optical parameters of the target pixel group, where the second target image includes a second pixel group matched with the optical parameters of the target pixel group, the second target image is an image in the N frames of first images, and the target pixel group includes Q pixels;
and the fusion module is used for carrying out fusion processing on the target pixel group and the second pixel group, wherein the position of the target pixel group in the first target image corresponds to the position of the second pixel group in the second target image.
7. The image fusion apparatus according to claim 6,
the determining module is further configured to determine X target pixels in the target pixel group, where X is a positive integer and is not greater than Q;
the determining module is further configured to determine an optical parameter of the target pixel group according to the optical parameters of the X target pixels.
8. The image fusion apparatus according to claim 7,
the determining module is further configured to determine the X target pixels in the target pixel group according to image content of the target pixel group corresponding to the first target image.
9. The image fusion device according to any one of claims 6 to 8, further comprising:
the identification module is used for identifying the image content of the first target image;
the grouping module is further used for dividing the first target image into P first pixel groups according to the image content;
the P first pixel groups comprise pixel groups with different pixel numbers.
10. The image fusion device according to any one of claims 6 to 8,
the acquiring module is further configured to acquire an optical parameter variation of a third pixel group in a third target image adjacent to the acquiring time, where the third target image is an image other than the first target image in the N frames of the first image, and a position of the third pixel group in the third target image corresponds to a position of the first pixel group in the first target image;
the determining module is further configured to determine the acquisition time at which the optical parameter variation is greater than a preset variation as a target time;
the determining module is configured to determine, according to the target time, at least one frame of the second target image in the N frames of the first images, where an acquisition time of the second target image is between an acquisition time of the first target image and the target time.
11. An electronic device, comprising:
a processor and a memory, the memory storing a program or instructions executable on the processor, the program or instructions when executed by the processor implementing the steps of the image fusion method of any one of claims 1 to 5.
12. A readable storage medium on which a program or instructions are stored, which program or instructions, when executed by a processor, carry out the steps of the image fusion method according to any one of claims 1 to 5.
CN202211084167.1A 2022-09-06 2022-09-06 Image fusion method and device, electronic equipment and storage medium Pending CN115439386A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202211084167.1A CN115439386A (en) 2022-09-06 2022-09-06 Image fusion method and device, electronic equipment and storage medium
PCT/CN2023/117064 WO2024051697A1 (en) 2022-09-06 2023-09-05 Image fusion method and apparatus, electronic device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211084167.1A CN115439386A (en) 2022-09-06 2022-09-06 Image fusion method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115439386A true CN115439386A (en) 2022-12-06

Family

ID=84247466

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211084167.1A Pending CN115439386A (en) 2022-09-06 2022-09-06 Image fusion method and device, electronic equipment and storage medium

Country Status (2)

Country Link
CN (1) CN115439386A (en)
WO (1) WO2024051697A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024051697A1 (en) * 2022-09-06 2024-03-14 维沃移动通信有限公司 Image fusion method and apparatus, electronic device, and storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10366478B2 (en) * 2016-05-18 2019-07-30 Interdigital Ce Patent Holdings Method and device for obtaining a HDR image by graph signal processing
CN110189285B (en) * 2019-05-28 2021-07-09 北京迈格威科技有限公司 Multi-frame image fusion method and device
CN111770243B (en) * 2020-08-04 2021-09-03 深圳市精锋医疗科技有限公司 Image processing method, device and storage medium for endoscope
CN114119423A (en) * 2021-12-08 2022-03-01 上海肇观电子科技有限公司 Image processing method, image processing device, electronic equipment and storage medium
CN114511487A (en) * 2022-02-16 2022-05-17 展讯通信(上海)有限公司 Image fusion method and device, computer readable storage medium and terminal
CN115439386A (en) * 2022-09-06 2022-12-06 维沃移动通信有限公司 Image fusion method and device, electronic equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024051697A1 (en) * 2022-09-06 2024-03-14 维沃移动通信有限公司 Image fusion method and apparatus, electronic device, and storage medium

Also Published As

Publication number Publication date
WO2024051697A1 (en) 2024-03-14

Similar Documents

Publication Publication Date Title
CN111835982B (en) Image acquisition method, image acquisition device, electronic device, and storage medium
CN108513069B (en) Image processing method, image processing device, storage medium and electronic equipment
CN112422798A (en) Photographing method and device, electronic equipment and storage medium
CN111951192A (en) Shot image processing method and shooting equipment
CN111028276A (en) Image alignment method and device, storage medium and electronic equipment
CN115439386A (en) Image fusion method and device, electronic equipment and storage medium
CN114390201A (en) Focusing method and device thereof
CN113628259A (en) Image registration processing method and device
WO2023001110A1 (en) Neural network training method and apparatus, and electronic device
CN108495038B (en) Image processing method, image processing device, storage medium and electronic equipment
CN112672056A (en) Image processing method and device
CN112367470B (en) Image processing method and device and electronic equipment
CN114143448B (en) Shooting method, shooting device, electronic equipment and readable storage medium
CN114302026B (en) Noise reduction method, device, electronic equipment and readable storage medium
CN112492208B (en) Shooting method and electronic equipment
CN117097982B (en) Target detection method and system
CN115797160A (en) Image generation method and device
CN110620911B (en) Video stream processing method and device of camera and terminal equipment
CN117793513A (en) Video processing method and device
CN116320729A (en) Image processing method, device, electronic equipment and readable storage medium
CN114979479A (en) Shooting method and device thereof
CN116342992A (en) Image processing method and electronic device
CN116320740A (en) Shooting focusing method, shooting focusing device, electronic equipment and storage medium
CN117201941A (en) Image processing method, device, electronic equipment and readable storage medium
CN117750195A (en) Image processing method, device, readable storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination