WO2024051697A1 - Procédé et appareil de fusion d'images, dispositif électronique et support de stockage - Google Patents

Procédé et appareil de fusion d'images, dispositif électronique et support de stockage Download PDF

Info

Publication number
WO2024051697A1
WO2024051697A1 PCT/CN2023/117064 CN2023117064W WO2024051697A1 WO 2024051697 A1 WO2024051697 A1 WO 2024051697A1 CN 2023117064 W CN2023117064 W CN 2023117064W WO 2024051697 A1 WO2024051697 A1 WO 2024051697A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
image
pixel group
pixels
target image
Prior art date
Application number
PCT/CN2023/117064
Other languages
English (en)
Chinese (zh)
Inventor
赵志杰
王宇飞
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2024051697A1 publication Critical patent/WO2024051697A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • G06T5/73
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • This application belongs to the field of photography technology, and specifically relates to an image fusion method, device, electronic equipment and storage medium.
  • the brightness of a photo is positively related to the amount of light entering the sensor.
  • the bright area will be overexposed due to the exposure time being too long, and the exposure time is too short will cause the dark area to be underexposed, resulting in a It is often difficult to ensure image quality in images captured using sexual exposure imaging.
  • the purpose of the embodiments of the present application is to provide an image fusion method, device, electronic device and storage medium to avoid the blurring problem in the fused image caused by the fusion of some pixels with other pixels that do not match the optical parameters, and to improve the The clarity of the final composite image.
  • embodiments of the present application provide an image fusion method, including: collecting N frames of first images, each frame of the first image including M pixels, N and M are both positive integers; dividing the first target image into is P first pixel groups, the first target image is any image in the N frames of first images, and P is a positive integer; obtain the optical parameters of the target pixel group in the P first pixel groups, and the target pixel group is P Any pixel group in the first pixel group; according to the optical parameters of the target pixel group, determine at least one frame of the second target image among the N frames of the first image, and the second target image includes the optical parameters that match the target pixel group.
  • the second pixel group, the second target image is an image in the first image of N frames, the target pixel group includes Q pixels, Q ⁇ M; the target pixel group and the second pixel group are fused, and the target pixel group is in the first
  • the position in the target image corresponds to the position of the second group of pixels in the second target image.
  • an image fusion device including: a collection module, used to collect N frames of first images, each frame of the first image includes M pixels, N and M are both positive integers; grouping Module, used to divide the first target image into P first pixel groups, the first target image is any image in N frames of first images, P is a positive integer; acquisition module, used to obtain P first pixels Optical parameters of the target pixel group in the group, the target pixel group is Any pixel group among the P first pixel groups; a determination module, configured to determine at least one second target image among the N frames of first images according to the optical parameters of the target pixel group, and the second target image includes the same target pixels.
  • the second pixel group whose optical parameters match the second group, the second target image is the image in the N-frame first image, and the target pixel group includes Q pixels; the fusion module is used to fuse the target pixel group with the second pixel group. Processing, the position of the target pixel group in the first target image corresponds to the position of the second pixel group in the second target image.
  • embodiments of the present application provide an electronic device, including a processor and a memory.
  • the memory stores programs or instructions that can be run on the processor.
  • the program or instructions are executed by the processor, the method of the first aspect is implemented. step.
  • embodiments of the present application provide a readable storage medium that stores a program or instructions, and when the program or instructions are executed by a processor, the steps of the method in the first aspect are implemented.
  • inventions of the present application provide a chip.
  • the chip includes a processor and a communication interface.
  • the communication interface is coupled to the processor.
  • the processor is used to run programs or instructions to implement the steps of the method in the first aspect. .
  • embodiments of the present application provide a computer program product, the program product is stored in a storage medium, and the program product is executed by at least one processor to implement the method of the first aspect.
  • the electronic device when the electronic device captures a subject in motion, multiple frames including the first image of the subject are collected, and the first target image in the N frames of first images is divided into a plurality of first pixels. group, and obtain a target pixel group in a plurality of first pixel groups.
  • a second pixel group in the second target image that can be fused with the target pixel group can be found. Since the The second pixel group is a pixel group found based on the optical parameters of the target pixel group, so it can be ensured that the optical parameters between the second pixel group for fusion and the target pixel group match.
  • the pixel group is used as the unit of image fusion, so that each target pixel group is fused with the matching second pixel group, which can avoid the existence of pixels in the fused image due to mismatch between some pixels and the optical parameters.
  • the blur caused by other pixel fusion improves the clarity of the final composite image.
  • the pixel group is used as the unit of image fusion, and only the optical parameters of the entire pixel group are calculated, without the need to fuse each pixel in the pixel group separately. Since the correlation between pixels in the target pixel group is strong, using the pixel group as the unit of image fusion can reduce the calculation amount of electronic equipment while ensuring the accuracy of image fusion processing. This method ensures the clarity of the fused image while also reducing the amount of data processing required for image fusion.
  • Figure 1 shows a schematic flow chart of the image fusion method provided by the embodiment of the present application
  • Figure 2 shows the luminous flux value change curve provided by the embodiment of the present application
  • Figure 3 shows one of the schematic diagrams of the pixel arrangement of the target pixel group provided by the embodiment of the present application
  • Figure 4 shows the second schematic diagram of the pixel arrangement of the target pixel group provided by the embodiment of the present application
  • Figure 5 shows the third schematic diagram of the pixel arrangement of the target pixel group provided by the embodiment of the present application.
  • Figure 6 shows one of the schematic diagrams of the first pixel group in the first target image provided by the embodiment of the present application
  • Figure 7 shows the second schematic diagram of the first pixel group in the first target image provided by the embodiment of the present application.
  • Figure 8 shows a schematic block diagram of an image fusion device provided by an embodiment of the present application.
  • Figure 9 shows a structural block diagram of an electronic device according to an embodiment of the present application.
  • FIG. 10 is a schematic diagram of the hardware structure of an electronic device implementing an embodiment of the present application.
  • first, second, etc. in the description and claims of this application are used to distinguish similar objects and are not used to describe a specific order or sequence. It is to be understood that the figures so used are interchangeable under appropriate circumstances so that the embodiments of the present application can be practiced in orders other than those illustrated or described herein, and that "first,” “second,” etc. are distinguished Objects are usually of one type, and the number of objects is not limited. For example, the first object can be one or multiple.
  • “and/or” in the description and claims indicates at least one of the connected objects, and the character “/" generally indicates that the related objects are in an "or” relationship.
  • an image fusion method is provided.
  • Figure 1 shows a schematic flowchart of an image fusion method provided by an embodiment of the present application. As shown in Figure 1, image fusion methods include:
  • Step 102 collect N frames of first images, each frame of the first image includes M pixels, and N and M are both positive integers;
  • the electronic device can collect N frames of the first image. Since the pixels in the N frames of the first image are all collected by the same sensor, the pixels in each first image The number is the same and both include M pixels.
  • Step 104 Divide the first target image into P first pixel groups, where the first target image is any image in N frames of first images, and P is a positive integer;
  • the first target image is an image used for image fusion processing among the N frames of first images.
  • the first target image can select any image among the N frames of first images.
  • the first target image is used as The initial image is fused with other images.
  • All pixels of the first image in each frame are divided into P first pixel groups.
  • the number of pixels in each first pixel group needs to be smaller than the number of pixels in the first target image, and the first pixel group includes at least one pixel.
  • the number of pixels in each of the P first pixel groups may be the same or different.
  • the pixel arrangement forms in the plurality of first pixel groups may be the same or different.
  • the arrangement form of the first pixel group may be 2 ⁇ 6, 3 ⁇ 4, or 1 ⁇ 12.
  • Step 106 Obtain the optical parameters of the target pixel group among the P first pixel groups, and the target pixel group is any pixel group among the P first pixel groups;
  • the target pixel group is any pixel group among the P first pixel groups.
  • each pixel group among the P first pixel groups can be used as the target pixel group.
  • Some pixel groups among the P first pixel groups may be used as target pixel groups.
  • Step 108 Determine at least one second target image among the N frames of first images based on the optical parameters of the target pixel group.
  • the second target image includes a second pixel group that matches the optical parameters of the target pixel group.
  • the target image is the image in the first image of N frames, and the target pixel group includes Q pixels, Q ⁇ M;
  • the second target image is filtered from the N frames of the first image according to the optical parameters of the target pixel group. It should be noted that the second target image and the first target image are different images, and the number of the second target image is at most N-1 frames, that is, the second target image can be N frames of the first image except the first target image. All images except .
  • the second group of pixels in the second target image matches optical parameters of the target group of pixels.
  • the parameter difference between the optical parameters of the second pixel group and the optical parameters of the target pixel group is less than the preset difference, it is determined that the optical parameters of the second pixel group match the target pixel group.
  • the position of the second pixel group in the second target image corresponds to the position of the target pixel group in the first target image.
  • the pixel matrices corresponding to the first target image and the second target image are the same, and the position of the target pixel group in the pixel matrix is the same as the position of the first pixel group in the pixel matrix.
  • Step 110 Fusion processing is performed on the target pixel group and the second pixel group.
  • the position of the target pixel group in the first target image corresponds to the position of the second pixel group in the second target image.
  • the electronic device collects multiple frames of first images, divides the first target image in the multiple frames of first images into P first pixel groups, and searches for multiple first pixel groups based on the target pixel groups in the P first pixel groups. Frame the second target image within the first image. The target pixel group is fused according to the second pixel group in the corresponding second target image. After all target pixel groups in the P first pixel groups are fused, a fused image is obtained. The image obtained through the fusion of the above process can avoid the problem of motion blur caused by the movement of the subject during the shooting process.
  • the process of searching for the second target image in the multi-frame first image based on the target pixel group it is necessary to first determine the third pixel in the multi-frame first image except the first target image based on the target pixel group.
  • Group. Obtain the optical parameters of the target pixel group and the optical parameters of the plurality of third pixel groups, and determine the image corresponding to the second pixel group of the plurality of third pixel groups that matches the optical parameters of the target pixel group as the second target image. .
  • determining whether the optical parameters of the third pixel group match the optical parameters of the target pixel group includes: calculating the difference between the optical parameters of the third pixel group and the target pixel group, and calculating the difference between the optical parameters that are smaller than the preset difference.
  • the three-pixel group is determined as the second pixel group that matches the target pixel group.
  • Optical parameters include any of the following: luminous flux value, number of photoelectrons.
  • the optical parameter only includes one of the luminous flux value or the number of photoelectrons.
  • the third pixel group is determined to be the second pixel group matching the target pixel group.
  • the optical parameters include luminous flux values and photoelectron numbers.
  • the third pixel group is determined to be the second pixel group matching the target pixel group. It can also be detected that the difference between the luminous flux values of the third pixel group and the target pixel group is less than the preset difference, or the difference between the photoelectron numbers of the third pixel group and the target pixel group is less than the preset difference. If there is a difference, the third pixel group is determined to be the second pixel group that matches the target pixel group.
  • the luminous flux value can be obtained through calculation.
  • the number of photoelectrons is the number of pixels in the lit state in the pixel group, which can be collected and counted by the sensor.
  • Figure 2 shows the change curve of the luminous flux value provided by the embodiment of the present application.
  • the electronic device collected 1000 frames of the third image, and the luminous flux value of the target pixel group had two obvious changes in the 1000 frames of the third image.
  • the change amount of the first change is 0.6
  • the change amount of the second change is 1.
  • the time corresponding to the second change is determined as the sudden change time of the luminous flux value, that is, the first time.
  • the electronic device when the electronic device captures a subject in motion, multiple frames including the first image of the subject are collected, and the first target image in the N frames of first images is divided into a plurality of first pixels. group, and obtain a target pixel group in a plurality of first pixel groups.
  • a second pixel group in the second target image that can be fused with the target pixel group can be found. Since the The second pixel group is a pixel group found based on the optical parameters of the target pixel group, so it can be ensured that the optical parameters between the second pixel group for fusion and the target pixel group match.
  • the pixel group is used as the unit of image fusion, so that each target pixel group is fused with the matching second pixel group, which can avoid the existence of pixels in the fused image due to mismatch between some pixels and the optical parameters.
  • the blur caused by other pixel fusion improves the clarity of the final composite image.
  • the pixel group is used as the unit of image fusion, and only the optical parameters of the entire pixel group are calculated, without the need to fuse each pixel in the pixel group separately. Since the correlation between pixels in the target pixel group is strong, using the pixel group as the unit of image fusion can reduce the calculation amount of electronic equipment while ensuring the accuracy of image fusion processing. This method ensures the clarity of the fused image while also reducing the amount of data processing required for image fusion.
  • obtaining the optical parameters of the target pixel group in the P first pixel groups includes: determining X target pixels in the target pixel group, X is a positive integer, X ⁇ Q; The optical parameters of the pixel determine the optical parameters of the target pixel group.
  • the X target pixels are part of the pixels in the target pixel group.
  • the value range of the number of target pixels is greater than 1 and less than or equal to Q.
  • the selection rules for the X target pixels can be set in advance before the electronic device leaves the factory.
  • the optical parameters of the target pixel are directly used as the optical parameters of the target pixel group.
  • the number of target pixels is 1 and the light flux value of the target pixel is 1.2
  • the light flux value of the target pixel group where the target pixel is located is determined to be 1.2.
  • the optical parameters of the multiple target pixels are calculated to determine the optical parameters of the target pixel group where the multiple target pixels are located. For example, when the number of target pixels is 2, the light flux value of one target pixel is 0.4, and the light flux value of the other target pixel is 0.6, the target is determined by calculating the average of the light flux values of the two target pixels.
  • the luminous flux value of the pixel group is 0.5. You may also choose to perform other weighted calculation methods on the optical parameters of multiple target pixels to determine the optical parameters of the target pixel group.
  • the optical parameter is the number of photoelectrons
  • Figure 3 shows one of the schematic diagrams of the pixel arrangement of the target pixel group provided by the embodiment of the present application.
  • the target pixel group 300 is a 2 ⁇ 2 pixel group. Select the upper left corner of the 2 ⁇ 2 pixel group. If a single target pixel 302 is used as the basis for calculating the light flux value of the target pixel group 300, there is no need to calculate the light flux values of other pixels, thereby reducing the calculation amount by three-quarters.
  • Figure 4 shows the second schematic diagram of the pixel arrangement of the target pixel group provided by the embodiment of the present application.
  • the target pixel group 400 is a 2 ⁇ 2 pixel group. Select the diagonal line in the 2 ⁇ 2 pixel group.
  • the two target pixels 402 on are used as the basis for calculating the luminous flux value of the target pixel group 400.
  • the average value of the light flux values of the two target pixels 402 is calculated, and the average value is used as the light flux value of the target pixel group 400, thereby reducing the amount of calculation.
  • Figure 5 shows the third schematic diagram of the pixel arrangement of the target pixel group provided by the embodiment of the present application.
  • the target pixel group 500 is a 2 ⁇ 2 pixel group, and the first row in the 2 ⁇ 2 pixel group is selected.
  • the two target pixels 502 in are used as the basis for calculating the luminous flux value of the target pixel group 500 .
  • the average value of the light flux values of the two target pixels 502 is calculated, and the average value is used as the light flux value of the target pixel group 500, thereby reducing the amount of calculation.
  • the optical parameters of the corresponding target pixel group are determined based on the target pixels, thereby achieving the goal of eliminating the need for Calculating the optical parameters of all pixels in the target pixel group reduces the amount of calculation required to determine the optical parameters corresponding to the target pixel group.
  • determining the X target pixels in the target pixel group includes: determining the X target pixels in the target pixel group according to the image content corresponding to the target pixel group in the first target image.
  • the target in the process of selecting X target pixels in the first pixel group, the number of target pixels in the first pixel group, the target Select the location of the pixel.
  • the image content corresponding to the first pixel group is relatively rich, in order to ensure the accuracy of determining the optical parameters in the first pixel group, a larger number of target pixels will be selected, and the image content in the first pixel group will be selected.
  • the target pixel at the corresponding position in .
  • the electronic device determines the image content corresponding to the target pixel group through an image recognition algorithm, and selects different target pixels for the target pixel group based on different image contents.
  • the image content corresponding to the target pixel group is obtained, and the number of target pixels in the target pixel group is determined according to the content of the image content corresponding to the target pixel group.
  • the image content corresponding to the target pixel group is the target image feature in the image area of the target pixel group in the first target image.
  • the target image feature in the image area can be determined.
  • the content of the image content is the proportion of the identified target image features in the image area. For example, the image area has a total of N pixels, and the identified target image features in the image area total n pixels, then the image content is is n/N.
  • the content of the image content corresponding to the target pixel group is large, a larger number of target pixels are selected from the target pixel group.
  • the content of the image content corresponding to the target pixel group is small, a larger number of target pixels are selected. , a smaller target number will be selected from the target pixels.
  • the image content corresponding to the target pixel group is obtained, and the target pixel in the target pixel group is selected according to the position information of the image content corresponding to the target pixel group.
  • the position information of the image content is the position of the identified target image feature in the image area.
  • the image area is an M ⁇ N pixel matrix
  • the position of the target image feature is an m ⁇ n pixel matrix
  • the electronic device adaptively selects the target pixel in the first pixel group according to the image content, thereby reducing the calculation amount of determining the optical parameters and improving the determined optical parameters of the first pixel group. accuracy.
  • dividing the first target image into P first pixel groups includes: obtaining a first resolution of the first target image; and determining, according to the first resolution, the number of pixels in each first pixel group. the preset number of pixels; divide the first target image into P first pixel groups according to the preset number of pixels.
  • the first target image before dividing the first target image into P first pixel groups, it is necessary to determine the first resolution of the first target image, determine the preset number of pixels in the first pixel group according to the first resolution, and The first target image is divided into P first pixel groups according to a preset number of pixels, and the number of pixels in the first pixel groups obtained by dividing in the above manner is equal.
  • the electronic device can automatically determine the number of pixels in the first pixel group according to the resolution of the first target image, and divide the first target image according to the number of pixels in the first pixel group without manual operation by the user.
  • the division of the first target image can be completed, and the number of pixels in each first pixel group obtained by division is equal.
  • the calculation amount required by the electronic device is further reduced, and the process of the electronic device performing fusion processing on the first target image is simplified.
  • the preset number of pixels in each first pixel group is determined according to the first resolution.
  • the amount includes: when the first resolution is greater than the preset resolution, determining the preset number of pixels to be the first value; when the first resolution is less than or equal to the preset resolution, determining the preset number of pixels to be the first value. Two values; where the first value is greater than the second value.
  • the number of pixels in the first pixel group when the resolution of the first target image is higher, the number of pixels in the first pixel group is set higher.
  • the resolution of the first target image is lower, the number of pixels in the first pixel group is set higher.
  • the number of pixels in the first pixel group is set lower.
  • Figure 6 shows one of the schematic diagrams of the first pixel group in the first target image provided by the embodiment of the present application. As shown in Figure 6, when the resolution of the first target image 600 is relatively high, you can choose to The default number of pixels in the first pixel group 602 in a target image 600 is set to 9, and the arrangement is 3 ⁇ 3.
  • the electronic device can select a different preset number of pixels for the first pixel group based on the comparison result between the preset resolution and the first resolution of the first target image, ensuring that the first resolution is higher.
  • the number of pixels in the first pixel group can be larger.
  • the first resolution is lower, the number of pixels in the first pixel group can be smaller. This enables the electronic device to automatically set the first pixel group. the number of pixels, and divide the first target image according to the set number of pixels.
  • dividing the first target image into P first pixel groups includes: identifying the image content of the first target image; dividing the first target image into P first pixel groups according to the image content ; Among them, the P first pixel groups include pixel groups with different numbers of pixels.
  • the electronic device in the process of dividing the first target image into the first pixel group, can adaptively divide the first target image according to the recognized image content.
  • the electronic device determines the image content in the first target image through image recognition, and divides the first target image according to the recognized image content, so that the first target image is The number of the first pixel group in the area with more image content is smaller, and the number of the first pixel group in the area with less image content in the first target image is larger.
  • the image content is the target image feature in the first target image, and by performing image recognition on the first target image, the target image feature in the image area can be determined.
  • the area with more image content is the area where the target image feature is located in the first target image.
  • the area with more image content can be used as the main body area in the first target image.
  • the area in the first target image is outside the area where the target image feature is located.
  • the area of is an area with less image content, and the area with less image content can be used as the background area in the first target image.
  • Figure 7 shows the second schematic diagram of the first pixel group in the first target image provided by the embodiment of the present application. As shown in Figure 7, the first area 702 of the first target image 700 is the background of the first target image 700.
  • the electronic device sets the first pixel group 708 located in the first area 702 as a 5 ⁇ 5 pixel group, and sets the first pixel group 708 located in the second area 704 as a 2 ⁇ 2 pixel group.
  • the electronic device can identify the main body area with more image content in the first target image and the background area with less image content through image recognition, and set the first pixel group in the main body area with relatively small content.
  • a small number of pixels ensures that the first pixel group located in the subject area will not lose image content after fusion, and a larger number of pixels is set for the first pixel group in the background area to further reduce the calculation amount of the electronic device.
  • the optical parameters include any of the following: luminous flux value, number of photoelectrons.
  • the optical parameters can be selected from the luminous flux value and the number of photoelectrons.
  • the moment of the sudden change is regarded as the first moment when the optical parameters change.
  • the number of photoelectrons is the number of pixels in the lit state in the first pixel group.
  • the number of photoelectrons can be directly collected by the sensor, and the luminous flux value can be obtained by calculation.
  • N pf is the number of the first image
  • T pf is the acquisition time of a single exposure
  • ti is the moment when the pixel receives photons
  • q is the photon detection efficiency of the sensor.
  • the optical parameters are set to either the luminous flux value or the number of photoelectrons, which ensures the accuracy of the first moment judgment of optical parameter mutation and simplifies the steps of obtaining the optical parameters.
  • optical parameters are not limited to the above-mentioned luminous flux value and the number of photoelectrons, and can also be other optical parameters, which are not specifically limited here.
  • determining at least one frame of the second target image among the N frames of first images according to the optical parameters of the target pixel group includes: acquiring the third pixel in the third target image adjacent to the acquisition time The optical parameter change amount of the group, the third target image is the other images in the N frames of the first image except the first target image, the position of the third pixel group in the third target image is the same as the position of the first pixel group in the first target image corresponding to the position in; determine the acquisition time when the change amount of the optical parameter is greater than the preset change amount as the target time; determine at least one frame of the second target image in the N frames of the first image according to the target time, and the second target image of the second target image.
  • the acquisition time is between the acquisition time of the first target image and the target time.
  • the acquisition time of each frame of the first image is recorded. All images in the N frames of first images except the first target image are used as third target images.
  • the third target images are sorted according to the acquisition time, and the third pixel group in the third target image in each frame is determined according to the position of the target pixel group in the first target image. Difference calculation is performed based on the optical parameters of the third pixel group in each adjacent two third target images to obtain the change amount of the optical parameter corresponding to the acquisition time. If it is determined that the change amount of the optical parameter is greater than the preset change amount, the change amount of the optical parameter is used as the target time.
  • the target moment is the moment when the optical parameters of the third pixel group suddenly change during the process of collecting the first image of multiple frames.
  • the acquisition time of the first target image is before the target time
  • the acquisition time will be at the acquisition time of the first target image.
  • the third target image after the time and before the target time is determined as the second target image.
  • the first image with the earliest acquisition time can be used as the first target image, and the first images collected subsequently can be used as the third target image. Therefore, in the process of collecting N frames of the first image, the target time can be determined, that is, every time a frame of the third target image is collected, the third pixel group in it is compared with the third pixel group in the third target image at the previous collection time.
  • the three-pixel group calculates the change amount of the optical parameters and compares it with the preset change amount, so that the target moment can be determined during the shooting process, and the second target image is determined based on the target moment, further improving the ability to determine the target.
  • the efficiency of the pixel group fusion of the second pixel group thereby improves the overall efficiency of image fusion.
  • the third target image whose collection time is after the target image and before the collection time of the first target image is determined as the second target image.
  • the target time is the time when the optical parameters of the third pixel group suddenly change during the acquisition of N frames of the first image.
  • the acquisition time of the first target image is before the target time, in order to ensure that the target pixel group in the first target image can be image fused with the second pixel group that matches the optical parameters, the image before the optical parameters change is determined as Second target image.
  • the acquisition time of the first target image is after the target time, in order to ensure that the target pixel group in the first target image can be image fused with the second pixel group that matches the optical parameters, the image after the optical parameter changes is determined as Second target image.
  • the difference in optical parameters of the adjacent third target pixel group at the acquisition time is calculated, and the change amount of the optical parameter corresponding to the acquisition time can be determined.
  • the optical parameter change amount of the third pixel group can be used to obtain the optical parameter change amount corresponding to each collection time.
  • the change amount of the optical parameter exceeds the preset change amount, it is determined that a sudden change in the optical change amount occurs at the acquisition time, and the third target image collected between the target time and the acquisition time of the first target image is The third pixel group all matches the target pixel group, so all the first images between the target time and the collection time of the first target image are used as the second target image, ensuring that according to the second pixel group in the second target image In the image that is fused with the target pixel group, the image area of the target pixel group will not be blurred.
  • the method includes: generating a fourth target image based on a plurality of third pixel groups, where the third pixel group is the target pixel group and the third pixel group.
  • the number of the third pixel group is the same as the number of the first pixel group.
  • each first pixel group is determined as a target pixel group, and then a second target image corresponding to each target pixel group is obtained.
  • the corresponding target pixel group is fused with the corresponding second pixel group in the second target image to obtain a third pixel group.
  • the electronic device uses the pixel group as the unit of fusion processing to ensure that each part of the first target image is fused in a targeted manner and avoids the existence of pixels in the fourth target image obtained after the fusion process. Partially unclear areas, and there is no need to process each pixel individually, which reduces the calculation amount of electronic equipment and improves the efficiency of drawing.
  • the execution subject may be an image fusion device.
  • the image fusion device performing the image fusion method is taken as an example to illustrate the image fusion device provided by the embodiment of the present application.
  • an image fusion device is provided.
  • Figure 8 shows a schematic block diagram of the image fusion device 800 provided by the embodiment of the present application. As shown in Figure 8, the image fusion device 800 includes:
  • the collection module 802 is used to collect N frames of first images.
  • Each frame of the first image includes M pixels, and N and M are both positive integers;
  • the grouping module 804 is used to divide the first target image into P first pixel groups, where the first target image is any image in the N frames of first images, and P is a positive integer;
  • the acquisition module 806 is used to acquire the optical parameters of the target pixel group in the P first pixel groups, and the target pixel group is any pixel group in the P first pixel groups;
  • Determining module 808, configured to determine at least one frame of the second target image among the N frames of first images according to the optical parameters of the target pixel group, and the second target image includes a second pixel group that matches the optical parameters of the target pixel group.
  • the second target image is an image in the N-frame first image, and the target pixel group includes Q pixels;
  • the fusion module 810 is configured to perform fusion processing on the target pixel group and the second pixel group.
  • the position of the target pixel group in the first target image corresponds to the position of the second pixel group in the second target image.
  • the electronic device when the electronic device captures a subject in motion, multiple frames including the first image of the subject are collected, and the first target image in the N frames of first images is divided into a plurality of first pixels. group, and obtain a target pixel group in a plurality of first pixel groups.
  • a second pixel group in the second target image that can be fused with the target pixel group can be found. Since the The second pixel group is a pixel group found based on the optical parameters of the target pixel group, so it can be ensured that the optical parameters between the second pixel group for fusion and the target pixel group match.
  • the pixel group is used as the unit of image fusion, so that each target pixel group is fused with the matching second pixel group, which can avoid the existence of pixels in the fused image due to mismatch between some pixels and the optical parameters.
  • the blur caused by other pixel fusion improves the clarity of the final composite image.
  • the pixel group is used as the unit of image fusion, and only the optical parameters of the entire pixel group are calculated, without the need to fuse each pixel in the pixel group separately. Since the correlation between pixels in the target pixel group is strong, using the pixel group as the unit of image fusion can reduce the calculation amount of electronic equipment while ensuring the accuracy of image fusion processing. This method ensures the clarity of the fused image while also reducing the amount of data processing required for image fusion.
  • the determination module 808 is also used to determine X target pixels in the target pixel group, where X is a positive integer and X ⁇ Q;
  • the determination module 808 is also used to determine the optical parameters of the X target pixels and determine the optical parameters of the target pixel group.
  • the optical parameters of the corresponding target pixel group are determined according to the target pixels, thereby achieving There is no need to calculate the optical parameters of all pixels in the target pixel group, which reduces the amount of calculation required to determine the optical parameters corresponding to the target pixel group.
  • the determination module 808 is also used to determine X target pixels in the target pixel group according to the image content corresponding to the target pixel group in the first target image.
  • the electronic device adaptively selects the target pixel in the first pixel group according to the image content, thereby reducing the calculation amount of determining the optical parameters and improving the determined optical parameters of the first pixel group. accuracy.
  • the acquisition module 806 is also used to acquire the first resolution of the first target image
  • the determination module 808 is also used to determine the preset number of pixels in each first pixel group according to the first resolution
  • the grouping module 804 is also used to divide the first target image into P first pixel groups according to the preset number of pixels.
  • the electronic device can automatically determine the number of pixels in the first pixel group according to the resolution of the first target image, and divide the first target image according to the number of pixels in the first pixel group without manual operation by the user.
  • the division of the first target image can be completed, and the number of pixels in each first pixel group obtained by division is equal.
  • the calculation amount required by the electronic device is further reduced, and the process of the electronic device performing fusion processing on the first target image is simplified.
  • the determination module 808 is also used to determine the preset number of pixels to be a first value when the first resolution is greater than the preset resolution;
  • the determination module 808 is also configured to determine the preset number of pixels to be a second value when the first resolution is less than or equal to the preset resolution;
  • the first numerical value is greater than the second numerical value.
  • the electronic device can select a different preset number of pixels for the first pixel group based on the comparison result between the preset resolution and the first resolution of the first target image, ensuring that the first resolution is higher.
  • the number of pixels in the first pixel group can be larger.
  • the first resolution is lower, the number of pixels in the first pixel group can be smaller. This enables the electronic device to automatically set the first pixel group. the number of pixels, and divide the first target image according to the set number of pixels.
  • the image fusion device 800 further includes:
  • a recognition module used to recognize the image content of the first target image
  • the grouping module 804 is also used to divide the first target image into P first pixel groups according to the image content;
  • the P first pixel groups include pixel groups with different numbers of pixels.
  • the electronic device can identify the main body area with more image content in the first target image and the background area with less image content through image recognition, and set the first pixel group in the main body area with relatively small content.
  • a small number of pixels ensures that the first pixel group located in the subject area will not lose image content after fusion, and a larger number of pixels is set for the first pixel group in the background area to further reduce the calculation amount of the electronic device.
  • the optical parameters include any of the following: luminous flux value, number of photoelectrons.
  • the optical parameters are set to either the luminous flux value or the number of photoelectrons, which ensures the accuracy of the first moment judgment of optical parameter mutation and simplifies the steps of obtaining the optical parameters.
  • the acquisition module 806 is used to acquire the optical parameter change amount of the third pixel group in the third target image adjacent to the acquisition time.
  • the third target image is the Nth frame of the first image except the In other images other than a target image, the position of the third pixel group in the third target image corresponds to the position of the first pixel group in the first target image;
  • the determination module 808 is used to set the optical parameter change amount to be greater than the preset
  • the collection time of the change amount is determined as the target time; the determination module 808 is used to determine at least one frame of the second target image among the N frames of the first image according to the target time, and the collection time of the second target image is at the time of the first target image. between the acquisition time and the target time.
  • the difference in optical parameters of the adjacent third target pixel group at the acquisition time is calculated, and the change amount of the optical parameter corresponding to the acquisition time can be determined.
  • the optical parameter change amount of the third pixel group can be used to obtain the optical parameter change amount corresponding to each collection time.
  • the change amount of the optical parameter exceeds the preset change amount, it is determined that a sudden change in the optical change amount occurs at the acquisition time, and the third target image collected between the target time and the acquisition time of the first target image is The third pixel group all matches the target pixel group, so all the first images between the target time and the collection time of the first target image are used as the second target image, ensuring that according to the second pixel group in the second target image In the image that is fused with the target pixel group, the image area of the target pixel group will not be blurred.
  • the image fusion device 800 further includes:
  • a generation module configured to generate a fourth target image based on a plurality of third pixel groups.
  • the third pixel group is a pixel group obtained by merging the target pixel group and the second pixel group.
  • the number of the third pixel group is equal to the number of the first pixel group. same.
  • the electronic device uses the pixel group as the unit of fusion processing to ensure that each part of the first target image is fused in a targeted manner and avoids the existence of pixels in the fourth target image obtained after the fusion process. Partially unclear areas, and there is no need to process each pixel individually, which reduces the calculation amount of electronic equipment and improves the efficiency of drawing.
  • the image fusion device in the embodiment of the present application may be an electronic device or a component in the electronic device, such as an integrated circuit or chip.
  • the electronic device may be a terminal or other devices other than the terminal.
  • the electronic device may be a mobile phone, a tablet computer, a notebook computer, a handheld computer, a vehicle-mounted electronic device, a mobile Internet device (MID), or augmented reality (AR)/virtual reality (VR).
  • MID mobile Internet device
  • AR augmented reality
  • VR virtual reality
  • the image fusion device in the embodiment of the present application may be a device with an operating system.
  • the operating system can be an Android operating system, an iOS operating system, or other possible operating systems, which are not specifically limited in the embodiments of this application.
  • the image fusion device provided by the embodiments of the present application can implement various processes implemented by the above method embodiments. To avoid duplication, they will not be described again here.
  • the embodiment of the present application also provides an electronic device, which includes the image fusion device as in any of the above embodiments, and thus has all the beneficial effects of the image fusion device in any embodiment, which will not be discussed here. Too much elaboration.
  • the embodiment of the present application also provides an electronic device.
  • Figure 9 shows a structural block diagram of the electronic device according to the embodiment of the present application.
  • the electronic device 900 includes a processor 902, a memory 904, and a storage device 900.
  • a program or instruction on the memory 904 that can be run on the processor 902.
  • the program or instruction When the program or instruction is executed by the processor 902, it implements each process of the above image fusion method embodiment and can achieve the same technical effect. To avoid duplication, I won’t go into details here.
  • the electronic devices in the embodiments of the present application include the above-mentioned mobile electronic devices and non-mobile electronic devices.
  • FIG. 10 is a schematic diagram of the hardware structure of an electronic device implementing an embodiment of the present application.
  • the electronic device 1000 includes but is not limited to: radio frequency unit 1001, network module 1002, audio output unit 1003, input unit 1004, sensor 1005, display unit 1006, user input unit 1007, interface unit 1008, memory 1009, processor 1010 and other components. .
  • the electronic device 1000 may also include a power supply (such as a battery) that supplies power to various components.
  • the power supply may be logically connected to the processor 1010 through a power management system, thereby managing charging, discharging, and function through the power management system. Consumption management and other functions.
  • the structure of the electronic device shown in Figure 10 does not constitute a limitation of the electronic device.
  • the electronic device may include more or less components than shown in the figure, or combine certain components, or arrange different components, which will not be described again here. .
  • the processor 1010 is used to collect N frames of first images, each frame of the first image includes M pixels, and N and M are both positive integers;
  • the processor 1010 is configured to divide the first target image into P first pixel groups, where the first target image is any image in the N frames of first images, and P is a positive integer;
  • Processor 1010 configured to obtain optical parameters of a target pixel group in the P first pixel groups, where the target pixel group is any pixel group in the P first pixel groups;
  • Processor 1010 configured to determine at least one frame of the second target image among the N frames of first images according to the optical parameters of the target pixel group, and the second target image includes a second pixel group that matches the optical parameters of the target pixel group.
  • the second target image is an image in the N-frame first image, and the target pixel group includes Q pixels;
  • the processor 1010 is configured to perform fusion processing on a target pixel group and a second pixel group, where the position of the target pixel group in the first target image corresponds to the position of the second pixel group in the second target image.
  • the electronic device when the electronic device captures a subject in motion, multiple frames including the first image of the subject are collected, and the first target image in the N frames of first images is divided into a plurality of first pixels. group, and obtain a target pixel group in a plurality of first pixel groups.
  • the target pixel group is the second pixel group in the second target image for fusion processing. Since the second pixel group is a pixel group found according to the optical parameters of the target pixel group, it can ensure that the second pixel group and the target pixel group are fused.
  • Optical parameters are matched between pixel groups.
  • the pixel group is used as the unit of image fusion, so that each target pixel group is fused with the matching second pixel group, which can avoid the existence of pixels in the fused image due to mismatch between some pixels and the optical parameters.
  • the blur problem caused by other pixel fusion reduces the motion blur in the image and improves the image clarity of shooting moving objects.
  • the pixel group is used as the unit of image fusion, and only the optical parameters of the entire pixel group are calculated, without the need to fuse each pixel in the pixel group separately. Since the correlation between pixels in the target pixel group is strong, using the pixel group as the unit of image fusion can reduce the calculation amount of electronic equipment while ensuring the accuracy of image fusion processing. This method ensures the clarity of the fused image while also reducing the amount of data processing required for image fusion.
  • the processor 1010 is used to determine X target pixels in the target pixel group, X is a positive integer, and X ⁇ Q;
  • the processor 1010 is used to determine the optical parameters of the target pixel group for the optical parameters of the X target pixels.
  • the optical parameters of the corresponding target pixel group are determined based on the target pixels, thereby achieving the goal of eliminating the need for Calculating the optical parameters of all pixels in the target pixel group reduces the amount of calculation required to determine the optical parameters corresponding to the target pixel group.
  • the processor 1010 is configured to determine X target pixels in the target pixel group according to the image content corresponding to the target pixel group in the first target image.
  • the electronic device adaptively selects the target pixel in the first pixel group according to the image content, thereby reducing the calculation amount of determining the optical parameters and improving the determined optical parameters of the first pixel group. accuracy.
  • processor 1010 is used to obtain the first resolution of the first target image
  • Processor 1010 configured to determine the preset number of pixels in each first pixel group according to the first resolution
  • the processor 1010 is configured to divide the first target image into P first pixel groups according to a preset number of pixels.
  • the electronic device can automatically determine the number of pixels in the first pixel group according to the resolution of the first target image, and divide the first target image according to the number of pixels in the first pixel group without manual operation by the user.
  • the division of the first target image can be completed, and the number of pixels in each first pixel group obtained by division is equal.
  • the calculation amount required by the electronic device is further reduced, and the process of the electronic device performing fusion processing on the first target image is simplified.
  • the processor 1010 is configured to determine the preset number of pixels to be the first value when the first resolution is greater than the preset resolution
  • Processor 1010 configured to determine the preset number of pixels when the first resolution is less than or equal to the preset resolution is the second value
  • the first numerical value is greater than the second numerical value.
  • the electronic device can select a different preset number of pixels for the first pixel group based on the comparison result between the preset resolution and the first resolution of the first target image, ensuring that the first resolution is higher.
  • the number of pixels in the first pixel group can be larger.
  • the first resolution is lower, the number of pixels in the first pixel group can be smaller. This enables the electronic device to automatically set the first pixel group. the number of pixels, and divide the first target image according to the set number of pixels.
  • processor 1010 is configured to identify the image content of the first target image
  • the processor 1010 is configured to divide the first target image into P first pixel groups according to the image content
  • the P first pixel groups include pixel groups with different numbers of pixels.
  • the electronic device can identify the main body area with more image content in the first target image and the background area with less image content through image recognition, and set the first pixel group in the main body area with relatively small content.
  • a small number of pixels ensures that the first pixel group located in the subject area will not lose image content after fusion, and a larger number of pixels is set for the first pixel group in the background area to further reduce the calculation amount of the electronic device.
  • optical parameters include any of the following: luminous flux value, number of photoelectrons.
  • the optical parameters are set to either the luminous flux value or the number of photoelectrons, which ensures the accuracy of the first moment judgment of optical parameter mutation and simplifies the steps of obtaining the optical parameters.
  • the processor 1010 is used to obtain the optical parameter change amount of the third pixel group in the third target image adjacent to the acquisition time.
  • the third target image is other than the first target image in the N frames of the first image. Image, the position of the third pixel group in the third target image corresponds to the position of the first pixel group in the first target image;
  • the processor 1010 is configured to determine the acquisition time when the optical parameter change amount is greater than the preset change amount as the target time;
  • the processor 1010 is configured to determine at least one frame of the second target image among the N frames of first images according to the target time, and the collection time of the second target image is between the collection time of the first target image and the target time.
  • the difference in optical parameters of the adjacent third target pixel group at the acquisition time is calculated, and the change amount of the optical parameter corresponding to the acquisition time can be determined.
  • the optical parameter change amount of the third pixel group can be used to obtain the optical parameter change amount corresponding to each collection time.
  • the change amount of the optical parameter exceeds the preset change amount, it is determined that a sudden change in the optical change amount occurs at the acquisition time, and the third target image collected between the target time and the acquisition time of the first target image is The third pixel group all matches the target pixel group, so all the first images between the target time and the collection time of the first target image are used as the second target image, ensuring that according to the second pixel group in the second target image In the image that is fused with the target pixel group, the image area of the target pixel group will not be blurred.
  • the processor 1010 is configured to generate a fourth target image according to a plurality of third pixel groups.
  • the third pixel group is a pixel group obtained by merging the target pixel group and the second pixel group.
  • the number of the third pixel group is the same as that of the first pixel group.
  • the number of pixel groups is the same.
  • the electronic device can ensure that the first Each part of the target image is subjected to targeted fusion processing to avoid partially unclear areas in the fourth target image obtained after the fusion processing, and there is no need to process each pixel individually, which reduces the calculation amount of electronic equipment. Improved drawing efficiency.
  • the input unit 1004 may include a graphics processor (Graphics Processing Unit, GPU) 10041 and a microphone 10042.
  • the graphics processor 10041 is responsible for the image capture device (GPU) in the video capture mode or the image capture mode. Process the image data of still pictures or videos obtained by cameras (such as cameras).
  • the display unit 1006 may include a display panel 10061, which may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like.
  • the user input unit 1007 includes at least one of a touch panel 10071 and other input devices 10072 .
  • Touch panel 10071 also known as touch screen.
  • the touch panel 10071 may include two parts: a touch detection device and a touch controller.
  • Other input devices 10072 may include but are not limited to physical keyboards, function keys (such as volume control keys, switch keys, etc.), trackballs, mice, and joysticks, which will not be described again here.
  • Memory 1009 may be used to store software programs as well as various data.
  • the memory 1009 may mainly include a first storage area for storing programs or instructions and a second storage area for storing data, wherein the first storage area may store an operating system, an application program or instructions required for at least one function (such as a sound playback function, Image playback function, etc.) etc.
  • memory 1009 may include volatile memory or nonvolatile memory, or memory 1009 may include both volatile and nonvolatile memory.
  • non-volatile memory can be read-only memory (Read-Only Memory, ROM), programmable read-only memory (Programmable ROM, PROM), erasable programmable read-only memory (Erasable PROM, EPROM), electrically removable memory.
  • Volatile memory can be random access memory (Random Access Memory, RAM), static random access memory (Static RAM, SRAM), dynamic random access memory (Dynamic RAM, DRAM), synchronous dynamic random access memory (Synchronous DRAM, SDRAM), double data rate synchronous dynamic random access memory (Double Data Rate SDRAM, DDRSDRAM), enhanced synchronous dynamic random access memory (Enhanced SDRAM, ESDRAM), synchronous link dynamic random access memory (Synch link DRAM) , SLDRAM) and direct memory bus random access memory (Direct Rambus RAM, DRRAM).
  • RAM Random Access Memory
  • SRAM static random access memory
  • DRAM dynamic random access memory
  • synchronous dynamic random access memory Synchronous DRAM, SDRAM
  • Double data rate synchronous dynamic random access memory Double Data Rate SDRAM, DDRSDRAM
  • enhanced SDRAM synchronous dynamic random access memory
  • Synch link DRAM synchronous link dynamic random access memory
  • SLDRAM direct memory bus random access memory
  • Direct Rambus RAM Direct Rambus RAM
  • the processor 1010 may include one or more processing units; optionally, the processor 1010 integrates an application processor and a modem processor, where the application processor mainly handles operations related to the operating system, user interface, application programs, etc., Modem processors mainly process wireless communication signals, such as baseband processors. It can be understood that the above modem processor may not be integrated into the processor 1010.
  • Embodiments of the present application also provide a readable storage medium.
  • Programs or instructions are stored on the readable storage medium.
  • the program or instructions are executed by a processor, each process of the above method embodiments is implemented and the same technical effect can be achieved. To avoid repetition, they will not be repeated here.
  • Readable storage media includes computer-readable storage media, such as computer read-only memory ROM, random access memory RAM, magnetic disks or optical disks.
  • the embodiment of the present application further provides a chip.
  • the chip includes a processor and a communication interface.
  • the communication interface and the processor Coupling, the processor is used to run programs or instructions to implement each process of the above image fusion method embodiment, and can achieve the same technical effect. To avoid duplication, it will not be described again here.
  • chips mentioned in the embodiments of this application may also be called system-on-chip, system-on-a-chip, system-on-a-chip or system-on-chip, etc.
  • Embodiments of the present application provide a computer program product.
  • the program product is stored in a storage medium.
  • the program product is executed by at least one processor to implement each process of the above image fusion method embodiment, and can achieve the same technical effect. , to avoid repetition, will not be repeated here.
  • modules, units, and subunits can be implemented in one or more Application Specific Integrated Circuits (ASIC), Digital Signal Processor (DSP), Digital Signal Processing Device (DSP Device, DSPD) ), programmable logic device (Programmable Logic Device, PLD), field-programmable gate array (Field-Programmable Gate Array, FPGA), general-purpose processor, controller, microcontroller, microprocessor, used to execute the disclosure other electronic units or combinations thereof with the above functions.
  • ASIC Application Specific Integrated Circuits
  • DSP Digital Signal Processor
  • DSP Device Digital Signal Processing Device
  • DSPD Digital Signal Processing Device
  • PLD programmable logic device
  • FPGA field-programmable gate array
  • controller microcontroller, microprocessor
  • the technology described in the embodiments of the present disclosure can be implemented through modules (such as procedures, functions, etc.) that perform the functions described in the embodiments of the present disclosure.
  • Software code may be stored in memory and executed by a processor.
  • the memory can be implemented in the processor or external to the processor.
  • the disclosed devices and methods can be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be combined or may be integrated into another system, or some features may be ignored, or not implemented.
  • the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, and the indirect coupling or communication connection of the devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or they may be distributed to multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in various embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the computer software product is stored in a storage medium (such as ROM/RAM, disk , CD), including several instructions to cause a terminal (which can be a mobile phone, computer, server, or network device, etc.) to execute the methods of various embodiments of the present application.
  • a storage medium such as ROM/RAM, disk , CD

Abstract

La présente demande se rapporte au domaine technique de la photographie et divulgue un procédé et un appareil de fusion d'image, ainsi qu'un dispositif électronique et un support de stockage. Le procédé de fusion d'image consiste à : collecter N trames de premières images, chaque trame de première image comprenant M pixels, et N et M étant tous deux des nombres entiers positifs ; diviser une première image cible en P premiers groupes de pixels, la première image cible étant une image quelconque dans les N trames de premières images, et P étant un nombre entier positif ; obtenir des paramètres optiques d'un groupe de pixels cible dans les P premiers groupes de pixels ; déterminer au moins une trame d'une seconde image cible dans les N trames de premières images selon les paramètres optiques du groupe de pixels cible, la seconde image cible comprenant un second groupe de pixels dont les paramètres optiques correspondent aux paramètres optiques du groupe de pixels cible, la seconde image cible étant une image dans les N trames de premières images, le groupe de pixels cible comprenant Q pixels, et Q étant inférieur à M ; et fusionner le groupe de pixels cible avec le second groupe de pixels.
PCT/CN2023/117064 2022-09-06 2023-09-05 Procédé et appareil de fusion d'images, dispositif électronique et support de stockage WO2024051697A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211084167.1 2022-09-06
CN202211084167.1A CN115439386A (zh) 2022-09-06 2022-09-06 图像融合方法、装置、电子设备和存储介质

Publications (1)

Publication Number Publication Date
WO2024051697A1 true WO2024051697A1 (fr) 2024-03-14

Family

ID=84247466

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/117064 WO2024051697A1 (fr) 2022-09-06 2023-09-05 Procédé et appareil de fusion d'images, dispositif électronique et support de stockage

Country Status (2)

Country Link
CN (1) CN115439386A (fr)
WO (1) WO2024051697A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115439386A (zh) * 2022-09-06 2022-12-06 维沃移动通信有限公司 图像融合方法、装置、电子设备和存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170337667A1 (en) * 2016-05-18 2017-11-23 Thomson Licensing Method and device for obtaining a hdr image by graph signal processing
CN110189285A (zh) * 2019-05-28 2019-08-30 北京迈格威科技有限公司 一种多帧图像融合方法及装置
CN111770243A (zh) * 2020-08-04 2020-10-13 深圳市精锋医疗科技有限公司 内窥镜的图像处理方法、装置、存储介质
CN114119423A (zh) * 2021-12-08 2022-03-01 上海肇观电子科技有限公司 图像处理方法、装置、电子设备和存储介质
CN114511487A (zh) * 2022-02-16 2022-05-17 展讯通信(上海)有限公司 图像融合方法及装置、计算机可读存储介质、终端
CN115439386A (zh) * 2022-09-06 2022-12-06 维沃移动通信有限公司 图像融合方法、装置、电子设备和存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170337667A1 (en) * 2016-05-18 2017-11-23 Thomson Licensing Method and device for obtaining a hdr image by graph signal processing
CN110189285A (zh) * 2019-05-28 2019-08-30 北京迈格威科技有限公司 一种多帧图像融合方法及装置
CN111770243A (zh) * 2020-08-04 2020-10-13 深圳市精锋医疗科技有限公司 内窥镜的图像处理方法、装置、存储介质
CN114119423A (zh) * 2021-12-08 2022-03-01 上海肇观电子科技有限公司 图像处理方法、装置、电子设备和存储介质
CN114511487A (zh) * 2022-02-16 2022-05-17 展讯通信(上海)有限公司 图像融合方法及装置、计算机可读存储介质、终端
CN115439386A (zh) * 2022-09-06 2022-12-06 维沃移动通信有限公司 图像融合方法、装置、电子设备和存储介质

Also Published As

Publication number Publication date
CN115439386A (zh) 2022-12-06

Similar Documents

Publication Publication Date Title
US11563897B2 (en) Image processing method and apparatus which determines an image processing mode based on status information of the terminal device and photographing scene information
CN110505411B (zh) 图像拍摄方法、装置、存储介质及电子设备
CN108322646B (zh) 图像处理方法、装置、存储介质及电子设备
CN109671106B (zh) 一种图像处理方法、装置与设备
CN107172345B (zh) 一种图像处理方法及终端
CN107592453B (zh) 一种拍摄方法及移动终端
CN111028189A (zh) 图像处理方法、装置、存储介质及电子设备
US7450756B2 (en) Method and apparatus for incorporating iris color in red-eye correction
US20150373248A1 (en) Intelligent Auto-Exposure Bracketing
CN107172296A (zh) 一种图像拍摄方法及移动终端
WO2015081556A1 (fr) Procédé de photographie destiné à un dispositif à double appareil photo et dispositif à double appareil photo
WO2022166944A1 (fr) Procédé et appareil de photographie, dispositif électronique et support
CN101998048A (zh) 数字图像信号处理方法和数字图像信号处理设备
WO2024051697A1 (fr) Procédé et appareil de fusion d'images, dispositif électronique et support de stockage
TW201902204A (zh) 通過按需感測器啟動來實現一多感測器攝像頭設備中的功率降低
CN110198418A (zh) 图像处理方法、装置、存储介质及电子设备
CN112668636A (zh) 摄像头遮挡检测方法及系统、电子设备及存储介质
CN110717871A (zh) 图像处理方法、装置、存储介质及电子设备
CN112422798A (zh) 拍照方法、装置、电子设备和存储介质
CN110929615B (zh) 图像处理方法、图像处理装置、存储介质与终端设备
US10769416B2 (en) Image processing method, electronic device and storage medium
CN103139521B (zh) 图像采集与选取方法及图像采集装置
CN108259767B (zh) 图像处理方法、装置、存储介质及电子设备
CN107180417B (zh) 照片处理方法、装置、计算机可读存储介质及电子设备
CN111885371A (zh) 图像遮挡检测方法、装置、电子设备和计算机可读介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23862391

Country of ref document: EP

Kind code of ref document: A1