CN115829888A - Exposure compensation value determining method and image fusion method - Google Patents

Exposure compensation value determining method and image fusion method Download PDF

Info

Publication number
CN115829888A
CN115829888A CN202111086445.2A CN202111086445A CN115829888A CN 115829888 A CN115829888 A CN 115829888A CN 202111086445 A CN202111086445 A CN 202111086445A CN 115829888 A CN115829888 A CN 115829888A
Authority
CN
China
Prior art keywords
compensation value
exposure compensation
image
area
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111086445.2A
Other languages
Chinese (zh)
Inventor
王涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jigan Technology Co ltd
Original Assignee
Beijing Jigan Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jigan Technology Co ltd filed Critical Beijing Jigan Technology Co ltd
Priority to CN202111086445.2A priority Critical patent/CN115829888A/en
Publication of CN115829888A publication Critical patent/CN115829888A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Studio Devices (AREA)

Abstract

The application provides an exposure compensation value determining method and an image fusion method, which are applied to the field of image processing, wherein the exposure compensation value determining method comprises the following steps: acquiring a reference image; determining a target area satisfying a brightness condition in a reference image; and determining an exposure compensation value of the image to be fused according to the attribute of the target area, wherein the image to be fused and the reference image correspond to the same shooting scene. In the scheme, the exposure compensation value of the image to be fused can be dynamically determined according to the attribute of the target area meeting the brightness condition in the reference image, so that different images to be fused can be flexibly shot according to different scenes, and the definition of the finally shot image is improved.

Description

Exposure compensation value determining method and image fusion method
Technical Field
The present disclosure relates to the field of image processing, and in particular, to an exposure compensation value determining method and an image fusion method.
Background
With the development of image processing technology, user requirements for quality of taken photos are becoming higher and higher. In the prior art, a High-Dynamic Range (HDR) algorithm is adopted on a device such as a mobile phone, so that the quality of a shot picture can be effectively improved.
To realize HDR, multiple frames of images with different exposure compensation values need to be taken for fusion. In the existing method, a fixed exposure compensation value is generally selected when shooting, for example, three frames of ev +1, ev0 and ev-1. In some scenarios, however, a fixed exposure compensation value is selected for capture and HDR is performed with the acquired image, resulting in lower sharpness.
Disclosure of Invention
An object of the embodiments of the present application is to provide an exposure compensation value determining method and an image fusion method, so as to solve the technical problem of low definition of an obtained result.
In a first aspect, an embodiment of the present application provides an exposure compensation value determining method, including: acquiring a reference image; determining a target area satisfying a brightness condition in the reference image; and determining an exposure compensation value of the image to be fused according to the attribute of the target area, wherein the image to be fused and the reference image correspond to the same shooting scene. In the scheme, the exposure compensation value of the image to be fused can be dynamically determined according to the attribute of the target area meeting the brightness condition in the reference image, so that different images to be fused can be flexibly shot according to different scenes, and the definition of the finally shot image is improved.
In an alternative embodiment, the target area comprises a dark area and/or a bright area; wherein, if the target area comprises a dark area, the brightness condition comprises that the brightness of the area is less than a first brightness threshold; if the target area comprises a bright area, the brightness condition comprises that the brightness of the area is greater than a second brightness threshold. In the above scheme, the target region satisfying the brightness condition may include a dark region having a brightness smaller than the first brightness threshold and a bright region having a brightness larger than the second brightness threshold. The method can determine whether images to be fused with other exposure compensation values are needed or not according to attributes of dark areas with low brightness and bright areas with high brightness, so that different images to be fused can be flexibly shot according to different scenes, and the definition of the finally shot images is improved.
In an alternative embodiment, the attribute includes at least one of brightness, area, and amount of edge information.
In an optional embodiment, the determining an exposure compensation value of an image to be fused according to the attribute of the target region includes: if the target area comprises the dark area, determining a maximum exposure compensation value of the image to be fused according to the attribute of the dark area; and if the target area comprises the bright area, determining the minimum exposure compensation value of the image to be fused according to the attribute of the bright area. In the above scheme, for a dark area with lower brightness, the maximum exposure compensation value of the image to be fused can be determined according to the attribute of the dark area, and the image definition of the dark area is higher in the image to be fused with the maximum exposure compensation value; aiming at the bright area with higher brightness, the minimum exposure compensation value of the image to be fused can be determined according to the attribute of the bright area, and the image definition of the bright area is higher in the image to be fused with the minimum exposure compensation value; the image to be fused is fused, so that the definition of the finally shot image can be improved.
In an optional embodiment, after the determining the target region satisfying the brightness condition in the reference image, the method further includes: and determining the brightness of the target area according to the average value of the pixel values in the target area.
In an optional embodiment, after the determining the target region satisfying the brightness condition in the reference image, the method further includes: and determining the area of the target area according to the proportion of the number of the pixels in the target area to the number of the pixels in the reference image.
In an optional embodiment, after the determining the target region satisfying the brightness condition in the reference image, the method further comprises: extracting edge information in the reference image to obtain a corresponding edge image; and determining the edge information amount of the target area according to the number of pixel points of the edge image in the area corresponding to the target area in the reference image.
In an optional embodiment, the determining the maximum exposure compensation value of the image to be fused according to the attribute of the dark region includes: if the brightness of the dark area is smaller than a third brightness threshold, the area of the dark area is larger than a first area threshold, and the edge information amount of the dark area is larger than a first edge information amount threshold, determining the maximum exposure compensation value as a second exposure compensation value; if the brightness of the dark area is greater than the third brightness threshold, or the area of the dark area is less than the first area threshold, or the edge information amount of the dark area is less than the first edge information amount threshold, determining that the maximum exposure compensation value is a third exposure compensation value; wherein the second exposure compensation value is greater than the third exposure compensation value, and the third exposure compensation value is greater than or equal to the first exposure compensation value. In the scheme, when the brightness of the dark area is small, the area is large and the edge information amount is rich, the maximum exposure compensation value can be determined to be large, so that the definition of the image in the dark area is improved; when the brightness of the dark area is larger, or the area is smaller, or the amount of edge information is not rich, the maximum exposure compensation value can be determined to be smaller, so that the frame taking time and the occupied memory are reduced.
In an optional embodiment, the determining the minimum exposure compensation value of the image to be fused according to the property of the bright area includes: if the brightness of the bright area is smaller than a fourth brightness threshold or the area of the bright area is smaller than a second area threshold, determining the minimum exposure compensation value as a fourth exposure compensation value; if the brightness of the bright area is larger than the fourth brightness threshold and the area of the bright area is larger than the second area threshold, acquiring a first candidate fusion image with a fifth exposure compensation value; wherein the fifth exposure compensation value is less than the fourth exposure compensation value, which is less than or equal to the first exposure compensation value; and determining the minimum exposure compensation value according to the edge information amount of a first area corresponding to a bright area in the reference image in the first candidate fusion image. In the above scheme, when the brightness of the bright area is small and the area is small, the minimum exposure compensation value can be determined to be large, so that the frame taking time and the occupied memory are reduced; when the brightness or the area of the bright area is larger, whether the edge information amount of the first candidate fusion image is rich or not can be further judged, and therefore the definition of the bright area image is improved.
In an optional embodiment, the determining the minimum exposure compensation value according to the edge information amount of the first region corresponding to the bright region in the reference image in the first candidate fused image includes: if the edge information amount of the first area is larger than a second edge information amount threshold value, determining the minimum exposure compensation value as the fifth exposure compensation value; and if the edge information amount of the first area is smaller than the second edge information amount threshold, determining the minimum exposure compensation value as the first exposure compensation value. In the scheme, when the edge information amount of the first candidate fusion image is rich, the minimum exposure compensation value can be determined to be smaller, so that the definition of the bright-area image is improved; when the edge information amount of the first candidate fusion image is not rich, the minimum exposure compensation value can be determined to be larger, so that the frame taking time and the occupied memory are reduced.
In an optional embodiment, the determining the minimum exposure compensation value according to the edge information amount of the first region corresponding to the bright region in the reference image in the first candidate fused image includes: if the edge information amount of the first area is larger than a second edge information amount threshold value, determining the minimum exposure compensation value as the fifth exposure compensation value; if the edge texture information amount of the first area is smaller than the second edge information amount threshold value, acquiring a second candidate fusion image with a sixth exposure compensation value; wherein the sixth exposure compensation value is greater than the fifth exposure compensation value and less than the first exposure compensation value; and determining the minimum exposure compensation value according to the edge information amount of a second area corresponding to the bright area in the reference image in the second candidate fusion image. In the scheme, when the edge information amount of the first candidate fusion image is rich, the minimum exposure compensation value can be determined to be smaller, so that the definition of a bright area image is improved; when the edge information amount of the first candidate fusion image is not rich, whether the edge information amount of the second candidate fusion image is rich or not can be further judged, so that the frame taking time and the occupied memory are reduced.
In an optional embodiment, the determining the maximum exposure compensation value of the image to be fused according to the attribute of the dark region includes: determining a maximum exposure compensation value of the image to be fused and at least one transition exposure compensation value between the maximum exposure compensation value and the first exposure compensation value according to the attributes of the dark area; the determining the minimum exposure compensation value of the image to be fused according to the attributes of the bright area comprises the following steps: and determining a minimum exposure compensation value of the image to be fused and at least one transition exposure compensation value between the minimum exposure compensation value and the first exposure compensation value according to the attributes of the bright areas. In the above scheme, after the maximum exposure compensation value and the minimum exposure compensation value are determined, transition exposure compensation values between the maximum exposure compensation value and the first exposure compensation value and between the minimum exposure compensation value and the first exposure compensation value may be further determined, so that transitions of each region in the finally fused image are smooth and non-abrupt.
In an alternative embodiment, the determining a target region in the reference image that satisfies a luminance condition includes: dividing the reference image into a plurality of connected regions; and combining the connected regions meeting the same brightness condition according to the brightness of each connected region to obtain the target region. In the above scheme, the reference image may be divided into a plurality of connected regions, and then the plurality of connected regions are combined to obtain a corresponding target region, so as to dynamically determine the exposure compensation value of the image to be fused according to the attribute of the target region.
In an alternative embodiment, the acquiring a reference image includes: acquiring a first reference image with a first exposure compensation value acquired by a first camera in an image acquisition device and a second reference image with a seventh exposure compensation value acquired by a second camera in the image acquisition device; wherein the seventh exposure compensation value is less than the first exposure compensation value; the determining an exposure compensation value of the image to be fused according to the attribute of the target area comprises the following steps: and determining an exposure compensation value of the image to be fused according to the attribute of the target area in the first reference image and the attribute of the target area in the second reference image. In the above scheme, when determining the exposure compensation value of the image to be fused, for the bright area, whether the edge information amount of the bright area is rich or not cannot be known due to overexposure. Therefore, if the amount of edge information of the bright area is not abundant, the dark area is forcibly fused, and the bright area of the image finally obtained by fusion is dark. Therefore, in a double-shot scene, two reference images with different exposure compensation can be respectively collected by the two cameras, so that the dark frame image can be taken firstly to analyze whether the dark frame image is needed.
In a second aspect, an embodiment of the present application provides an image fusion method, including: acquiring a corresponding image to be fused according to the exposure compensation value; wherein the exposure compensation value is determined according to the exposure compensation value determination method of any one of the first aspects described above; and fusing the images to be fused to obtain a target image. In the above scheme, after the exposure compensation value of the image to be fused is dynamically determined according to the attribute of the target region satisfying the brightness condition in the reference image, the image to be fused can be fused to obtain a final fused image, and the definition of the final image is high.
In a third aspect, an embodiment of the present application provides an exposure compensation value determining apparatus, including: the first acquisition module is used for acquiring a reference image; the first determination module is used for determining a target area meeting a brightness condition in the reference image; and the second determining module is used for determining an exposure compensation value of the image to be fused according to the attribute of the target area, wherein the image to be fused and the reference image correspond to the same shooting scene. In the scheme, the exposure compensation value of the image to be fused can be dynamically determined according to the attribute of the target area meeting the brightness condition in the reference image, so that different images to be fused can be flexibly shot according to different scenes, and the definition of the finally shot image is improved.
In an alternative embodiment, the target region comprises a dark and/or light region; wherein, if the target area comprises a dark area, the brightness condition comprises that the brightness of the area is less than a first brightness threshold; if the target area comprises a bright area, the brightness condition comprises that the brightness of the area is greater than a second brightness threshold. In the above scheme, the target region satisfying the brightness condition may include a dark region having a brightness smaller than the first brightness threshold and a bright region having a brightness larger than the second brightness threshold. The method can determine whether images to be fused with other exposure compensation values are needed or not according to attributes of dark areas with low brightness and bright areas with high brightness, so that different images to be fused can be flexibly shot according to different scenes, and the definition of the finally shot images is improved.
In an alternative embodiment, the attribute includes at least one of brightness, area, and amount of edge information.
In an optional embodiment, the second determining module is specifically configured to: if the target area comprises the dark area, determining a maximum exposure compensation value of the image to be fused according to the attribute of the dark area; and if the target area comprises the bright area, determining the minimum exposure compensation value of the image to be fused according to the attribute of the bright area. In the above scheme, for a dark area with lower brightness, the maximum exposure compensation value of the image to be fused can be determined according to the attribute of the dark area, and the image definition of the dark area is higher in the image to be fused with the maximum exposure compensation value; aiming at the bright area with higher brightness, the minimum exposure compensation value of the image to be fused can be determined according to the attribute of the bright area, and the image definition of the bright area is higher in the image to be fused with the minimum exposure compensation value; the image to be fused is fused, so that the definition of the finally shot image can be improved.
In an optional embodiment, the exposure compensation value determining apparatus further includes: and the third determining module is used for determining the brightness of the target area according to the average value of the pixel values in the target area.
In an optional embodiment, the exposure compensation value determining apparatus further includes: and the fourth determining module is used for determining the area of the target area according to the proportion of the number of the pixel points in the target area to the number of the pixel points in the reference image.
In an optional embodiment, the exposure compensation value determining apparatus further includes: the extraction module is used for extracting the edge information in the reference image to obtain a corresponding edge image; and the fifth determining module is used for determining the edge information content of the target area according to the number of pixel points of the edge image in the area corresponding to the target area in the reference image.
In an alternative embodiment, the reference image has a first exposure compensation value, and the second determining module is further configured to: if the brightness of the dark area is smaller than a third brightness threshold, the area of the dark area is larger than a first area threshold, and the edge information amount of the dark area is larger than a first edge information amount threshold, determining the maximum exposure compensation value as a second exposure compensation value; if the brightness of the dark area is greater than the third brightness threshold, or the area of the dark area is less than the first area threshold, or the edge information amount of the dark area is less than the first edge information amount threshold, determining that the maximum exposure compensation value is a third exposure compensation value; wherein the second exposure compensation value is greater than the third exposure compensation value, and the third exposure compensation value is greater than or equal to the first exposure compensation value. In the scheme, when the brightness of the dark area is small, the area is large and the edge information amount is rich, the maximum exposure compensation value can be determined to be large, so that the definition of the image in the dark area is improved; when the brightness of the dark area is larger, or the area is smaller, or the amount of edge information is not rich, the maximum exposure compensation value can be determined to be smaller, so that the frame taking time and the occupied memory are reduced.
In an alternative embodiment, the reference image has a first exposure compensation value, and the second determining module is further configured to: if the brightness of the bright area is smaller than a fourth brightness threshold or the area of the bright area is smaller than a second area threshold, determining the minimum exposure compensation value as a fourth exposure compensation value; if the brightness of the bright area is greater than the fourth brightness threshold and the area of the bright area is greater than the second area threshold, acquiring a first candidate fusion image with a fifth exposure compensation value; wherein the fifth exposure compensation value is less than the fourth exposure compensation value, which is less than or equal to the first exposure compensation value; and determining the minimum exposure compensation value according to the edge information amount of a first area corresponding to a bright area in the reference image in the first candidate fusion image. In the above scheme, when the brightness of the bright area is small and the area is small, the minimum exposure compensation value can be determined to be large, so that the frame taking time and the occupied memory are reduced; when the brightness of the bright area is larger or the area of the bright area is larger, whether the edge information amount of the first candidate fusion image is rich or not can be further judged, and therefore the definition of the bright area image is improved.
In an optional embodiment, the second determining module is further configured to: if the edge information amount of the first area is larger than a second edge information amount threshold value, determining the minimum exposure compensation value as the fifth exposure compensation value; and if the edge information amount of the first area is smaller than the second edge information amount threshold, determining the minimum exposure compensation value as the first exposure compensation value. In the scheme, when the edge information amount of the first candidate fusion image is rich, the minimum exposure compensation value can be determined to be smaller, so that the definition of the bright-area image is improved; when the edge information amount of the first candidate fusion image is not rich, the minimum exposure compensation value can be determined to be larger, so that the frame taking time and the occupied memory are reduced.
In an optional embodiment, the second determining module is further configured to: if the edge information amount of the first area is greater than a second edge information amount threshold, determining that the minimum exposure compensation value is the fifth exposure compensation value; if the edge texture information amount of the first area is smaller than the second edge information amount threshold value, acquiring a second candidate fusion image with a sixth exposure compensation value; wherein the sixth exposure compensation value is greater than the fifth exposure compensation value and less than the first exposure compensation value; and determining the minimum exposure compensation value according to the edge information amount of a second area corresponding to the bright area in the reference image in the second candidate fusion image. In the scheme, when the edge information amount of the first candidate fusion image is rich, the minimum exposure compensation value can be determined to be smaller, so that the definition of the bright-area image is improved; when the edge information amount of the first candidate fusion image is not rich, whether the edge information amount of the second candidate fusion image is rich or not can be further judged, so that the frame taking time and the occupied memory are reduced.
In an alternative embodiment, the reference image has a first exposure compensation value, and the second determining module is further configured to: determining a maximum exposure compensation value of the image to be fused and at least one transition exposure compensation value between the maximum exposure compensation value and the first exposure compensation value according to the attributes of the dark area; the determining the minimum exposure compensation value of the image to be fused according to the property of the bright area comprises the following steps: and determining a minimum exposure compensation value of the image to be fused and at least one transition exposure compensation value between the minimum exposure compensation value and the first exposure compensation value according to the attributes of the bright areas. In the above scheme, after the maximum exposure compensation value and the minimum exposure compensation value are determined, transition exposure compensation values between the maximum exposure compensation value and the first exposure compensation value and between the minimum exposure compensation value and the first exposure compensation value may be further determined, so that transitions of each region in the finally fused image are smooth and non-abrupt.
In an optional embodiment, the first determining module is specifically configured to: dividing the reference image into a plurality of connected regions; and combining the connected regions meeting the same brightness condition according to the brightness of each connected region to obtain the target region. In the above scheme, the reference image may be divided into a plurality of connected regions, and then the plurality of connected regions are combined to obtain a corresponding target region, so as to dynamically determine the exposure compensation value of the image to be fused according to the attribute of the target region.
In an optional embodiment, the first obtaining module is specifically configured to: acquiring a first reference image with a first exposure compensation value acquired by a first camera in an image acquisition device and a second reference image with a seventh exposure compensation value acquired by a second camera in the image acquisition device; wherein the seventh exposure compensation value is less than the first exposure compensation value; the determining an exposure compensation value of the image to be fused according to the attribute of the target area comprises the following steps: and determining an exposure compensation value of the image to be fused according to the attribute of the target area in the first reference image and the attribute of the target area in the second reference image. In the above scheme, when determining the exposure compensation value of the image to be fused, for the bright area, it is impossible to know whether the edge information amount of the bright area is rich due to overexposure. Therefore, if the amount of edge information of the bright area is not abundant, the dark area is forcibly fused, and the bright area of the image finally obtained by fusion is dark. Therefore, in a double-shot scene, two reference images with different exposure compensation can be respectively collected by the two cameras, so that the dark frame image can be taken firstly to analyze whether the dark frame image is needed.
In a fourth aspect, an embodiment of the present application provides an image fusion apparatus, including: the second acquisition module is used for acquiring the corresponding image to be fused according to the exposure compensation value; wherein the exposure compensation value is determined according to the exposure compensation value determination method of any one of the first aspects described above; and the fusion module is used for fusing the images to be fused to obtain a target image. In the above scheme, after the exposure compensation value of the image to be fused is dynamically determined according to the attribute of the target region satisfying the brightness condition in the reference image, the image to be fused can be fused to obtain a final fused image, and the definition of the final image is high.
In a fifth aspect, an embodiment of the present application provides an electronic device, including: a processor, a memory, and a bus; the processor and the memory are communicated with each other through the bus; the memory stores program instructions executable by the processor, the processor invoking the program instructions to perform the exposure compensation value determination method of any one of the first aspects or the image fusion method of the second aspect.
In a sixth aspect, embodiments of the present application provide a computer-readable storage medium storing computer instructions, which when executed by a computer, cause the computer to perform the exposure compensation value determination method according to any one of the first aspects or the image fusion method according to the second aspect.
Seventh aspect, a computer program product comprising a computer program which, when executed by a processor, implements the exposure compensation value determination method as defined in any one of the first aspects or the image fusion method as defined in the second aspect.
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, embodiments accompanied with figures are described in detail below.
Drawings
To more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 is a flowchart of an exposure compensation value determining method according to an embodiment of the present disclosure;
fig. 2 is a block diagram illustrating a structure of an exposure compensation value determining apparatus according to an embodiment of the present disclosure;
fig. 3 is a block diagram of an image fusion apparatus according to an embodiment of the present disclosure;
fig. 4 is a block diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
An HDR image is an image that can provide more dynamic range and image detail than a normal image. Corresponding Low-Dynamic Range (LDR) images can be obtained according to different exposure compensation values, and the LDR images with the best details corresponding to each exposure compensation value are used for fusion to obtain an HDR image.
Exposure compensation is an exposure control method: the exposure quantity of the camera is changed by intentionally changing the exposure parameters (such as an aperture and a shutter) automatically calculated by the camera, so that the shot picture is brighter or darker.
The exposure compensation value can be regarded as a quantitative representation of the exposure compensation strength, and the value range of the exposure compensation value can be an interval with 0 as the center, such as [ -2,2], [ -3,3], and the like. The exposure compensation value is 0, which means that the camera automatically determines the exposure amount for shooting, the exposure compensation value is positive (0 relative to the exposure compensation value) which means that the camera shoots under the condition of increasing the exposure amount, the exposure compensation value is negative (0 relative to the exposure compensation value) which means that the camera shoots under the condition of reducing the exposure amount, and the larger the absolute value of the exposure compensation value is, the larger the degree of increasing or reducing the exposure amount is. In some existing devices, the exposure compensation values are taken as interval values, for example, 1 is taken as an interval in [ -2,2], 5 levels of exposure compensation values of ev-2, ev-1, ev0, ev +1 and ev +2 can be obtained, the difference between adjacent exposure compensation values is 1ev, which indicates that the exposure amount is different by one time, for example, the exposure amount of ev +1 is increased by one time relative to ev0, and of course, 1/2 and 1/3 are also taken as the interval between the adjacent exposure compensation values.
And the definition of the HDR image obtained by fusion is related to the LDR image corresponding to the exposure compensation value. In the prior art, a plurality of exposure compensation values are generally fixed when an LDR image is shot, and the resolution of an obtained HDR image may be low for different scenes.
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
Referring to fig. 1, fig. 1 is a flowchart of an exposure compensation value determining method according to an embodiment of the present disclosure, where the exposure compensation value determining method may be executed by an electronic device, and the method may specifically include the following steps:
step S101: a reference image is acquired.
Step S102: and determining a target area satisfying the brightness condition in the reference image.
Step S103: and determining an exposure compensation value of the image to be fused according to the attribute of the target area.
Specifically, in the solution of the present application, one or more images need to be acquired by the image acquisition device, and then the exposure compensation value required for image fusion is calculated based on the acquired images, and these images that need to be acquired first are referred to as reference images. For example, the reference image may be an image that the image pickup device picks up at a predetermined exposure compensation value when the user clicks a photographing button while the user is using the image pickup device. The image acquisition device can be various devices capable of acquiring images such as a mobile phone, a camera, a video camera and the like, and the shooting buttons are different for different image acquisition devices. For example: for a mobile phone, the shooting button can be a virtual button on a mobile phone screen; alternatively, for the camera, the shooting button may be an entity button on the camera body, and the like, which is not specifically limited in the embodiment of the present application.
As an embodiment, when the user clicks the shooting button, each camera in the image acquisition device can continuously acquire a plurality of reference images with different exposure compensation values; as another embodiment, each camera in the image capturing device may capture only one reference image when the user clicks the shooting button. Since the exposure compensation value determining method provided by the embodiment of the present application is similar to the execution steps of each reference image acquired by the camera, the following embodiment will be described by taking the example that each camera acquires only one reference image.
According to different types of electronic devices, the electronic devices acquire reference images in different ways. For example, when the electronic device is a processor located in the image capturing device, the electronic device may acquire the reference image by: receiving an image sent by a camera in an image acquisition device; alternatively, when the electronic device is an external processor, the manner of acquiring the reference image by the electronic device may be: the method comprises the steps of receiving an image sent by an image acquisition device, reading an image stored in advance by the image acquisition device from a cloud end, reading an image stored in advance by the image acquisition device from a local position, and the like. The present invention is not limited to the above embodiments, and those skilled in the art can make appropriate adjustments according to actual situations.
It will be appreciated that an image capture device may include one or more cameras for capturing images, such as: a normal camera may include one camera; alternatively, a dual-camera phone may include two cameras; alternatively, a three-camera phone may include three cameras, etc.
For an image capturing device with only one camera working, the reference image in step S101 may be an image captured by the camera with the first exposure compensation value when the user clicks the shooting button. In this case, the step S101 may specifically include: a reference image with a first exposure compensation value acquired by a camera in an image acquisition device is acquired.
For an image acquisition device with two cameras working, when only one of the two cameras works, the reference image in step S101 may be an image acquired by the camera when the user clicks a shooting button; when both the cameras are operated, the reference image in step S101 may be images respectively acquired by the two cameras when the user clicks the shooting button. For the second case, the step S101 may specifically include: acquiring a first reference image with a first exposure compensation value acquired by a first camera in an image acquisition device and a second reference image with a seventh exposure compensation value acquired by a second camera in the image acquisition device; wherein the seventh exposure compensation value may be smaller than the first exposure compensation value.
For an image acquisition device with more than two cameras working, the manner of acquiring the reference image in step S101 is similar to the manner of acquiring the reference image by an image acquisition device with two cameras working, and details are not repeated here.
After acquiring the reference image, the electronic device may perform the subsequent steps S102 and S103. The above steps S102 and S103 will be described below by taking an example in which the image capturing device with only one camera operating captures one reference image and an example in which the image capturing device with two cameras operating captures two reference images.
First, a case where an image pickup device in which only one camera operates picks up a reference image will be described.
The electronic device can determine a target region in the reference image that satisfies the brightness condition. The reference image can be divided into a plurality of areas based on the pixel value of each pixel point in the reference image; the brightness condition may be a preset condition regarding the brightness of the pixel, and the target area is an area satisfying the brightness condition among the plurality of areas.
There are various criteria for dividing the reference image, and this is not specifically limited in the embodiments of the present application. For example, a region in the reference image having a luminance smaller than the first luminance threshold may be divided into a dark region, a region in the reference image having a luminance larger than the first luminance threshold and smaller than the second luminance threshold may be divided into a middle region, and a region in the reference image having a luminance larger than the second luminance threshold may be divided into a bright region; alternatively, similar to the division criterion, the reference image may be divided into a dark region, a darker region, a middle region, a lighter region, and a bright region according to the regional brightness; alternatively, a region of the reference image having a first percentage of the previous luminance level may be divided into bright regions, a region of the reference image having a second percentage of the next luminance level may be divided into dark regions, and the remaining regions of the reference image may be divided into medium regions, etc.
In the above embodiment, the brightness of a region refers to a value calculated based on pixel values corresponding to all pixel points in the region. For example: the brightness of a region may be an average value of pixel values corresponding to all pixel points in the region.
In the process of dividing the reference image, the specific value can be adjusted according to actual conditions. For example, the first brightness threshold may be 60, the second brightness threshold may be 200; alternatively, the first percentage may be 30% and the second percentage may be 20%.
In addition, when the reference image is divided into a plurality of regions, as an embodiment, the division may be performed with each pixel point as a minimum unit, and at this time, the division may be performed directly according to the pixel value size of the pixel point. For example: the method comprises the steps of dividing pixel points with pixel values larger than 200 into bright areas, dividing pixel points with pixel values smaller than 200 and larger than 60 into medium areas, and dividing pixel points with pixel values smaller than 60 into dark areas.
As another embodiment, the area formed by a plurality of pixel points may be divided as a minimum unit, and at this time, the division may be performed according to an average value of pixel values of each pixel point in the area formed by the plurality of pixel points. For example: the area with the average value of the pixel values larger than 200 is divided into a bright area, the area with the average value of the pixel values smaller than 200 and larger than 60 is divided into a medium area, and the area with the average value of the pixel values smaller than 60 is divided into a dark area.
There are various ways to determine the region composed of multiple pixels. For example, a square formed by every four adjacent pixels can be used as an area according to the arrangement position of the pixels in the reference image; alternatively, the reference image may be divided into a plurality of regions based on an image division algorithm, and in this case, after step S101, the exposure compensation value determining method provided in the embodiment of the present application may further include the following steps:
the reference image is divided into a plurality of connected regions.
And combining the connected regions meeting the same brightness condition according to the brightness of each connected region to obtain a plurality of regions.
It is to be understood that the luminance condition may be different depending on the standard for dividing the reference image, and the embodiment of the present application also does not specifically limit this. For example, the brightness condition may include that the brightness is less than a first brightness threshold, that is, the area of which the brightness is less than the first brightness threshold is the target area; alternatively, the brightness condition may include that the brightness is greater than a second brightness threshold, that is, the regions with the brightness greater than the second brightness threshold are all target regions; or, the brightness condition may include that the brightness is less than the first brightness threshold or greater than the second brightness threshold, that is, both the region whose brightness is less than the first brightness threshold and the region whose brightness is greater than the second brightness threshold are the target regions; alternatively, the luminance condition may include that the luminance size is first in the reference image, that is, that the region having the luminance size first in the reference image is the target region, and the like.
After determining the target area satisfying the brightness condition in the reference image, the electronic device may determine the exposure compensation value of the image to be fused according to the attribute of the target area. The image to be fused and the reference image correspond to the same shooting scene, and refer to images which are needed to obtain a final fused image.
It is to be understood that the attribute of the target region may include at least one of brightness, area, and amount of edge information, depending on the actual situation. For example: the properties of the target area may include only brightness; alternatively, the attributes of the target region may include brightness, area, and amount of edge information, and the like.
Wherein, determining the brightness of the target area can adopt the following modes: the brightness of the target area is determined from the average of the pixel values in the target area. Determining the area of the target region may be performed as follows: and determining the area of the target area according to the proportion of the number of the pixels in the target area to the number of the pixels in the reference image. The edge information amount of the target area can be determined in the following way: extracting edge information in the reference image to obtain a corresponding edge image; and determining the edge information amount of the target area according to the number of pixel points of the edge image in the area corresponding to the target area in the reference image.
It is understood that the above-mentioned manner for determining the brightness, the area, and the amount of edge information of the target region is only an example provided in the embodiments of the present application, and those skilled in the art can flexibly adjust the manner for determining the brightness, the area, and the amount of edge information of the target region according to the actual situation in combination with the prior art. For example: determining the brightness of the target area by using the median of the pixel values in the target area; or, the area of the target area is determined by the number of the pixel points in the target area.
When the target area includes a plurality of areas (for example, includes a dark area and a bright area), the step S103 may determine an exposure compensation value of the image to be fused according to the attribute of each of the plurality of target areas; when the target area includes a region (e.g., includes a dark region or includes a bright region), the step S103 may determine an exposure compensation value of the image to be fused according to the property of the target area.
Taking as an example that the target area includes a dark area and a bright area, and the brightness condition includes that the brightness of the area is less than the first brightness threshold and the brightness of the area is greater than the second brightness threshold, the step S103 may specifically include the following steps:
and if the target area comprises a dark area, determining the maximum exposure compensation value of the image to be fused according to the attribute of the dark area.
And if the target area comprises a bright area, determining the minimum exposure compensation value of the image to be fused according to the attribute of the bright area.
For a dark area, after determining the maximum exposure compensation value of the image to be fused, as an implementation manner, the image to be fused corresponding to the maximum exposure compensation value may be directly fused with the reference image to obtain an image of the dark area. For example: and if the exposure compensation value of the reference image is ev0 and the maximum exposure compensation value of the image to be fused is ev +2, fusing the image corresponding to ev and the image corresponding to ev +2 to obtain the image of the dark area.
As another embodiment, at least one transition exposure compensation value between the maximum exposure compensation value and the exposure compensation value corresponding to the reference image may be determined, and then the image to be fused corresponding to the maximum exposure compensation value, the image to be fused corresponding to the transition exposure compensation value, and the reference image are fused to obtain the image in the dark area. For example: assuming that the exposure compensation value of the reference image is ev0 and the maximum exposure compensation value of the image to be fused is ev +2, the transition exposure compensation value ev +1 between ev0 and ev +2 may be determined, and then the image of the dark region is obtained by fusing the image corresponding to ev0, the image corresponding to ev +1, and the image corresponding to ev + 2.
Similarly, for a bright area, after determining the minimum exposure compensation value of the image to be fused, as an implementation manner, the image to be fused corresponding to the minimum exposure compensation value may be directly fused with the reference image to obtain an image of a dark area. For example: and if the exposure compensation value of the reference image is ev0 and the minimum exposure compensation value of the image to be fused is ev-2, fusing the image corresponding to ev0 and the image corresponding to ev-2 to obtain the image of the bright area.
As another embodiment, at least one transition exposure compensation value between the minimum exposure compensation value and the exposure compensation value corresponding to the reference image may be determined, and then the image to be fused corresponding to the minimum exposure compensation value, the image to be fused corresponding to the transition exposure compensation value, and the reference image are fused to obtain the image of the bright area. For example: assuming that the exposure compensation value of the reference image is ev0 and the maximum exposure compensation value of the image to be fused is ev-2, the transition exposure compensation value ev-1 between ev0 and ev-2 can be determined, and then the image of the bright area is obtained according to the fusion of the image corresponding to ev0, the image corresponding to ev-1 and the image corresponding to ev-2.
Therefore, after the maximum exposure compensation value and the minimum exposure compensation value are determined, transitional exposure compensation values between the maximum exposure compensation value and the first exposure compensation value and between the minimum exposure compensation value and the first exposure compensation value can be further determined, so that the transition of each region in the finally fused image is smooth and unobtrusive.
Further, if the target region includes a dark region, the step of determining the maximum exposure compensation value of the image to be fused according to the attribute of the dark region may specifically include the following steps:
and if the brightness of the dark area is smaller than the third brightness threshold, the area of the dark area is larger than the first area threshold, and the edge information amount of the dark area is larger than the first edge information amount threshold, determining the maximum exposure compensation value as a second exposure compensation value.
And if the brightness of the dark area is greater than the third brightness threshold, or the area of the dark area is less than the first area threshold, or the edge information amount of the dark area is less than the first edge information amount threshold, determining the maximum exposure compensation value as a third exposure compensation value.
That is, the maximum exposure compensation value of the image to be fused may be determined according to the brightness of the dark area, the area of the dark area, and the amount of edge information of the dark area. The brightness of the dark area is lower in the whole reference image, so that the maximum exposure compensation value can be determined to be larger for the dark area with low brightness (the definition of the dark area image is lower), large area (the influence on the whole image is larger) and rich edge information content, so that the brightness of the dark area can be improved after the images are fused to obtain a dark area image with clearer details; and for a dark area with high brightness (the definition of the dark area image is high), small area (the influence on the whole image is small) or not rich in edge information, the maximum exposure compensation value can be determined to be smaller.
Therefore, in the above step, the second exposure compensation value may be greater than the third exposure compensation value, and the third exposure compensation value may be greater than or equal to the first exposure compensation value. For example: the first exposure compensation value is ev0, the second exposure compensation value is ev +2, and the third exposure compensation value is ev +1; alternatively, the first exposure compensation value is ev0, the second exposure compensation value is ev +2, and the third exposure compensation value is ev.
For the different cases, the image to be fused obtained based on the dark area analysis may include: ev0 images and ev +2 images (when the luminance of the dark area is less than the third luminance threshold, the area of the dark area is greater than the first area threshold, and the edge information amount of the dark area is greater than the first edge information amount threshold); or ev0 images and ev +1 images (when the luminance of the dark area is less than the third luminance threshold, the area of the dark area is greater than the first area threshold, and the edge information amount of the dark area is greater than the first edge information amount threshold); or ev0 image, ev +1 image, and ev +2 image (when the luminance of the dark area is less than the third luminance threshold, the area of the dark area is greater than the first area threshold, the edge information amount of the dark area is greater than the first edge information amount threshold, and the overexposure compensation value is present); or ev0 image (when the luminance of the dark area is less than the third luminance threshold, the area of the dark area is greater than the first area threshold, and the edge information amount of the dark area is greater than the first edge information amount threshold).
In the above scheme, when the brightness of the dark area is small, the area is large and the amount of edge information is rich, the maximum exposure compensation value can be determined to be large, so that the definition of the image in the dark area is improved. When the brightness of the dark area is large, or the area is small, or the amount of edge information is not abundant, the maximum exposure compensation value may be determined to be small. In the prior art, the definition is improved by acquiring multiple frames of images corresponding to different exposure compensation values (for example, acquiring five frames of images of ev-2, ev-1, ev0, ev +1 and ev +2 at the same time), so that the frame taking time and the occupied memory can be reduced by adopting the scheme.
Further, if the target area includes a bright area, the step of determining the minimum exposure compensation value of the image to be fused according to the property of the bright area may specifically include the following steps:
and if the brightness of the bright area is smaller than the fourth brightness threshold or the area of the bright area is smaller than the second area threshold, determining the minimum exposure compensation value as a fourth exposure compensation value.
And if the brightness of the bright area is greater than the fourth brightness threshold and the area of the bright area is greater than the second area threshold, acquiring a first candidate fusion image with a fifth exposure compensation value.
And determining a minimum exposure compensation value according to the edge information amount of a first area corresponding to the bright area in the reference image in the first candidate fusion image.
That is, the minimum exposure compensation value of the image to be fused may be determined according to the brightness of the bright area, the area of the bright area, and the amount of edge information of the dark area. It can be seen that, unlike the dark area, the analysis of the bright area is not directly performed according to the amount of edge information of the bright area. This is because the bright area may have an overexposure condition, so that it is impossible to know whether the texture information of the bright area is rich, and if the texture information of the bright area is not or very little, the dark area is forcibly fused, so that the bright area of the resulting image is very dark. Therefore, for a scene with a large area of bright areas and overexposure, a dark frame image needs to be taken for further analysis, that is, a first candidate fusion image needs to be obtained, and the edge information content of the first candidate fusion image needs to be analyzed.
Since the brightness of the bright area is higher in the whole reference image, the minimum exposure compensation value can be determined to be smaller for the bright area with small brightness (the dark area image has higher definition) or small area (the influence on the whole reference image is less); and aiming at the bright area with high brightness (the bright area image has overexposure and is low in definition) and large area (the influence on the whole image is large), the first candidate fusion image with a low exposure compensation value can be obtained.
Therefore, in the above step, the fifth exposure compensation value is smaller than the fourth exposure compensation value, and the fourth exposure compensation value is smaller than or equal to the first exposure compensation value. For example: the first exposure compensation value is ev0, the fourth exposure compensation value is ev-1, and the fifth exposure compensation value is ev-2; alternatively, the first exposure compensation value is ev0, the fourth exposure compensation value is ev, and the fifth exposure compensation value is ev-2.
In the above scheme, when the brightness of the bright area is small and the area is small, the minimum exposure compensation value can be determined to be large, so that the frame taking time and the occupied memory are reduced. When the brightness or the area of the bright area is larger, whether the edge information amount of the first candidate fusion image is rich or not can be further judged, and therefore the definition of the bright area image is improved.
As an embodiment, the step of determining the minimum exposure compensation value according to the edge information amount of the first region corresponding to the bright region in the reference image in the first candidate fused image may specifically include the following steps:
and if the edge information amount of the first area is larger than the second edge information amount threshold value, determining the minimum exposure compensation value as a fifth exposure compensation value.
And if the edge information amount of the first area is smaller than the second edge information amount threshold value, determining the minimum exposure compensation value as a first exposure compensation value.
That is, in the case where the luminance of the bright region is large and the area of the bright region is large, the minimum exposure compensation value of the image to be fused may be determined according to the amount of edge information of the first region corresponding to the bright region in the reference image in the first candidate fused image. Wherein, for the first area with rich edge information, the minimum exposure compensation value can be determined to be smaller; for the first region where the amount of edge information is not rich, the minimum exposure value may be determined to be larger.
For example: the first exposure compensation value is ev0, and the fifth exposure compensation value is ev-2; alternatively, the first exposure compensation value is ev0 and the fifth exposure compensation value is ev-1.
In the above scheme, when the amount of edge information of the first candidate fusion image is rich, the minimum exposure compensation value can be determined to be smaller, so that the definition of the bright-area image is improved. When the amount of edge information of the first candidate fusion image is not abundant, the minimum exposure compensation value may be determined to be larger. In the prior art, the definition is improved by acquiring multiple frames of images corresponding to different exposure compensation values (for example, acquiring five frames of images of ev-2, ev-1, ev0, ev +1 and ev +2 at the same time), so that the frame taking time and the occupied memory can be reduced by adopting the scheme.
As another embodiment, the step of determining the minimum exposure compensation value according to the edge information amount of the first region corresponding to the bright region in the reference image in the first candidate fused image may specifically include the following steps:
and if the edge information amount of the first area is larger than the second edge information amount threshold value, determining the minimum exposure compensation value as a fifth exposure compensation value.
And if the edge texture information amount of the first area is smaller than the second edge information amount threshold value, acquiring a second candidate fusion image with a sixth exposure compensation value.
And determining a minimum exposure compensation value according to the edge information amount of a second area corresponding to the bright area in the reference image in the second candidate fusion image.
Similar to the above-described embodiment, in the case where the luminance of a bright region is large and the area of the bright region is large, the minimum exposure compensation value of the image to be fused may be determined according to the amount of edge information of the first region corresponding to the bright region in the reference image in the first candidate fused image. Wherein, for the first area with rich edge information, the minimum exposure compensation value can be determined to be smaller; for a first area with a small amount of edge information, a second candidate fusion image with a slightly higher exposure compensation value can be obtained, and then the minimum exposure compensation value of the image to be fused is determined according to the amount of edge information of the second candidate fusion image.
That is, in the above step, the sixth exposure compensation value may be greater than the fifth exposure compensation value and less than the first exposure compensation value. For example: the first exposure compensation value is ev0, the fifth exposure compensation value is ev-2, and the sixth exposure compensation value is ev-1.
The embodiment of determining the minimum exposure compensation value of the image to be fused according to the edge information amount of the second candidate fused image is similar to the embodiment of determining the minimum exposure compensation value of the image to be fused according to the edge information amount of the first candidate fused image, and details are not repeated here.
In the above scheme, when the amount of edge information of the first candidate fusion image is rich, the minimum exposure compensation value can be determined to be smaller, so that the definition of the bright-area image is improved. When the amount of edge information of the first candidate fused image is not rich, whether the amount of edge information of the second candidate fused image is rich or not can be further judged, so that the frame taking time and the occupied memory are reduced.
It is understood that, as can be seen from the above embodiments, when the exposure compensation value of the image to be fused cannot be determined based on the reference image, the first candidate fused image may be further analyzed; when the exposure compensation value of the image to be fused can not be determined based on the first candidate fused image, the second candidate fused image can be further analyzed. The first candidate fusion image can be obtained first, and then the second candidate fusion image can be obtained when needed; the first candidate fusion image may be acquired again while the second candidate fusion image is acquired, and the previously acquired second candidate fusion image may be directly used when necessary.
For the different situations, the image to be fused obtained based on the bright area analysis may include: ev0 images and ev-2 images (when the luminance of a bright area is greater than the fourth luminance threshold value and the area of the bright area is greater than the second area threshold value, and the edge information amount of the first area is greater than the second edge information amount threshold value); or ev0 images and ev-1 images (when the brightness of the bright area is less than the fourth brightness threshold or the area of the bright area is less than the second area threshold, or when the brightness of the bright area is greater than the fourth brightness threshold and the area of the bright area is greater than the second area threshold and the edge information amount of the first area is less than the second edge information amount threshold); or the ev0 image, the ev-1 image and the ev-2 image (when the brightness of the bright area is greater than the fourth brightness threshold and the area of the bright area is greater than the second area threshold, and the edge information amount of the first area is greater than the second edge information amount threshold, and the overexposure compensation value exists); or ev0 image (when the brightness of the bright area is less than the fourth brightness threshold or the area of the bright area is less than the second area threshold, or when the brightness of the bright area is greater than the fourth brightness threshold and the area of the bright area is greater than the second area threshold and the edge information amount of the first area is less than the second edge information amount threshold).
Therefore, for a dark area with low brightness, the maximum exposure compensation value of the image to be fused can be determined according to the attribute of the dark area, and the image definition of the dark area is higher in the image to be fused with the maximum exposure compensation value. And aiming at the bright area with higher brightness, determining the minimum exposure compensation value of the image to be fused according to the attribute of the bright area, wherein the image definition of the bright area is higher in the image to be fused with the minimum exposure compensation value. The image to be fused is fused, so that the definition of the finally shot image can be improved.
Next, a case where two reference images are acquired by an image acquisition device in which two cameras operate will be described.
As an implementation manner, for each of the two reference images, in an embodiment in which the reference image is acquired by the image acquisition device in which only one camera operates, the reference image may be analyzed. Namely, each reference image is partitioned, a partitioned target area is determined, and an exposure compensation value of the image to be fused is determined according to the attribute of each target area. The detailed description is given with reference to the above embodiments and will not be repeated herein.
In another embodiment, the first reference image of the two reference images is divided, the divided target area is determined, and a part of the target area is analyzed; and analyzing a corresponding area in the second reference image based on the target area which is remained in the first reference image and is not analyzed in the second reference image.
Similarly, taking an example that the target area includes a dark area and a bright area, and the brightness condition includes that the brightness of the area is smaller than the first brightness threshold and the brightness of the area is greater than the second brightness threshold, the steps S102 to S103 may specifically include the following steps:
a bright region and a dark region in the first reference image are determined.
And determining an exposure compensation value of the image to be fused according to the attribute of the bright area in the first reference image and the attribute of the area corresponding to the dark area in the first reference image in the second reference image.
The step of determining the bright area and the dark area of the first reference image, the step of analyzing the bright area in the first reference image, and the step of analyzing the area corresponding to the dark area in the second reference image may refer to the above embodiments, and details thereof are not repeated herein.
In the above scheme, when determining the exposure compensation value of the image to be fused, for the bright area, whether the edge information amount of the bright area is rich or not cannot be known due to overexposure. Therefore, if the amount of edge information of the bright area is not abundant, the dark area is forcibly fused, and the bright area of the image finally obtained by fusion is dark. Therefore, in a double-shot scene, two reference images with different exposure compensation can be respectively collected by the two cameras, so that the dark frame image can be taken firstly to analyze whether the dark frame image is needed.
In summary, in the exposure compensation value determining method provided in the embodiment of the present application, the exposure compensation value of the image to be fused can be dynamically determined according to the attribute of the target region satisfying the brightness condition in the reference image, so that different images to be fused can be flexibly shot according to different scenes, and the definition of the image finally shot is improved.
Further, an embodiment of the present application further provides an image fusion method, where the image fusion method may include the following steps:
and acquiring a corresponding image to be fused according to the exposure compensation value.
And fusing the images to be fused to obtain a target image.
The exposure compensation value may be determined according to the exposure compensation value determination method in the above embodiment. That is, after determining the exposure compensation value of the image to be fused, the image capturing device may capture the corresponding image to be fused based on the exposure compensation value and fuse the image to be fused and the reference image. For example: the exposure compensation values of the images to be fused are ev-1 and ev +1, and the image acquisition device can acquire the image with ev-1 and the image with ev +1 respectively.
Referring to fig. 2, fig. 2 is a block diagram of an exposure compensation value determining apparatus according to an embodiment of the present disclosure, where the exposure compensation value determining apparatus 200 may include: a first obtaining module 201, configured to obtain a reference image; a first determining module 202, configured to determine a target region in the reference image that meets a brightness condition; a second determining module 203, configured to determine an exposure compensation value of an image to be fused according to the attribute of the target region, where the image to be fused and the reference image correspond to the same shooting scene.
In the embodiment of the application, the exposure compensation value of the image to be fused can be dynamically determined according to the attribute of the target area meeting the brightness condition in the reference image, so that different images to be fused can be flexibly shot according to different scenes, and the definition of the finally shot image is improved.
Further, the target area comprises a dark area and/or a bright area; wherein, if the target area comprises a dark area, the brightness condition comprises that the brightness of the area is less than a first brightness threshold; if the target area comprises a bright area, the brightness condition comprises that the brightness of the area is greater than a second brightness threshold.
In the embodiment of the present application, the target region satisfying the brightness condition may include a dark region having a brightness smaller than the first brightness threshold and a bright region having a brightness larger than the second brightness threshold. The method can determine whether images to be fused with other exposure compensation values are needed or not according to attributes of dark areas with low brightness and bright areas with high brightness, so that different images to be fused can be flexibly shot according to different scenes, and the definition of the finally shot images is improved.
Further, the attribute includes at least one of brightness, area, and amount of edge information.
Further, the second determining module 203 is specifically configured to: if the target area comprises the dark area, determining a maximum exposure compensation value of the image to be fused according to the attribute of the dark area; and if the target area comprises the bright area, determining the minimum exposure compensation value of the image to be fused according to the attribute of the bright area.
In the embodiment of the application, the maximum exposure compensation value of the image to be fused can be determined according to the attribute of the dark area with lower brightness, and the image definition of the dark area is higher in the image to be fused with the maximum exposure compensation value; aiming at the bright area with higher brightness, the minimum exposure compensation value of the image to be fused can be determined according to the attribute of the bright area, and the image definition of the bright area is higher in the image to be fused with the minimum exposure compensation value; the image to be fused is fused, so that the definition of the finally shot image can be improved.
Further, the exposure compensation value determining apparatus 200 further includes: and the third determining module is used for determining the brightness of the target area according to the average value of the pixel values in the target area.
Further, the exposure compensation value determining apparatus 200 further includes: and the fourth determining module is used for determining the area of the target area according to the proportion of the number of the pixel points in the target area to the number of the pixel points in the reference image.
Further, the exposure compensation value determining apparatus 200 further includes: the extraction module is used for extracting the edge information in the reference image to obtain a corresponding edge image; and the fifth determining module is used for determining the edge information content of the target area according to the number of pixel points of the edge image in the area corresponding to the target area in the reference image.
Further, the reference image has a first exposure compensation value, and the second determining module 203 is further configured to: if the brightness of the dark area is smaller than a third brightness threshold, the area of the dark area is larger than a first area threshold, and the edge information amount of the dark area is larger than a first edge information amount threshold, determining the maximum exposure compensation value as a second exposure compensation value; if the brightness of the dark area is greater than the third brightness threshold, or the area of the dark area is less than the first area threshold, or the edge information amount of the dark area is less than the first edge information amount threshold, determining that the maximum exposure compensation value is a third exposure compensation value; wherein the second exposure compensation value is greater than the third exposure compensation value, and the third exposure compensation value is greater than or equal to the first exposure compensation value.
In the embodiment of the application, when the brightness of the dark area is small, the area is large and the amount of edge information is rich, the maximum exposure compensation value can be determined to be large, so that the definition of the image in the dark area is improved; when the brightness of the dark area is larger, or the area is smaller, or the amount of the edge information is not rich, the maximum exposure compensation value can be determined to be smaller, so that the frame taking time and the occupied memory are reduced.
Further, the reference image has a first exposure compensation value, and the second determining module 203 is further configured to: if the brightness of the bright area is smaller than a fourth brightness threshold or the area of the bright area is smaller than a second area threshold, determining the minimum exposure compensation value as a fourth exposure compensation value; if the brightness of the bright area is greater than the fourth brightness threshold and the area of the bright area is greater than the second area threshold, acquiring a first candidate fusion image with a fifth exposure compensation value; wherein the fifth exposure compensation value is less than the fourth exposure compensation value, which is less than or equal to the first exposure compensation value; and determining the minimum exposure compensation value according to the edge information amount of a first area corresponding to a bright area in the reference image in the first candidate fusion image.
In the embodiment of the application, when the brightness of the bright area is smaller and the area is smaller, the minimum exposure compensation value can be determined to be larger, so that the frame taking time and the occupied memory are reduced; when the brightness or the area of the bright area is larger, whether the edge information amount of the first candidate fusion image is rich or not can be further judged, and therefore the definition of the bright area image is improved.
Further, the second determining module 203 is further configured to: if the edge information amount of the first area is larger than a second edge information amount threshold value, determining the minimum exposure compensation value as the fifth exposure compensation value; and if the edge information amount of the first area is smaller than the second edge information amount threshold, determining the minimum exposure compensation value as the first exposure compensation value.
In the embodiment of the application, when the edge information amount of the first candidate fusion image is rich, the minimum exposure compensation value can be determined to be smaller, so that the definition of the bright area image is improved; when the edge information amount of the first candidate fusion image is not rich, the minimum exposure compensation value can be determined to be larger, so that the frame taking time and the occupied memory are reduced.
Further, the second determining module 203 is further configured to: if the edge information amount of the first area is larger than a second edge information amount threshold value, determining the minimum exposure compensation value as the fifth exposure compensation value; if the edge texture information amount of the first area is smaller than the second edge information amount threshold value, acquiring a second candidate fusion image with a sixth exposure compensation value; wherein the sixth exposure compensation value is greater than the fifth exposure compensation value and less than the first exposure compensation value; and determining the minimum exposure compensation value according to the edge information amount of a second area corresponding to the bright area in the reference image in the second candidate fusion image.
In the embodiment of the application, when the edge information amount of the first candidate fusion image is rich, the minimum exposure compensation value can be determined to be smaller, so that the definition of the bright-area image is improved; when the edge information amount of the first candidate fusion image is not rich, whether the edge information amount of the second candidate fusion image is rich or not can be further judged, so that the frame taking time and the occupied memory are reduced.
Further, the reference image has a first exposure compensation value, and the second determining module 203 is further configured to: determining a maximum exposure compensation value of the image to be fused and at least one transition exposure compensation value between the maximum exposure compensation value and the first exposure compensation value according to the attributes of the dark area; the determining the minimum exposure compensation value of the image to be fused according to the attributes of the bright area comprises the following steps: and determining a minimum exposure compensation value of the image to be fused and at least one transition exposure compensation value between the minimum exposure compensation value and the first exposure compensation value according to the attributes of the bright areas.
In this embodiment of the present application, after the maximum exposure compensation value and the minimum exposure compensation value are determined, transition exposure compensation values between the maximum exposure compensation value and the first exposure compensation value and between the minimum exposure compensation value and the first exposure compensation value may be further determined, so that transitions of each region in the finally fused image are smooth and non-abrupt.
Further, the first determining module 202 is specifically configured to: dividing the reference image into a plurality of connected regions; and combining the connected regions meeting the same brightness condition according to the brightness of each connected region to obtain the target region.
In the embodiment of the application, the reference image may be firstly divided into a plurality of connected regions, and then the plurality of connected regions are combined to obtain the corresponding target region, so that the exposure compensation value of the image to be fused is dynamically determined according to the attribute of the target region.
Further, the first obtaining module 201 is specifically configured to: acquiring a first reference image with a first exposure compensation value acquired by a first camera in an image acquisition device and a second reference image with a seventh exposure compensation value acquired by a second camera in the image acquisition device; wherein the seventh exposure compensation value is less than the first exposure compensation value; the determining an exposure compensation value of the image to be fused according to the attribute of the target area comprises the following steps: and determining an exposure compensation value of the image to be fused according to the attribute of the target area in the first reference image and the attribute of the target area in the second reference image.
In the embodiment of the application, when the exposure compensation value of the image to be fused is determined, for a bright area, whether the edge information amount of the bright area is rich or not cannot be known due to overexposure. Therefore, if the amount of edge information of the bright area is not abundant, the dark area is forcibly fused, and the bright area of the image finally obtained by fusion is dark. Therefore, in a double-shot scene, two reference images with different exposure compensation can be respectively collected by the two cameras, so that the dark frame image can be taken firstly to analyze whether the dark frame image is needed.
Referring to fig. 3, fig. 3 is a block diagram of an image fusion apparatus according to an embodiment of the present disclosure, where the image fusion apparatus 300 includes: the second obtaining module 301 is configured to obtain a corresponding image to be fused according to the exposure compensation value; wherein the exposure compensation value is determined according to the exposure compensation value determination method as described in any one of the preceding embodiments; and a fusion module 302, configured to fuse the images to be fused to obtain a target image.
In the embodiment of the application, after the exposure compensation value of the image to be fused is dynamically determined according to the attribute of the target area meeting the brightness condition in the reference image, the image to be fused can be fused to obtain a final fused image, and the definition of the finally obtained image is high.
Referring to fig. 4, fig. 4 is a block diagram of an electronic device according to an embodiment of the present disclosure, where the electronic device 400 includes: at least one processor 401, at least one communication interface 402, at least one memory 403 and at least one communication bus 404. Wherein the communication bus 404 is used for implementing direct connection communication of these components, the communication interface 402 is used for communicating signaling or data with other node devices, and the memory 403 stores machine-readable instructions executable by the processor 401. When the electronic device 400 is in operation, the processor 401 communicates with the memory 403 via the communication bus 404, and the machine-readable instructions, when invoked by the processor 401, perform the exposure compensation value determination method described above.
For example, the processor 401 of the embodiment of the present application may read the computer program from the memory 403 through the communication bus 404 and execute the computer program to implement the following method: step S101: a reference image is acquired. Step S102: and determining a target area satisfying the brightness condition in the reference image. Step S103: and determining an exposure compensation value of the image to be fused according to the attribute of the target area.
The processor 401 may be an integrated circuit chip having signal processing capabilities. The Processor 401 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), application Specific Integrated Circuits (ASICs), field-Programmable Gate arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components. Which may implement or perform the various methods, steps, and logic blocks disclosed in the embodiments of the present application. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The Memory 403 may include, but is not limited to, random Access Memory (RAM), read Only Memory (ROM), programmable Read Only Memory (PROM), erasable Read Only Memory (EPROM), electrically Erasable Read Only Memory (EEPROM), and the like.
It will be appreciated that the configuration shown in fig. 3 is merely illustrative and that electronic device 400 may include more or fewer components than shown in fig. 3 or have a different configuration than shown in fig. 3. The components shown in fig. 3 may be implemented in hardware, software, or a combination thereof. In the embodiment of the present application, the electronic device 400 may be, but is not limited to, an entity device such as a desktop, a laptop, a smart phone, an intelligent wearable device, and a vehicle-mounted device, and may also be a virtual device such as a virtual machine. In addition, the electronic device 400 is not necessarily a single device, but may be a combination of multiple devices, such as a server cluster, and the like.
Embodiments of the present application further provide a computer program product, including a computer program stored on a non-transitory computer readable storage medium, the computer program including program instructions, when the program instructions are executed by a computer, the computer being capable of executing the steps of the exposure compensation value determination method in the above embodiments, for example, including: acquiring a reference image; determining a target area satisfying a brightness condition in the reference image; and determining an exposure compensation value of the image to be fused according to the attribute of the target area, wherein the image to be fused and the reference image correspond to the same shooting scene.
The embodiments of the present application also disclose a computer program product, which includes a computer program stored on a non-transitory computer readable storage medium, the computer program including program instructions, when the program instructions are executed by a computer, the computer can execute the method provided by the above method embodiments, for example, the method includes: acquiring a reference image; determining a target area satisfying a brightness condition in the reference image; and determining an exposure compensation value of the image to be fused according to the attribute of the target area, wherein the image to be fused and the reference image correspond to the same shooting scene.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
In addition, units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist alone, or two or more modules may be integrated to form an independent part.
It should be noted that the functions, if implemented in the form of software functional modules and sold or used as independent products, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (15)

1. An exposure compensation value determination method, comprising:
acquiring a reference image;
determining a target area satisfying a brightness condition in the reference image;
and determining an exposure compensation value of the image to be fused according to the attribute of the target area, wherein the image to be fused and the reference image correspond to the same shooting scene.
2. The exposure compensation value determination method according to claim 1, wherein the target area includes a dark area and/or a bright area;
wherein, if the target area comprises a dark area, the brightness condition comprises that the brightness of the area is less than a first brightness threshold; if the target area comprises a bright area, the brightness condition comprises that the brightness of the area is greater than a second brightness threshold.
3. The exposure compensation value determination method according to claim 1 or 2, wherein the attribute includes at least one of a luminance, an area, and an edge information amount.
4. The method for determining the exposure compensation value according to claim 3, wherein the determining the exposure compensation value of the image to be fused according to the attribute of the target area comprises:
if the target area comprises the dark area, determining a maximum exposure compensation value of the image to be fused according to the attribute of the dark area;
and if the target area comprises the bright area, determining the minimum exposure compensation value of the image to be fused according to the attribute of the bright area.
5. The exposure compensation value determination method according to claim 4, wherein the reference image has a first exposure compensation value, and the determining the maximum exposure compensation value of the image to be fused according to the attribute of the dark region comprises:
if the brightness of the dark area is smaller than a third brightness threshold, the area of the dark area is larger than a first area threshold, and the edge information amount of the dark area is larger than a first edge information amount threshold, determining the maximum exposure compensation value as a second exposure compensation value;
if the brightness of the dark area is greater than the third brightness threshold, or the area of the dark area is less than the first area threshold, or the edge information amount of the dark area is less than the first edge information amount threshold, determining that the maximum exposure compensation value is a third exposure compensation value; wherein the second exposure compensation value is greater than the third exposure compensation value, and the third exposure compensation value is greater than or equal to the first exposure compensation value.
6. The exposure compensation value determination method according to claim 4, wherein the reference image has a first exposure compensation value, and the determining the minimum exposure compensation value of the image to be fused according to the property of the bright area comprises:
if the brightness of the bright area is smaller than a fourth brightness threshold or the area of the bright area is smaller than a second area threshold, determining the minimum exposure compensation value as a fourth exposure compensation value;
if the brightness of the bright area is greater than the fourth brightness threshold and the area of the bright area is greater than the second area threshold, acquiring a first candidate fusion image with a fifth exposure compensation value; wherein the fifth exposure compensation value is less than the fourth exposure compensation value, which is less than or equal to the first exposure compensation value;
and determining the minimum exposure compensation value according to the edge information amount of a first area corresponding to a bright area in the reference image in the first candidate fusion image.
7. The exposure compensation value determination method according to claim 6, wherein the determining the minimum exposure compensation value from an amount of edge information of a first region in the first candidate fused image corresponding to a bright region in the reference image includes:
if the edge information amount of the first area is greater than a second edge information amount threshold, determining that the minimum exposure compensation value is the fifth exposure compensation value;
and if the edge information amount of the first area is smaller than the second edge information amount threshold, determining the minimum exposure compensation value as the first exposure compensation value.
8. The exposure compensation value determination method according to claim 6, wherein the determining the minimum exposure compensation value from an amount of edge information of a first region in the first candidate fused image corresponding to a bright region in the reference image includes:
if the edge information amount of the first area is larger than a second edge information amount threshold value, determining the minimum exposure compensation value as the fifth exposure compensation value;
if the edge texture information amount of the first area is smaller than the second edge information amount threshold value, acquiring a second candidate fusion image with a sixth exposure compensation value; wherein the sixth exposure compensation value is greater than the fifth exposure compensation value and less than the first exposure compensation value;
and determining the minimum exposure compensation value according to the edge information amount of a second area corresponding to the bright area in the reference image in the second candidate fusion image.
9. The exposure compensation value determination method according to any one of claims 4 to 8, wherein the reference image has a first exposure compensation value, and the determining the maximum exposure compensation value of the image to be fused according to the attribute of the dark region comprises:
determining a maximum exposure compensation value of the image to be fused and at least one transition exposure compensation value between the maximum exposure compensation value and the first exposure compensation value according to the attributes of the dark area;
the determining the minimum exposure compensation value of the image to be fused according to the attributes of the bright area comprises the following steps:
and determining a minimum exposure compensation value of the image to be fused and at least one transition exposure compensation value between the minimum exposure compensation value and the first exposure compensation value according to the attributes of the bright areas.
10. The exposure compensation value determination method according to any one of claims 1 to 9, wherein the determining of the target region in the reference image that satisfies a luminance condition includes:
dividing the reference image into a plurality of connected regions;
and combining the connected regions meeting the same brightness condition according to the brightness of each connected region to obtain the target region.
11. The exposure compensation value determining method according to any one of claims 1 to 10, wherein the acquiring of the reference image includes:
acquiring a first reference image with a first exposure compensation value acquired by a first camera in an image acquisition device and a second reference image with a seventh exposure compensation value acquired by a second camera in the image acquisition device; wherein the seventh exposure compensation value is less than the first exposure compensation value;
the determining of the exposure compensation value of the image to be fused according to the attribute of the target area comprises the following steps:
and determining an exposure compensation value of the image to be fused according to the attribute of the target area in the first reference image and the attribute of the target area in the second reference image.
12. An image fusion method, comprising:
acquiring a corresponding image to be fused according to the exposure compensation value; wherein the exposure compensation value is determined according to the exposure compensation value determination method according to any one of claims 1 to 11;
and fusing the images to be fused to obtain a target image.
13. An electronic device, comprising: a processor, memory, and a bus;
the processor and the memory are communicated with each other through the bus;
the memory stores program instructions executable by the processor, the processor being capable of executing the exposure compensation value determination method according to any one of claims 1 to 11 or the image fusion method according to claim 12 when invoked by the processor.
14. A computer-readable storage medium storing computer instructions which, when executed by a computer, cause the computer to execute the exposure compensation value determination method according to any one of claims 1 to 11 or the image fusion method according to claim 12.
15. A computer program product comprising a computer program, characterized in that the computer program, when being executed by a processor, implements the exposure compensation value determination method according to any one of claims 1 to 11 or the image fusion method according to claim 12.
CN202111086445.2A 2021-09-16 2021-09-16 Exposure compensation value determining method and image fusion method Pending CN115829888A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111086445.2A CN115829888A (en) 2021-09-16 2021-09-16 Exposure compensation value determining method and image fusion method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111086445.2A CN115829888A (en) 2021-09-16 2021-09-16 Exposure compensation value determining method and image fusion method

Publications (1)

Publication Number Publication Date
CN115829888A true CN115829888A (en) 2023-03-21

Family

ID=85515046

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111086445.2A Pending CN115829888A (en) 2021-09-16 2021-09-16 Exposure compensation value determining method and image fusion method

Country Status (1)

Country Link
CN (1) CN115829888A (en)

Similar Documents

Publication Publication Date Title
CN108335279B (en) Image fusion and HDR imaging
CN111028189B (en) Image processing method, device, storage medium and electronic equipment
US9934438B2 (en) Scene recognition method and apparatus
US11563897B2 (en) Image processing method and apparatus which determines an image processing mode based on status information of the terminal device and photographing scene information
US9451173B2 (en) Electronic device and control method of the same
EP3836534A1 (en) Imaging control method, apparatus, electronic device, and computer-readable storage medium
US9357127B2 (en) System for auto-HDR capture decision making
CN109829859B (en) Image processing method and terminal equipment
CN110443766B (en) Image processing method and device, electronic equipment and readable storage medium
CN109618102B (en) Focusing processing method and device, electronic equipment and storage medium
CN110290325B (en) Image processing method, image processing device, storage medium and electronic equipment
CN113259594A (en) Image processing method and device, computer readable storage medium and terminal
WO2023137956A1 (en) Image processing method and apparatus, electronic device, and storage medium
CN112653845B (en) Exposure control method, exposure control device, electronic equipment and readable storage medium
CN110740266B (en) Image frame selection method and device, storage medium and electronic equipment
CN112822413B (en) Shooting preview method, shooting preview device, terminal and computer readable storage medium
CN111885371A (en) Image occlusion detection method and device, electronic equipment and computer readable medium
CN115829888A (en) Exposure compensation value determining method and image fusion method
CN113747062B (en) HDR scene detection method and device, terminal and readable storage medium
CN112085002A (en) Portrait segmentation method, portrait segmentation device, storage medium and electronic equipment
CN111970501A (en) Pure color scene AE color processing method and device, electronic equipment and storage medium
CN110868549A (en) Exposure control method and device and electronic equipment
US20230085693A1 (en) Image capturing device and image calculation method
CN114998957A (en) Automatic exposure data processing method, device, equipment and system
CN116233379A (en) Image brightness adjustment method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination