CN116055891A - Image processing method and device - Google Patents

Image processing method and device Download PDF

Info

Publication number
CN116055891A
CN116055891A CN202310004425.9A CN202310004425A CN116055891A CN 116055891 A CN116055891 A CN 116055891A CN 202310004425 A CN202310004425 A CN 202310004425A CN 116055891 A CN116055891 A CN 116055891A
Authority
CN
China
Prior art keywords
image
area
region
target
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310004425.9A
Other languages
Chinese (zh)
Inventor
刘思航
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202310004425.9A priority Critical patent/CN116055891A/en
Publication of CN116055891A publication Critical patent/CN116055891A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20208High dynamic range [HDR] image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Abstract

The application discloses the technical field of image processing, and in particular relates to an image processing method and device. The method comprises the following steps: acquiring a first image, wherein the first image comprises a plurality of pixel units, and each pixel unit comprises a plurality of pixels; in a case where a first region of the first image overlaps with a motion region of the first image, performing a target operation on pixels of the first region to obtain M sets of region images, the target operation including: at least one of a sampling operation and a multi-pixel integration operation, M being a positive integer; obtaining a high dynamic range image corresponding to the first image based on the M groups of region images and the first image; wherein the brightness of the first area is outside a preset brightness range; the M sets of region images correspond to M luminance ranges.

Description

Image processing method and device
Technical Field
The application belongs to the technical field of image processing, and particularly relates to an image processing method and an image processing device.
Background
In a high dynamic scene, in order to avoid the problems of dead black in a dark area, overexposure in a bright area and the like, a surrounding exposure (black exposure) can be generally adopted, namely, a plurality of frames of images with different exposures are shot; and then taking one frame of image in the photographed multi-frame images as a reference frame, and fusing the images to obtain the high dynamic range image.
However, according to the above method, when there is motion in the overexposed region or the dead black region of the reference frame, the registration accuracy of the pre-fusion image is limited, so that a ghost (high) phenomenon may exist in the high dynamic range scene image, thereby resulting in poor display effect of the high dynamic range image.
Disclosure of Invention
An object of the embodiments of the present application is to provide an image processing method and an image processing device, which can solve the problem of poor display effect of a high dynamic range image.
In a first aspect, an embodiment of the present application provides an image processing method, including: acquiring a first image, wherein the first image comprises a plurality of pixel units, and each pixel unit comprises a plurality of pixels; in a case where a first region of the first image overlaps with a motion region of the first image, performing a target operation on pixels of the first region to obtain M sets of region images, the target operation including: at least one of a sampling operation and a multi-pixel integration operation, M being a positive integer; determining a high dynamic range image corresponding to the first image based on the M groups of region images and the first image; wherein the brightness of the first area is outside a preset brightness range; the M sets of region images correspond to M luminance ranges.
In a second aspect, an embodiment of the present application provides an image processing apparatus including: the device comprises an acquisition module and a processing module; the acquisition module is used for acquiring a first image, wherein the first image comprises a plurality of pixel units, and each pixel unit comprises a plurality of pixels; the processing module is configured to perform a target operation on pixels of the first area to obtain M groups of area images when the first area of the first image acquired by the acquiring module overlaps with a motion area of the first image, where the target operation includes: at least one of a sampling operation and a multi-pixel integration operation, M being a positive integer; the processing module is further used for determining a high dynamic range image corresponding to the first image based on the M groups of area images and the first image; wherein the brightness of the first area is outside a preset brightness range; the M sets of region images correspond to M luminance ranges.
In a third aspect, embodiments of the present application provide an electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the method as described in the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium having stored thereon a program or instructions which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, embodiments of the present application provide a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and where the processor is configured to execute a program or instructions to implement a method according to the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product stored in a storage medium, the program product being executable by at least one processor to implement the method according to the first aspect.
In the embodiment of the application, a first image may be acquired, where the first image includes a plurality of pixel units, and each pixel unit includes a plurality of pixels; and under the condition that a first area of the first image is overlapped with a motion area of the first image, performing a target operation on the first area to obtain M groups of area images, wherein the target operation comprises the following steps: at least one of a sampling operation and a multi-pixel integration operation, M being a positive integer; determining a high dynamic range image corresponding to the first image based on the M groups of region images and the first image; wherein the brightness of the first area is outside a preset brightness range; the M groups of region images correspond to M luminance ranges. According to the scheme, when the first area (including the highlight area or the dead black area) of the brightness of the image, which is outside the preset brightness range, overlaps with the moving area of the image, such as the moving object is included in the first image area, the M groups of area images corresponding to the M brightness ranges can be obtained by performing sampling and/or multi-pixel unification operation on the first area, so that the high registration precision between the M groups of area images and the first image can be ensured, and ghost images in the high dynamic range image can be avoided. This can improve the display effect of the high dynamic range image.
Drawings
Fig. 1 is a schematic flow chart of one possible image processing method according to an embodiment of the present application;
FIG. 2 is a schematic view of a first region in an embodiment of the present application;
FIG. 3 is a schematic diagram of a first region and a region image in an embodiment of the present application;
fig. 4 is a schematic structural diagram of an image processing apparatus provided in an embodiment of the present application;
fig. 5 is one of schematic structural diagrams of an electronic device according to an embodiment of the present application;
fig. 6 is a second schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Technical solutions in the embodiments of the present application will be clearly described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application are within the scope of the protection of the present application.
The terms first, second and the like in the description and in the claims, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged, as appropriate, such that embodiments of the present application may be implemented in sequences other than those illustrated or described herein, and that the objects identified by "first," "second," etc. are generally of a type and not limited to the number of objects, e.g., the first object may be one or more. Furthermore, in the description and claims, "and/or" means at least one of the connected objects, and the character "/", generally means that the associated object is an "or" relationship.
The following is a description of terms involved in the embodiments of the present application.
Four bayer image sensor (Quad bayer sensor): quad layer is a "four-pixel-in-one" image sensor technology; quad refers to that four same-color pixels are arranged together to form a pixel unit; bayer refers to bayer pattern (bayer). The bayer array is a color filter array commonly used for an image sensor, and is used for acquiring color information of three color channels of Red (Red, R) Green (Green, G) blue (hole, B), and each pixel unit acquires only one color. The four bayer image sensor increases the pixel density four times, and the output of the four bayer image sensor may convert the four bayer image into a normal bayer array image by a rearrangement demosaicing (remosaic) algorithm.
The image processing method and the device provided by the embodiment of the application are described in detail below by means of specific embodiments and application scenes thereof with reference to the accompanying drawings.
The existing methods for solving the problem of ghosting can be mainly divided into two types: 1, from an algorithm point of view, the ghost can be reduced to a certain extent by either a deghost (deghost) algorithm or an artificial intelligence (Artificial Intelligence, AI) fusion (fusion) algorithm. But the effect of these two algorithms is still quite undesirable when motion occurs in overexposed or dead black areas. 2, from the perspective of hardware, a sensor which performs long exposure and short exposure almost simultaneously, such as a digital overlay technology image sensor (trigger), has a short frame interval, but has obvious displacement in a motion area, and a scheme which realizes different exposure by means of one exposure with two gains, such as a dual conversion gain technology (Dual Conversion Gain, DCG) image sensor, has a short exposure image quality problem and the like.
In summary, there is no solution for solving the ghost phenomenon in the high dynamic range image in the related art.
In order to solve the technical problems, the embodiment of the application provides a photographing method for realizing the motion of a high dynamic scene without ghosting by combining a four-Bayer image sensor, and meanwhile, the image quality of a dark area can be well ensured.
Specifically, in a high dynamic scene, a first image may be acquired, where the first image includes a plurality of pixel units, and each pixel unit includes a plurality of pixels; in a case that a first region of the first image overlaps with a motion region of the first image, performing a target operation on the first region to obtain M groups of region images, the target operation including: at least one of a sampling operation and a multi-pixel integration operation, M being a positive integer; determining a high dynamic range image corresponding to the first image based on the M groups of region images and the first image; wherein the brightness of the first area is outside a preset brightness range; the M groups of region images correspond to M luminance ranges. Therefore, when the first area (including the highlight area or the dead black area) of the brightness of the image outside the preset brightness range overlaps with the moving area of the image, such as the moving object is included in the first image area, the sampling and/or multi-pixel unification operation can be performed on the first area to obtain the M groups of area images corresponding to the M brightness ranges, so that the higher registration precision between the M groups of area images and the first image can be ensured, and the occurrence of ghosts in the high dynamic range image can be avoided. This can improve the display effect of the high dynamic range image.
An embodiment of the present application provides an image processing method, fig. 1 shows a possible flow chart of the image processing method provided by the embodiment of the present application, and as shown in fig. 1, the image processing method provided by the embodiment of the present application may include the following steps 101 to 103. The following is an illustration of an electronic device performing the method.
Step 101, the electronic device acquires a first image.
The first image may include a plurality of pixel units, each pixel unit including a plurality of pixels. For example, each pixel unit may include 4 pixels, 6 pixels, or 8 pixels, which may be determined according to a sensor used to capture the first image.
In this embodiment of the present application, each pixel unit corresponds to one color channel. Wherein the color channel may be one of: r channel, G channel, B channel.
Specifically, the pixel channel of the pixel in each pixel unit is the same as the pixel channel corresponding to the pixel unit. That is, each pixel unit includes a plurality of pixels with the same pixel channel, that is, the pixels in each pixel unit have the same color.
For example, as shown in fig. 2, the first area 20 of the first image includes 4 pixel units, where the pixel unit 21 includes 4 red pixels, which are respectively: r1 to r4; the pixel unit 21 includes 4 green pixels, which are respectively: 11 g to 14 g; the pixel unit 22 also includes 4 green pixels, which are respectively: g 21-g 24; the pixel unit 23 includes 4 blue pixels, which are respectively: b1 to b4.
Optionally, the electronic device acquiring the first image includes: the electronic device captures a first image, downloads the first image from a network, or invokes a locally stored first image.
Alternatively, the first image may be an image acquired by a four bayer image sensor. In other words, the first image may be referred to as a four bayer image.
For a description of the four bayer image sensor, see the description of the four bayer image sensor in the above embodiment.
Step 102, the electronic device performs a target operation on the pixels of the first region under the condition that the first region of the first image overlaps with the motion region of the first image, so as to obtain M groups of region images.
Wherein the target operation may include: at least one of the sampling operation and the multi-pixel integration operation, M may be a positive integer.
In this embodiment, the brightness of the first area is outside the preset brightness range, and the M groups of area images correspond to M brightness ranges.
In the embodiment of the present application, after the electronic device acquires the first image, it may first determine whether a first area of the first image overlaps with a motion area of the first image, and if not, it indicates that no ghost phenomenon exists in a high dynamic range image obtained by processing the first image according to an existing method, so that the electronic device may directly sequentially perform rearrangement demosaicing processing and tone mapping processing on the first image. If so, the high dynamic range image obtained by processing the first image according to the existing method is shown to have a ghost phenomenon, so that the electronic equipment can execute target operation on the pixels of the first area to obtain M groups of area images.
Optionally, the lower limit luminance and the upper limit luminance of the preset luminance range are respectively: a first preset brightness and a second preset brightness.
When the brightness of the first area is less than or equal to the first preset brightness, the first area is a dead black area, that is, the brightness of the first area is lower. When the brightness of the first area is greater than or equal to the second preset brightness, the first area is an overexposed area, namely the brightness of the first area is overhigh.
It can be seen that the preset brightness range is used to judge the overexposed area and the dead black area in the image. In other words, when the luminance of one image area is within the preset luminance range, it means that the image area is neither an overexposed area nor a dead black area.
In this embodiment of the present application, the first preset brightness and the second preset brightness may be set according to actual use requirements, which is not limited in this embodiment of the present application.
In this embodiment of the present application, the M groups of area images are in one-to-one correspondence with M luminance ranges, and the M luminance ranges are different.
In the embodiment of the application, each set of region images in the M sets of region images includes at least one region image. The brightness of the at least one region image is within a brightness range corresponding to the region image.
When a plurality of area images are included in a set of area images, the difference in brightness between the plurality of area images is small.
In the embodiment of the present application, the first region may also be referred to as a region of interest of the first image.
It should be noted that, the pixel arrangement mode of the area image is the same as the pixel arrangement mode of the first area. And the pixel units in the area image are in one-to-one correspondence with the pixel units in the first area.
For example, the pixel arrangement of the region image 30 shown in (a) in fig. 3 is the same as that of the first region 31 shown in (a) in fig. 3.
In this embodiment of the present application, the size of the area image is the same as the size of the first area.
In this embodiment of the present application, M is less than or equal to the number of pixels included in the pixel unit of the first image.
Optionally, overlapping the first region with the motion region of the first image comprises: partially overlapping, fully overlapping.
The motion area of the first image refers to an area corresponding to a motion path of a moving object in the first image.
Optionally, taking the first image as an image captured by the electronic device as an example, the electronic device may buffer continuous preview frames in a preview stage, and perform motion detection on the buffered image of the preview frames by using a common manner such as a frame difference method or an optical flow method, so as to determine a first motion area in the captured scene. Then, after the electronic device captures images (such as the first image and the second image) of the captured scene, the region corresponding to the first motion region in the image may be directly used as the motion region of the image.
Alternatively, the electronic device may obtain the brightness of the pixel by converting the image into a color space (e.g., HSV color space, hue, saturation, and brightness) containing a brightness channel, comparing the brightness Value of each pixel with a preset overexposure brightness threshold (i.e., a first preset brightness) and a dead black brightness threshold (i.e., a second preset brightness), and determining an image area exceeding the corresponding brightness threshold and the number of pixels in the continuous range of the color space as an overexposure area and a dead black area.
Optionally, the electronic device performs a target operation on the pixels of the first region to obtain M groups of region images, which may include step a described below.
And step A, the electronic equipment executes the target operation on the pixels of the first area according to the M sampling numbers to obtain M groups of area images corresponding to the M sampling numbers one by one.
Wherein each of the sampling numbers is used for indicating the number of collected pixels from each pixel unit of the first area;
each sampling number corresponds to at least one sampling pattern, each sampling pattern being indicative of a pattern of capturing pixels from each pixel cell of the first region.
In this embodiment of the present application, the number of samples is less than or equal to the number of pixels in each pixel unit of the first area.
Alternatively, assuming that N pixels are included in each pixel unit of the first image, each sampling number may be: any one of 1 to N, N being an integer greater than 1.
It can be seen that the larger the number of samples, the greater the brightness of a set of area images corresponding to the number of samples.
It can be seen that, for each sampling number, the electronic device may perform at least one target operation on the pixels of the first area according to the sampling number and at least one sampling manner, so as to obtain a set of area images, where the same area image corresponds to the same sampling number, and different area images of the same area image correspond to different sampling manners.
It should be noted that, the electronic device may determine a group of pixels in the first area based on the target number of samples in the M number of samples and one sampling manner corresponding to the target number of samples, and generate a target area image according to the pixel values of the group of pixels, where the target area image is one area image in the group of area images corresponding to the target number of samples.
Specifically, assuming that each pixel unit of the first area includes X pixels, then: for a first pixel unit in the first area, the first pixel unit may be any pixel unit in the first area, and the electronic device may determine X sub-groups of pixels from the first pixel unit according to the target sampling number and the target sampling manner, where each sub-group of pixels includes the target sampling number of pixels; and determining the pixel value of one pixel in a second pixel unit according to the pixel values of the sub-groups of pixels, wherein the second pixel unit is a pixel unit corresponding to the first pixel unit in the target area image.
Wherein the position of the first pixel unit in the first region is the same as the position of the second pixel unit in the target region image.
In the embodiment of the present application, "determining the pixel value of one pixel in the second pixel unit according to the pixel value of each sub-group of pixels" may include any one of the following: 1) Determining the sum of the pixel values of the sub-groups of pixels as the pixel value of the second pixel unit in one pixel; 2) Multiplying the sum of the pixel values of the target sampling number pixels in each subgroup by a value, and determining the pixel value obtained by multiplying as the pixel value of one pixel in the second pixel unit, wherein the value may be: any number greater than 0, such as 0.5, 0.8 or 1.1, may be used.
The image processing method provided in the embodiment of the present application is exemplarily described below taking an example in which the electronic device determines the bin value of one pixel in one pixel unit in one image area a.
For one of the pixel units 1 in the first area, the electronic device determines the sampling number of 1 pixel from the pixel unit d1 in a sampling manner 1; the electronic device may then determine a pixel value based on the determined pixel value of the pixel; and the pixel value is taken as one pixel p in the area image a corresponding to the sampling number 1 and the sampling mode 1. Wherein, the pixel p is assumed to be in the area image
The pixel cell at this point is pixel cell d2, then: the position of the pixel unit d1 in the first area is the same as the position of the pixel 5 unit 2 in the area image.
The principle of obtaining M sets of area images by the electronic device will be described in detail below taking the first image as a four bayer image as an example.
In the embodiment of the present application, when the first region overlaps with the motion region in the first image, the first region
The domain performs sampling and multiple pixel unification (binning) operations. By controlling the manner of fusion of the three color components 0 of the four bayer image and the number of each color component used, i.e., the number of samples, area images of different brightness ranges, also referred to as different exposure intensities, corresponding to the first area are generated. Specifically, the three color components of the four bayer image include: the R component, also known as R; the G component, also known as G; the B component, also called B.
It will be appreciated that a group of region images from the above-described M groups of region images may be constituted using the same number of color components.
5 assuming that the first region is the image region shown in fig. 2, the 4 sets of region images include: group 1, group 2, group 3 and group 4, and the number of color components used by the 4-group area image is in order: 1. 2, 3, 4, then:
i for group 1, the pixel value of each pixel in the second pixel unit may be defined by any one of r1 to r4
The pixel determination, i.e. for each pixel, takes the pixel value of one pixel of the first pixel unit as the pixel value of this pixel 0. I.e. group 1 supports 4-choice 1, i.e. C (4, 1).
ii for group 2, the pixel value of each pixel in the second pixel unit may be determined by any 2 pixels from r1 to r4, i.e. for each pixel the pixel value of that pixel is determined from the pixel values of the 2 pixels of the first pixel unit, i.e. C (4, 2).
iii, for group 3, the pixel value of each pixel in the second pixel unit may be determined by any 35 pixels of r 1-r 4, i.e. for each pixel, based on the pixel values of 3 pixels of the first pixel unit
The pixel value of this pixel, i.e. C (4, 3).
iii, for group 3, the pixel value of each pixel in the second pixel unit may be determined by any 3 pixels of r 1-r 4, i.e. for each pixel the pixel value of that pixel is determined from the pixel values of the 3 pixels of the first pixel unit, i.e. C (4, 4).
The same set of area images includes at least one area image. For example, the group 2 may include therein an area image 31 shown in (b) of fig. 3 and an image area 32 shown in (c) of fig. 3.
It can be seen that the area image in group 1 has the lowest brightness because the number of color components used is the smallest, so that overexposure of bright areas can be prevented better.
The area images in the group 2 are images with dark brightness because the number of the used color components is 2, so that the area images in the group 2 can contain more texture detail information; and different texture detail information can be contained in different area images.
The area image in group 3 is an image with a brighter luminance because the number of color components used is 3.
The area image in the group 4 has the highest brightness because the most color components are used, and the best noise performance can better ensure the image quality of the dark area.
In addition, the fusion of multiple combinations of the same brightness can obtain different weak textures and detail expressions, so that the weak textures and the detail can be better reserved in the final high dynamic range image.
In the embodiment of the present application, the brightness range corresponding to a set of area images is positively correlated with the sampling number corresponding to the set of area images.
For example, if the L luminance corresponding to group 4 is considered, group 2 corresponds to 0.75L luminance, group 2 corresponds to 0.5L luminance, and group 1 corresponds to 0.25L luminance.
In this way, since the pixel values of the pixels in the area image are determined by the pixel values of the fixed number of pixels in the pixel unit corresponding to the pixel unit where the pixels are located in the first area, the image content of the area image is the same as that of the first area, but the brightness of the area image and the brightness of the first area can be different, so that higher registration accuracy between the area image and the first area can be ensured.
And 103, the electronic equipment obtains a high dynamic range image corresponding to the first image based on the M groups of area images and the first image.
In this embodiment of the present application, the electronic device may process the M groups of area images and the first image to obtain a high dynamic range image corresponding to the first image.
Alternatively, step 103 may be specifically implemented by steps 103a to 103d described below.
And 103a, the electronic equipment fuses the M groups of area images to obtain a target area image.
Alternatively, the above step 103a may be specifically realized by the following steps 103a1 and 103a 2.
Step 103a1, the electronic device determines fusion weights corresponding to the region images of the M groups of region images.
Step 103a2, the electronic device fuses the M area images according to the fusion weight to obtain a target area image.
Step 103b, the electronic device synthesizes the target area image and the first image to obtain a second image.
Step 103c, the electronic device performs rearrangement demosaicing processing on the second image to obtain a third image.
Step 103d, the electronic device performs tone mapping processing on the third image, so as to obtain a high dynamic range image.
Wherein the fusion weight comprises at least one of: good exposure estimation weight, sharpness weight.
For a description of step 103c and step 103, reference may be made to the related art.
Alternatively, the electronic device may determine the good exposure estimation weight by the following formula (1):
W i =exp(-(i-0.5) 2 /(2*sigma 2 )) (1);
wherein i is a pixel value normalized to 0-1; 0.5 is the middle value of the pixel value range; sigma is an adjustable parameter, the value range of the sigma is 0-1, and the brightness of the target area image can be changed by adjusting the sigma; exp is an exponential function; wi is the good exposure estimation weight for the i-th pixel in the area image.
Alternatively, sharpness weights may be determined using a sobel (sobel) or laplace (laplace) operator.
Alternatively, the step 103b may be implemented in a manner 1 or a manner 2.
In mode 1, an electronic device performs pixel alignment on a target area image and an image area except for a first area in a first image to obtain a second image.
In the mode 2, the electronic device may fuse the target area image with the first area to obtain the second image.
Optionally, taking the first image as an example of an image captured by the electronic device, before step 101, the image processing method provided in the embodiment of the present application may further include the following steps 104 and 105, where step 101 may be specifically implemented by the following step 101 a.
And 104, the electronic equipment determines target coincidence parameters.
Wherein the target coincidence parameter may be used to indicate: a degree of coincidence between the second region of the fourth image and the moving region of the fourth image. The brightness of the second region is outside the preset brightness range.
Alternatively, the fourth image may be the last image taken before the first image was taken. Alternatively, the second image may be a preview image displayed in the image preview interface before the first image is captured.
Alternatively, the target coincidence parameter may be a coincidence ratio.
Step 105, the electronic device adjusts the exposure parameters based on the target coincidence parameter.
Step 101a, the electronic device adopts the adjusted exposure parameters to shoot a first image.
Optionally, the exposure parameters may include at least one of: exposure time, exposure gain.
Alternatively, it is assumed that the lower limit luminance and the upper limit luminance of the preset luminance range are respectively: first preset brightness and second preset brightness, then: when the brightness of the second area is smaller than or equal to the first preset brightness, namely the moving area is overlapped with the dead black area, the electronic equipment can increase the exposure time and reduce the exposure gain according to the target overlapping parameter; when the brightness of the second area is greater than or equal to the second preset brightness, that is, the movement area overlaps the overexposure area, the electronic device may reduce the exposure time and increase the exposure gain according to the target overlapping parameter.
Illustratively, assume that at the current sensitivity of the electronic device, the electronic device defaults to an exposure time of T and the exposure gain of G; then: taking the case that the motion area and the overexposure area overlap, that is, the highlight area has motion, when the overlapping area exceeds the preset picture proportion 0.0001,0.0002,0.0005, the exposure time is respectively adjusted as follows: t/2, T/4, T/8; the exposure gains are respectively adjusted as follows: 2G,4G,8G.
Accordingly, taking the case that the moving area overlaps with the dead black area, that is, the dead black area has a movement, when the overlapping area exceeds the preset picture proportion 0.0001,0.0002,0.0005, the exposure time is respectively adjusted as follows: 2t,4t,8t; the exposure gains are respectively adjusted as follows: g/2, G/4, G/8.
In this way, in the high dynamic motion scene, the exposure parameters can be adjusted first, so that the exposure parameters of the first image are more excellent, and the display effect of the finally obtained high dynamic range image can be further improved.
According to the image processing method provided by the embodiment of the application, the execution subject can be an image processing device. In the embodiment of the present application, an image processing apparatus provided in the embodiment of the present application will be described by taking an example in which the image processing apparatus executes an image processing method.
An image processing apparatus is provided in the embodiment of the present application, fig. 4 is a schematic diagram showing a possible structure of an image processing method provided in the embodiment of the present application, and as shown in fig. 4, an image processing apparatus 40 provided in the embodiment of the present application may include: an acquisition module 41 and a processing module 42; the acquiring module 41 is configured to acquire a first image, where the first image includes a plurality of pixel units, and each pixel unit includes a plurality of pixels;
the processing module 42 is configured to perform, when the first area of the first image acquired by the acquiring module 41 overlaps with the motion area of the first image, a target operation on pixels of the first area to obtain M groups of area images, where the target operation includes: at least one of a sampling operation and a multi-pixel integration operation, M being a positive integer;
The processing module 42 is further configured to obtain a high dynamic range image corresponding to the first image based on the M sets of area images and the first image;
wherein the brightness of the first area is outside a preset brightness range;
the M groups of region images correspond to M luminance ranges.
In a possible implementation manner, the processing module 42 is specifically configured to determine a pixel value of one pixel in the second pixel unit based on a pixel value of a target number of pixels in the first pixel unit;
the first pixel unit is a pixel unit of the first area, the second pixel unit is a pixel unit of the area image, and the position of the first pixel unit in the area image is the same as the position of the second pixel unit in the first area;
wherein the M groups of region images correspond to M target numbers.
In one possible implementation, the processing module 42 is specifically configured to:
fusing the M groups of region images to obtain a target region image;
synthesizing the target area image and the first image to obtain a second image;
performing rearrangement demosaicing processing on the second image to obtain a third image;
And performing tone mapping processing on the third image to obtain the high dynamic range image.
In a possible implementation manner, the processing module 42 is specifically configured to:
determining fusion weights corresponding to the region images of the M groups of region images;
according to the fusion weight, fusing the M area images to obtain the target area image;
wherein the fusion weight includes at least one of: good exposure estimation weight, sharpness weight.
In a possible implementation manner, the processing module 42 is further configured to determine a target coincidence parameter before the acquiring module 41 acquires the first image; adjusting exposure parameters based on the target coincidence parameters;
the acquiring module 41 is specifically configured to take the first image by using the adjusted exposure parameter;
wherein the target coincidence parameter is used for indicating: a degree of coincidence between a second region of a fourth image and a moving region of the fourth image;
the fourth image is the last image shot before the first image is shot; the brightness of the second area exceeds the preset brightness range.
In the image processing device provided in the embodiment of the present application, when a first area (including a highlight area or a dead black area) of an image with brightness outside a preset brightness range overlaps a moving area of the image, such as the first image area includes a moving object, sampling and/or multi-pixel integration operation may be performed on the first area to obtain M groups of area images corresponding to M brightness ranges, so that higher registration accuracy between the M groups of area images and the first image may be ensured, and further, ghosting in the high dynamic range image may be avoided. This can improve the display effect of the high dynamic range image.
The image processing apparatus in the embodiment of the present application may be an electronic device, or may be a component in an electronic device, for example, an integrated circuit or a chip. The electronic device may be a terminal, or may be other devices than a terminal. By way of example, the electronic device may be a mobile phone, tablet computer, notebook computer, palm computer, vehicle-mounted electronic device, mobile internet appliance (Mobile Internet Device, MID), augmented reality (augmented reality, AR)/Virtual Reality (VR) device, robot, wearable device, ultra-mobile personal computer, UMPC, netbook or personal digital assistant (personal digital assistant, PDA), etc., but may also be a server, network attached storage (Network Attached Storage, NAS), personal computer (personal computer, PC), television (TV), teller machine or self-service machine, etc., and the embodiments of the present application are not limited in particular.
The image processing apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android operating system, an ios operating system, or other possible operating systems, which are not specifically limited in the embodiments of the present application.
The image processing apparatus provided in this embodiment of the present application can implement each process implemented by the embodiments of the methods of fig. 1 to 3, and in order to avoid repetition, a description is omitted here.
Optionally, as shown in fig. 5, an embodiment of the present application further provides an electronic device 600, including
The processor 601 and the memory 602, the memory 602 stores a program 5 or an instruction that can run on the processor 601, and the program or the instruction implement each step of the shadow estimation method embodiment when executed by the processor 601, and achieve the same technical effects, so that repetition is avoided and no further description is provided herein.
The electronic device in the embodiment of the application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 6 is a schematic hardware structure of an electronic device implementing an embodiment of the present application.
0 the electronic device 7000 includes, but is not limited to: radio frequency unit 7001, network module 7002, and audio transmission
A unit 7003, an input unit 7004, a sensor 7005, a display unit 7006, a user input unit 7007, an interface unit 7008, a memory 7009, a processor 7010, and the like.
Those skilled in the art will appreciate that the electronic device 7000 may further include a power source (e.g., a battery) for powering the various components, which may be logically connected to the processor 7010 by a power management system, such as 5 to perform functions such as managing charging, discharging, and power consumption by the power management system. Shown in FIG. 6
The electronic device structure does not constitute a limitation of the electronic device, and the electronic device may include more or less components than illustrated, or may combine certain components, or may be arranged in different components, which are not described in detail herein.
An input unit 7004 for acquiring a first image, where the first image includes a plurality of pixel units, and each pixel unit includes a plurality of pixels;
0, a processor 7010 for obtaining a first image of the first image obtained by the input unit 7004
And under the condition that the area is overlapped with the motion area of the first image, performing target operation on pixels of the first area to obtain M groups of area images, wherein the target operation comprises the following steps: at least one of a sampling operation and a multi-pixel integration operation, M being a positive integer;
the processor 7010 is further configured to obtain 5 a high dynamic range image corresponding to the first image based on the M groups of area images and the first image;
wherein the brightness of the first area is outside a preset brightness range; the M groups of region images correspond to M luminance ranges.
In one possible implementation, the processor 7010 is specifically configured to determine a pixel value of one pixel in the second pixel unit based on a pixel value of a target number of pixels in the first pixel unit;
The first pixel unit is a pixel unit of the first area, the second pixel unit is a pixel unit of the area image, and the position of the first pixel unit in the area image is the same as the position of the second pixel unit in the first area;
wherein the M groups of region images correspond to M target numbers.
In one possible implementation, the processor 7010 is specifically configured to:
fusing the M groups of region images to obtain a target region image;
synthesizing the target area image and the first image to obtain a second image;
performing rearrangement demosaicing processing on the second image to obtain a third image;
and performing tone mapping processing on the third image to obtain the high dynamic range image.
In one possible implementation, the processor 7010 is specifically configured to:
determining fusion weights corresponding to the region images of the M groups of region images;
according to the fusion weight, fusing the M area images to obtain the target area image;
wherein the fusion weight includes at least one of: good exposure estimation weight, sharpness weight.
In a possible implementation, the processor 7010 is further configured to determine a target coincidence parameter before the input unit 7004 acquires the first image; adjusting exposure parameters based on the target coincidence parameters;
the input unit 7004 is specifically configured to capture the first image by using the adjusted exposure parameter;
wherein the target coincidence parameter is used for indicating: a degree of coincidence between a second region of a fourth image and a moving region of the fourth image;
the fourth image is the last image shot before the first image is shot; the brightness of the second area exceeds the preset brightness range.
In the electronic device provided in the embodiment of the present application, when the first area (including the highlight area or the dead black area) of the image with brightness outside the preset brightness range overlaps with the moving area of the image, such as the moving object is included in the first image area, since sampling and/or multi-pixel integration operation may be performed on the first area, so as to obtain M groups of area images corresponding to M brightness ranges, it may be ensured that the registration accuracy between the M groups of area images and the first image is higher, and further, ghosting may be avoided in the high dynamic range image. This can improve the display effect of the high dynamic range image.
It should be appreciated that in embodiments of the present application, the input unit 7004 may include a graphics processor (Graphics Processing Unit, GPU) 70041 and a microphone 70042, the graphics processor 70041 processing image data of still pictures or video obtained by an image capture device (e.g., a camera) in a video capture mode or an image capture mode. The display unit 7006 may include a display panel 70061, and the display panel 70061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 7007 includes at least one of a touch panel 70071 and other input devices 70072. The touch panel 70071 is also referred to as a touch screen. The touch panel 70071 may include two parts, a touch detection device and a touch controller. Other input devices 70072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and so forth, which are not described in detail herein.
The memory 7009 can be used to store software programs and various data. The memory 7009 may mainly include a first memory area storing programs or instructions and a second memory area storing data, wherein the first memory area may store an operating system, application programs or instructions (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, the memory 7009 may include volatile memory or nonvolatile memory, or the memory 7009 may include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM), static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (ddr SDRAM), enhanced SDRAM (Enhanced SDRAM), synchronous DRAM (SLDRAM), and Direct RAM (DRRAM). Memory 7009 in embodiments of the present application includes, but is not limited to, these and any other suitable types of memory.
The processor 7010 may include one or more processing units; the processor 7010 optionally integrates an application processor that primarily handles operations involving an operating system, user interfaces, applications, etc., and a modem processor that primarily handles wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into the processor 7010.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the embodiment of the image processing method, and the same technical effects can be achieved, so that repetition is avoided, and no further description is given here.
Wherein the processor is a processor in the electronic device described in the above embodiment. The readable storage medium includes computer readable storage medium such as computer readable memory ROM, random access memory RAM, magnetic or optical disk, etc.
The embodiment of the application further provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled with the processor, and the processor is used for running a program or an instruction, so as to implement each process of the embodiment of the image processing method, and achieve the same technical effect, so that repetition is avoided, and no redundant description is provided here.
It should be understood that the chips referred to in the embodiments of the present application may also be referred to as system-on-chip chips, chip systems, or system-on-chip chips, etc.
The embodiments of the present application provide a computer program product, which is stored in a storage medium, and the program product is executed by at least one processor to implement the respective processes of the above method embodiments, and achieve the same technical effects, and are not repeated herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Furthermore, it should be noted that the scope of the methods and apparatus in the embodiments of the present application is not limited to performing the functions in the order shown or discussed, but may also include performing the functions in a substantially simultaneous manner or in an opposite order depending on the functions involved, e.g., the described methods may be performed in an order different from that described, and various steps may also be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
From the above description of the embodiments, it will be clear to those skilled in the art that the above embodiment method may be implemented by means of software plus necessary general hardware platform, of course also by means of hard 5 pieces, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solutions of the present application may be embodied essentially or in a part contributing to the prior art in the form of a computer software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk), comprising several instructions for causing a terminal (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the methods described in the embodiments of the present application.
The examples of the present application were described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative, not restrictive, and many forms may be made by those of ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are all within the protection of the present application.

Claims (10)

1. An image processing method, the method comprising:
acquiring a first image, wherein the first image comprises a plurality of pixel units, and each pixel unit comprises a plurality of pixels;
and under the condition that a first area of the first image is overlapped with a motion area of the first image, performing a target operation on pixels of the first area to obtain M groups of area images, wherein the target operation comprises the following steps: at least one of a sampling operation and a multi-pixel integration operation, M being a positive integer;
obtaining a high dynamic range image corresponding to the first image based on the M groups of region images and the first image;
wherein the brightness of the first area is outside a preset brightness range;
the M groups of region images correspond to M luminance ranges.
2. The method of claim 1, wherein performing a target operation on pixels of the first region results in M sets of region images, comprising:
according to the M sampling numbers, executing the target operation on the pixels of the first region to obtain M groups of region images corresponding to the M sampling numbers one by one;
wherein each of the sampling numbers is used for indicating the number of collected pixels from each pixel unit of the first area;
Each of the sampling numbers corresponds to at least one sampling pattern, and each sampling pattern is used for indicating a pattern of collecting pixels from each pixel unit of the first area.
3. The method according to claim 1, wherein the obtaining a high dynamic range image corresponding to the first image based on the M sets of region images and the first image includes:
fusing the M groups of region images to obtain a target region image;
synthesizing the target area image and the first image to obtain a second image;
performing rearrangement demosaicing processing on the second image to obtain a third image;
and performing tone mapping processing on the third image to obtain the high dynamic range image.
4. A method according to claim 3, wherein the fusing the M sets of region images to obtain the target region image comprises:
determining fusion weights corresponding to the region images of the M groups of region images;
according to the fusion weight, fusing the M area images to obtain the target area image;
wherein the fusion weight includes at least one of: good exposure estimation weight, sharpness weight.
5. The method of claim 1, wherein prior to the acquiring the first image, the method further comprises:
determining a target coincidence parameter;
adjusting exposure parameters based on the target coincidence parameters;
the acquiring a first image includes:
shooting the first image by adopting the adjusted exposure parameters;
wherein the target coincidence parameter is used for indicating: a degree of coincidence between a second region of a fourth image and a moving region of the fourth image;
the fourth image is the last image shot before the first image is shot;
the brightness of the second area is outside the preset brightness range.
6. An image processing apparatus, characterized in that the apparatus comprises: the device comprises an acquisition module and a processing module;
the acquisition module is used for acquiring a first image, wherein the first image comprises a plurality of pixel units, and each pixel unit comprises a plurality of pixels;
the processing module is configured to perform a target operation on pixels of the first area to obtain M groups of area images when the first area of the first image acquired by the acquiring module overlaps with a motion area of the first image, where the target operation includes: at least one of a sampling operation and a multi-pixel integration operation, M being a positive integer;
The processing module is further configured to obtain a high dynamic range image corresponding to the first image based on the M groups of area images and the first image;
wherein the brightness of the first area is outside a preset brightness range;
the M groups of region images correspond to M luminance ranges.
7. The apparatus of claim 6, wherein the processing module is specifically configured to perform the target operation on the pixels of the first region according to M sample numbers, to obtain M groups of region images corresponding to the M sample numbers one to one;
wherein each of the sampling numbers is used for indicating the number of collected pixels from each pixel unit of the first area;
each of the sampling numbers corresponds to at least one sampling pattern, and each sampling pattern is used for indicating a pattern of collecting pixels from each pixel unit of the first area.
8. The apparatus of claim 6, wherein the processing module is specifically configured to:
fusing the M groups of region images to obtain a target region image;
synthesizing the target area image and the first image to obtain a second image;
performing rearrangement demosaicing processing on the second image to obtain a third image;
And performing tone mapping processing on the third image to obtain the high dynamic range image.
9. The apparatus according to claim 8, wherein the processing module is specifically configured to:
determining fusion weights corresponding to the region images of the M groups of region images;
according to the fusion weight, fusing the M area images to obtain the target area image;
wherein the fusion weight includes at least one of: good exposure estimation weight, sharpness weight.
10. The apparatus of claim 6, wherein the processing module is further configured to determine a target coincidence parameter prior to the acquisition module acquiring the first image; adjusting exposure parameters based on the target coincidence parameters;
the acquisition module is specifically used for shooting the first image by adopting the adjusted exposure parameters;
wherein the target coincidence parameter is used for indicating: a degree of coincidence between a second region of a fourth image and a moving region of the fourth image;
the fourth image is the last image shot before the first image is shot;
the brightness of the second area is outside the preset brightness range.
CN202310004425.9A 2023-01-03 2023-01-03 Image processing method and device Pending CN116055891A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310004425.9A CN116055891A (en) 2023-01-03 2023-01-03 Image processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310004425.9A CN116055891A (en) 2023-01-03 2023-01-03 Image processing method and device

Publications (1)

Publication Number Publication Date
CN116055891A true CN116055891A (en) 2023-05-02

Family

ID=86128975

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310004425.9A Pending CN116055891A (en) 2023-01-03 2023-01-03 Image processing method and device

Country Status (1)

Country Link
CN (1) CN116055891A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117278865A (en) * 2023-11-16 2023-12-22 荣耀终端有限公司 Image processing method and related device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117278865A (en) * 2023-11-16 2023-12-22 荣耀终端有限公司 Image processing method and related device

Similar Documents

Publication Publication Date Title
US9549123B2 (en) Multi-field CCD capture for HDR imaging
US8737755B2 (en) Method for creating high dynamic range image
JP5762756B2 (en) Image processing apparatus, image processing method, image processing program, and photographing apparatus
CN110505411A (en) Image capturing method, device, storage medium and electronic equipment
CN112822412B (en) Exposure method, exposure device, electronic equipment and storage medium
CN110266954A (en) Image processing method, device, storage medium and electronic equipment
CN113132695B (en) Lens shading correction method and device and electronic equipment
CN116744120B (en) Image processing method and electronic device
WO2023137956A1 (en) Image processing method and apparatus, electronic device, and storage medium
CN116055891A (en) Image processing method and device
CN112437237B (en) Shooting method and device
CN108353133B (en) Apparatus and method for reducing exposure time set for high dynamic range video/imaging
CN110278375A (en) Image processing method, device, storage medium and electronic equipment
CN110266965B (en) Image processing method, image processing device, storage medium and electronic equipment
CN113676674B (en) Image processing method, device, electronic equipment and readable storage medium
US20220230283A1 (en) Method and device for processing image, and storage medium
CN115439386A (en) Image fusion method and device, electronic equipment and storage medium
CN114125319A (en) Image sensor, camera module, image processing method and device and electronic equipment
Gil Rodríguez et al. High quality video in high dynamic range scenes from interlaced dual-iso footage
CN114143447B (en) Image processing method and device and electronic equipment
CN116156334A (en) Shooting method, shooting device, electronic equipment and readable storage medium
CN117479025A (en) Video processing method, video processing device, electronic equipment and medium
WO2023245391A1 (en) Preview method and apparatus for camera
Chaurasiya et al. High dynamic range imaging for dynamic scenes
CN117412185A (en) Image generation method, device, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination