CN113012081A - Image processing method, device and electronic system - Google Patents

Image processing method, device and electronic system Download PDF

Info

Publication number
CN113012081A
CN113012081A CN202110122061.5A CN202110122061A CN113012081A CN 113012081 A CN113012081 A CN 113012081A CN 202110122061 A CN202110122061 A CN 202110122061A CN 113012081 A CN113012081 A CN 113012081A
Authority
CN
China
Prior art keywords
image
initial
images
brightness
initial images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110122061.5A
Other languages
Chinese (zh)
Inventor
蒋霆
饶青
韩明燕
刘帅成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Megvii Technology Co Ltd
Original Assignee
Beijing Megvii Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Megvii Technology Co Ltd filed Critical Beijing Megvii Technology Co Ltd
Priority to CN202110122061.5A priority Critical patent/CN113012081A/en
Publication of CN113012081A publication Critical patent/CN113012081A/en
Priority to PCT/CN2021/132503 priority patent/WO2022160895A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20208High dynamic range [HDR] image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The invention provides an image processing method, an image processing device and an electronic system, wherein a plurality of initial images collected under different exposure degrees and mask images corresponding to the initial images are obtained, and the mask images are used for indicating foreground areas in the initial images; fusing the brightness information of a plurality of initial images to obtain initial brightness information; fusing the initial brightness information and the brightness information of the reference frame image in the plurality of initial images to obtain final brightness information based on the mask image; and obtaining a target image based on the final brightness information and the color information of the reference frame image. According to the method, the mask image of the foreground region of the indicated image is obtained based on a plurality of images with different exposure degrees, and then the luminance information of the foreground region in the reference frame image and the non-foreground region in the initial luminance information are fused through the mask image, so that the foreground region in the fused image is matched with the reference frame image, the visual effect of the foreground region in the fused image is improved, and the overall effect of the fused image is also improved.

Description

Image processing method, device and electronic system
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to an image processing method, an image processing apparatus, and an electronic system.
Background
HDR (High-Dynamic Range) images can provide more Dynamic Range and image details than ordinary images, and HDR images are images formed by using a plurality of pictures with different exposures and then combining the images with different exposures into one image by using software, and the image can better reflect the visual effect in a real environment.
In the related art, the HDR image synthesis method usually adopts a linear fusion method, but the synthesized image obtained by the method has low contrast and is prone to appear gray visually, so that the portrait area of the synthesized image looks gray visually, and the overall effect of the image is not natural enough.
Disclosure of Invention
The invention aims to provide an image processing method, an image processing device and an electronic system, which are used for improving the effect of a portrait area in a synthetic image and simultaneously improving the overall effect of the synthetic image.
In a first aspect, the present invention provides an image processing method, comprising: acquiring a plurality of initial images acquired under different exposure degrees and mask images corresponding to the initial images; wherein the mask image is generated based on at least a portion of the images in the plurality of initial images; the mask image is used for indicating foreground areas in a plurality of initial images; fusing the brightness information of a plurality of initial images to obtain initial brightness information; based on the mask image, fusing the initial brightness information and the pixel brightness information of the reference frame image in the plurality of initial images to obtain final brightness information; and obtaining a target image based on the final brightness information and the color information of the reference frame image.
In an optional embodiment, the step of performing fusion processing on the initial luminance information and the pixel luminance information of the reference frame image in the plurality of initial images based on the mask image to obtain final luminance information includes: and carrying out alpha fusion on the initial brightness information and the pixel brightness information of the reference frame image through the mask image to obtain final brightness information.
In an optional embodiment, the pixel value of each pixel point in the mask image is 0 or 255; the step of performing alpha fusion on the initial brightness information and the pixel brightness information of the reference frame image through the mask image to obtain the final brightness information includes: and calculating to obtain final brightness information by the following formula: new _ res (ev 0) (M/255) + (1-M/255) res;
wherein new _ res represents final luminance information; ev0 denotes pixel luminance information of the reference frame image; m represents the pixel value of each pixel point in the mask image; res denotes initial luminance information.
In an optional embodiment, the step of acquiring a plurality of initial images acquired at different exposures and mask images corresponding to the plurality of initial images includes: acquiring a plurality of initial images acquired under different exposure levels; wherein each initial image corresponds to one exposure level; selecting a specified number of initial images from a plurality of initial images; performing fusion processing on the brightness information of the specified number of initial images to obtain a fused image; and carrying out binarization processing on the fused image to obtain a mask image.
In an alternative embodiment, the above specified number is two; the step of selecting a predetermined number of initial images from the plurality of initial images includes: selecting an initial image with the maximum exposure and an initial image with the minimum exposure from the plurality of initial images.
In an optional embodiment, the step of performing fusion processing on the luminance information of a specified number of initial images to obtain a fused image includes: aiming at each pair of two initial images with adjacent exposure degrees in the initial images with the specified number, carrying out fusion processing on the two initial images with the adjacent current exposure degrees to obtain a first fusion image of the two initial images with the adjacent current exposure degrees; performing fusion processing on two first fusion images adjacent to the current exposure degree aiming at two first fusion images adjacent to each pair of exposure degrees in a plurality of first fusion images obtained from the initial images with the specified number to obtain a second fusion image of the two first fusion images adjacent to the current exposure degree; and taking the second fused image as a new first fused image, and continuing to perform the step of fusing the two adjacent first fused images with the current exposure for each pair of adjacent first fused images with the current exposure to obtain a second fused image of the two adjacent first fused images with the current exposure until a final fused image is obtained.
In an optional embodiment, the step of performing fusion processing on two initial images adjacent to the current exposure level to obtain a first fused image of the two initial images adjacent to the current exposure level includes: aiming at each pixel point in two initial images adjacent to the current exposure, the following operations are executed: searching a weighting weight from a preset weight matrix according to the brightness values of current pixel points in two initial images adjacent to the current exposure; wherein, the weighting matrix comprises weighting weights corresponding to any two brightness values; and determining the brightness value corresponding to the current pixel point in the first fusion image according to the searched weighting weight and the brightness value of the current pixel point in the two initial images adjacent to the current exposure.
In an optional embodiment, the step of searching for a weighting weight from a preset weight matrix according to the brightness values of current pixel points in two initial images adjacent to the current exposure includes: determining a first brightness value of a current pixel point in a first initial image and a second brightness value of the current pixel point in a second initial image from two initial images adjacent to the current exposure; determining the sum of the first brightness value and one as the target column number; adding the second brightness value and one to determine the target line number; and determining the position of the weighting weight in the weight matrix according to the target column number and the target row number, and determining the numerical value corresponding to the position in the weight matrix as the searched weighting weight.
In an optional embodiment, the step of determining the brightness value corresponding to the current pixel point in the first fused image according to the found weighting weight and the brightness value of the current pixel point in the two initial images adjacent to the current exposure includes: and determining the brightness value corresponding to the current pixel point in the first fusion image according to the following formula: b isi=w1i*A1i+w2i*A2i(ii) a Wherein the content of the first and second substances,Biexpressing the brightness value of the ith pixel point in the first fusion image, wherein the ith pixel point is the current pixel point; w1iRepresenting the weighting weight searched from the weighting matrix, wherein the weighting weight is the weighting weight corresponding to the ith pixel point in the first initial image A1 in the two adjacent initial images with the current exposure; w2iRepresents the weighting corresponding to the ith pixel point in the second initial image A2 of the two adjacent initial images with the current exposure, wherein, w2i=1-w1i;A1iThe luminance value of the ith pixel point in the first initial image A1 is represented; a2iIndicating the luminance value of the ith pixel in the second original image a 2.
In an optional embodiment, the step of performing binarization processing on the fused image to obtain a mask image includes: and performing guided filtering processing on the fusion image after the binarization processing to obtain a filtered mask image.
In an optional embodiment, the step of performing fusion processing on the luminance information of the plurality of initial images to obtain the initial luminance information includes: aiming at each pair of two initial images with adjacent exposure degrees in the plurality of initial images, carrying out fusion processing on the two initial images with adjacent current exposure degrees to obtain a first brightness map of the two initial images with adjacent current exposure degrees; aiming at two first brightness graphs with each pair of adjacent exposure levels in a plurality of first brightness graphs obtained from a plurality of initial images, carrying out fusion processing on the two first brightness graphs with the adjacent current exposure levels to obtain a second brightness graph of the two first brightness graphs with the adjacent current exposure levels; taking the second brightness graph as a new first brightness graph, continuing to execute the step of performing fusion processing on the two adjacent first brightness graphs of the current exposure degree aiming at each pair of two adjacent first brightness graphs of the exposure degree, and obtaining the second brightness graph of the two adjacent first brightness graphs of the current exposure degree until obtaining one brightness graph; and determining the brightness information of the obtained brightness map as initial brightness information.
In an optional embodiment, the color space of the initial image is in a YUV form, the luminance information is a Y channel, and the color information is a UV channel.
In a second aspect, the present invention provides an image processing apparatus comprising: the image acquisition module is used for acquiring a plurality of initial images acquired under different exposure degrees and mask images corresponding to the plurality of initial images; wherein the mask image is generated based on at least a portion of the images in the plurality of initial images; the mask image is used for indicating foreground areas in a plurality of initial images; the first fusion module is used for carrying out fusion processing on the brightness information of a plurality of initial images to obtain initial brightness information; the second fusion module is used for carrying out fusion processing on the initial brightness information and the pixel brightness information of the reference frame image in the initial images based on the mask image to obtain final brightness information; and the information integration module is used for obtaining a target image based on the final brightness information and the color information of the reference frame image.
In a third aspect, the present invention provides an electronic system comprising: the device comprises an image acquisition device, a processing device and a storage device; the image acquisition equipment is used for acquiring preview video frames or image data; the storage device has stored thereon a computer program that, when executed by a processing apparatus, performs the image processing method of any of the preceding embodiments.
In a fourth aspect, the present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processing apparatus, performs the image processing method of any one of the preceding embodiments.
The embodiment of the invention has the following beneficial effects:
the invention provides an image processing method, an image processing device and an electronic system.A plurality of initial images collected under different exposure degrees and mask images corresponding to the initial images are firstly acquired; wherein the mask image is generated based on at least a portion of the images in the plurality of initial images; the mask image is used for indicating foreground areas in a plurality of initial images; then, fusing the brightness information of the plurality of initial images to obtain initial brightness information; based on the mask image, fusing the initial brightness information and the pixel brightness information of the reference frame image in the plurality of initial images to obtain final brightness information; and then, obtaining a target image, namely a fused image, based on the final brightness information and the color information of the reference frame image. According to the method, the mask image used for indicating the image foreground region is obtained based on a plurality of images with different exposure degrees, then the luminance information of the foreground region in the reference frame image and the non-foreground region in the initial luminance information are fused through the mask image, and the foreground region in the reference frame image has a good effect, so that the foreground region in the fused image is matched with the foreground region in the reference frame image, the visual effect of the foreground region in the fused image is improved, and the overall effect of the fused image is also improved.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the invention as set forth above.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a schematic structural diagram of an electronic system according to an embodiment of the present invention;
FIG. 2 is a flowchart of an image processing method according to an embodiment of the present invention;
FIG. 3 is a flow chart of another image processing method according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a mask image according to an embodiment of the present invention;
FIG. 5 is a flow chart of another image processing method according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
With the continuous perfection of the functions of the mobile phone, the demand of users for the functions of the mobile phone, such as a camera, is higher and higher, and the HDR is widely applied as a functional application of the mobile phone. HDR, also called wide dynamic range technology, is a technology that is applied to make a camera see the features of an image under very strong contrast. When a high-brightness area and a shadow, a backlight and other areas with relatively low brightness under the irradiation of a strong light source (sunlight, lamps, reflected light and the like) exist in an image at the same time, the image output by the camera is changed into white due to overexposure, and the image quality is seriously influenced because a dark area is changed into black due to underexposure. There is a limitation in the appearance of a camera to the brightest and darker areas in the same scene, which is commonly referred to as "dynamic range". HDR images are pictures that use multiple different exposures and then combine them into one picture with software. This has the advantage that a picture with details both in the shadow and in the highlight is finally obtained.
In the HDR image synthesis method in the related art, a linear fusion method is generally used, but the synthesized image obtained by the method has low contrast and is prone to appear gray visually, so that a portrait area of the synthesized image looks gray visually, and the overall effect of the image is not natural enough.
The first embodiment is as follows:
first, an example electronic system 100 for implementing the image processing method, apparatus, and electronic system of the embodiments of the present invention is described with reference to fig. 1.
As shown in FIG. 1, an electronic system 100 includes one or more processing devices 102, one or more memory devices 104, an input device 106, an output device 108, and one or more image capture devices 110, which are interconnected via a bus system 112 and/or other type of connection mechanism (not shown). It should be noted that the components and structure of the electronic system 100 shown in fig. 1 are exemplary only, and not limiting, and that the electronic system may have other components and structures as desired.
The processing device 102 may be a gateway or an intelligent terminal, or a device including a Central Processing Unit (CPU) or other form of processing unit having data processing capability and/or instruction execution capability, and may process data of other components in the electronic system 100 and may control other components in the electronic system 100 to perform desired functions.
The storage 104 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, Random Access Memory (RAM), cache memory (cache), and/or the like. The non-volatile memory may include, for example, Read Only Memory (ROM), hard disk, flash memory, etc. On which one or more computer program instructions may be stored that may be executed by processing device 102 to implement client functionality (implemented by the processing device) and/or other desired functionality in embodiments of the present invention described below. Various applications and various data, such as various data used and/or generated by the applications, may also be stored in the computer-readable storage medium.
The input device 106 may be a device used by a user to input instructions and may include one or more of a keyboard, a mouse, a microphone, a touch screen, and the like.
The output device 108 may output various information (e.g., images or sounds) to the outside (e.g., a user), and may include one or more of a display, a speaker, and the like.
The image capture device 110 may capture preview video frames or image data and store the captured preview video frames or image data in the storage 104 for use by other components.
For example, the devices in the electronic system for implementing the image processing method, the text recognition apparatus, and the electronic system according to the embodiments of the present invention may be integrally disposed, or may be disposed in a distributed manner, such as integrally disposing the processing device 102, the storage device 104, the input device 106, and the output device 108, and disposing the image capturing device 110 at a designated position where a target image can be captured. When the above-described devices in the electronic system are integrally provided, the electronic system may be implemented as an intelligent terminal such as a camera, a smart phone, a tablet computer, a vehicle-mounted terminal, and the like.
Example two:
the embodiment provides an image processing method, which is executed by a processing device in the electronic system; the processing device may be any device or chip having data processing capabilities. The processing equipment can independently process the received information, and can also be connected with a server to jointly analyze and process the information. As shown in fig. 2, the image processing method includes the following specific steps:
step S202, acquiring a plurality of initial images collected under different exposure degrees and mask images corresponding to the plurality of initial images; wherein the mask image is generated based on at least a portion of the images in the plurality of initial images; the mask image is used to indicate foreground regions in the plurality of initial images.
The initial image may be a picture or photograph taken by an image pickup device or a camera. In a specific implementation, the initial image may be acquired by shooting through a camera, or the like connected to the communication device, or by acquiring the initial image from a storage device storing the image to be processed that has been shot. The scene shot by each initial image in the multiple acquired initial images is the same, but the exposure corresponding to each initial image is different, the exposure refers to the intensity and time of the sensed light, and generally, the higher the exposure is, the more whitish the shot image is; the lower the exposure, the darker the image taken.
The mask image is generated based on at least a part of the images of the plurality of initial images, and the at least a part of the images may be at least two images of the plurality of initial images. In specific implementation, at least a part of images can be selected from a plurality of initial images, and then the gray level images corresponding to the selected initial images are subjected to fusion processing, so that a mask image can be obtained. The mask image is used for indicating a foreground region in the plurality of images, and the pixel value of an image region corresponding to the foreground region in the mask image is different from the pixel value of an image region except the foreground region, so that the foreground region and the non-foreground region can be distinguished. The foreground region generally refers to an image region containing a portrait.
And step S204, fusing the brightness information of the plurality of initial images to obtain initial brightness information.
The brightness information of the initial image generally refers to the brightness corresponding to each pixel point in the initial image, and the brightness may also be referred to as a gray-scale value, or a brightness value, and is used for indicating the brightness of the image; the brightness value range is 0-255. In some embodiments, the color space of the initial image may be in the form of YUV (Y represents brightness, and U and V represent chrominance), and the brightness information of the initial image is the corresponding data of the initial image on the Y channel.
During specific implementation, the brightness information of a plurality of initial images with different exposure levels is fused together, so that the brightness information corresponding to one fused image can be obtained, and the brightness information is also the initial brightness information; the initial brightness information includes brightness corresponding to each pixel point in the fused image.
And step S206, fusing the initial brightness information and the pixel brightness information of the reference frame image in the plurality of initial images based on the mask image to obtain final brightness information.
The reference frame image may be an initial image specified by a user according to a preset rule from among a plurality of initial images, for example, the reference frame image may be an initial image with an exposure level of a specified value (the specified value may be set according to a user requirement, and may be set to 0 or 1, for example), or may be an image with the best foreground region effect among the plurality of initial images, and the like. The pixel brightness information of the reference frame image includes brightness corresponding to each pixel point in the reference frame image.
Because the foreground region in the reference frame image has the best effect, the foreground region in the reference frame image is obtained through the foreground region indicated by the mask image, and therefore the pixel brightness information of the foreground region in the reference frame image is used as the brightness of each pixel point corresponding to the foreground region in the final brightness information; and obtaining brightness information corresponding to the image areas except the foreground area from the initial brightness information through the foreground area indicated by the mask image, so that the brightness information corresponding to the image areas except the foreground area in the initial brightness information is used as the brightness corresponding to each pixel point in the image areas except the foreground area in the final brightness information.
And step S208, obtaining a target image based on the final brightness information and the color information of the reference frame image.
The color information of the reference frame image may refer to a corresponding chromaticity of each pixel point in the initial image, where the chromaticity is used to describe the color and saturation of the reference frame image and is used to specify the color of the pixel. In some embodiments, the color space of the reference frame image may be in YUV form, and the color information of the reference frame image is the corresponding data of the reference frame image on the UV channel.
In a specific implementation, the final luminance information and the color information of the reference frame image are fused to obtain the target image, that is, the pixel value corresponding to each pixel point in the target image is determined by the brightness of each pixel point in the final luminance information and the chromaticity of each pixel point in the reference frame image.
The image processing method provided by the embodiment of the invention comprises the steps of firstly, acquiring a plurality of initial images acquired under different exposure degrees and mask images corresponding to the plurality of initial images; wherein the mask image is generated based on at least a portion of the images in the plurality of initial images; the mask image is used for indicating foreground areas in a plurality of initial images; then, fusing the brightness information of the plurality of initial images to obtain initial brightness information; based on the mask image, fusing the initial brightness information and the pixel brightness information of the reference frame image in the plurality of initial images to obtain final brightness information; and then, obtaining a target image, namely a fused image, based on the final brightness information and the color information of the reference frame image. According to the method, the mask image used for indicating the image foreground region is obtained based on a plurality of images with different exposure degrees, then the luminance information of the foreground region in the reference frame image and the non-foreground region in the initial luminance information are fused through the mask image, and the foreground region in the reference frame image has a good effect, so that the foreground region in the fused image is matched with the foreground region in the reference frame image, the visual effect of the foreground region in the fused image is improved, and the overall effect of the fused image is also improved.
Example three:
the embodiment of the invention also provides another image processing method which is realized on the basis of the method in the embodiment; the method mainly describes a specific process of acquiring a plurality of initial images acquired under different exposure levels and mask images corresponding to the plurality of initial images (realized by the following steps S302-S308); as shown in fig. 3, the method comprises the following specific steps:
step S302, acquiring a plurality of initial images collected under different exposure levels; wherein each initial image corresponds to one exposure level.
In step S304, a specified number of initial images are selected from the plurality of initial images.
The number of the above specified numbers is at least two, and the specific numerical value of the number of the specified numbers can be set according to the research and development requirements, for example, the number can be set to two or three, etc. In some embodiments, in order to improve the calculation efficiency, the specified number may be set to two, that is, two initial images may be randomly selected from the plurality of initial images, or an initial image with the highest exposure and an initial image with the lowest exposure may be selected from the plurality of initial images.
And step S306, carrying out fusion processing on the brightness information of the initial images of the specified number to obtain a fused image.
And fusing the brightness information of the initial images of the specified number into a fused image, wherein the fused image only comprises the brightness value corresponding to each pixel point. In a specific implementation, if the specified number is greater than two, the step S306 can be implemented by the following steps 10-12:
and step 10, aiming at each pair of two initial images with adjacent exposure degrees in the initial images with the specified number, carrying out fusion processing on the two initial images with the adjacent current exposure degrees to obtain a first fusion image of the two initial images with the adjacent current exposure degrees.
In a specific implementation, the above-mentioned specified number of initial images may be sorted in the order of increasing exposure or decreasing exposure. If the designated number is three, two pairs of two initial images with adjacent exposure degrees exist in the designated number of initial images, namely two first fusion images can be obtained; if the designated number is two, only one pair of two initial images with adjacent exposure levels in the designated number of initial images is obtained, that is, a first fused image is obtained, and the first fused image is the fused image obtained in step S306.
Specifically, the specific process of performing fusion processing on two initial images adjacent to the current exposure to obtain a first fused image of the two initial images adjacent to the current exposure is as follows: aiming at each pixel point in two initial images adjacent to the current exposure, executing the following steps 20-21 to obtain a first fusion image:
step 20, searching a weighting weight from a preset weight matrix according to the brightness values of current pixel points in two initial images adjacent to the current exposure; wherein, the weighting matrix includes weighting weights corresponding to any two luminance values.
The luminance value is equivalent to brightness on the Y channel, and may also be referred to as a gray scale value, and the luminance value is an integer between 0 and 255. The weight matrix is set in advance by research and development personnel according to research and development requirements, the weight matrix comprises weighting weights corresponding to any two gray values, and each numerical value contained in the weight matrix is a positive number smaller than 1, namely the weighting weight is a positive number smaller than 1. The weight matrix is a 256 × 256 matrix, that is, the weight matrix has 256 rows and 256 columns, wherein the 256 rows respectively correspond to the rows in sequence of integers between 0 and 255; 256 columns correspond to columns in sequence with integers between 0 and 255, respectively.
In some embodiments, step 20 above may be implemented by steps 30-32 below:
and step 30, determining a first brightness value of a current pixel point in the first initial image and a second brightness value of the current pixel point in the second initial image from two initial images adjacent to the current exposure.
The first initial image may be an image with a larger exposure level or an image with a smaller exposure level of the two initial images.
Step 31, summing the first brightness value and one to determine the target column number; and determining the sum of the second brightness value and one as the target line number.
Since the luminance values start from 0 and the number of rows or columns of the matrix start from 1, the sum of the first luminance value and one can be determined as the number of target columns to be searched, and the sum of the second luminance value and one can be determined as the number of target rows to be searched.
And step 32, determining the position of the weighting weight in the weighting matrix according to the target column number and the target row number, and determining the numerical value corresponding to the position in the weighting matrix as the searched weighting weight.
And positioning the position of the weight to be searched in the weight matrix according to the target row number and the target column number, and determining a numerical value corresponding to the position as the weight to be searched.
And step 21, determining the brightness value corresponding to the current pixel point in the first fusion image according to the searched weighting weight and the brightness value of the current pixel point in the two initial images adjacent to the current exposure.
In a specific implementation, the luminance value corresponding to the current pixel point in the first fused image may be determined by the following equation:
Bi=w1i*A1i+w2i*A2i
wherein, BiExpressing the brightness value of the ith pixel point in the first fusion image, wherein the ith pixel point is the current pixel point; w1iRepresenting the weighting weight searched from the weighting matrix, wherein the weighting weight is the weighting weight corresponding to the ith pixel point in the first initial image A1 in the two adjacent initial images with the current exposure; w2iRepresents the weighting corresponding to the ith pixel point in the second initial image A2 of the two adjacent initial images with the current exposure, wherein, w2i=1-w1i;A1iThe luminance value of the ith pixel point in the first initial image A1 is represented; a2iIndicating the luminance value of the ith pixel in the second original image a 2.
And 11, performing fusion processing on two adjacent first fusion images with the current exposure degree aiming at two adjacent first fusion images with each pair of exposure degrees in the multiple first fusion images obtained by the initial images with the specified number to obtain a second fusion image of the two adjacent first fusion images with the current exposure degree.
And step 12, taking the second fused image as a new first fused image, continuing to execute each pair of two adjacent first fused images with the exposure, and carrying out fusion processing on the two adjacent first fused images with the current exposure to obtain a second fused image of the two adjacent first fused images with the current exposure until a final fused image is obtained.
If the designated number is three or more, at least two pairs of two initial images with adjacent exposure levels in the designated number of initial images exist, namely at least two first fusion images can be obtained; then, the fusion processing needs to be continued on two first fusion images with adjacent exposure of each pair of the plurality of first fusion images in the manner of the above step 20-21 (replacing the initial image in step 20-21 with the first fusion image), so as to obtain a second fusion image corresponding to two first fusion images with adjacent exposure of each pair. If the specified number is three, the obtained second fused image is the fused image obtained in step S306; if the specified number exceeds three, the number of the second fusion images obtained in step 11 is at least two, not one, so that it is still necessary to continue the fusion processing on the plurality of second fusion images corresponding to the specified number of initial images until a final fusion image is obtained. The specific process of performing fusion processing on the plurality of second fusion images corresponding to the specified number of initial images is the same as the process of performing fusion processing on the first fusion image.
And step S308, carrying out binarization processing on the fused image to obtain a mask image.
The fused image only contains the brightness value of each pixel point, so the fused image can also be called a gray-scale image, the brightness value (or the gray-scale value) of each pixel point in the gray-scale image is between 0 and 255, so the brightness value smaller than the set value in the fused image can be set as a first pixel value, the brightness value larger than or equal to the set value is set as a second pixel value, the binarization processing of the fused image is completed, and the mask image only containing two numerical values is obtained. In some embodiments, the first pixel value may be set to 0, and the second pixel value may be set to 255, so as to obtain a mask image with only black and white, as shown in fig. 4, which is a schematic diagram of the mask image. The white area (with a pixel value of 255) in fig. 4 is a foreground area, and the foreground area includes an image area where a portrait is located, that is, the mask image may be obtained by deducting the area where the portrait is located, and dividing the portrait area from other areas.
In some embodiments, the specific process of performing binarization processing on the fused image to obtain the mask image may further be: and performing guided filtering processing on the fusion image after the binarization processing to obtain a filtered mask image, so that a foreground region in the obtained mask image and an image region outside the foreground region in the mask image are excessively smooth. That is, the mask image with the foreground region and the background region (equivalent to the image region outside the foreground region) being excessively smooth can be obtained through the guide filtering processing, so that the subsequent fusion processing is facilitated, and the overall effect of the image is improved. In a specific implementation, the foreground region may refer to a white region in fig. 4, and the background region may refer to a black region in fig. 4.
Step S310, the brightness information of a plurality of initial images is subjected to fusion processing to obtain initial brightness information.
Step S312, based on the mask image, performing fusion processing on the initial luminance information and the pixel luminance information of the reference frame image in the multiple initial images to obtain final luminance information.
In step S314, a target image is obtained based on the final luminance information and the color information of the reference frame image.
According to the image processing method, due to the light and shade characteristics of the images with different exposure degrees, the mask image can be obtained, and the mask image can better divide a foreground region in the original image, namely a portrait region; and then fusing the foreground region in the mask image with the reference frame image with a better foreground region, so that the portrait can be well protected, and the portrait region effect of the fused image is better. Meanwhile, the foreground region is not required to be segmented by the pre-trained model, so that the running time can be effectively reduced, and the image processing efficiency is improved.
Example four:
the embodiment of the invention also provides another image processing method which is realized on the basis of the method in the embodiment; the method mainly describes a specific process of performing fusion processing on the brightness information of the plurality of initial images to obtain initial brightness information (realized by the following steps S504-S510), and a specific process of performing fusion processing on the initial brightness information and the pixel brightness information of a reference frame image in the plurality of initial images to obtain final brightness information based on a mask image (realized by the following step S512); as shown in fig. 5, the method includes the following specific steps:
step S502, acquiring a plurality of initial images collected under different exposure degrees and mask images corresponding to the plurality of initial images; the mask image is used to indicate foreground regions in the plurality of initial images.
Step S504, aiming at each pair of two initial images with adjacent exposure degrees in the plurality of initial images, carrying out fusion processing on the two initial images with adjacent current exposure degrees to obtain a first brightness image of the two initial images with adjacent current exposure degrees.
The specific implementation manner of the step S504 is the same as the implementation manner of the step S10, that is, the specified number of initial images in the step S10 are replaced by a plurality of initial images, and the first luminance map is replaced by a first fusion image, which is the implementation manner of the step S504. Specifically, the manner of obtaining the first luminance maps of two initial images adjacent to the current exposure level may refer to the above steps 20-21, and is not described herein again.
Step S506, for each pair of two adjacent first luminance graphs of the plurality of first luminance graphs obtained from the plurality of initial images, performing fusion processing on the two adjacent first luminance graphs of the current exposure to obtain a second luminance graph of the two adjacent first luminance graphs of the current exposure.
Step S508, taking the second luminance graph as a new first luminance graph, and continuing to perform a step of performing fusion processing on two adjacent first luminance graphs of the current exposure for each pair of adjacent first luminance graphs of the exposure until obtaining a second luminance graph of the two adjacent first luminance graphs of the current exposure.
In a specific implementation, the specific processes of the above steps S506 to S508 may refer to steps 11 to 12 in the third embodiment, and are not described herein again.
In step S510, the luminance information of the obtained luminance map is determined as the initial luminance information.
And S512, performing alpha fusion on the initial brightness information and the pixel brightness information of the reference frame image through the mask image to obtain final brightness information.
In specific implementation, the specific process of alpha fusion may be: determining a foreground region in a reference frame image according to the foreground region in the mask image, and then determining pixel brightness information corresponding to the foreground region in the reference frame image as first brightness information corresponding to the foreground region in final brightness information; determining brightness information corresponding to the image area except the foreground area in the initial brightness information according to the image area except the foreground area in the mask image, and then determining the brightness information corresponding to the image area except the foreground area in the initial brightness information as second brightness information corresponding to the image area except the foreground area in the final brightness information; and integrating the first brightness information and the second brightness information to obtain final brightness information.
In some embodiments, the pixel value of each pixel point in the mask image is 0 or 255; when the pixel value is 0, the corresponding image area is black, that is, the black area in fig. 4; when the pixel value is 255, the corresponding image area is white, that is, the white area in fig. 4, and the white area is a foreground area, so that the final luminance information can be obtained by the following equation:
new_res=ev0*(M/255)+(1-M/255)*res;
wherein new _ res represents final luminance information; ev0 denotes pixel luminance information of the reference frame image, the pixel luminance information including a luminance value of each pixel point in the reference frame image; m represents a pixel value of each pixel point in the mask image (the pixel value is 0 or 255); res represents initial brightness information, and the initial brightness information includes the brightness value of each pixel point in the image obtained by fusing the plurality of initial images.
In step S514, a target image is obtained based on the final luminance information and the color information of the reference frame image.
In order to facilitate understanding of the embodiment of the present invention, the present invention will be further described below by taking three initial images with different exposure levels as an example. Firstly, three initial images with different exposure degrees, of which the color spaces are in YUV forms, are obtained, wherein the three initial images are ev +1, ev0 and ev-2 respectively; wherein the exposure level of ev +1 is +1, the exposure level of ev0 is 0, and the exposure level of ev-2 is-2; and then splitting the three initial images on a Y channel and a UV channel, wherein the brightness information of the initial images is obtained on the Y channel, and the color information of the initial images is obtained on the UV channel. Furthermore, on a Y channel, performing fusion processing (also called filter fusion processing) on ev +1 and ev-2 selected from the three initial images to obtain a fusion image, and then performing binarization processing on the fusion image to obtain a mask image (also called an image mask); and then, conducting guide filtering processing on the obtained mask image to enable a foreground area and a background area in the mask image to be excessively smooth, so as to obtain a final mask image. And then, carrying out fusion processing on the three acquired initial images with different exposure degrees on a Y channel to obtain initial brightness information. Further, the final luminance information is obtained by the above formula for calculating the final luminance information, and in this case, the reference frame image is ev0, that is, the initial image with an exposure of 0. And finally, combining the final brightness information obtained in the Y channel with the color information of the reference frame image on the UV channel to obtain a target image.
According to the image processing method, the mask image used for indicating the foreground region of the image is obtained based on the multiple images with different exposure degrees, and then the luminance information of the foreground region in the reference frame image and the non-foreground region in the initial luminance information are fused through the mask image, so that the foreground region in the fused image is consistent with the foreground region in the reference frame image, the visual effect of the foreground region in the fused image is improved, and the overall effect of the fused image is also improved.
Example five:
corresponding to the above-described image processing method embodiment, an embodiment of the present invention provides an image processing apparatus, as shown in fig. 6, including:
an image obtaining module 60, configured to obtain multiple initial images collected under different exposure levels and mask images corresponding to the multiple initial images; wherein the mask image is generated based on at least a portion of the images in the plurality of initial images; the mask image is used to indicate foreground regions in the plurality of initial images.
The first fusion module 61 is configured to perform fusion processing on the luminance information of the multiple initial images to obtain initial luminance information.
And a second fusion module 62, configured to perform fusion processing on the initial luminance information and the pixel luminance information of the reference frame image in the multiple initial images based on the mask image to obtain final luminance information.
And an information integration module 63, configured to obtain the target image based on the final luminance information and the color information of the reference frame image.
The color space of the initial image is in a YUV form, the brightness information is a Y channel, and the color information is a UV channel.
Further, the second fusion module 62 is configured to: and carrying out alpha fusion on the initial brightness information and the pixel brightness information of the reference frame image through the mask image to obtain final brightness information.
In specific implementation, the pixel value of each pixel point in the mask image is 0 or 255; the second fusion module 62 is further configured to: and calculating to obtain final brightness information by the following formula:
new_res=ev0*(M/255)+(1-M/255)*res;
wherein new _ res represents final luminance information; ev0 denotes pixel luminance information of the reference frame image; m represents the pixel value of each pixel point in the mask image; res denotes initial luminance information.
Specifically, the image obtaining module 60 includes: the initial image acquisition module is used for acquiring a plurality of initial images acquired under different exposure degrees; wherein each initial image corresponds to one exposure level; the image selection module is used for selecting a specified number of initial images from a plurality of initial images; the fusion processing module is used for carrying out fusion processing on the brightness information of the initial images in the specified number to obtain a fusion image; and the binarization processing module is used for carrying out binarization processing on the fusion image to obtain a mask image.
Further, the fusion processing module is configured to: aiming at each pair of two initial images with adjacent exposure degrees in the initial images with the specified number, carrying out fusion processing on the two initial images with the adjacent current exposure degrees to obtain a first fusion image of the two initial images with the adjacent current exposure degrees; performing fusion processing on two first fusion images adjacent to the current exposure degree aiming at two first fusion images adjacent to each pair of exposure degrees in a plurality of first fusion images obtained from the initial images with the specified number to obtain a second fusion image of the two first fusion images adjacent to the current exposure degree; and taking the second fused image as a new first fused image, and continuing to perform the step of fusing the two adjacent first fused images with the current exposure for each pair of adjacent first fused images with the current exposure to obtain a second fused image of the two adjacent first fused images with the current exposure until a final fused image is obtained.
Further, the above-mentioned specified number may be two; the image selecting module is configured to: selecting an initial image with the maximum exposure and an initial image with the minimum exposure from the plurality of initial images.
Specifically, the fusion processing module is further configured to: aiming at each pixel point in two initial images adjacent to the current exposure, the following operations are executed: searching a weighting weight from a preset weight matrix according to the brightness values of current pixel points in two initial images adjacent to the current exposure; wherein, the weighting matrix comprises weighting weights corresponding to any two brightness values; and determining the brightness value corresponding to the current pixel point in the first fusion image according to the searched weighting weight and the brightness value of the current pixel point in the two initial images adjacent to the current exposure.
In a specific implementation, the fusion processing module is further configured to: determining a first brightness value of a current pixel point in a first initial image and a second brightness value of the current pixel point in a second initial image from two initial images adjacent to the current exposure; determining the sum of the first brightness value and one as the target column number; adding the second brightness value and one to determine the target line number; and determining the position of the weighting weight in the weight matrix according to the target column number and the target row number, and determining the numerical value corresponding to the position in the weight matrix as the searched weighting weight.
In some embodiments, the fusion processing module is further configured to: and determining the brightness value corresponding to the current pixel point in the first fusion image according to the following formula:
Bi=w1i*A1i+w2i*A2i
wherein, BiExpressing the brightness value of the ith pixel point in the first fusion image, wherein the ith pixel point is the current pixel point; w1iRepresenting the weighting weight searched from the weighting matrix, wherein the weighting weight is the weighting weight corresponding to the ith pixel point in the first initial image A1 in the two adjacent initial images with the current exposure; w2iRepresents the weighting corresponding to the ith pixel point in the second initial image A2 of the two adjacent initial images with the current exposure, wherein, w2i=1-w1i;A1iThe luminance value of the ith pixel point in the first initial image A1 is represented; a2iIndicating the luminance value of the ith pixel in the second original image a 2.
Further, the binarization processing module is further configured to: and performing guided filtering processing on the fusion image after the binarization processing to obtain a filtered mask image.
Further, the first fusion module 61 is configured to: aiming at each pair of two initial images with adjacent exposure degrees in the plurality of initial images, carrying out fusion processing on the two initial images with adjacent current exposure degrees to obtain a first brightness map of the two initial images with adjacent current exposure degrees; aiming at two first brightness graphs with each pair of adjacent exposure levels in a plurality of first brightness graphs obtained from a plurality of initial images, carrying out fusion processing on the two first brightness graphs with the adjacent current exposure levels to obtain a second brightness graph of the two first brightness graphs with the adjacent current exposure levels; taking the second brightness graph as a new first brightness graph, continuing to execute the step of performing fusion processing on the two adjacent first brightness graphs of the current exposure degree aiming at each pair of two adjacent first brightness graphs of the exposure degree, and obtaining the second brightness graph of the two adjacent first brightness graphs of the current exposure degree until obtaining one brightness graph; and determining the brightness information of the obtained brightness map as initial brightness information.
The image processing device firstly acquires a plurality of initial images collected under different exposure degrees and mask images corresponding to the plurality of initial images; wherein the mask image is generated based on at least a portion of the images in the plurality of initial images; the mask image is used for indicating foreground areas in a plurality of initial images; then, fusing the brightness information of the plurality of initial images to obtain initial brightness information; based on the mask image, fusing the initial brightness information and the pixel brightness information of the reference frame image in the plurality of initial images to obtain final brightness information; and then, obtaining a target image, namely a fused image, based on the final brightness information and the color information of the reference frame image. According to the method, the mask image used for indicating the image foreground region is obtained based on a plurality of images with different exposure degrees, then the luminance information of the foreground region in the reference frame image and the non-foreground region in the initial luminance information are fused through the mask image, and the foreground region in the reference frame image has a good effect, so that the foreground region in the fused image is matched with the foreground region in the reference frame image, the visual effect of the foreground region in the fused image is improved, and the overall effect of the fused image is also improved.
Example six:
an embodiment of the present invention provides an electronic system, including: the device comprises an image acquisition device, a processing device and a storage device; the image acquisition equipment is used for acquiring preview video frames or image data; the storage means has stored thereon a computer program which, when run by a processing apparatus, performs the image processing method as described above, or the steps of the image processing method as described above.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working process of the electronic system described above may refer to the corresponding process in the foregoing method embodiments, and is not described herein again.
An embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, and when being executed by a processing device, the computer program performs the image processing method.
The image processing method, the image processing apparatus, and the computer program product of the electronic system provided in the embodiments of the present invention include a computer-readable storage medium storing a program code, where instructions included in the program code may be used to execute the method described in the foregoing method embodiments, and specific implementation may refer to the method embodiments, and will not be described herein again.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and/or the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In addition, in the description of the embodiments of the present invention, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In the description of the present invention, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc., indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of description and simplicity of description, but do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
Finally, it should be noted that: the above-mentioned embodiments are only specific embodiments of the present invention, which are used for illustrating the technical solutions of the present invention and not for limiting the same, and the protection scope of the present invention is not limited thereto, although the present invention is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present invention, and they should be construed as being included therein. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (15)

1. An image processing method, characterized in that the method comprises:
acquiring a plurality of initial images acquired under different exposure degrees and mask images corresponding to the initial images; wherein the mask image is generated based on at least a portion of the images in the plurality of initial images; the mask image is used for indicating foreground regions in the plurality of initial images;
fusing the brightness information of the initial images to obtain initial brightness information;
based on the mask image, carrying out fusion processing on the initial brightness information and pixel brightness information of a reference frame image in the plurality of initial images to obtain final brightness information;
and obtaining a target image based on the final brightness information and the color information of the reference frame image.
2. The method according to claim 1, wherein the step of fusing the initial luminance information and the pixel luminance information of the reference frame image of the plurality of initial images based on the mask image to obtain final luminance information comprises:
and carrying out alpha fusion on the initial brightness information and the pixel brightness information of the reference frame image through the mask image to obtain final brightness information.
3. The method of claim 2, wherein the pixel value of each pixel point in the mask image is 0 or 255; the step of performing alpha fusion on the initial brightness information and the pixel brightness information of the reference frame image through the mask image to obtain final brightness information comprises the following steps:
and calculating to obtain the final brightness information by the following formula:
new_res=ev0*(M/255)+(1-M/255)*res;
wherein new _ res represents the final luminance information; ev0 denotes pixel luminance information of the reference frame image; m represents the pixel value of each pixel point in the mask image; res denotes the initial luminance information.
4. The method of any one of claims 1 to 3, wherein the step of acquiring a plurality of initial images acquired at different exposures and corresponding mask images for the plurality of initial images comprises:
acquiring a plurality of initial images acquired under different exposure levels; wherein each initial image corresponds to one exposure level;
selecting a specified number of initial images from the plurality of initial images;
performing fusion processing on the brightness information of the specified number of the initial images to obtain a fused image;
and carrying out binarization processing on the fused image to obtain the mask image.
5. The method of claim 4, wherein the specified number is two; the step of selecting a specified number of the initial images from the plurality of initial images includes:
and selecting the initial image with the maximum exposure and the initial image with the minimum exposure from the plurality of initial images.
6. The method according to claim 4, wherein the step of performing the fusion processing on the brightness information of the specified number of the initial images to obtain a fused image comprises:
performing fusion processing on two initial images adjacent to the current exposure degree aiming at each pair of two initial images adjacent to each exposure degree in the initial images with the specified number to obtain a first fusion image of the two initial images adjacent to each current exposure degree;
performing fusion processing on two first fusion images adjacent to the current exposure degree in each pair of two first fusion images adjacent to the exposure degree in the plurality of first fusion images obtained by aiming at the initial images with the specified number to obtain a second fusion image of the two first fusion images adjacent to the current exposure degree;
and taking the second fused image as a new first fused image, and continuing to perform the step of fusing the two adjacent first fused images with the current exposure for each pair of adjacent first fused images with the current exposure to obtain a second fused image of the two adjacent first fused images with the current exposure until a final fused image is obtained.
7. The method according to claim 6, wherein the step of performing the fusion processing on the two initial images adjacent to the current exposure level to obtain a first fused image of the two initial images adjacent to the current exposure level comprises:
aiming at each pixel point in two initial images adjacent to the current exposure, the following operations are executed:
searching for a weighting weight from a preset weight matrix according to the brightness values of current pixel points in two initial images adjacent to the current exposure; wherein the weighting matrix comprises weighting weights corresponding to any two brightness values;
and determining the brightness value corresponding to the current pixel point in the first fusion image according to the searched weighting weight and the brightness value of the current pixel point in the two initial images adjacent to the current exposure.
8. The method according to claim 7, wherein the step of searching for the weighting weight from a preset weight matrix according to the brightness values of the current pixel points in the two initial images adjacent to the current exposure comprises:
determining a first brightness value of a current pixel point in a first initial image and a second brightness value of the current pixel point in a second initial image from two initial images adjacent to the current exposure;
determining the sum of the first brightness value and one as a target column number; determining the sum of the second brightness value and one as a target line number;
and determining the position of the weighting weight in the weight matrix according to the target column number and the target row number, and determining the numerical value corresponding to the position in the weight matrix as the searched weighting weight.
9. The method according to claim 7, wherein the step of determining the brightness value corresponding to the current pixel point in the first fused image according to the found weighting weight and the brightness value of the current pixel point in the two initial images adjacent to the current exposure comprises:
determining the brightness value corresponding to the current pixel point in the first fusion image according to the following formula:
Bi=w1i*A1i+w2i*A2i
wherein, BiExpressing the brightness value of the ith pixel point in the first fusion image, wherein the ith pixel point is the current pixel point; w1iRepresenting the weighting weight searched from the weighting matrix, wherein the weighting weight is corresponding to the ith pixel point in the first initial image A1 in the two adjacent initial images with the current exposure; w2iRepresenting the weighting corresponding to the ith pixel point in the second initial image A2 of two adjacent initial images with the current exposure, wherein w2i=1-w1i;A1iA luminance value representing the ith pixel point in the first said initial image a 1; a2iRepresenting the luminance value of the ith pixel in the second of said initial images a 2.
10. The method according to claim 4, wherein the step of binarizing the fused image to obtain the mask image comprises:
and performing guided filtering processing on the fusion image after the binarization processing to obtain the filtered mask image.
11. The method according to any one of claims 1 to 10, wherein the step of performing fusion processing on the luminance information of the plurality of initial images to obtain initial luminance information comprises:
for each pair of two initial images with adjacent exposure degrees in the plurality of initial images, carrying out fusion processing on the two initial images with adjacent current exposure degrees to obtain a first brightness map of the two initial images with adjacent current exposure degrees;
performing fusion processing on two adjacent first brightness graphs of the current exposure degree aiming at two adjacent first brightness graphs of each pair of exposure degrees in the multiple first brightness graphs obtained from the multiple initial images to obtain a second brightness graph of the two adjacent first brightness graphs of the current exposure degree;
taking the second brightness graph as a new first brightness graph, continuing to execute the step of performing fusion processing on the two adjacent first brightness graphs of the current exposure for each pair of adjacent first brightness graphs of the exposure, and obtaining the second brightness graph of the two adjacent first brightness graphs of the current exposure until a brightness graph is obtained;
and determining the obtained brightness information of the brightness map as the initial brightness information.
12. The method according to any one of claims 1-11, wherein the color space of the initial image is in YUV form, the luminance information is Y channel, and the color information is UV channel.
13. An image processing apparatus, characterized in that the apparatus comprises:
the system comprises an image acquisition module, a mask image acquisition module and a control module, wherein the image acquisition module is used for acquiring a plurality of initial images acquired under different exposure degrees and mask images corresponding to the plurality of initial images; wherein the mask image is generated based on at least a portion of the images in the plurality of initial images; the mask image is used for indicating foreground regions in the plurality of initial images;
the first fusion module is used for carrying out fusion processing on the brightness information of the plurality of initial images to obtain initial brightness information;
the second fusion module is used for carrying out fusion processing on the initial brightness information and pixel brightness information of a reference frame image in the initial images based on the mask image to obtain final brightness information;
and the information integration module is used for obtaining a target image based on the final brightness information and the color information of the reference frame image.
14. An electronic system, characterized in that the electronic system comprises: the device comprises an image acquisition device, a processing device and a storage device;
the image acquisition equipment is used for acquiring preview video frames or image data;
the storage means has stored thereon a computer program which, when executed by the processing apparatus, performs the image processing method of any one of claims 1 to 12.
15. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processing device, performs the image processing method according to any one of claims 1 to 12.
CN202110122061.5A 2021-01-28 2021-01-28 Image processing method, device and electronic system Pending CN113012081A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110122061.5A CN113012081A (en) 2021-01-28 2021-01-28 Image processing method, device and electronic system
PCT/CN2021/132503 WO2022160895A1 (en) 2021-01-28 2021-11-23 Image processing method, image processing apparatus, electronic system and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110122061.5A CN113012081A (en) 2021-01-28 2021-01-28 Image processing method, device and electronic system

Publications (1)

Publication Number Publication Date
CN113012081A true CN113012081A (en) 2021-06-22

Family

ID=76384946

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110122061.5A Pending CN113012081A (en) 2021-01-28 2021-01-28 Image processing method, device and electronic system

Country Status (2)

Country Link
CN (1) CN113012081A (en)
WO (1) WO2022160895A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114143517A (en) * 2021-10-26 2022-03-04 深圳华侨城卡乐技术有限公司 Fusion mask calculation method and system based on overlapping area and storage medium
WO2022160895A1 (en) * 2021-01-28 2022-08-04 北京迈格威科技有限公司 Image processing method, image processing apparatus, electronic system and readable storage medium

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115953543B (en) * 2023-03-14 2023-05-12 北京天图万境科技有限公司 Method and device for pixel-by-pixel time-sequence-containing analog consistency processing
CN116579960B (en) * 2023-05-06 2023-12-08 广州纳诺科技股份有限公司 Geospatial data fusion method
CN116630220B (en) * 2023-07-25 2023-11-21 江苏美克医学技术有限公司 Fluorescent image depth-of-field fusion imaging method, device and storage medium
CN118155551A (en) * 2024-05-09 2024-06-07 歌尔股份有限公司 Display control method, device, equipment and storage medium for LED display screen

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140307117A1 (en) * 2013-04-15 2014-10-16 Htc Corporation Automatic exposure control for sequential images
US20170208234A1 (en) * 2014-07-17 2017-07-20 Nokia Technologies Oy Method and apparatus for detecting imaging conditions
CN108668093A (en) * 2017-03-31 2018-10-16 华为技术有限公司 The generation method and device of HDR image
CN110060213A (en) * 2019-04-09 2019-07-26 Oppo广东移动通信有限公司 Image processing method, device, storage medium and electronic equipment
CN110087003A (en) * 2019-04-30 2019-08-02 深圳市华星光电技术有限公司 More exposure image fusion methods
WO2019183813A1 (en) * 2018-03-27 2019-10-03 华为技术有限公司 Image capture method and device
WO2020057198A1 (en) * 2018-09-20 2020-03-26 Oppo广东移动通信有限公司 Image processing method and device, electronic device and storage medium
CN111028189A (en) * 2019-12-09 2020-04-17 Oppo广东移动通信有限公司 Image processing method, image processing device, storage medium and electronic equipment
CN111986129A (en) * 2020-06-30 2020-11-24 普联技术有限公司 HDR image generation method and device based on multi-shot image fusion and storage medium
CN112132769A (en) * 2020-08-04 2020-12-25 绍兴埃瓦科技有限公司 Image fusion method and device and computer equipment
CN112150399A (en) * 2020-09-27 2020-12-29 安谋科技(中国)有限公司 Image enhancement method based on wide dynamic range and electronic equipment
CN112215875A (en) * 2020-09-04 2021-01-12 北京迈格威科技有限公司 Image processing method, device and electronic system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113012081A (en) * 2021-01-28 2021-06-22 北京迈格威科技有限公司 Image processing method, device and electronic system

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140307117A1 (en) * 2013-04-15 2014-10-16 Htc Corporation Automatic exposure control for sequential images
US20170208234A1 (en) * 2014-07-17 2017-07-20 Nokia Technologies Oy Method and apparatus for detecting imaging conditions
CN108668093A (en) * 2017-03-31 2018-10-16 华为技术有限公司 The generation method and device of HDR image
WO2019183813A1 (en) * 2018-03-27 2019-10-03 华为技术有限公司 Image capture method and device
WO2020057198A1 (en) * 2018-09-20 2020-03-26 Oppo广东移动通信有限公司 Image processing method and device, electronic device and storage medium
CN110060213A (en) * 2019-04-09 2019-07-26 Oppo广东移动通信有限公司 Image processing method, device, storage medium and electronic equipment
CN110087003A (en) * 2019-04-30 2019-08-02 深圳市华星光电技术有限公司 More exposure image fusion methods
CN111028189A (en) * 2019-12-09 2020-04-17 Oppo广东移动通信有限公司 Image processing method, image processing device, storage medium and electronic equipment
CN111986129A (en) * 2020-06-30 2020-11-24 普联技术有限公司 HDR image generation method and device based on multi-shot image fusion and storage medium
CN112132769A (en) * 2020-08-04 2020-12-25 绍兴埃瓦科技有限公司 Image fusion method and device and computer equipment
CN112215875A (en) * 2020-09-04 2021-01-12 北京迈格威科技有限公司 Image processing method, device and electronic system
CN112150399A (en) * 2020-09-27 2020-12-29 安谋科技(中国)有限公司 Image enhancement method based on wide dynamic range and electronic equipment

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
张丽芳;周军;: "利用多曝光对图像进行动态范围增强", 数据采集与处理, no. 04 *
张淑芳;丁文鑫;韩泽欣;刘孟娅;郭志鹏;: "采用主成分分析与梯度金字塔的高动态范围图像生成方法", 西安交通大学学报, no. 04 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022160895A1 (en) * 2021-01-28 2022-08-04 北京迈格威科技有限公司 Image processing method, image processing apparatus, electronic system and readable storage medium
CN114143517A (en) * 2021-10-26 2022-03-04 深圳华侨城卡乐技术有限公司 Fusion mask calculation method and system based on overlapping area and storage medium

Also Published As

Publication number Publication date
WO2022160895A1 (en) 2022-08-04

Similar Documents

Publication Publication Date Title
CN113012081A (en) Image processing method, device and electronic system
CN111402135B (en) Image processing method, device, electronic equipment and computer readable storage medium
CN108810418B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
CN108537155B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN108537749B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
CN108734676B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN108419028B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN110766621B (en) Image processing method, image processing device, storage medium and electronic equipment
CN108717530B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN107395991B (en) Image synthesis method, image synthesis device, computer-readable storage medium and computer equipment
CN112818732B (en) Image processing method, device, computer equipment and storage medium
CN109242794B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN108616700B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN108875619A (en) Method for processing video frequency and device, electronic equipment, computer readable storage medium
CN112351195B (en) Image processing method, device and electronic system
CN110443766B (en) Image processing method and device, electronic equipment and readable storage medium
CN112634183A (en) Image processing method and device
US10645304B2 (en) Device and method for reducing the set of exposure times for high dynamic range video/imaging
CN113313626A (en) Image processing method, image processing device, electronic equipment and storage medium
CN113781370A (en) Image enhancement method and device and electronic equipment
CN108401109B (en) Image acquisition method and device, storage medium and electronic equipment
CN108881876B (en) Method and device for carrying out white balance processing on image and electronic equipment
CN112822413A (en) Shooting preview method, device, terminal and computer readable storage medium
CN109040598B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN107481199B (en) Image defogging method and device, storage medium and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination