CN111192218B - Image processing method and device, electronic equipment and storage medium - Google Patents

Image processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN111192218B
CN111192218B CN201911415910.5A CN201911415910A CN111192218B CN 111192218 B CN111192218 B CN 111192218B CN 201911415910 A CN201911415910 A CN 201911415910A CN 111192218 B CN111192218 B CN 111192218B
Authority
CN
China
Prior art keywords
image
pixel
processed
mask
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911415910.5A
Other languages
Chinese (zh)
Other versions
CN111192218A (en
Inventor
叶鑫
王昊然
王甜甜
严琼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Sensetime Technology Co Ltd
Original Assignee
Shenzhen Sensetime Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Sensetime Technology Co Ltd filed Critical Shenzhen Sensetime Technology Co Ltd
Priority to CN201911415910.5A priority Critical patent/CN111192218B/en
Publication of CN111192218A publication Critical patent/CN111192218A/en
Application granted granted Critical
Publication of CN111192218B publication Critical patent/CN111192218B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The present disclosure relates to an image processing method and apparatus, an electronic device, and a storage medium, the method including: obtaining a first image according to an image to be processed; obtaining a mask image of the image to be processed according to the pixel value of each pixel point of the image to be processed; and fusing the image to be processed and the first image according to the mask image to obtain a vignetting-removed image. The embodiment of the disclosure can improve the precision of the deghosting image.

Description

Image processing method and device, electronic equipment and storage medium
Technical Field
The disclosure relates to the field of computer technology, and in particular, to an image processing method and apparatus, an electronic device, and a storage medium.
Background
With the progress of technology, terminal devices such as mobile phones have become available as photographing devices for replacing cameras.
In night scene shooting, there are typically a large number of artificial light sources in the scene, such as: light sources such as street lamps, light rays emitted by the light sources can bring a relatively large halation near a light source area in a final image through multiple scattering among different optical surfaces in an optical system, so that the aesthetic feeling of the image can be influenced, and the detail restoration of the image to a real scene can be influenced.
In order to meet the requirement of users for shooting at night, many factories are provided with night scene modes for mobile phones and other terminal equipment, and images can be processed through a simple iterative algorithm or a deep learning neural network in the night scene modes to remove halos in the images, but the removing effect of the halos is not satisfactory.
Disclosure of Invention
The present disclosure proposes a processing solution for removing halation images from images.
According to an aspect of the present disclosure, there is provided an image processing method including:
obtaining a first image according to an image to be processed;
obtaining a mask image of the image to be processed according to the pixel value of each pixel point of the image to be processed;
and fusing the image to be processed and the first image according to the mask image to obtain a vignetting-removed image.
In one possible implementation, the mask image includes a luminance mask image; the obtaining a mask image of the image to be processed according to the pixel value of each pixel point of the image to be processed comprises the following steps:
determining a brightness mask value of each pixel point according to the pixel value of each pixel point in the image to be processed, wherein the brightness mask value of each pixel point is inversely related to a pixel difference value, and the pixel difference value represents the difference value between the pixel value of each pixel point and the middle pixel value of the image to be processed;
And generating a brightness mask image of the image to be processed according to the brightness mask value of each pixel point.
In one possible implementation, the mask image includes a distance mask image; the obtaining a mask image of the image to be processed according to the pixel value of each pixel point of the image to be processed comprises the following steps:
determining highlight pixel points in the image to be processed;
determining a distance mask value of any pixel point in the image to be processed according to the distance from the pixel point to the highlight pixel point;
and generating a distance mask image of the image to be processed according to the distance mask value of each pixel point.
In a possible implementation manner, the determining a distance mask value of any pixel point in the image to be processed according to the distance from the pixel point to a highlight pixel point includes:
and determining a distance mask value of the pixel point according to the distance from any pixel point in the image to be processed to a first highlight pixel point, wherein the first highlight pixel point is the highlight pixel point closest to the pixel point.
In a possible implementation manner, the determining a highlight pixel point in the image to be processed includes:
downsampling the image to be processed to obtain a downsampled image;
Determining the highlight pixel point in the downsampled image;
the generating a distance mask image of the image to be processed according to the distance mask value of each pixel point comprises the following steps:
generating a downsampled distance mask image according to the distance mask value of each pixel point in the downsampled image;
and upsampling the downsampled distance mask image to obtain a distance mask image of the image to be processed.
In a possible implementation manner, the upsampling the downsampled distance mask image to obtain the distance mask image of the image to be processed includes:
obtaining a corresponding gray level image according to the downsampled image;
guiding the downsampling distance mask image to filter through the gray level image to obtain a filtered downsampling distance mask image;
and upsampling the filtered downsampled distance mask image to obtain a distance mask image of the image to be processed.
In one possible implementation manner, the obtaining a first image according to the image to be processed includes:
performing halation extraction on the image to be processed to obtain a first image;
the merging of the image to be processed and the first image according to the mask image to obtain a dehalogenated image comprises the following steps:
Determining a first mask value corresponding to any pixel point in the mask image aiming at any pixel point in the image to be processed;
and obtaining the vignetting-removed image according to the pixel value of the pixel point in the image to be processed, the pixel value corresponding to the pixel point in the first image and the first mask value corresponding to the pixel point.
In one possible implementation manner, the obtaining a first image according to the image to be processed includes:
performing halation extraction on the image to be processed to obtain a halation image;
and performing halation removal processing on the image to be processed according to the halation map to obtain the first image.
In a possible implementation manner, the fusing the image to be processed and the first image according to the mask image to obtain a vignetting-removed image includes:
determining a first mask value corresponding to any pixel point in the mask image for the pixel point in the image to be processed;
determining a second mask value from the first mask value;
determining a first pixel value according to the first mask value and the pixel value of the pixel point in the first image, and determining a second pixel value according to the second mask value and the pixel value of the pixel point in the image to be processed;
And obtaining the dehalogenated image according to the first pixel value and the second pixel value.
According to an aspect of the present disclosure, there is provided an image processing apparatus including:
the first processing module is used for obtaining a first image according to the image to be processed;
the second processing module is used for obtaining a mask image of the image to be processed according to the pixel values of the pixel points of the image to be processed;
and the fusion module is used for fusing the image to be processed and the first image according to the mask image to obtain a vignetting-removed image.
In one possible implementation, the mask image includes a luminance mask image; the second processing module is further configured to:
determining a brightness mask value of each pixel point according to the pixel value of each pixel point in the image to be processed, wherein the brightness mask value of each pixel point is inversely related to a pixel difference value, and the pixel difference value represents the difference value between the pixel value of each pixel point and the middle pixel value of the image to be processed;
and generating a brightness mask image of the image to be processed according to the brightness mask value of each pixel point.
In one possible implementation, the mask image includes a distance mask image; the second processing module is further configured to:
Determining highlight pixel points in the image to be processed;
determining a distance mask value of any pixel point in the image to be processed according to the distance from the pixel point to the highlight pixel point;
and generating a distance mask image of the image to be processed according to the distance mask value of each pixel point.
In one possible implementation manner, the second processing module is further configured to:
and determining a distance mask value of the pixel point according to the distance from any pixel point in the image to be processed to a first highlight pixel point, wherein the first highlight pixel point is the highlight pixel point closest to the pixel point.
In one possible implementation manner, the second processing module is further configured to:
downsampling the image to be processed to obtain a downsampled image;
determining the highlight pixel point in the downsampled image;
the generating a distance mask image of the image to be processed according to the distance mask value of each pixel point comprises the following steps:
generating a downsampled distance mask image according to the distance mask value of each pixel point in the downsampled image;
and upsampling the downsampled distance mask image to obtain a distance mask image of the image to be processed.
In one possible implementation manner, the second processing module is further configured to:
obtaining a corresponding gray level image according to the downsampled image;
guiding the downsampling distance mask image to filter through the gray level image to obtain a filtered downsampling distance mask image;
and upsampling the filtered downsampled distance mask image to obtain a distance mask image of the image to be processed.
In one possible implementation manner, the first processing module is further configured to:
performing halation extraction on the image to be processed to obtain a first image;
the fusion module is further configured to:
determining a first mask value corresponding to any pixel point in the mask image aiming at any pixel point in the image to be processed;
and obtaining the vignetting-removed image according to the pixel value of the pixel point in the image to be processed, the pixel value corresponding to the pixel point in the first image and the first mask value corresponding to the pixel point.
In one possible implementation manner, the first processing module is further configured to:
performing halation extraction on the image to be processed to obtain a halation image;
and performing halation removal processing on the image to be processed according to the halation map to obtain the first image.
In one possible implementation, the fusion module is further configured to:
determining a first mask value corresponding to any pixel point in the mask image for the pixel point in the image to be processed;
determining a second mask value from the first mask value;
determining a first pixel value according to the first mask value and the pixel value of the pixel point in the first image, and determining a second pixel value according to the second mask value and the pixel value of the pixel point in the image to be processed;
and obtaining the dehalogenated image according to the first pixel value and the second pixel value.
According to an aspect of the present disclosure, there is provided an electronic apparatus including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to invoke the instructions stored in the memory to perform the above method.
According to an aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the above-described method.
In this way, the image processing method, the image processing device, the electronic equipment and the storage medium provided by the disclosure obtain the mask image of the image to be processed according to the pixel value of each pixel point in the image to be processed, and further guide the image to be processed to be fused with the first image obtained according to the image to be processed according to the mask image, so as to obtain the vignetting-removed image, and the precision of the vignetting-removed image can be improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure. Other features and aspects of the present disclosure will become apparent from the following detailed description of exemplary embodiments, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the technical aspects of the disclosure.
FIG. 1 shows a flow chart of an image processing method according to an embodiment of the present disclosure;
FIG. 2 shows a schematic diagram of an image processing method according to an embodiment of the present disclosure;
FIG. 3 shows a schematic diagram of an image processing method according to an embodiment of the present disclosure;
fig. 4 shows a block diagram of an image processing apparatus according to an embodiment of the present disclosure;
fig. 5 illustrates a block diagram of an electronic device 800, according to an embodiment of the disclosure;
fig. 6 illustrates a block diagram of an electronic device 1900 according to an embodiment of the disclosure.
Detailed Description
Various exemplary embodiments, features and aspects of the disclosure will be described in detail below with reference to the drawings. In the drawings, like reference numbers indicate identical or functionally similar elements. Although various aspects of the embodiments are illustrated in the accompanying drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
The word "exemplary" is used herein to mean "serving as an example, embodiment, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
The term "and/or" is herein merely an association relationship describing an associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C.
Furthermore, numerous specific details are set forth in the following detailed description in order to provide a better understanding of the present disclosure. It will be understood by those skilled in the art that the present disclosure may be practiced without some of these specific details. In some instances, methods, means, elements, and circuits well known to those skilled in the art have not been described in detail in order not to obscure the present disclosure.
Fig. 1 shows a flowchart of an image processing method according to an embodiment of the present disclosure, which may be performed by an electronic device such as a terminal device or a server, the terminal device may be a User Equipment (UE), a mobile device, a User terminal, a cellular phone, a cordless phone, a personal digital assistant (Personal Digital Assistant, PDA), a handheld device, a computing device, an in-vehicle device, a wearable device, or the like, which may be implemented by a processor invoking computer readable instructions stored in a memory. Alternatively, the method may be performed by a server.
As shown in fig. 1, the image processing method may include:
in step S11, a first image is obtained from the image to be processed.
For example, the image to be processed may be subjected to deghosting processing through a simple iterative algorithm or a pre-trained neural network for deghosting the image to be processed, so as to obtain a first image, where the first image is an image obtained after the image to be processed is subjected to deghosting for the first time; or the halo extraction can be performed on the image to be processed through a pre-trained network for performing halo extraction, so as to obtain a first image, wherein the first image can be a halo image corresponding to the image to be processed.
In step S12, a mask image of the image to be processed is obtained according to the pixel values of the pixels of the image to be processed.
For example, the pixel values of each pixel point in the image to be processed can be obtained (including respectively obtaining the pixel values corresponding to the pixels on each color channel, for example, for three-channel RGB (Red, green, blue, red, green, blue) images, respectively obtaining the corresponding pixel values on the three channels), further determining the mask value corresponding to each pixel point according to the pixel value of each pixel point in the image to be processed, and obtaining the mask image of the image to be processed according to the mask value of each pixel point, wherein when the first image is the image obtained after the first vignetting removal, the mask value can be used for representing the weights of the pixel values of each pixel point in the image to be processed and the first image when the image to be processed and the first image are fused to obtain the vignetting removal image; when the first image is a halation image, the mask value may be used to represent a weight corresponding to a pixel value of each pixel point in the first image when the image to be processed and the first image are fused to obtain the halation image.
In a possible implementation manner, the mask image may include a luminance mask image and/or a distance mask image, where the luminance mask image is used to identify a size relationship between a pixel value of each pixel point in the image to be processed and an intermediate pixel value, where the intermediate pixel value is an intermediate value between an upper limit of the pixel value and a lower limit of the pixel value in the image to be processed; the distance mask image is used for identifying the distance relation between each pixel point in the image to be processed and a highlight pixel point, and the highlight pixel point is a pixel point with a pixel value larger than a pixel threshold value corresponding to any color channel in the image to be processed.
For example, a luminance mask image may be determined according to a pixel value of each pixel in the image to be processed, a luminance mask value corresponding to any pixel in the luminance mask image may be used to identify a size relationship between the pixel value of the pixel and an intermediate pixel value of the image to be processed (a mean value of an upper limit of pixel values allowed by an image format to which the image to be processed belongs and a lower limit of pixel values may be determined as the intermediate pixel value), and the size relationship may be a gap size. For example: the closer the pixel value is to the intermediate pixel value, the larger the corresponding luminance mask value (the maximum value may be set to 1). For any pixel point, the brightness mask value corresponding to any color channel of the pixel point can be determined as the weight corresponding to the pixel value of the pixel point in the color channel in the first image when the image to be processed and the first image are fused to obtain the deghosting image, and the (1-brightness mask value) can be determined as the weight corresponding to the pixel value of the pixel point in the color channel in the image to be processed.
A distance mask image may be determined according to a pixel value and a pixel spatial position of each pixel in the image to be processed, where a distance mask value corresponding to any pixel in the distance mask image may be used to identify a distance relationship between the pixel and a highlight pixel (after normalizing the pixel value of each pixel in the image to be processed to be [0,1], it is determined that a pixel having a pixel value corresponding to any color channel greater than a pixel threshold (e.g., 0.99) is a highlight pixel), for example: the closer the highlight pixel point is, the smaller the distance mask value corresponding to the pixel point (the minimum value may be set to 0). The distance mask value corresponding to any pixel point may be determined as a weight corresponding to a pixel value of the pixel point in the image to be processed when the image to be processed and the first image are fused to obtain the deghosting image, and the (1-distance mask value) may be determined as a weight corresponding to a pixel value of the pixel point in the first image.
In step S13, the image to be processed and the first image are fused according to the mask image, so as to obtain a vignetting-removed image.
For example, when the first image is a preliminary halation-removing image, the pixel value of each pixel in the image to be processed and the weight of the pixel value of each pixel in the first image may be determined according to the mask image, and then the pixel value of the image to be processed and the pixel value of each pixel in the first image are fused according to the weight to obtain the halation-removing image (for example, the product of the weight corresponding to the pixel value of any pixel in the image to be processed and the product of the weight corresponding to the pixel value of the pixel in the first image is determined as the sum of the products of the pixel value and the weight corresponding to the pixel value in the image to be processed); or when the first image is a halation image, the weights of the pixel points of each pixel point in the first image can be determined according to the mask image, and then the pixel values of the image to be processed and the pixel values of each pixel in the first image are fused according to the weights, so as to obtain a halation image (for example, the difference value of the product of the pixel value of any pixel point in the image to be processed and the weight of the pixel point in the first image and the pixel value is determined as the pixel value corresponding to the pixel point in the halation image).
In this way, according to the image processing method provided by the disclosure, the mask image of the image to be processed is obtained according to the pixel value of each pixel point in the image to be processed, and then the image to be processed is guided to be fused with the first image obtained according to the image to be processed according to the mask image, so that the vignetting removing image is obtained, and the accuracy of the vignetting removing image can be improved.
In a possible implementation manner, the mask image includes a luminance mask image, and the obtaining the mask image of the to-be-processed image according to the pixel value of each pixel point of the to-be-processed image may include:
determining a brightness mask value of each pixel point according to the pixel value of each pixel point in the image to be processed, wherein the brightness mask value of each pixel point is inversely related to a pixel difference value, and the pixel difference value represents the difference value between the pixel value of each pixel point and the middle pixel value of the image to be processed;
and generating a brightness mask image of the image to be processed according to the brightness mask value of each pixel point.
For example, an upper limit of pixel values and a lower limit of pixel values allowed for the image to be processed may be determined according to an image format to which the image to be processed belongs, for example: for an RGB image, the upper limit of the allowable pixel value is 255 and the lower limit is 0, and the intermediate pixel value can be determined to be 127.5.
The luminance mask value of each pixel point in the image to be processed can be determined according to the pixel value of each pixel point, wherein the luminance mask value corresponding to the upper limit of the pixel value and the lower limit of the pixel value is a first value, the luminance mask value corresponding to the middle pixel value is a second value, the upper limit of the pixel value and the lower limit of the pixel value are close to each other along with the fact that the pixel value is far away from the middle pixel value, and the corresponding luminance mask value is changed from the second value to the first value.
In one possible implementation, the first value may be set to 0, and the second value may be set to 1, and the range of values of the luminance mask values in the luminance mask image is [0,1]. After determining the luminance mask value of each pixel, a luminance mask image of the image to be processed may be generated according to the luminance mask value of each pixel point.
In this way, the to-be-processed image and the first image can be guided to be fused according to the brightness mask image, because the brightness mask image is used for identifying the size relation between the pixel value of each pixel point in the to-be-processed image and the middle pixel value, and according to priori knowledge, the halo region is more in the middle brightness region, so as to ensure that the smaller the difference between the pixel value and the middle pixel value is, the smaller the probability that the pixel point exists in the halo region, namely the smaller the corresponding halo is, the halo removing force of each pixel point can be determined according to the brightness mask value of each pixel point in the brightness mask image, for example: when the image to be processed and the first image are fused to obtain the halation-removed image, the weight of the pixel value of each pixel point in the first image and the weight of the pixel value of each pixel point in the image to be processed are determined, and then the pixel value of each pixel point in the halation-removed image is determined according to the weight of the pixel value of each pixel point in the first image and the weight of the pixel value of each pixel point in the image to be processed, so that the problems of false processing or dead black of dark parts in the halation-removed process can be relieved, and the halation-removed precision is improved.
In a possible implementation manner, the mask image includes a distance mask image, and the obtaining the mask image of the to-be-processed image according to the pixel value of each pixel point of the to-be-processed image may include:
determining highlight pixel points in the image to be processed;
determining a distance mask value of any pixel point in the image to be processed according to the distance from the pixel point to the highlight pixel point;
and generating a distance mask image of the image to be processed according to the distance mask value of each pixel point.
For example, after normalizing the pixel values of the pixels in the image to be processed, it may be determined that the pixel corresponding to any color channel has a pixel value greater than a pixel threshold (for example, may be set to 0.99) is a highlight pixel. For any pixel point, a distance mask value corresponding to the pixel point can be determined according to the distance between the pixel point and the highlight pixel, and a distance mask image of the image to be processed can be further generated according to the distance mask value corresponding to each pixel point.
In a possible implementation manner, the determining the distance mask value of the pixel according to the distance from any pixel in the image to be processed to the highlight pixel may include:
And determining a distance mask value of the pixel point according to the distance from any pixel point in the image to be processed to a first highlight pixel point, wherein the first highlight pixel point is the highlight pixel point closest to the pixel point.
For example, the distance between the pixel and the nearest first highlight pixel may be determined, and the distance mask value (the same distance mask value is corresponding to a plurality of color channels of the same pixel) of the pixel may be determined according to the distance between the pixel and the nearest first highlight pixel, for example: the distance corresponding to the pixel point with the distance larger than the preset distance threshold can be set as the preset distance threshold, the distance corresponding to each pixel point can be normalized to be 0,1, the value corresponding to the normalized distance can be determined to be the distance mask value corresponding to each pixel point, and then a distance mask image can be generated according to the distance mask value of each pixel point.
In one possible implementation manner, the determining a highlight pixel point in the image to be processed may include:
downsampling the image to be processed to obtain a downsampled image;
and determining the highlight pixel points in the downsampled image.
Generating a small-size distance mask image according to the distance mask value of each pixel;
The generating a distance mask image of the image to be processed according to the distance mask value of each pixel point may include:
generating a downsampled distance mask image according to the distance mask value of each pixel point in the downsampled image;
and upsampling the downsampled distance mask image to obtain a distance mask image of the image to be processed.
For example, the image to be processed may be downsampled to obtain a downsampled image. For example: the image to be processed is downsampled in a region-averaging manner, and the downsampling manner is not particularly limited in the present disclosure. The distance mask value corresponding to each pixel point in the downsampled image is determined, and then the downsampled distance mask image can be generated according to the distance mask value of each pixel, and the downsampled distance mask image is upsampled to obtain a distance mask image with the same size as the image to be processed. In this way, the efficiency of image processing can be improved.
In a possible implementation manner, the upsampling the downsampled distance mask image to obtain the distance mask image of the image to be processed may include:
Obtaining a corresponding gray level image according to the downsampled image;
guiding the downsampling distance mask image to filter through the gray level image to obtain a filtered downsampling distance mask image;
and upsampling the filtered downsampled distance mask image to obtain a distance mask image of the image to be processed.
For example, a corresponding gray level image can be obtained according to the downsampled image, and then the downsampled distance mask image is guided to be filtered (such as bilateral filtering, guide filtering, recursive domain transformation filter, etc.) through the gray level image, so as to obtain a filtered downsampled distance mask image, and the filtered downsampled filter mask image is better attached to the edge of the corresponding content in the image to be processed, thereby avoiding the introduction of unnatural phenomena. And up-sampling the filtered down-sampling distance mask image to obtain a distance mask image with the same size as the image to be processed.
In this way, the to-be-processed image and the first image can be guided to be fused according to the distance mask image, and the distance mask image is used for identifying the distance relation between each pixel point in the to-be-processed image and the high-light pixel point, and according to priori knowledge, the halation of the pixel points linearly decreases along with the increasing distance between the pixel points and the light source, so that the halation removing strength of each pixel point can be determined according to the distance mask value of each pixel point in the distance mask image. For example: when the image to be processed and the first image are fused to obtain the halation-removed image, the weight of the pixel value of each pixel point in the first image and the weight of the pixel value of each pixel point in the image to be processed can be determined, and then the pixel value of each pixel point in the halation-removed image is determined according to the weight of the pixel value of each pixel point in the first image and the weight of the pixel value of each pixel point in the image to be processed, so that the problems that the light source area is mistakenly processed and the light source transition is unnatural can be relieved, and the halation removal precision is improved.
In a possible implementation manner, the obtaining a first image according to the image to be processed may include:
performing halation extraction on the image to be processed to obtain a first image;
the fusing the image to be processed and the first image according to the mask image to obtain a dehalogenated image may include:
determining a first mask value corresponding to any pixel point in the mask image aiming at any pixel point in the image to be processed;
and obtaining the vignetting-removed image according to the pixel value of the pixel point in the image to be processed, the pixel value corresponding to the pixel point in the first image and the first mask value corresponding to the pixel point.
For example, a pre-trained neural network for extracting halation may be used to extract halation of the image to be processed, and a halation map caused by the light source may be extracted, where the halation map may be used as the first image.
In a possible implementation manner, the mask image may include a luminance mask image, the first mask value may be a first luminance mask value of a pixel point in the luminance mask image, and obtaining the deghosting image according to a pixel value of the pixel point in the image to be processed, a pixel value corresponding to the pixel point in the first image, and a first mask value corresponding to the pixel point may include:
And obtaining the vignetting-removed image according to the pixel value of the pixel point in the image to be processed, the pixel value of the pixel point in the first image and the first brightness mask value corresponding to the pixel point.
For example, a first luminance mask value corresponding to any pixel in the luminance mask image may be determined, and a difference between a pixel value of the pixel in the image to be processed and a product of the pixel value of the pixel in the first image and the first luminance mask value may be determined as a pixel value of the pixel in the anti-halation image to obtain the anti-halation image.
For example, assuming that the pixel a corresponds to each channel in the luminance mask image (the first luminance mask value 1, the first luminance mask value 2, and the first luminance mask value 3), and assuming that the pixel a corresponds to the pixel value (a, b, c) in the image to be processed and the pixel value (a 11, b11, c 11) in the first image, the pixel a corresponds to each channel in the dehalogenated image: (a 21, b21, c 21), wherein a21=a-first luminance mask value 1×a11, b21=b-first luminance mask value 2×b11, c21=c-first luminance mask value 3×c11.
In this way, the closer the pixel value is to the middle pixel value, the larger the corresponding vignetting removal force, the larger the duty ratio of the pixel value of the pixel point in the first image, the closer the pixel value is to the upper limit of the pixel value and the lower limit of the pixel value, the smaller the corresponding vignetting removal force, the smaller the duty ratio of the pixel value of the pixel point in the first image, the corresponding vignetting removal force of the pixel can be determined according to the pixel value of the pixel point in the image to be processed, the problem of mishandling or dark part dead black in the vignetting removal process can be relieved, and the vignetting removal precision is improved.
In a possible implementation manner, the mask image may include a distance mask image, the first mask value may be a first distance mask value of a pixel point in the distance mask image, and obtaining the dehairing image according to a pixel value of the pixel point in the image to be processed, a pixel value corresponding to the pixel point in the first image, and a first mask value corresponding to the pixel point may include:
and obtaining the vignetting-removed image according to the pixel value of the pixel point in the image to be processed, the pixel value of the pixel point in the first image and the first distance mask value corresponding to the pixel point.
For example, a first distance mask value corresponding to any pixel in the distance mask image may be determined, and a difference between a pixel value of the pixel in the image to be processed and a product of the pixel value of the pixel in the first image and (1-the first distance mask value) may be determined as a pixel value of the pixel in the vignetting image to obtain the vignetting image.
For example, assuming that the pixel a corresponds to each channel in the distance mask image (the first distance mask value 1, the first distance mask value 2, and the first distance mask value 3), and assuming that the pixel a corresponds to the pixel value (a, b, c) in the image to be processed and the pixel value (a 11, b11, c 11) in the first image, the pixel a corresponds to each channel in the dehalogenated image: (a 21, b21, c 21), wherein a21=a- (1-first distance mask value 1) ×a11, b21=b- (1-first distance mask value 2) ×b11, c21=c- (1-first distance mask value 3) ×c11.
In this way, the more the pixel point vignetting removing force is close to the highlight pixel point, the larger the duty ratio of the pixel value of the pixel point in the first image is, the smaller the corresponding pixel point vignetting removing force is, the smaller the duty ratio of the pixel value of the pixel point in the first image is, and further the corresponding vignetting removing force of the pixel point can be determined according to the distance between the pixel point in the image to be processed and the light source (highlight pixel), the problems that the light source area is mistakenly processed and the light source transition is unnatural can be relieved, and the vignetting removing precision is improved.
In a possible implementation manner, the mask image may include a luminance mask image and a distance mask image, the first mask value may include a first luminance mask image of a pixel point in the luminance mask image and a first distance mask value in the distance mask image, and obtaining the dehairing image according to a pixel value of the pixel point in the image to be processed, a pixel value corresponding to the pixel point in the first image, and a first mask value corresponding to the pixel point may include:
and obtaining the anti-halation image according to the pixel value of the pixel point in the image to be processed, the pixel value of the pixel point in the first image, the first brightness mask value corresponding to the pixel point and the first distance mask value.
For example, a first luminance mask image of any pixel point in the luminance mask image and a first distance mask value corresponding to the pixel point in the distance mask image may be determined, and a weight corresponding to the pixel point in the first image may be determined according to the first luminance mask value and the first distance mask value, for example: the product of the first luminance value and (1-the first distance mask value) may be determined as a weight corresponding to the pixel in the first image, and further, a difference between the pixel value of the pixel in the image to be processed and the product of the pixel value of the pixel in the first image and the weight may be determined as the pixel value of the pixel in the deghosting image, so as to obtain the deghosting image.
For example, assuming that a pixel a corresponds to each channel in a luminance mask image (a first luminance mask value 1, a first luminance mask value 2, and a first luminance mask value 3), corresponds to each channel in a distance mask image (a first distance mask value 1, a first distance mask value 2, and a first distance mask value 3), and assuming that the pixel a corresponds to a pixel value (a, b, c) in an image to be processed and the pixel value (a 11, b11, c 11) in a first image, the pixel value of the pixel a corresponds to each channel in a motion image is: (a 21, b21, c 21), wherein a21=a-first luminance mask value 1 (1-first distance mask value 1) ×a11, b21=b-first luminance mask value 2 (1-first distance mask value 2) ×b11, c21=c-first luminance mask value 3 (1-first distance mask value 3) ×c11.
In this way, according to the vignetting removal image obtained by the distance mask image and the brightness mask image, the problems of mishandling or dead black of a dark part generated in the vignetting removal process can be relieved, the problems of mishandling and unnatural light source transition in a region far away from a light source can be relieved, and the vignetting removal precision is improved.
In one possible implementation manner, the performing the deghosting processing on the image to be processed to obtain the first image may include:
performing halation extraction on the image to be processed to obtain a halation image;
and obtaining the first image according to the image to be processed and the halation chart.
For example, the image to be processed is composed of the first image (the image expected to have no halation) and the halation map, and the pixel value of any pixel in the image to be processed may be composed of the pixel value of the pixel in the first image and the pixel value in the halation map. The pre-trained neural network for extracting halation can be used for extracting halation from the image to be processed, and a halation image caused by the light source can be extracted. The image to be processed is subtracted from the halation graph, so that a first image can be obtained, and the pixel value of any pixel point in the first image is the difference value between the pixel value of the pixel point in the image to be processed and the pixel value of the pixel point in the halation graph. For example: the image to be processed is an RGB image, wherein the pixel value of a certain pixel point is (a, b, c), the pixel value of the pixel point in the first image is (a 1, b1, c 1), the pixel value in the halation chart is (a 2, b2, c 2), wherein a1=a-a 2, b1=b-b 2, and c1=c-c 2.
In a possible implementation manner, the fusing the to-be-processed image and the first image according to the mask image to obtain a vignetting-removed image may include:
determining a first mask value corresponding to any pixel point in the mask image for the pixel point in the image to be processed;
determining a second mask value from the first mask value;
determining a first pixel value according to the first mask value and the pixel value of the pixel point in the first image, and determining a second pixel value according to the second mask value and the pixel value of the pixel point in the image to be processed;
and obtaining the dehalogenated image according to the first pixel value and the second pixel value.
For example, the mask image may include a luminance mask image, the first mask value may be a first luminance mask value corresponding to the pixel point in the luminance mask image, and the second mask value may be a second luminance mask value determined according to the first luminance mask value.
Determining a first pixel value according to the first brightness mask value and the pixel value of the pixel point in the first image, determining a second pixel value according to the second brightness mask value and the pixel value of the pixel point in the image to be processed, and obtaining the anti-halation image according to the first pixel value and the second pixel value.
For example, a first luminance mask value corresponding to any pixel point in the luminance mask image may be determined, and a second luminance mask value of 1-the first luminance mask value may be determined. A first pixel value may be determined from a first luminance mask value and a pixel value of the pixel point in the first image, for example: a product of the first luminance mask value and a pixel value of the pixel point in the first image may be determined as a first pixel value. A second pixel value may be determined from the second luminance mask value and a pixel value of the pixel point in the image to be processed, for example: the product of the second brightness mask value and the pixel value of the pixel point in the image to be processed can be determined to be a second pixel value, and the sum of the first pixel value and the second pixel value is determined to be the pixel value of the pixel point in the anti-halation image, so that the anti-halation image is obtained.
For example, assuming that the pixel a corresponds to each channel in the luminance mask image (the first luminance mask value 1, the first luminance mask value 2, and the first luminance mask value 3), and assuming that the pixel a corresponds to the pixel value (a, b, c) in the image to be processed and the pixel value (a 1, b1, c 1) in the first image, the pixel a corresponds to each channel in the dehalogenated image: (a 2, b2, c 2), wherein a2=first luminance mask value 1×a1+ (1-first luminance mask value 1) ×a, b2=first luminance mask value 2×b1+ (1-first luminance mask value 2) ×b, c2=first luminance mask value 3×c1+ (1-first luminance mask value 3) ×c.
In this way, the closer the pixel value is to the middle pixel value, the larger the corresponding vignetting removal force, the larger the duty ratio of the pixel value of the pixel point in the first image, the closer the pixel value is to the upper limit of the pixel value and the lower limit of the pixel value, the smaller the corresponding vignetting removal force, the smaller the duty ratio of the pixel value of the pixel point in the first image, the corresponding vignetting removal force of the pixel point can be determined according to the pixel value of the pixel point in the image to be processed, the problem of mishandling or dark part dead black in the vignetting removal process can be relieved, and the vignetting removal precision is improved.
For example, the mask image may include a distance mask image, the first mask value may be a first distance mask value corresponding to a pixel point in the distance mask image, and the second mask value may be a second distance mask value determined from the first distance mask.
Determining a first pixel value according to the first distance mask value and the pixel value of the pixel point in the image to be processed, determining a second pixel value according to the second distance mask value and the pixel value of the pixel point in the first image, and obtaining the anti-halation image according to the first pixel value and the second pixel value.
For example, a first distance mask value corresponding to any pixel point in the distance mask image may be determined, and a second distance mask value of 1-the first distance mask value may be determined. A first pixel value may be determined from a first distance mask value and a pixel value of the pixel point in the image to be processed, for example: and determining the product of the first distance mask value and the pixel value of the pixel point in the image to be processed as a first pixel value. A second pixel value may be determined from the second distance mask value and a pixel value of the pixel point in the first image, for example: the product of the second luminance mask value and the pixel value of the pixel point in the first image can be determined to be a second pixel value, and the sum of the first pixel value and the second pixel value is determined to be the pixel value of the pixel point in the vignetting image so as to obtain the vignetting image.
For example, assuming that a pixel a corresponds to a first distance mask value in a distance mask image, assuming that the pixel a corresponds to a pixel value (a, b, c) in an image to be processed and a pixel value (a 1, b1, c 1) in a first image, the pixel a corresponds to a pixel value in a vignetting image: (a 2, b2, c 2), wherein a2=first distance mask value×a+ (1-first distance mask value) ×a1, b2=first distance mask value×b+ (1-first distance mask value) ×b1, c2=first distance mask value×c+ (1-first distance mask value) ×c1.
In this way, the more the pixel vignetting removal intensity of the pixel point is close to the highlight pixel point, the larger the duty ratio of the pixel value of the pixel point in the first image is, the smaller the corresponding pixel vignetting removal intensity of the pixel point far away from the highlight pixel point is, the smaller the duty ratio of the pixel value of the pixel point in the first image is, and further the corresponding vignetting removal intensity of the pixel point can be determined according to the distance between the pixel point and the light source (highlight pixel) in the image to be processed, the problems that the light source area is mistakenly processed and the light source transition is unnatural can be solved, and the vignetting removal accuracy is improved.
For example, when the mask image includes a luminance mask image and includes a distance mask image, obtaining the dehazing image according to the first pixel value and the second pixel value may include:
obtaining a second image according to the first pixel value and the second pixel value;
determining a first distance mask value corresponding to the pixel in the distance mask image;
determining a second distance mask value from the first distance mask;
determining a third pixel value according to the first distance mask value and the pixel value of the pixel point in the image to be processed, and determining a fourth pixel value according to the second distance mask value and the pixel value of the pixel point in the second image;
And obtaining the vignetting image according to the third pixel value and the fourth pixel value.
Referring to fig. 2, a vignetting removing image may be obtained by combining a brightness mask image with a distance mask image, and the image to be processed and the first image may be fused by using the brightness mask image (the fusion process may refer to the foregoing embodiment, and this disclosure will not be repeated here), so as to obtain a second image, where the second image is a transition image. Further, the image to be processed and the second image may be fused by a distance mask image, specifically: the product of the first distance mask value and the pixel value of the pixel point in the image to be processed can be determined to be a third pixel value, the product of the second brightness mask value and the pixel value of the pixel point in the second image can be determined to be a fourth pixel value, and the sum of the third pixel value and the fourth pixel value is determined to be the pixel value of the pixel point in the anti-halation image so as to obtain the anti-halation image.
Therefore, the anti-halation image is obtained by combining the brightness mask image with the distance mask image, so that the problems of mishandling or dead black of a dark part in the anti-halation process can be relieved, and the problems of mishandling and unnatural light source transition in a region far away from a light source can be relieved.
For example, when the mask image includes a distance mask image and a luminance mask image, the obtaining the dehazing image according to the first pixel value and the second pixel value may include:
obtaining a third image according to the first pixel value and the second pixel value;
determining a first brightness mask value corresponding to the pixel point in the brightness mask image;
determining a second luminance mask value from the first luminance mask value;
determining a fifth pixel value according to the first brightness mask value and the pixel value of the pixel point in the third image, and determining a sixth pixel value according to the second brightness mask value and the pixel value of the pixel point in the image to be processed;
and obtaining the dehalogenated image according to the fifth pixel value and the sixth pixel value.
Referring to fig. 3, a distance mask image may be combined with a luminance mask image to obtain a vignetting image, and the distance mask image may be used to fuse the image to be processed and the first image (the fusing process may refer to the foregoing embodiment, and this disclosure will not be repeated here), so as to obtain a third image, where the third image is a transition image. Further, the image to be processed and the third image may be fused by the luminance mask image, specifically: the product of the first distance mask value and the pixel value of the pixel point in the third image may be determined to be a fifth pixel value, the product of the second brightness mask value and the pixel value of the pixel point in the image to be processed may be determined to be a sixth pixel value, and the sum of the fifth pixel value and the sixth pixel value may be determined to be the pixel value of the pixel point in the motion blur image, so as to obtain the motion blur image.
Therefore, the distance mask image is combined with the brightness mask image to obtain the halation-removing image, so that the problems of mishandling or dead black of a dark part in the halation-removing process can be relieved, and the problems that a light source area is mishandled and light source transition is unnatural can also be relieved.
It will be appreciated that the above-mentioned method embodiments of the present disclosure may be combined with each other to form a combined embodiment without departing from the principle logic, and are limited to the description of the present disclosure. It will be appreciated by those skilled in the art that in the above-described methods of the embodiments, the particular order of execution of the steps should be determined by their function and possible inherent logic.
In addition, the disclosure further provides an image processing apparatus, an electronic device, a computer readable storage medium, and a program, where the foregoing may be used to implement any one of the image processing methods provided in the disclosure, and corresponding technical schemes and descriptions and corresponding descriptions referring to method parts are not repeated.
Fig. 4 shows a block diagram of an image processing apparatus according to an embodiment of the present disclosure, as shown in fig. 4, including:
the first processing module 401 may be configured to obtain a first image according to an image to be processed;
The second processing module 402 may be configured to obtain a mask image of the image to be processed according to pixel values of each pixel point of the image to be processed;
the fusion module 403 may be configured to fuse the image to be processed and the first image according to the mask image to obtain a vignetting-removed image.
In this way, the image processing device provided by the disclosure obtains the mask image of the image to be processed according to the pixel value of each pixel point in the image to be processed, and further guides the image to be processed to be fused with the first image obtained according to the image to be processed according to the mask image, so as to obtain the vignetting-removed image, and the accuracy of the vignetting-removed image can be improved.
In one possible implementation, the mask image may include a luminance mask image; the second processing module may be further configured to:
determining a brightness mask value of each pixel point according to the pixel value of each pixel point in the image to be processed, wherein the brightness mask value of each pixel point is inversely related to a pixel difference value, and the pixel difference value represents the difference value between the pixel value of each pixel point and the middle pixel value of the image to be processed;
and generating a brightness mask image of the image to be processed according to the brightness mask value of each pixel point.
In one possible implementation, the mask image package may include a distance mask image; the second processing module may be further configured to:
determining highlight pixel points in the image to be processed;
determining a distance mask value of any pixel point in the image to be processed according to the distance from the pixel point to the highlight pixel point;
and generating a distance mask image of the image to be processed according to the distance mask value of each pixel point.
In one possible implementation, the second processing module may be further configured to:
and determining a distance mask value of the pixel point according to the distance from any pixel point in the image to be processed to a first highlight pixel point, wherein the first highlight pixel point is the highlight pixel point closest to the pixel point.
In one possible implementation, the second processing module may be further configured to:
downsampling the image to be processed to obtain a downsampled image;
determining the highlight pixel point in the downsampled image;
the generating a distance mask image of the image to be processed according to the distance mask value of each pixel point comprises the following steps:
generating a downsampled distance mask image according to the distance mask value of each pixel point in the downsampled image;
And upsampling the downsampled distance mask image to obtain a distance mask image of the image to be processed.
In one possible implementation, the second processing module may be further configured to:
obtaining a corresponding gray level image according to the downsampled image;
guiding the downsampling distance mask image to filter through the gray level image to obtain a filtered downsampling distance mask image;
and upsampling the filtered downsampled distance mask image to obtain a distance mask image of the image to be processed.
In one possible implementation manner, the first processing module may be further configured to:
performing halation extraction on the image to be processed to obtain a first image;
the fusion module may be further configured to:
determining a first mask value corresponding to any pixel point in the mask image aiming at any pixel point in the image to be processed;
and obtaining the vignetting-removed image according to the pixel value of the pixel point in the image to be processed, the pixel value corresponding to the pixel point in the first image and the first mask value corresponding to the pixel point.
In one possible implementation manner, the first processing module may be further configured to:
Performing halation extraction on the image to be processed to obtain a halation image;
and performing halation removal processing on the image to be processed according to the halation map to obtain the first image.
In one possible implementation, the fusion module may be further configured to:
determining a first mask value corresponding to any pixel point in the mask image for the pixel point in the image to be processed;
determining a second mask value from the first mask value;
determining a first pixel value according to the first mask value and the pixel value of the pixel point in the first image, and determining a second pixel value according to the second mask value and the pixel value of the pixel point in the image to be processed;
and obtaining the dehalogenated image according to the first pixel value and the second pixel value.
In some embodiments, functions or modules included in an apparatus provided by the embodiments of the present disclosure may be used to perform a method described in the foregoing method embodiments, and specific implementations thereof may refer to descriptions of the foregoing method embodiments, which are not repeated herein for brevity.
The disclosed embodiments also provide a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the above-described method. The computer readable storage medium may be a non-volatile computer readable storage medium.
The embodiment of the disclosure also provides an electronic device, which comprises: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to invoke the instructions stored in the memory to perform the above method.
The disclosed embodiments also provide a computer program product comprising computer readable code which, when run on a device, causes a processor in the device to execute instructions for implementing the image processing method as provided in any of the embodiments above.
The disclosed embodiments also provide another computer program product for storing computer readable instructions that, when executed, cause a computer to perform the operations of the image processing method provided in any of the above embodiments.
The electronic device may be provided as a terminal, server or other form of device.
Fig. 5 illustrates a block diagram of an electronic device 800, according to an embodiment of the disclosure. For example, electronic device 800 may be a mobile phone, computer, digital broadcast terminal, messaging device, game console, tablet device, medical device, exercise device, personal digital assistant, or the like.
Referring to fig. 5, an electronic device 800 may include one or more of the following components: a processing component 802, a memory 804, a power component 806, a multimedia component 808, an audio component 810, an input/output (I/O) interface 812, a sensor component 814, and a communication component 816.
The processing component 802 generally controls overall operation of the electronic device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 802 may include one or more processors 820 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interactions between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the electronic device 800. Examples of such data include instructions for any application or method operating on the electronic device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power supply component 806 provides power to the various components of the electronic device 800. The power components 806 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the electronic device 800.
The multimedia component 808 includes a screen between the electronic device 800 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front camera and/or a rear camera. When the electronic device 800 is in an operational mode, such as a shooting mode or a video mode, the front camera and/or the rear camera may receive external multimedia data. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the electronic device 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 further includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be a keyboard, click wheel, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 814 includes one or more sensors for providing status assessment of various aspects of the electronic device 800. For example, the sensor assembly 814 may detect an on/off state of the electronic device 800, a relative positioning of the components, such as a display and keypad of the electronic device 800, the sensor assembly 814 may also detect a change in position of the electronic device 800 or a component of the electronic device 800, the presence or absence of a user's contact with the electronic device 800, an orientation or acceleration/deceleration of the electronic device 800, and a change in temperature of the electronic device 800. The sensor assembly 814 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communication between the electronic device 800 and other devices, either wired or wireless. The electronic device 800 may access a wireless network based on a communication standard, such as WiFi,2G, or 3G, or a combination thereof. In one exemplary embodiment, the communication component 816 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the electronic device 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for executing the methods described above.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as memory 804 including computer program instructions executable by processor 820 of electronic device 800 to perform the above-described methods.
Fig. 6 illustrates a block diagram of an electronic device 1900 according to an embodiment of the disclosure. For example, electronic device 1900 may be provided as a server. Referring to FIG. 6, electronic device 1900 includes a processing component 1922 that further includes one or more processors and memory resources represented by memory 1932 for storing instructions, such as application programs, that can be executed by processing component 1922. The application programs stored in memory 1932 may include one or more modules each corresponding to a set of instructions. Further, processing component 1922 is configured to execute instructions to perform the methods described above.
The electronic device 1900 may also include a power component 1926 configured to perform power management of the electronic device 1900, a wired or wireless network interface 1950 configured to connect the electronic device 1900 to a network, and an input/output (I/O) interface 1958. The electronic device 1900 may operate based on an operating system stored in memory 1932, such as Windows Server, mac OS XTM, unixTM, linuxTM, freeBSDTM, or the like.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as memory 1932, including computer program instructions executable by processing component 1922 of electronic device 1900 to perform the methods described above.
The present disclosure may be a system, method, and/or computer program product. The computer program product may include a computer readable storage medium having computer readable program instructions embodied thereon for causing a processor to implement aspects of the present disclosure.
The computer readable storage medium may be a tangible device that can hold and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: portable computer disks, hard disks, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), static Random Access Memory (SRAM), portable compact disk read-only memory (CD-ROM), digital Versatile Disks (DVD), memory sticks, floppy disks, mechanical coding devices, punch cards or in-groove structures such as punch cards or grooves having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media, as used herein, are not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through waveguides or other transmission media (e.g., optical pulses through fiber optic cables), or electrical signals transmitted through wires.
The computer readable program instructions described herein may be downloaded from a computer readable storage medium to a respective computing/processing device or to an external computer or external storage device over a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmissions, wireless transmissions, routers, firewalls, switches, gateway computers and/or edge servers. The network interface card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium in the respective computing/processing device.
Computer program instructions for performing the operations of the present disclosure can be assembly instructions, instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, c++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer readable program instructions may be executed entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, aspects of the present disclosure are implemented by personalizing electronic circuitry, such as programmable logic circuitry, field Programmable Gate Arrays (FPGAs), or Programmable Logic Arrays (PLAs), with state information of computer readable program instructions, which can execute the computer readable program instructions.
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable medium having the instructions stored therein includes an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The computer program product may be realized in particular by means of hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied as a computer storage medium, and in another alternative embodiment, the computer program product is embodied as a software product, such as a software development kit (Software Development Kit, SDK), or the like.
The foregoing description of the embodiments of the present disclosure has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various embodiments described. The terminology used herein was chosen in order to best explain the principles of the embodiments, the practical application, or the improvement of technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (12)

1. An image processing method, comprising:
obtaining a first image according to an image to be processed, wherein the first image is an image obtained after the first time of removing the halation of the image to be processed, or the first image is a halation image corresponding to the image to be processed;
Obtaining a mask image of the image to be processed according to the pixel values of all the pixel points of the image to be processed, wherein when a first image is an image obtained after primary halation, the mask value in the mask image is used for representing the weight of the pixel values of all the pixel points in the image to be processed and the first image when the image to be processed and the first image are fused to obtain a halation image; when the first image is a halation image, the mask value in the mask image is used for representing the weight corresponding to the pixel value of each pixel point in the first image when the image to be processed and the first image are fused to obtain the halation image;
fusing the image to be processed and the first image according to the mask image to obtain a halation-removed image, wherein the method comprises the following steps: and determining the pixel value of each pixel point in the vignetting-removed image according to the weight, the pixel value of each pixel point in the image to be processed and the pixel value of each pixel point in the first image.
2. The method of claim 1, wherein the mask image comprises a luminance mask image; the obtaining a mask image of the image to be processed according to the pixel value of each pixel point of the image to be processed comprises the following steps:
Determining a brightness mask value of each pixel point according to the pixel value of each pixel point in the image to be processed, wherein the brightness mask value of each pixel point is inversely related to a pixel difference value, and the pixel difference value represents the difference value between the pixel value of each pixel point and the middle pixel value of the image to be processed;
and generating a brightness mask image of the image to be processed according to the brightness mask value of each pixel point.
3. The method of claim 1, wherein the mask image comprises a distance mask image; the obtaining a mask image of the image to be processed according to the pixel value of each pixel point of the image to be processed comprises the following steps:
determining highlight pixel points in the image to be processed;
determining a distance mask value of any pixel point in the image to be processed according to the distance from the pixel point to the highlight pixel point;
and generating a distance mask image of the image to be processed according to the distance mask value of each pixel point.
4. A method according to claim 3, wherein said determining a distance mask value for any pixel in the image to be processed based on the distance of said pixel to a high-light pixel comprises:
and determining a distance mask value of the pixel point according to the distance from any pixel point in the image to be processed to a first highlight pixel point, wherein the first highlight pixel point is the highlight pixel point closest to the pixel point.
5. The method according to claim 3 or 4, wherein said determining highlight pixels in said image to be processed comprises:
downsampling the image to be processed to obtain a downsampled image;
determining the highlight pixel point in the downsampled image;
the generating a distance mask image of the image to be processed according to the distance mask value of each pixel point comprises the following steps:
generating a downsampled distance mask image according to the distance mask value of each pixel point in the downsampled image;
and upsampling the downsampled distance mask image to obtain a distance mask image of the image to be processed.
6. The method of claim 5, wherein upsampling the downsampled distance mask image to obtain the distance mask image of the image to be processed comprises:
obtaining a corresponding gray level image according to the downsampled image;
guiding the downsampling distance mask image to filter through the gray level image to obtain a filtered downsampling distance mask image;
and upsampling the filtered downsampled distance mask image to obtain a distance mask image of the image to be processed.
7. The method according to any one of claims 1 to 4, wherein the obtaining a first image from the image to be processed comprises:
performing halation extraction on the image to be processed to obtain a first image;
the merging of the image to be processed and the first image according to the mask image to obtain a dehalogenated image comprises the following steps:
determining a first mask value corresponding to any pixel point in the mask image aiming at any pixel point in the image to be processed;
and obtaining the vignetting-removed image according to the pixel value of the pixel point in the image to be processed, the pixel value corresponding to the pixel point in the first image and the first mask value corresponding to the pixel point.
8. The method according to any one of claims 1 to 4, wherein the obtaining a first image from the image to be processed comprises:
performing halation extraction on the image to be processed to obtain a halation image;
and performing halation removal processing on the image to be processed according to the halation map to obtain the first image.
9. The method according to claim 8, wherein the fusing the image to be processed and the first image according to the mask image to obtain a dehalogenated image comprises:
Determining a first mask value corresponding to any pixel point in the mask image for the pixel point in the image to be processed;
determining a second mask value from the first mask value;
determining a first pixel value according to the first mask value and the pixel value of the pixel point in the first image, and determining a second pixel value according to the second mask value and the pixel value of the pixel point in the image to be processed;
and obtaining the dehalogenated image according to the first pixel value and the second pixel value.
10. An image processing apparatus, comprising:
the first processing module is used for obtaining a first image according to the image to be processed, wherein the first image is an image obtained after the image to be processed is subjected to primary halation removal, or the first image is a halation image corresponding to the image to be processed;
the second processing module is used for obtaining a mask image of the image to be processed according to the pixel values of all the pixel points of the image to be processed, wherein when the first image is the image obtained after primary deghosting, the mask value in the mask image is used for representing the weight of the pixel values of all the pixel points in the image to be processed and the first image when the image to be processed and the first image are fused to obtain a deghosting image; when the first image is a halation image, the mask value in the mask image is used for representing the weight corresponding to the pixel value of each pixel point in the first image when the image to be processed and the first image are fused to obtain the halation image;
The fusion module is configured to fuse the image to be processed and the first image according to the mask image, to obtain a vignetting-removed image, and includes: and determining the pixel value of each pixel point in the vignetting-removed image according to the weight, the pixel value of each pixel point of the image to be processed and the pixel value of each pixel point in the first image.
11. An electronic device, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to invoke the instructions stored in the memory to perform the method of any of claims 1 to 9.
12. A computer readable storage medium having stored thereon computer program instructions, which when executed by a processor, implement the method of any of claims 1 to 9.
CN201911415910.5A 2019-12-31 2019-12-31 Image processing method and device, electronic equipment and storage medium Active CN111192218B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911415910.5A CN111192218B (en) 2019-12-31 2019-12-31 Image processing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911415910.5A CN111192218B (en) 2019-12-31 2019-12-31 Image processing method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111192218A CN111192218A (en) 2020-05-22
CN111192218B true CN111192218B (en) 2023-11-24

Family

ID=70709725

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911415910.5A Active CN111192218B (en) 2019-12-31 2019-12-31 Image processing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111192218B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112200889A (en) * 2020-10-30 2021-01-08 上海商汤智能科技有限公司 Sample image generation method, sample image processing method, intelligent driving control method and device
CN112837251B (en) * 2020-12-31 2024-04-02 维沃移动通信有限公司 Image processing method and device
CN113034389B (en) * 2021-03-17 2023-07-25 武汉联影智融医疗科技有限公司 Image processing method, device, computer equipment and storage medium
CN113689361B (en) * 2021-10-27 2022-02-22 深圳市慧鲤科技有限公司 Image processing method and device, electronic equipment and storage medium
CN118115486B (en) * 2024-04-01 2024-08-30 山东仕达思生物产业有限公司 Light-transmitting target depth of field superposition method and device for removing halation interference

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10209420A (en) * 1997-01-17 1998-08-07 Sony Corp Manufacture of solid-state imaging element
CN107231525A (en) * 2017-05-31 2017-10-03 深圳可思美科技有限公司 The dark angle self-adapting compensation method of image and device
CN108229353A (en) * 2017-12-21 2018-06-29 深圳市商汤科技有限公司 Sorting technique and device, electronic equipment, storage medium, the program of human body image
CN108932696A (en) * 2017-05-26 2018-12-04 杭州海康威视数字技术股份有限公司 The Halation inhibition method and device of signal lamp
CN108961170A (en) * 2017-05-24 2018-12-07 阿里巴巴集团控股有限公司 Image processing method, device and system
US10382712B1 (en) * 2018-08-01 2019-08-13 Qualcomm Incorporated Automatic removal of lens flares from images

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5498335B2 (en) * 2010-09-27 2014-05-21 富士フイルム株式会社 Imaging apparatus and endoscope diagnostic apparatus
US10298864B2 (en) * 2016-06-10 2019-05-21 Apple Inc. Mismatched foreign light detection and mitigation in the image fusion of a two-camera system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10209420A (en) * 1997-01-17 1998-08-07 Sony Corp Manufacture of solid-state imaging element
CN108961170A (en) * 2017-05-24 2018-12-07 阿里巴巴集团控股有限公司 Image processing method, device and system
CN108932696A (en) * 2017-05-26 2018-12-04 杭州海康威视数字技术股份有限公司 The Halation inhibition method and device of signal lamp
CN107231525A (en) * 2017-05-31 2017-10-03 深圳可思美科技有限公司 The dark angle self-adapting compensation method of image and device
CN108229353A (en) * 2017-12-21 2018-06-29 深圳市商汤科技有限公司 Sorting technique and device, electronic equipment, storage medium, the program of human body image
US10382712B1 (en) * 2018-08-01 2019-08-13 Qualcomm Incorporated Automatic removal of lens flares from images

Also Published As

Publication number Publication date
CN111192218A (en) 2020-05-22

Similar Documents

Publication Publication Date Title
CN110348537B (en) Image processing method and device, electronic equipment and storage medium
CN110378976B (en) Image processing method and device, electronic equipment and storage medium
CN111192218B (en) Image processing method and device, electronic equipment and storage medium
US20210118112A1 (en) Image processing method and device, and storage medium
CN110889469B (en) Image processing method and device, electronic equipment and storage medium
CN111462268B (en) Image reconstruction method and device, electronic equipment and storage medium
CN112001321B (en) Network training method, pedestrian re-identification method, device, electronic equipment and storage medium
CN109784255B (en) Neural network training method and device and recognition method and device
CN111553864B (en) Image restoration method and device, electronic equipment and storage medium
CN112465843A (en) Image segmentation method and device, electronic equipment and storage medium
CN110569777B (en) Image processing method and device, electronic device and storage medium
CN111340731B (en) Image processing method and device, electronic equipment and storage medium
CN110633700B (en) Video processing method and device, electronic equipment and storage medium
CN111523346B (en) Image recognition method and device, electronic equipment and storage medium
CN113139947B (en) Image processing method and device, electronic equipment and storage medium
CN110619610B (en) Image processing method and device
CN109934240B (en) Feature updating method and device, electronic equipment and storage medium
CN110633715B (en) Image processing method, network training method and device and electronic equipment
CN111583142A (en) Image noise reduction method and device, electronic equipment and storage medium
CN111369482B (en) Image processing method and device, electronic equipment and storage medium
CN110415258B (en) Image processing method and device, electronic equipment and storage medium
CN113689361B (en) Image processing method and device, electronic equipment and storage medium
CN109840890B (en) Image processing method and device, electronic equipment and storage medium
CN112598676B (en) Image segmentation method and device, electronic equipment and storage medium
CN112102300B (en) Counting method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant