CN111192218A - Image processing method and device, electronic equipment and storage medium - Google Patents

Image processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN111192218A
CN111192218A CN201911415910.5A CN201911415910A CN111192218A CN 111192218 A CN111192218 A CN 111192218A CN 201911415910 A CN201911415910 A CN 201911415910A CN 111192218 A CN111192218 A CN 111192218A
Authority
CN
China
Prior art keywords
image
processed
pixel
mask
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911415910.5A
Other languages
Chinese (zh)
Other versions
CN111192218B (en
Inventor
叶鑫
王昊然
王甜甜
严琼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Sensetime Technology Co Ltd
Original Assignee
Shenzhen Sensetime Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Sensetime Technology Co Ltd filed Critical Shenzhen Sensetime Technology Co Ltd
Priority to CN201911415910.5A priority Critical patent/CN111192218B/en
Publication of CN111192218A publication Critical patent/CN111192218A/en
Application granted granted Critical
Publication of CN111192218B publication Critical patent/CN111192218B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The present disclosure relates to an image processing method and apparatus, an electronic device, and a storage medium, the method including: obtaining a first image according to the image to be processed; obtaining a mask image of the image to be processed according to the pixel value of each pixel point of the image to be processed; and fusing the image to be processed and the first image according to the mask image to obtain a halation-removed image. The embodiment of the disclosure can improve the precision of the halo image removal.

Description

Image processing method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to an image processing method and apparatus, an electronic device, and a storage medium.
Background
With the development of science and technology, terminal devices such as mobile phones and the like can be used as photographing devices to replace cameras.
In night scene shooting, there are usually a large number of artificial light sources in the scene, such as: light rays emitted by light sources such as street lamps and the like are subjected to multiple scattering among different optical surfaces in an optical system, so that a relatively large amount of halation near a light source area in a final image can be brought, the aesthetic feeling of the image can be influenced, and the detail restoration of the image to a real scene can be influenced.
In order to meet the requirement of a user for shooting at night, a night scene mode is provided for terminal equipment such as a mobile phone by a plurality of factories and merchants, and images can be processed through a simple iterative algorithm or a deep learning neural network in the night scene mode to remove halos in the images, but the effect of removing the halos is not satisfactory.
Disclosure of Invention
The present disclosure provides a processing technical solution for removing halo image in an image.
According to an aspect of the present disclosure, there is provided an image processing method including:
obtaining a first image according to the image to be processed;
obtaining a mask image of the image to be processed according to the pixel value of each pixel point of the image to be processed;
and fusing the image to be processed and the first image according to the mask image to obtain a halation-removed image.
In one possible implementation, the mask image comprises a luminance mask image; the obtaining of the mask image of the image to be processed according to the pixel values of the pixels of the image to be processed includes:
determining a brightness mask value of each pixel point according to a pixel value of each pixel point in the image to be processed, wherein the brightness mask value of the pixel point is in negative correlation with a pixel difference value, and the pixel difference value represents a difference value between the pixel value of the pixel point and a middle pixel value of the image to be processed;
and generating a brightness mask image of the image to be processed according to the brightness mask value of each pixel point.
In one possible implementation, the mask image includes a distance mask image; the obtaining of the mask image of the image to be processed according to the pixel values of the pixels of the image to be processed includes:
determining highlight pixel points in the image to be processed;
determining a distance mask value of the pixel point according to the distance from any pixel point in the image to be processed to the highlight pixel point;
and generating a distance mask image of the image to be processed according to the distance mask value of each pixel point.
In a possible implementation manner, the determining a distance mask value of a pixel point according to a distance from any pixel point in the image to be processed to a highlight pixel point includes:
and determining a distance mask value of the pixel point according to the distance from any pixel point in the image to be processed to a first highlight pixel point, wherein the first highlight pixel point is the highlight pixel point closest to the pixel point.
In a possible implementation manner, the determining highlight pixel points in the image to be processed includes:
down-sampling the image to be processed to obtain a down-sampled image;
determining the highlight pixel points in the down-sampled image;
the generating of the distance mask image of the image to be processed according to the distance mask value of each pixel point includes:
generating a down-sampling distance mask image according to the distance mask value of each pixel point in the down-sampling image;
and performing up-sampling on the down-sampling distance mask image to obtain a distance mask image of the image to be processed.
In a possible implementation manner, the upsampling the downsampled distance mask image to obtain a distance mask image of the image to be processed includes:
obtaining a corresponding gray image according to the down-sampling image;
guiding the down-sampling distance mask image to filter through the gray level image to obtain a filtered down-sampling distance mask image;
and performing up-sampling on the filtered down-sampling distance mask image to obtain a distance mask image of the image to be processed.
In a possible implementation manner, the obtaining a first image according to an image to be processed includes:
performing halo extraction on the image to be processed to obtain a first image;
the fusing the image to be processed and the first image according to the mask image to obtain a halo-removed image, comprising:
determining a first mask value corresponding to any pixel point in an image to be processed in the mask image;
and obtaining the image without the halation according to the pixel value of the pixel point in the image to be processed, the pixel value of the pixel point in the first image and the first mask value corresponding to the pixel point.
In a possible implementation manner, the obtaining a first image according to an image to be processed includes:
performing halo extraction on the image to be processed to obtain a halo image;
and performing halo removing treatment on the image to be processed according to the halo image to obtain the first image.
In a possible implementation manner, the fusing the image to be processed and the first image according to the mask image to obtain a halo-removed image includes:
aiming at any pixel point in the image to be processed, determining a first mask value corresponding to the pixel point in the mask image;
determining a second mask value from the first mask value;
determining a first pixel value according to the first mask value and the pixel value of the pixel point in the first image, and determining a second pixel value according to the second mask value and the pixel value of the pixel point in the image to be processed;
and obtaining the degaussing image according to the first pixel value and the second pixel value.
According to an aspect of the present disclosure, there is provided an image processing apparatus including:
the first processing module is used for obtaining a first image according to the image to be processed;
the second processing module is used for obtaining a mask image of the image to be processed according to the pixel value of each pixel point of the image to be processed;
and the fusion module is used for fusing the image to be processed and the first image according to the mask image to obtain a halation-removed image.
In one possible implementation, the mask image comprises a luminance mask image; the second processing module is further configured to:
determining a brightness mask value of each pixel point according to a pixel value of each pixel point in the image to be processed, wherein the brightness mask value of the pixel point is in negative correlation with a pixel difference value, and the pixel difference value represents a difference value between the pixel value of the pixel point and a middle pixel value of the image to be processed;
and generating a brightness mask image of the image to be processed according to the brightness mask value of each pixel point.
In one possible implementation, the mask image includes a distance mask image; the second processing module is further configured to:
determining highlight pixel points in the image to be processed;
determining a distance mask value of the pixel point according to the distance from any pixel point in the image to be processed to the highlight pixel point;
and generating a distance mask image of the image to be processed according to the distance mask value of each pixel point.
In a possible implementation manner, the second processing module is further configured to:
and determining a distance mask value of the pixel point according to the distance from any pixel point in the image to be processed to a first highlight pixel point, wherein the first highlight pixel point is the highlight pixel point closest to the pixel point.
In a possible implementation manner, the second processing module is further configured to:
down-sampling the image to be processed to obtain a down-sampled image;
determining the highlight pixel points in the down-sampled image;
the generating of the distance mask image of the image to be processed according to the distance mask value of each pixel point includes:
generating a down-sampling distance mask image according to the distance mask value of each pixel point in the down-sampling image;
and performing up-sampling on the down-sampling distance mask image to obtain a distance mask image of the image to be processed.
In a possible implementation manner, the second processing module is further configured to:
obtaining a corresponding gray image according to the down-sampling image;
guiding the down-sampling distance mask image to filter through the gray level image to obtain a filtered down-sampling distance mask image;
and performing up-sampling on the filtered down-sampling distance mask image to obtain a distance mask image of the image to be processed.
In a possible implementation manner, the first processing module is further configured to:
performing halo extraction on the image to be processed to obtain a first image;
the fusion module is further configured to:
determining a first mask value corresponding to any pixel point in an image to be processed in the mask image;
and obtaining the image without the halation according to the pixel value of the pixel point in the image to be processed, the pixel value of the pixel point in the first image and the first mask value corresponding to the pixel point.
In a possible implementation manner, the first processing module is further configured to:
performing halo extraction on the image to be processed to obtain a halo image;
and performing halo removing treatment on the image to be processed according to the halo image to obtain the first image.
In a possible implementation manner, the fusion module is further configured to:
aiming at any pixel point in the image to be processed, determining a first mask value corresponding to the pixel point in the mask image;
determining a second mask value from the first mask value;
determining a first pixel value according to the first mask value and the pixel value of the pixel point in the first image, and determining a second pixel value according to the second mask value and the pixel value of the pixel point in the image to be processed;
and obtaining the degaussing image according to the first pixel value and the second pixel value.
According to an aspect of the present disclosure, there is provided an electronic device including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to invoke the memory-stored instructions to perform the above-described method.
According to an aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the above-described method.
Therefore, according to the image processing method and device, the electronic device and the storage medium, the mask image of the image to be processed is obtained according to the pixel value of each pixel point in the image to be processed, and then the image to be processed is guided to be fused with the first image obtained according to the image to be processed according to the mask image, so that the halation-removed image is obtained, and the precision of the halation-removed image can be improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure. Other features and aspects of the present disclosure will become apparent from the following detailed description of exemplary embodiments, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure.
FIG. 1 shows a flow diagram of an image processing method according to an embodiment of the present disclosure;
FIG. 2 shows a schematic diagram of an image processing method according to an embodiment of the present disclosure;
FIG. 3 shows a schematic diagram of an image processing method according to an embodiment of the present disclosure;
fig. 4 shows a block diagram of an image processing apparatus according to an embodiment of the present disclosure;
FIG. 5 shows a block diagram of an electronic device 800 in accordance with an embodiment of the disclosure;
fig. 6 illustrates a block diagram of an electronic device 1900 in accordance with an embodiment of the disclosure.
Detailed Description
Various exemplary embodiments, features and aspects of the present disclosure will be described in detail below with reference to the accompanying drawings. In the drawings, like reference numbers can indicate functionally identical or similar elements. While the various aspects of the embodiments are presented in drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
The word "exemplary" is used exclusively herein to mean "serving as an example, embodiment, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
The term "and/or" herein is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C.
Furthermore, in the following detailed description, numerous specific details are set forth in order to provide a better understanding of the present disclosure. It will be understood by those skilled in the art that the present disclosure may be practiced without some of these specific details. In some instances, methods, means, elements and circuits that are well known to those skilled in the art have not been described in detail so as not to obscure the present disclosure.
Fig. 1 shows a flowchart of an image processing method according to an embodiment of the present disclosure, which may be performed by an electronic device such as a terminal device or a server, the terminal device may be a User Equipment (UE), a mobile device, a User terminal, a cellular phone, a cordless phone, a Personal Digital Assistant (PDA), a handheld device, a computing device, a vehicle-mounted device, a wearable device, or the like, and the method may be implemented by a processor calling a computer-readable instruction stored in a memory. Alternatively, the method may be performed by a server.
As shown in fig. 1, the image processing method may include:
in step S11, a first image is obtained from the image to be processed.
For example, the image to be processed may be subjected to the halo removing processing by a simple iterative algorithm or a pre-trained neural network for performing the halo removing processing on the image to be processed, so as to obtain a first image, where the first image is obtained after the image to be processed is subjected to the primary halo removing processing; or the image to be processed may be subjected to halo extraction through a pre-trained network for halo extraction, so as to obtain a first image, where the first image may be a halo image corresponding to the image to be processed.
In step S12, a mask image of the image to be processed is obtained according to the pixel values of the pixels of the image to be processed.
For example, the pixel values of the pixels in the image to be processed may be obtained (including obtaining the pixel values of the pixels corresponding to the color channels, for example, for a three-channel RGB (Red, Green, Blue, Red, Green, Blue) image, the pixel values corresponding to the three channels may be obtained respectively), the mask value corresponding to each pixel may be determined according to the pixel value of each pixel in the image to be processed, and the mask image of the image to be processed may be obtained according to the mask value of each pixel, where when the first image is an image obtained after the initial blooming removal, the mask value may be used to represent the weight of the pixel values of the pixels in the image to be processed and the first image when the image to be processed and the first image are fused to obtain the blooming removal image; when the first image is a halo image, the mask value may be used to represent a weight corresponding to a pixel value of each pixel point in the first image when the image to be processed and the first image are fused to obtain a halo-removed image.
In a possible implementation manner, the mask image may include a luminance mask image and/or a distance mask image, where the luminance mask image is used to identify a size relationship between a pixel value of each pixel in the image to be processed and an intermediate pixel value, and the intermediate pixel value is an intermediate value between an upper limit of the pixel value and a lower limit of the pixel value in the image to be processed; the distance mask image is used for identifying the distance relationship between each pixel point and highlight pixel points in the image to be processed, and the highlight pixel points are pixel points of which the pixel values corresponding to any color channel in the image to be processed are larger than a pixel threshold value.
For example, a luminance mask image may be determined according to a pixel value of each pixel point in the image to be processed, where a luminance mask value corresponding to any pixel point in the luminance mask image may be used to identify a size relationship between the pixel value of the pixel point and an intermediate pixel value of the image to be processed (an average value between an upper limit of a pixel value and a lower limit of a pixel value allowed by an image format to which the image to be processed belongs may be determined as the intermediate pixel value), and the size relationship may be a gap size. For example: the closer the pixel value is to the intermediate pixel value, the larger the corresponding luminance mask value (the maximum value may be set to 1). For any pixel point, the luminance mask value corresponding to any color channel of the pixel point may be determined as the weight corresponding to the pixel value of the pixel point in the color channel in the first image when the to-be-processed image and the first image are fused to obtain the anti-blooming image, and (1-the luminance mask value) may be determined as the weight corresponding to the pixel value of the pixel point in the color channel in the to-be-processed image.
A distance mask image may be determined according to a pixel value and a pixel spatial position of each pixel point in an image to be processed, and a distance mask value corresponding to any pixel point in the distance mask image may be used to identify a distance relationship between the pixel point and a highlight pixel point (after normalizing the pixel value of each pixel point in the image to be processed to [0, 1], a pixel point whose pixel value corresponding to any color channel is greater than a pixel threshold (e.g., 0.99) is determined to be a highlight pixel point), for example: the distance mask value corresponding to a pixel point closer to a highlight pixel point is smaller (the minimum value can be set to 0). The distance mask value corresponding to any pixel point may be determined as a weight corresponding to a pixel value of the pixel point in the image to be processed when the image to be processed and the first image are fused to obtain the image with the halo removed, and the (1-distance mask value) may be determined as a weight corresponding to a pixel value of the pixel point in the first image.
In step S13, the image to be processed and the first image are fused according to the mask image, so as to obtain a halo-removed image.
For example, when the first image is an image with primary halo being removed, the pixel value of each pixel point in the image to be processed and the weight of the pixel value of each pixel point in the first image can be determined according to the mask image, and then the pixel value of the image to be processed and the pixel value of each pixel point in the first image are fused according to the weight to obtain a halo-removed image (for example, the sum of the product of the pixel value of any pixel point in the image to be processed and the weight corresponding to the pixel value and the product of the pixel value of the pixel point in the first image and the weight corresponding to the pixel value is determined as the pixel value corresponding to the pixel point in the halo-removed image); or, when the first image is a halo image, determining the weight of the pixel point of each pixel point in the first image according to the mask image, and further fusing the pixel value of the image to be processed and the pixel value of each pixel in the first image according to the weight to obtain a deglosed image (for example, determining the difference value between the pixel value of any pixel point in the image to be processed and the product of the pixel value of the pixel point in the first image and the weight corresponding to the pixel value as the pixel value corresponding to the pixel point in the deglosed image).
Therefore, according to the image processing method provided by the disclosure, the mask image of the image to be processed is obtained according to the pixel value of each pixel point in the image to be processed, and then the image to be processed is guided to be fused with the first image obtained according to the image to be processed according to the mask image, so that the halation-removed image is obtained, and the precision of the halation-removed image can be improved.
In a possible implementation manner, the obtaining the mask image of the image to be processed according to the pixel values of the pixels of the image to be processed includes:
determining a brightness mask value of each pixel point according to a pixel value of each pixel point in the image to be processed, wherein the brightness mask value of the pixel point is in negative correlation with a pixel difference value, and the pixel difference value represents a difference value between the pixel value of the pixel point and a middle pixel value of the image to be processed;
and generating a brightness mask image of the image to be processed according to the brightness mask value of each pixel point.
For example, the upper limit and the lower limit of the pixel values allowed by the image to be processed may be determined according to the image format to which the image to be processed belongs, for example: for an RGB image with an upper limit of 255 and a lower limit of 0, the intermediate pixel value can be determined to be 127.5.
The brightness mask value of each pixel point can be determined according to the pixel value of each pixel point in the image to be processed, wherein the brightness mask value corresponding to the upper limit of the pixel value and the lower limit of the pixel value is a first numerical value, the brightness mask value corresponding to the middle pixel value is a second numerical value, the upper limit of the pixel value and the lower limit of the pixel value are close to each other along with the fact that the pixel value is far away from the middle pixel value, and the corresponding brightness mask value is changed from the second numerical value to the first numerical value.
In a possible implementation manner, the first value may be set to 0, and the second value may be set to 1, so that the luminance mask value in the luminance mask image has a value range of [0, 1 ]. After the brightness mask value of each pixel is determined, a brightness mask image of the image to be processed can be generated according to the brightness mask value of each pixel.
Therefore, the to-be-processed image and the first image can be guided to be fused according to the brightness mask image, because the brightness mask image is used for identifying the size relationship between the pixel value of each pixel point in the to-be-processed image and the intermediate pixel value, and according to the priori knowledge, the halo region is mostly in the medium-brightness region, as the difference between the pixel value and the intermediate pixel value is increased, the probability that the pixel point exists in the halo region is smaller, namely the corresponding halo is smaller, so that the halo removing strength of each pixel point can be determined according to the brightness mask value of each pixel point in the brightness mask image, for example: when the image to be processed and the first image are fused to obtain the deglosed image, the weight of the pixel value of each pixel point in the first image and the weight of the pixel value of each pixel point in the image to be processed are determined, and then the pixel value of each pixel point in the deglosed image is determined according to the weight of the pixel value of each pixel point in the first image and the weight of the pixel value of each pixel point in the image to be processed, so that the problems of wrong processing or black and death of a dark part in the deglosed process can be solved, and the precision of removing the halation is improved.
In a possible implementation manner, the obtaining the mask image of the image to be processed according to the pixel values of the pixels of the image to be processed includes:
determining highlight pixel points in the image to be processed;
determining a distance mask value of the pixel point according to the distance from any pixel point in the image to be processed to the highlight pixel point;
and generating a distance mask image of the image to be processed according to the distance mask value of each pixel point.
For example, after the pixel values of the pixels in the image to be processed are normalized, the pixels having the pixel values larger than the pixel threshold (for example, may be set to 0.99) corresponding to any color channel are determined as highlight pixels. For any pixel point, a distance mask value corresponding to the pixel point can be determined according to the distance between the pixel point and the highlight pixel, and further a distance mask image of the image to be processed is generated according to the distance mask value corresponding to each pixel point.
In a possible implementation manner, the determining a distance mask value of a pixel point according to a distance from any pixel point in the image to be processed to a highlight pixel point may include:
and determining a distance mask value of the pixel point according to the distance from any pixel point in the image to be processed to a first highlight pixel point, wherein the first highlight pixel point is the highlight pixel point closest to the pixel point.
For example, the distance between the pixel point and the nearest first highlight pixel may be determined, and the distance mask value of the pixel point is determined according to the distance between the pixel point and the nearest first highlight pixel (the same distance mask value corresponds to multiple color channels of the same pixel), for example: the distance corresponding to the pixel point larger than the preset distance threshold value can be set as the preset distance threshold value, the distance corresponding to each pixel point can be normalized to [0, 1], the value corresponding to the normalized distance can be determined as the distance mask value corresponding to each pixel point, and then the distance mask image can be generated according to the distance mask value of each pixel point.
In a possible implementation manner, the determining highlight pixel points in the image to be processed may include:
down-sampling the image to be processed to obtain a down-sampled image;
determining the highlight pixel points in the down-sampled image.
Generating a small-size distance mask image according to the distance mask value of each pixel;
generating a distance mask image of the image to be processed according to the distance mask value of each pixel point may include:
generating a down-sampling distance mask image according to the distance mask value of each pixel point in the down-sampling image;
and performing up-sampling on the down-sampling distance mask image to obtain a distance mask image of the image to be processed.
For example, the image to be processed may be downsampled to obtain a downsampled image. For example: the image to be processed is downsampled by adopting a region averaging mode, and the downsampling mode is not particularly limited in the disclosure. The distance mask value corresponding to each pixel point in the downsampled image is determined, then the downsampled distance mask image can be generated according to the distance mask value of each pixel, the downsampled distance mask image is upsampled, and the distance mask image with the same size as the image to be processed can be obtained. Thus, the efficiency of image processing can be improved.
In a possible implementation manner, the upsampling the downsampled distance mask image to obtain a distance mask image of the image to be processed may include:
obtaining a corresponding gray image according to the down-sampling image;
guiding the down-sampling distance mask image to filter through the gray level image to obtain a filtered down-sampling distance mask image;
and performing up-sampling on the filtered down-sampling distance mask image to obtain a distance mask image of the image to be processed.
For example, a corresponding gray image can be obtained according to the downsampled image, and the downsampled distance mask image is guided by the gray image to be filtered (such as bilateral filtering, guided filtering, recursive domain conversion filter and the like) to obtain a filtered downsampled distance mask image, wherein the filtered downsampled filtering mask image is better attached to the edge of the corresponding content in the image to be processed, and introduction of an unnatural phenomenon is avoided. And performing up-sampling on the filtered down-sampling distance mask image to obtain a distance mask image with the same size as the image to be processed.
Therefore, the image to be processed and the first image can be guided to be fused according to the distance mask image, the distance mask image is used for identifying the distance relation between each pixel point and the highlight pixel point in the image to be processed, and the halo of the pixel point is linearly decreased along with the increase of the distance between the pixel point and the light source according to the priori knowledge, so that the halo removing strength of each pixel point can be determined according to the distance mask value of each pixel point in the distance mask image. For example: when the image to be processed and the first image are fused to obtain the halation-removed image, the weight of the pixel value of each pixel point in the first image and the weight of the pixel value of each pixel point in the image to be processed can be determined, and then the pixel value of each pixel point in the halation-removed image is determined according to the weight of the pixel value of each pixel point in the first image and the weight of the pixel value of each pixel point in the image to be processed, so that the problems that a region far away from a light source is processed by mistake and the light source is in unnatural transition can be solved, and the precision of removing the halation is improved.
In a possible implementation manner, the obtaining the first image according to the image to be processed may include:
performing halo extraction on the image to be processed to obtain a first image;
the fusing the to-be-processed image and the first image according to the mask image to obtain a halo-removed image may include:
determining a first mask value corresponding to any pixel point in an image to be processed in the mask image;
and obtaining the image without the halation according to the pixel value of the pixel point in the image to be processed, the pixel value of the pixel point in the first image and the first mask value corresponding to the pixel point.
For example, a pre-trained neural network for halo extraction may be used to perform halo extraction on an image to be processed, and a halo map caused by a light source is extracted, where the halo map may be used as the first image.
In a possible implementation manner, the obtaining the deblurred image according to a pixel value of the pixel in the image to be processed, a pixel value of the pixel in the first image, and the first mask value corresponding to the pixel may include:
and obtaining the image without the halation according to the pixel value of the pixel point in the image to be processed, the pixel value of the pixel point in the first image and a first brightness mask value corresponding to the pixel point.
For example, a first luminance mask value corresponding to any pixel point in the luminance mask image may be determined, and a difference between a pixel value of the pixel point in the image to be processed and a product of the pixel value of the pixel point in the first image and the first luminance mask value may be determined as a pixel value of the pixel point in the deglosed image, so as to obtain the deglosed image.
For example, assuming that a pixel a corresponds to each channel in the luminance mask image (the first luminance mask value 1, the first luminance mask value 2, and the first luminance mask value 3), assuming that the pixel a corresponds to a pixel value (a, b, c) in the image to be processed, and the pixel value in the first image is (a11, b11, c11), the pixel a corresponds to a pixel value in the deglossing image of each channel: (a21, b21, c21), wherein a21 ═ a — first luminance mask value 1 × a11, b21 ═ b — first luminance mask value 2 × b11, and c21 ═ c — first luminance mask value 3 × c 11.
Therefore, the closer the pixel value is to the middle pixel value, the greater the corresponding degaussing force is, the larger the proportion of the pixel value of the pixel point in the first image is, and correspondingly, the closer the pixel value is to the upper limit and the lower limit of the pixel value, the smaller the corresponding degaussing force is, and the smaller the proportion of the pixel value of the pixel point in the first image is, so that the degaussing force corresponding to the pixel can be determined according to the pixel value of the pixel point in the image to be processed, the problem of false processing or dark part dead black generated in the degaussing process can be solved, and the degaussing precision of the degaussing is improved.
In a possible implementation manner, the obtaining the deblurred image according to a pixel value of the pixel in the image to be processed, a pixel value of the pixel in the first image, and the first mask value corresponding to the pixel may include:
and obtaining the image without the halation according to the pixel value of the pixel point in the image to be processed, the pixel value of the pixel point in the first image and a first distance mask value corresponding to the pixel point.
For example, a first distance mask value corresponding to any pixel point in the distance mask image may be determined, and a difference between a pixel value of the pixel point in the image to be processed and a product of the pixel value of the pixel point in the first image and (1 — the first distance mask value) may be determined as a pixel value of the pixel point in the deblurred image, so as to obtain the deblurred image.
For example, assuming that a pixel a corresponds to each channel in the distance mask image (the first distance mask value 1, the first distance mask value 2, and the first distance mask value 3), assuming that the pixel a corresponds to a pixel value (a, b, c) in the image to be processed, and the pixel value in the first image is (a11, b11, c11), the pixel a corresponds to a pixel value in the deglossing image of each channel: (a21, b21, c21), wherein a21 is a- (1-first distance mask value 1) × a11, b21 is b- (1-first distance mask value 2) × b11, and c21 is c- (1-first distance mask value 3) × c 11.
Therefore, the larger the halo removing force of the pixel point closer to the highlight pixel point is, the larger the occupation ratio of the pixel value of the pixel point in the first image is, correspondingly, the smaller the halo removing force of the pixel point farther away from the highlight pixel point is, the smaller the occupation ratio of the pixel value of the pixel point in the first image is, and further the corresponding halo removing force of the pixel point can be determined according to the distance between the pixel point in the image to be processed and a light source (highlight pixel), so that the problems of mishandling of a region far away from the light source and unnatural transition of the light source can be solved, and the halo removing precision is improved.
In a possible implementation manner, the mask image may include a luminance mask image and a distance mask image, where the first mask value may include a first luminance mask image of a pixel in the luminance mask image and a first distance mask value in the distance mask image, and the obtaining the deblurring image according to a pixel value of the pixel in the image to be processed, a pixel value of the pixel in the first image, and the first mask value corresponding to the pixel may include:
and obtaining the image without the halation according to the pixel value of the pixel point in the image to be processed, the pixel value of the pixel point in the first image, a first brightness mask value and a first distance mask value corresponding to the pixel point.
For example, a first luminance mask image of any pixel point in the luminance mask image and a first distance mask value corresponding to the pixel point in the distance mask image may be determined, and a weight corresponding to the pixel point in the first image may be determined according to the first luminance mask value and the first distance mask value, for example: the product of the first brightness value and (1-first distance mask value) may be determined as the weight corresponding to the pixel point in the first image, and further, the difference between the pixel value of the pixel point in the image to be processed and the product of the pixel value of the pixel point in the first image and the weight may be determined as the pixel value of the pixel point in the degaussed image, so as to obtain the degaussed image.
For example, assuming that a pixel a corresponds to each channel in the luminance mask image (first luminance mask value 1, first luminance mask value 2, and first luminance mask value 3), and corresponds to each channel in the distance mask image (first distance mask value 1, first distance mask value 2, and first distance mask value 3), assuming that the pixel a corresponds to a pixel value (a, b, c) in the image to be processed, and the corresponding pixel value (a11, b11, c11) in the first image, the pixel a corresponds to a pixel value in the deglosed image of: (a21, b21, c21), wherein a21 is a-first luminance mask value 1 (1-first distance mask value 1) × a11, b21 is b-first luminance mask value 2 (1-first distance mask value 2) × b11, and c21 is c-first luminance mask value 3 (1-first distance mask value 3) × c 11.
Therefore, the halo removing image obtained according to the distance mask image and the brightness mask image can relieve the problems of error processing or dark part dead black generated in the halo removing process, relieve the problems of error processing of a region far away from a light source and unnatural transition of the light source, and improve the halo removing precision.
In a possible implementation manner, the performing a halo removal process on the image to be processed to obtain a first image may include:
performing halo extraction on the image to be processed to obtain a halo image;
and obtaining the first image according to the image to be processed and the halo image.
For example, the image to be processed is composed of the first image (the image without halo is expected) and the halo map, and the pixel value of any pixel in the image to be processed may be composed of the pixel value of the pixel in the first image and the pixel value in the halo map. The pre-trained neural network for extracting the halation can be adopted to extract the halation of the image to be processed, and a halation image caused by the light source is extracted. Subtracting the halo image from the image to be processed to obtain a first image, wherein the pixel value of any pixel point in the first image is the difference value between the pixel value of the pixel point in the image to be processed and the pixel value of the pixel point in the halo image. For example: the image to be processed is an RGB image, where the pixel value of a certain pixel point is (a, b, c), the pixel value of the pixel point in the first image is (a1, b1, c1), and the pixel value in the halo image is (a2, b2, c2), where a1 is a-a2, b1 is b-b2, and c1 is c-c 2.
In a possible implementation manner, the fusing the to-be-processed image and the first image according to the mask image to obtain a halo-removed image may include:
aiming at any pixel point in the image to be processed, determining a first mask value corresponding to the pixel point in the mask image;
determining a second mask value from the first mask value;
determining a first pixel value according to the first mask value and the pixel value of the pixel point in the first image, and determining a second pixel value according to the second mask value and the pixel value of the pixel point in the image to be processed;
and obtaining the degaussing image according to the first pixel value and the second pixel value.
For example, the mask image may include a luminance mask image, and the first mask value may be a first luminance mask value corresponding to a pixel point in the luminance mask image, and the second mask value may be a second luminance mask value determined according to the first luminance mask value.
And determining a first pixel value according to the first brightness mask value and the pixel value of the pixel point in the first image, determining a second pixel value according to the second brightness mask value and the pixel value of the pixel point in the image to be processed, and obtaining the antihalation image according to the first pixel value and the second pixel value.
For example, a first luminance mask value corresponding to any pixel point in the luminance mask image may be determined, and the second luminance mask value is determined to be 1 — the first luminance mask value. A first pixel value may be determined from a first luminance mask value and a pixel value of the pixel point in the first image, for example: a product of the first luminance mask value and a pixel value of the pixel point in the first image may be determined to be a first pixel value. A second pixel value may be determined according to the second luminance mask value and the pixel value of the pixel point in the image to be processed, for example: the product of the second luminance mask value and the pixel value of the pixel point in the image to be processed can be determined as a second pixel value, and the sum of the first pixel value and the second pixel value is determined as the pixel value of the pixel point in the deglosed image, so as to obtain the deglosed image.
For example, assuming that a pixel a corresponds to each channel in the luminance mask image (the first luminance mask value 1, the first luminance mask value 2, and the first luminance mask value 3), assuming that the pixel a corresponds to a pixel value (a, b, c) in the image to be processed, and the pixel value in the first image is (a1, b1, c1), the pixel a corresponds to a pixel value in the deglossing image of each channel: (a2, b2, c2), wherein a2 ═ first luminance mask value 1 × a1+ (1-first luminance mask value 1) × a, b2 ═ first luminance mask value 2 × b1+ (1-first luminance mask value 2) × b, and c2 ═ first luminance mask value 3 × c1+ (1-first luminance mask value 3) × c.
Therefore, the closer the pixel value is to the middle pixel value, the greater the corresponding degaussing force is, the larger the proportion of the pixel value of the pixel point in the first image is, and correspondingly, the closer the pixel value is to the upper limit and the lower limit of the pixel value, the smaller the corresponding degaussing force is, and the smaller the proportion of the pixel value of the pixel point in the first image is, so that the degaussing force corresponding to the pixel point can be determined according to the pixel value of the pixel point in the image to be processed, the problem of false processing or dark part dead black generated in the degaussing process can be solved, and the degaussing precision of the degaussing is improved.
For example, the mask image may include a distance mask image, the first mask value may be a first distance mask value corresponding to a pixel point in the distance mask image, and the second mask value may be a second distance mask value determined according to the first distance mask.
And determining a first pixel value according to the first distance mask value and the pixel value of the pixel point in the image to be processed, determining a second pixel value according to the second distance mask value and the pixel value of the pixel point in the first image, and obtaining the antihalation image according to the first pixel value and the second pixel value.
For example, a first distance mask value corresponding to any pixel point in the distance mask image may be determined, and the second distance mask value is determined to be 1 — the first distance mask value. A first pixel value may be determined from the first distance mask value and the pixel value of the pixel point in the image to be processed, for example: and determining the product of the first distance mask value and the pixel value of the pixel point in the image to be processed as a first pixel value. A second pixel value may be determined from the second distance mask value and the pixel value of the pixel point in the first image, for example: the second luminance mask value multiplied by the pixel value of the pixel in the first image may be determined as a second pixel value, and the sum of the first pixel value and the second pixel value may be determined as the pixel value of the pixel in the deblurred image to obtain the deblurred image.
For example, assuming that the pixel a corresponds to a first distance mask value in the distance mask image, assuming that the pixel a corresponds to a pixel value (a, b, c) in the image to be processed, and a pixel value (a1, b1, c1) in the first image, the pixel a corresponds to a pixel value in the deglossing image: (a2, b2, c2), wherein a2 is the first distance mask value × a + (1-first distance mask value) × a1, b2 is the first distance mask value × b + (1-first distance mask value) × b1, and c2 is the first distance mask value × c + (1-first distance mask value) × c 1.
Therefore, the pixel degaussing strength closer to the highlight pixel point is larger, the pixel value of the pixel point in the first image is larger in proportion, correspondingly, the pixel degaussing strength far away from the highlight pixel point is smaller, the pixel value of the pixel point in the first image is smaller in proportion, and the degaussing strength corresponding to the pixel point can be determined according to the distance between the pixel point in the image to be processed and a light source (highlight pixel), so that the problems of mishandling of a region far away from the light source and unnatural transition of the light source can be solved, and the degaussing precision is improved.
For example, when the mask image includes a luminance mask image and includes a distance mask image, obtaining the degaussing image according to the first pixel values and the second pixel values may include:
obtaining a second image according to the first pixel value and the second pixel value;
determining a first distance mask value corresponding to the pixel in the distance mask image;
determining a second distance mask value from the first distance mask;
determining a third pixel value according to the first distance mask value and the pixel value of the pixel point in the image to be processed, and determining a fourth pixel value according to the second distance mask value and the pixel value of the pixel point in the second image;
and obtaining the degaussing image according to the third pixel value and the fourth pixel value.
Referring to fig. 2, a halo-removed image may be obtained by combining the luminance mask image and the distance mask image, and the luminance mask image may be first used to fuse the image to be processed and the first image (the fusion process may refer to the foregoing embodiment, and details of the present disclosure are not described herein), so as to obtain a second image, which is a transition image. Further, the image to be processed and the second image may be fused through the distance mask image, specifically: the product of the first distance mask value and the pixel value of the pixel point in the image to be processed may be determined as a third pixel value, the product of the second luminance mask value and the pixel value of the pixel point in the second image may be determined as a fourth pixel value, and the sum of the third pixel value and the fourth pixel value is determined as the pixel value of the pixel point in the deblurred image, so as to obtain the deblurred image.
Therefore, the halo removing image is obtained by combining the brightness mask image and the distance mask image, so that the problems of wrong processing or dark part dead black generated in the halo removing process can be solved, and the problems of wrong processing of a region far away from a light source and unnatural transition of the light source can be solved.
For example, when the mask image includes a distance mask image and a brightness mask image, the obtaining the degaussing image according to the first pixel values and the second pixel values may include:
obtaining a third image according to the first pixel value and the second pixel value;
determining a first brightness mask value corresponding to the pixel point in the brightness mask image;
determining a second luminance mask value from the first luminance mask value;
determining a fifth pixel value according to the first brightness mask value and the pixel value of the pixel point in the third image, and determining a sixth pixel value according to the second brightness mask value and the pixel value of the pixel point in the image to be processed;
and obtaining the degaussing image according to the fifth pixel value and the sixth pixel value.
Referring to fig. 3, the distance mask image may be combined with the brightness mask image to obtain a halo-removed image, and the distance mask image may be first used to fuse the image to be processed and the first image (the fusion process may refer to the foregoing embodiment, and details of the present disclosure are not described herein), so as to obtain a third image, which is a transition image. Further, the image to be processed and the third image may be fused through the luminance mask image, specifically: the product of the first distance mask value and the pixel value of the pixel point in the third image may be determined as a fifth pixel value, the product of the second luminance mask value and the pixel value of the pixel point in the image to be processed may be determined as a sixth pixel value, and the sum of the fifth pixel value and the sixth pixel value is determined as the pixel value of the pixel point in the deblurred image, so as to obtain the deblurred image.
Therefore, the distance mask image is combined with the brightness mask image to obtain the halo removing image, so that the problems of wrong processing or dark part dead black generated in the halo removing process can be solved, and the problems of wrong processing of a region far away from a light source and unnatural transition of the light source can be solved.
It is understood that the above-mentioned method embodiments of the present disclosure can be combined with each other to form a combined embodiment without departing from the logic of the principle, which is limited by the space, and the detailed description of the present disclosure is omitted. Those skilled in the art will appreciate that in the above methods of the specific embodiments, the specific order of execution of the steps should be determined by their function and possibly their inherent logic.
In addition, the present disclosure also provides an image processing apparatus, an electronic device, a computer-readable storage medium, and a program, which can be used to implement any one of the image processing methods provided by the present disclosure, and the descriptions and corresponding descriptions of the corresponding technical solutions and the corresponding descriptions in the methods section are omitted for brevity.
Fig. 4 illustrates a block diagram of an image processing apparatus according to an embodiment of the present disclosure, which includes, as illustrated in fig. 4:
a first processing module 401, configured to obtain a first image according to an image to be processed;
the second processing module 402 may be configured to obtain a mask image of the image to be processed according to a pixel value of each pixel point of the image to be processed;
the fusion module 403 may be configured to fuse the image to be processed and the first image according to the mask image to obtain a halo-removed image.
Therefore, the image processing device provided by the disclosure obtains the mask image of the image to be processed according to the pixel value of each pixel point in the image to be processed, and then guides the image to be processed to be fused with the first image obtained according to the image to be processed according to the mask image to obtain the anti-halation image, so that the precision of the anti-halation image can be improved.
In one possible implementation, the mask image may include a luminance mask image; the second processing module may be further configured to:
determining a brightness mask value of each pixel point according to a pixel value of each pixel point in the image to be processed, wherein the brightness mask value of the pixel point is in negative correlation with a pixel difference value, and the pixel difference value represents a difference value between the pixel value of the pixel point and a middle pixel value of the image to be processed;
and generating a brightness mask image of the image to be processed according to the brightness mask value of each pixel point.
In one possible implementation, the mask image may include a distance mask image; the second processing module may be further configured to:
determining highlight pixel points in the image to be processed;
determining a distance mask value of the pixel point according to the distance from any pixel point in the image to be processed to the highlight pixel point;
and generating a distance mask image of the image to be processed according to the distance mask value of each pixel point.
In a possible implementation manner, the second processing module may be further configured to:
and determining a distance mask value of the pixel point according to the distance from any pixel point in the image to be processed to a first highlight pixel point, wherein the first highlight pixel point is the highlight pixel point closest to the pixel point.
In a possible implementation manner, the second processing module may be further configured to:
down-sampling the image to be processed to obtain a down-sampled image;
determining the highlight pixel points in the down-sampled image;
the generating of the distance mask image of the image to be processed according to the distance mask value of each pixel point includes:
generating a down-sampling distance mask image according to the distance mask value of each pixel point in the down-sampling image;
and performing up-sampling on the down-sampling distance mask image to obtain a distance mask image of the image to be processed.
In a possible implementation manner, the second processing module may be further configured to:
obtaining a corresponding gray image according to the down-sampling image;
guiding the down-sampling distance mask image to filter through the gray level image to obtain a filtered down-sampling distance mask image;
and performing up-sampling on the filtered down-sampling distance mask image to obtain a distance mask image of the image to be processed.
In a possible implementation manner, the first processing module may be further configured to:
performing halo extraction on the image to be processed to obtain a first image;
the fusion module may be further configured to:
determining a first mask value corresponding to any pixel point in an image to be processed in the mask image;
and obtaining the image without the halation according to the pixel value of the pixel point in the image to be processed, the pixel value of the pixel point in the first image and the first mask value corresponding to the pixel point.
In a possible implementation manner, the first processing module may be further configured to:
performing halo extraction on the image to be processed to obtain a halo image;
and performing halo removing treatment on the image to be processed according to the halo image to obtain the first image.
In a possible implementation manner, the fusion module may be further configured to:
aiming at any pixel point in the image to be processed, determining a first mask value corresponding to the pixel point in the mask image;
determining a second mask value from the first mask value;
determining a first pixel value according to the first mask value and the pixel value of the pixel point in the first image, and determining a second pixel value according to the second mask value and the pixel value of the pixel point in the image to be processed;
and obtaining the degaussing image according to the first pixel value and the second pixel value.
In some embodiments, functions of or modules included in the apparatus provided in the embodiments of the present disclosure may be used to execute the method described in the above method embodiments, and specific implementation thereof may refer to the description of the above method embodiments, and for brevity, will not be described again here.
Embodiments of the present disclosure also provide a computer-readable storage medium having stored thereon computer program instructions, which when executed by a processor, implement the above-mentioned method. The computer readable storage medium may be a non-volatile computer readable storage medium.
An embodiment of the present disclosure further provides an electronic device, including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to invoke the memory-stored instructions to perform the above-described method.
The embodiments of the present disclosure also provide a computer program product, which includes computer readable code, and when the computer readable code runs on a device, a processor in the device executes instructions for implementing the image processing method provided in any one of the above embodiments.
The embodiments of the present disclosure also provide another computer program product for storing computer readable instructions, which when executed cause a computer to perform the operations of the image processing method provided in any of the above embodiments.
The electronic device may be provided as a terminal, server, or other form of device.
Fig. 5 illustrates a block diagram of an electronic device 800 in accordance with an embodiment of the disclosure. For example, the electronic device 800 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, or the like terminal.
Referring to fig. 5, electronic device 800 may include one or more of the following components: processing component 802, memory 804, power component 806, multimedia component 808, audio component 810, input/output (I/O) interface 812, sensor component 814, and communication component 816.
The processing component 802 generally controls overall operation of the electronic device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the electronic device 800. Examples of such data include instructions for any application or method operating on the electronic device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power supply component 806 provides power to the various components of the electronic device 800. The power components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the electronic device 800.
The multimedia component 808 includes a screen that provides an output interface between the electronic device 800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the electronic device 800 is in an operation mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the electronic device 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 814 includes one or more sensors for providing various aspects of state assessment for the electronic device 800. For example, the sensor assembly 814 may detect an open/closed state of the electronic device 800, the relative positioning of components, such as a display and keypad of the electronic device 800, the sensor assembly 814 may also detect a change in the position of the electronic device 800 or a component of the electronic device 800, the presence or absence of user contact with the electronic device 800, orientation or acceleration/deceleration of the electronic device 800, and a change in the temperature of the electronic device 800. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate wired or wireless communication between the electronic device 800 and other devices. The electronic device 800 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 816 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the electronic device 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium, such as the memory 804, is also provided that includes computer program instructions executable by the processor 820 of the electronic device 800 to perform the above-described methods.
Fig. 6 illustrates a block diagram of an electronic device 1900 in accordance with an embodiment of the disclosure. For example, the electronic device 1900 may be provided as a server. Referring to fig. 6, electronic device 1900 includes a processing component 1922 further including one or more processors and memory resources, represented by memory 1932, for storing instructions, e.g., applications, executable by processing component 1922. The application programs stored in memory 1932 may include one or more modules that each correspond to a set of instructions. Further, the processing component 1922 is configured to execute instructions to perform the above-described method.
The electronic device 1900 may also include a power component 1926 configured to perform power management of the electronic device 1900, a wired or wireless network interface 1950 configured to connect the electronic device 1900 to a network, and an input/output (I/O) interface 1958. The electronic device 1900 may operate based on an operating system stored in memory 1932, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, or the like.
In an exemplary embodiment, a non-transitory computer readable storage medium, such as the memory 1932, is also provided that includes computer program instructions executable by the processing component 1922 of the electronic device 1900 to perform the above-described methods.
The present disclosure may be systems, methods, and/or computer program products. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied thereon for causing a processor to implement various aspects of the present disclosure.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present disclosure may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, the electronic circuitry that can execute the computer-readable program instructions implements aspects of the present disclosure by utilizing the state information of the computer-readable program instructions to personalize the electronic circuitry, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA).
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The computer program product may be embodied in hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied in a computer storage medium, and in another alternative embodiment, the computer program product is embodied in a Software product, such as a Software Development Kit (SDK), or the like.
Having described embodiments of the present disclosure, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (12)

1. An image processing method, comprising:
obtaining a first image according to the image to be processed;
obtaining a mask image of the image to be processed according to the pixel value of each pixel point of the image to be processed;
and fusing the image to be processed and the first image according to the mask image to obtain a halation-removed image.
2. The method of claim 1, wherein the mask image comprises a luminance mask image; the obtaining of the mask image of the image to be processed according to the pixel values of the pixels of the image to be processed includes:
determining a brightness mask value of each pixel point according to a pixel value of each pixel point in the image to be processed, wherein the brightness mask value of the pixel point is in negative correlation with a pixel difference value, and the pixel difference value represents a difference value between the pixel value of the pixel point and a middle pixel value of the image to be processed;
and generating a brightness mask image of the image to be processed according to the brightness mask value of each pixel point.
3. The method according to claim 1 or 2, wherein the mask image comprises a distance mask image; the obtaining of the mask image of the image to be processed according to the pixel values of the pixels of the image to be processed includes:
determining highlight pixel points in the image to be processed;
determining a distance mask value of the pixel point according to the distance from any pixel point in the image to be processed to the highlight pixel point;
and generating a distance mask image of the image to be processed according to the distance mask value of each pixel point.
4. The method of claim 3, wherein determining the distance mask value of the pixel according to the distance from any pixel in the image to be processed to a highlight pixel comprises:
and determining a distance mask value of the pixel point according to the distance from any pixel point in the image to be processed to a first highlight pixel point, wherein the first highlight pixel point is the highlight pixel point closest to the pixel point.
5. The method according to claim 3 or 4, wherein the determining highlight pixel points in the image to be processed comprises:
down-sampling the image to be processed to obtain a down-sampled image;
determining the highlight pixel points in the down-sampled image;
the generating of the distance mask image of the image to be processed according to the distance mask value of each pixel point includes:
generating a down-sampling distance mask image according to the distance mask value of each pixel point in the down-sampling image;
and performing up-sampling on the down-sampling distance mask image to obtain a distance mask image of the image to be processed.
6. The method according to claim 5, wherein the upsampling the downsampled distance mask image to obtain a distance mask image of the image to be processed comprises:
obtaining a corresponding gray image according to the down-sampling image;
guiding the down-sampling distance mask image to filter through the gray level image to obtain a filtered down-sampling distance mask image;
and performing up-sampling on the filtered down-sampling distance mask image to obtain a distance mask image of the image to be processed.
7. The method according to any one of claims 1 to 6, wherein the deriving the first image from the image to be processed comprises:
performing halo extraction on the image to be processed to obtain a first image;
the fusing the image to be processed and the first image according to the mask image to obtain a halo-removed image, comprising:
determining a first mask value corresponding to any pixel point in an image to be processed in the mask image;
and obtaining the image without the halation according to the pixel value of the pixel point in the image to be processed, the pixel value of the pixel point in the first image and the first mask value corresponding to the pixel point.
8. The method according to any one of claims 1 to 6, wherein the deriving the first image from the image to be processed comprises:
performing halo extraction on the image to be processed to obtain a halo image;
and performing halo removing treatment on the image to be processed according to the halo image to obtain the first image.
9. The method according to claim 8, wherein the fusing the image to be processed and the first image according to the mask image to obtain a deglossed image comprises:
aiming at any pixel point in the image to be processed, determining a first mask value corresponding to the pixel point in the mask image;
determining a second mask value from the first mask value;
determining a first pixel value according to the first mask value and the pixel value of the pixel point in the first image, and determining a second pixel value according to the second mask value and the pixel value of the pixel point in the image to be processed;
and obtaining the degaussing image according to the first pixel value and the second pixel value.
10. An image processing apparatus characterized by comprising:
the first processing module is used for obtaining a first image according to the image to be processed;
the second processing module is used for obtaining a mask image of the image to be processed according to the pixel value of each pixel point of the image to be processed;
and the fusion module is used for fusing the image to be processed and the first image according to the mask image to obtain a halation-removed image.
11. An electronic device, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to invoke the memory-stored instructions to perform the method of any of claims 1 to 9.
12. A computer readable storage medium having computer program instructions stored thereon, which when executed by a processor implement the method of any one of claims 1 to 9.
CN201911415910.5A 2019-12-31 2019-12-31 Image processing method and device, electronic equipment and storage medium Active CN111192218B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911415910.5A CN111192218B (en) 2019-12-31 2019-12-31 Image processing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911415910.5A CN111192218B (en) 2019-12-31 2019-12-31 Image processing method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111192218A true CN111192218A (en) 2020-05-22
CN111192218B CN111192218B (en) 2023-11-24

Family

ID=70709725

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911415910.5A Active CN111192218B (en) 2019-12-31 2019-12-31 Image processing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111192218B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112837251A (en) * 2020-12-31 2021-05-25 维沃移动通信有限公司 Image processing method and device
CN113034389A (en) * 2021-03-17 2021-06-25 武汉联影智融医疗科技有限公司 Image processing method, image processing device, computer equipment and storage medium
CN113689361A (en) * 2021-10-27 2021-11-23 深圳市慧鲤科技有限公司 Image processing method and device, electronic equipment and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10209420A (en) * 1997-01-17 1998-08-07 Sony Corp Manufacture of solid-state imaging element
US20120075445A1 (en) * 2010-09-27 2012-03-29 Fujifilm Corporation Imaging device and endoscopic diagnosis system
CN107231525A (en) * 2017-05-31 2017-10-03 深圳可思美科技有限公司 The dark angle self-adapting compensation method of image and device
US20170359534A1 (en) * 2016-06-10 2017-12-14 Apple Inc. Mismatched Foreign Light Detection And Mitigation In The Image Fusion Of A Two-Camera System
CN108229353A (en) * 2017-12-21 2018-06-29 深圳市商汤科技有限公司 Sorting technique and device, electronic equipment, storage medium, the program of human body image
CN108932696A (en) * 2017-05-26 2018-12-04 杭州海康威视数字技术股份有限公司 The Halation inhibition method and device of signal lamp
CN108961170A (en) * 2017-05-24 2018-12-07 阿里巴巴集团控股有限公司 Image processing method, device and system
US10382712B1 (en) * 2018-08-01 2019-08-13 Qualcomm Incorporated Automatic removal of lens flares from images

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10209420A (en) * 1997-01-17 1998-08-07 Sony Corp Manufacture of solid-state imaging element
US20120075445A1 (en) * 2010-09-27 2012-03-29 Fujifilm Corporation Imaging device and endoscopic diagnosis system
US20170359534A1 (en) * 2016-06-10 2017-12-14 Apple Inc. Mismatched Foreign Light Detection And Mitigation In The Image Fusion Of A Two-Camera System
CN108961170A (en) * 2017-05-24 2018-12-07 阿里巴巴集团控股有限公司 Image processing method, device and system
CN108932696A (en) * 2017-05-26 2018-12-04 杭州海康威视数字技术股份有限公司 The Halation inhibition method and device of signal lamp
CN107231525A (en) * 2017-05-31 2017-10-03 深圳可思美科技有限公司 The dark angle self-adapting compensation method of image and device
CN108229353A (en) * 2017-12-21 2018-06-29 深圳市商汤科技有限公司 Sorting technique and device, electronic equipment, storage medium, the program of human body image
US10382712B1 (en) * 2018-08-01 2019-08-13 Qualcomm Incorporated Automatic removal of lens flares from images

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112837251A (en) * 2020-12-31 2021-05-25 维沃移动通信有限公司 Image processing method and device
CN112837251B (en) * 2020-12-31 2024-04-02 维沃移动通信有限公司 Image processing method and device
CN113034389A (en) * 2021-03-17 2021-06-25 武汉联影智融医疗科技有限公司 Image processing method, image processing device, computer equipment and storage medium
CN113034389B (en) * 2021-03-17 2023-07-25 武汉联影智融医疗科技有限公司 Image processing method, device, computer equipment and storage medium
CN113689361A (en) * 2021-10-27 2021-11-23 深圳市慧鲤科技有限公司 Image processing method and device, electronic equipment and storage medium
CN113689361B (en) * 2021-10-27 2022-02-22 深圳市慧鲤科技有限公司 Image processing method and device, electronic equipment and storage medium
WO2023071167A1 (en) * 2021-10-27 2023-05-04 深圳市慧鲤科技有限公司 Image processing method and apparatus, and electronic device, storage medium and program product

Also Published As

Publication number Publication date
CN111192218B (en) 2023-11-24

Similar Documents

Publication Publication Date Title
CN110348537B (en) Image processing method and device, electronic equipment and storage medium
CN110378976B (en) Image processing method and device, electronic equipment and storage medium
CN111462268B (en) Image reconstruction method and device, electronic equipment and storage medium
CN109784255B (en) Neural network training method and device and recognition method and device
CN107692997B (en) Heart rate detection method and device
CN109118430B (en) Super-resolution image reconstruction method and device, electronic equipment and storage medium
CN111369464B (en) Method and device for removing reflection in image, electronic equipment and storage medium
CN112465843A (en) Image segmentation method and device, electronic equipment and storage medium
CN109859144B (en) Image processing method and device, electronic equipment and storage medium
CN110569777B (en) Image processing method and device, electronic device and storage medium
CN110532957B (en) Face recognition method and device, electronic equipment and storage medium
CN111553864A (en) Image restoration method and device, electronic equipment and storage medium
CN111192218B (en) Image processing method and device, electronic equipment and storage medium
CN110619610B (en) Image processing method and device
CN108900903B (en) Video processing method and device, electronic equipment and storage medium
CN113139947B (en) Image processing method and device, electronic equipment and storage medium
CN111340731A (en) Image processing method and device, electronic equipment and storage medium
CN106792255B (en) Video playing window frame body display method and device
CN112219224A (en) Image processing method and device, electronic equipment and storage medium
CN114187498A (en) Occlusion detection method and device, electronic equipment and storage medium
CN111523346A (en) Image recognition method and device, electronic equipment and storage medium
CN110458771B (en) Image processing method and device, electronic equipment and storage medium
CN110415258B (en) Image processing method and device, electronic equipment and storage medium
CN113689361B (en) Image processing method and device, electronic equipment and storage medium
CN111583142A (en) Image noise reduction method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant