CN113034509A - Image processing method and device - Google Patents

Image processing method and device Download PDF

Info

Publication number
CN113034509A
CN113034509A CN202110216499.XA CN202110216499A CN113034509A CN 113034509 A CN113034509 A CN 113034509A CN 202110216499 A CN202110216499 A CN 202110216499A CN 113034509 A CN113034509 A CN 113034509A
Authority
CN
China
Prior art keywords
distance
color
image
reference point
pixel point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110216499.XA
Other languages
Chinese (zh)
Inventor
穆晶
章佳杰
戴宇荣
于冰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dajia Internet Information Technology Co Ltd
Original Assignee
Beijing Dajia Internet Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dajia Internet Information Technology Co Ltd filed Critical Beijing Dajia Internet Information Technology Co Ltd
Priority to CN202110216499.XA priority Critical patent/CN113034509A/en
Publication of CN113034509A publication Critical patent/CN113034509A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20132Image cropping

Abstract

The present disclosure relates to an image processing method and apparatus. The image processing method comprises the following steps: acquiring a color reference point on an image; calculating the color distance between each pixel point of the image and a color reference point in a preset space; matting the image based on the color distance. According to the image processing method and the image processing device, the cutout can be executed in real time at the mobile terminal.

Description

Image processing method and device
Technical Field
The present disclosure relates to the field of video technology. More particularly, the present disclosure relates to an image processing method and apparatus.
Background
Image processing techniques are widely used in various industries and play an important role in numerous technological industries such as movie and television production, live broadcast platforms, augmented reality, and the like. Image processing separates foreground images from the green screen by estimating their alpha transparency, and many different techniques have been developed today. However, most of the prior art of matting involve complex calculation, the calculation amount is huge, and the speed is very slow; many algorithms rely on expensive graphics card support and therefore these techniques are not suitable for mobile end scenes where real-time imaging is required.
Disclosure of Invention
An exemplary embodiment of the present disclosure is to provide an image processing method and apparatus to solve at least the problems of image processing in the related art, and may not solve any of the above problems.
According to an exemplary embodiment of the present disclosure, there is provided an image processing method including: acquiring a color reference point on an image; calculating a color distance between each pixel point of the image and a color reference point in a preset space; matting the image based on the color distance.
Alternatively, the preset space may include at least one of an HSV space and a YCbCr space.
Alternatively, when the preset space is an HSV space, the step of calculating a color distance between each pixel point of the image and the color reference point in the preset space may include: converting the image to HSV space; calculating the hue distance between the hue of each pixel point and the hue of the color reference point, and calculating the saturation distance between the saturation of each pixel point and the saturation of the color reference point; calculating an HSV color distance between each pixel point and the color reference point based on the hue distance and the saturation distance as a color distance between each pixel point and the color reference point.
Alternatively, the step of calculating the HSV color distance between each pixel point and the color reference point based on the hue distance and the saturation distance may include: determining a shadow region of the image; calculating the brightness distance between the brightness of each pixel point in the shadow area of the image and the brightness of the color reference point; calculating an HSV color distance between each pixel point of a shadow region of the image and a color reference point based on the hue distance, the saturation distance, and the brightness distance.
Alternatively, the step of calculating the HSV color distance between each pixel point of the shadow region of the image and the color reference point based on the hue distance, the saturation distance, and the brightness distance may include: and carrying out weighted calculation on the hue distance, the saturation distance and the brightness distance to obtain the HSV color distance between each pixel point of the shadow area of the image and the color reference point.
Alternatively, when the preset space is a YCbCr space, the step of calculating a color distance between each pixel point of the image and the color reference point in the preset space may include: converting the image to YCbCr space; calculating a blue distance between the blue concentration offset of each pixel point and the blue concentration offset of the color reference point, and calculating a red distance between the red concentration offset of each pixel point and the red concentration offset of the color reference point; the YCbCr color distance between each pixel point and the color reference point is calculated based on the blue distance and the red distance as the color distance between each pixel point and the color reference point.
Alternatively, the step of calculating the YCbCr color distance between each pixel point and the color reference point based on the blue and red distances may include: and performing weighted calculation on the blue distance and the red distance to obtain the YCbCr color distance between each pixel point and the color reference point.
Alternatively, when the preset space is an HSV space and an YCbCr space, the step of calculating a color distance between each pixel point of the image and the color reference point in the preset space may include: calculating an HSV color distance between each pixel point of the image and a color reference point in an HSV space, and calculating a YCbCr color distance between each pixel point of the image and the color reference point in an YCbCr space; calculating a color distance between each pixel point of the image and a color reference point based on the HSV color distance and the YCbCr color distance.
Optionally, the step of matting the image based on the color distance may comprise: calculating the transparency of each pixel point of the image through a preset Gaussian function based on the color distance; and matting the image based on the transparency of each pixel point.
Optionally, the image processing method may further include: determining the edge area of the image obtained by matting; and taking the image as a guide image, and carrying out guide filtering on the transparency of each pixel point of the edge area of the image obtained by matting.
Optionally, the image processing method may further include: and removing the color overflow of the image obtained by the matting based on the color of the color reference point.
Optionally, the step of removing the color spill of the image obtained by the matting based on the color of the color reference point may include: determining pixel points with the same color as the color reference point in the image obtained by matting; and reducing the saturation of the determined pixel points in the HSV space.
Optionally, the step of determining a pixel point having the same color as the color reference point in the image obtained by matting may include: determining pixel points with the same color as the color reference point in the image obtained by matting in the HSV space and the YCbCr space respectively; and determining pixel points which are determined to have the same color as the color reference point in the HSV space and the YCbCr space in the image obtained by the sectional drawing as the pixel points which have the same color as the color reference point.
According to an exemplary embodiment of the present disclosure, there is provided an image processing apparatus including: a reference point acquisition unit configured to acquire a color reference point on an image; a distance calculation unit configured to calculate a color distance between each pixel point of the image and a color reference point in a preset space; and an image matting unit configured to matte the image based on the color distance.
Alternatively, the preset space may include at least one of an HSV space and a YCbCr space.
Alternatively, the distance calculation unit may be configured to: converting the image to HSV space; calculating the hue distance between the hue of each pixel point and the hue of the color reference point, and calculating the saturation distance between the saturation of each pixel point and the saturation of the color reference point; calculating an HSV color distance between each pixel point and the color reference point based on the hue distance and the saturation distance as a color distance between each pixel point and the color reference point.
Alternatively, the distance calculation unit may be configured to: determining a shadow region of the image; calculating the brightness distance between the brightness of each pixel point in the shadow area of the image and the brightness of the color reference point; calculating an HSV color distance between each pixel point of a shadow region of the image and a color reference point based on the hue distance, the saturation distance, and the brightness distance.
Alternatively, the distance calculation unit may be configured to: and carrying out weighted calculation on the hue distance, the saturation distance and the brightness distance to obtain the HSV color distance between each pixel point of the shadow area of the image and the color reference point.
Alternatively, the distance calculation unit may be configured to: converting the image to YCbCr space; calculating a blue distance between the blue concentration offset of each pixel point and the blue concentration offset of the color reference point, and calculating a red distance between the red concentration offset of each pixel point and the red concentration offset of the color reference point; the YCbCr color distance between each pixel point and the color reference point is calculated based on the blue distance and the red distance as the color distance between each pixel point and the color reference point.
Alternatively, the distance calculation unit may be configured to: and performing weighted calculation on the blue distance and the red distance to obtain the YCbCr color distance between each pixel point and the color reference point.
Alternatively, the distance calculation unit may be configured to: calculating an HSV color distance between each pixel point of the image and a color reference point in an HSV space, and calculating a YCbCr color distance between each pixel point of the image and the color reference point in an YCbCr space; calculating a color distance between each pixel point of the image and a color reference point based on the HSV color distance and the YCbCr color distance.
Optionally, the matting unit can be configured to: calculating the transparency of each pixel point of the image through a preset Gaussian function based on the color distance; and matting the image based on the transparency of each pixel point.
Optionally, the image processing apparatus may further include a guide filtering unit configured to: determining the edge area of the image obtained by matting; and taking the image as a guide image, and carrying out guide filtering on the transparency of each pixel point of the edge area of the image obtained by matting.
Optionally, the image processing apparatus may further include an overflow removing unit configured to: and removing the color overflow of the image obtained by the matting based on the color of the color reference point.
Alternatively, the flash removal unit may be configured to: determining pixel points with the same color as the color reference point in the image obtained by matting; and reducing the saturation of the determined pixel points in the HSV space.
Alternatively, the flash removal unit may be configured to: determining pixel points with the same color as the color reference point in the image obtained by matting in the HSV space and the YCbCr space respectively; and determining pixel points which are determined to have the same color as the color reference point in the HSV space and the YCbCr space in the image obtained by the sectional drawing as the pixel points which have the same color as the color reference point.
According to an exemplary embodiment of the present disclosure, there is provided an electronic apparatus including: a processor; a memory for storing the processor-executable instructions; wherein the processor is configured to execute the instructions to implement an image processing method according to an exemplary embodiment of the present disclosure.
According to an exemplary embodiment of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor of an electronic device, causes the electronic device to execute an image processing method according to an exemplary embodiment of the present disclosure.
According to an exemplary embodiment of the present disclosure, a computer program product is provided, comprising computer programs/instructions which, when executed by a processor, implement an image processing method according to an exemplary embodiment of the present disclosure.
The technical scheme provided by the embodiment of the disclosure at least brings the following beneficial effects:
1. a large amount of manual interaction is avoided;
2. the matting effect is better, and the method is suitable for removing different color spills generated under different illumination;
3. the adjustment of the shadow retention degree is realized;
4. edge burrs and unevenness generated by the cutout are reduced;
5. green color overflow is effectively eliminated;
6. the mobile terminal can execute image matting in real time.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure and are not to be construed as limiting the disclosure.
Fig. 1 illustrates a flowchart of an image processing method according to an exemplary embodiment of the present disclosure.
Fig. 2 illustrates a block diagram of an image processing apparatus according to an exemplary embodiment of the present disclosure.
Fig. 3 illustrates a block diagram of an image processing apparatus according to another exemplary embodiment of the present disclosure.
Fig. 4 is a block diagram of an electronic device 400 according to an example embodiment of the present disclosure.
Detailed Description
In order to make the technical solutions of the present disclosure better understood by those of ordinary skill in the art, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like in the description and claims of the present disclosure and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the disclosure described herein are capable of operation in sequences other than those illustrated or otherwise described herein. The embodiments described in the following examples do not represent all embodiments consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
In this case, the expression "at least one of the items" in the present disclosure means a case where three types of parallel expressions "any one of the items", "a combination of any plural ones of the items", and "the entirety of the items" are included. For example, "include at least one of a and B" includes the following three cases in parallel: (1) comprises A; (2) comprises B; (3) including a and B. For another example, "at least one of the first step and the second step is performed", which means that the following three cases are juxtaposed: (1) executing the step one; (2) executing the step two; (3) and executing the step one and the step two.
Commonly used real-time image processing techniques mainly include: color Difference Keying, Luma Keying, and Chroma Keying. Color difference matting uses the color difference of R, G, B three channels to solve for opacity. The color difference matting technology is very fast, but the matting effect is not ideal. In order to solve the problem that the RGB space cannot separate green well, the brightness matting technology utilizes the brightness information of an image to carry out soft threshold operation on the image to solve the opacity. Namely, the soft thresholding is performed on the luminance component L in the HLS space, thereby obtaining alpha which varies gently from 0 to 1. However, this technique only considers luminance information of an image and ignores color information of the image, and thus the effect is not very ideal for most images. The chroma matting technique uses the color information of the image to solve for opacity. The chroma matting technique converts the color image into HSV space and then performs soft-threshold segmentation on the chroma components (H). In practical application, under the influence of environmental factors such as illumination, color information cannot be completely distinguished by chrominance components, and opacity needs to be solved by weighting three channels of chrominance (H), saturation (S) and luminance (V). Because it is fast and can get better image processing effect, the chroma matting technique is widely applied to real-time scenes.
In the related art, the real-time matting apparatus mainly includes: a matting module, an edge module, and a color module.
Firstly, the matting module takes a green screen image as input, and controls matting threshold value by using a matting range parameter Average, controls matting intensity by using an intensity parameter sensitivity, and controls color compensation by using a compensation parameter percentage. The core algorithm used by the real-time matting equipment is as follows: c.a ═ clamp ((((1-clamp ((c.g- (c.r) + c.b), 0,1))) -Average (tan ((45+44 sensitivity)/180.14159)) + Average),0, 1). Here, c.r, c.g, c.b, c.a are RGBA channels of the image, respectively, and clamp (float value, float min, float max) limits the value of value in min and max.
Further, the edge module takes the foreground image obtained by the matting module as input and carries out fuzzy processing on the foreground image edge. Controlling the overall blurring processing of the image by an algorithm according to the edge blurring degree, and iteratively blurring for multiple times; introducing edge indentation strength intensity, controlling the edge indentation strength through the edge overlapping times, subtracting an original image without blur to obtain a blurred edge, overlapping the edge to obtain a complete edge, and subtracting the edge image from the original image to achieve the purpose of removing the black edge.
Finally, the problem of green color overflow is solved by using a color module, and if the pixel intensity of the green channel is greater than the average value of the red and blue channels, the pixel intensity is modified to the average value of the red and blue channels.
The matting module performs threshold processing in the RGB color space, however, the RGB space cannot distinguish various colors well, so that green is not completely matted and other colors are removed, which is especially serious when the green screen is interfered by ambient light.
The image is blurred in the edge module, and due to the fact that the edge cannot be accurately judged, the background or foreground content of the image is also blurred, so that the resolution of the image is reduced, and the visual quality is affected. The edge module performs blurring in a multi-iteration mode, so that the complexity of an algorithm is increased, and extra time overhead is brought.
The color module replaces the oversized green value with the average of the red and blue channels in order to solve the problem of green color overflow. Since the hue of yellow is close to green, the intensity value of the green channel of yellow is also greater than the mean value of the red and blue channels in the RGB space, which means that the yellow is also modified by removing the color overflow, so that the obtained foreground image has color difference.
Hereinafter, an image processing method and apparatus according to an exemplary embodiment of the present disclosure will be described in detail with reference to fig. 1 to 4.
Fig. 1 illustrates a flowchart of an image processing method according to an exemplary embodiment of the present disclosure. The image processing method in fig. 1 can be applied to matting an image to be matte (for example, but not limited to, a green veil image).
Referring to fig. 1, in step S101, a color reference point on an image is acquired. Here, the image is an image to be scratched (for example, but not limited to, a green screen image).
In one example, a user may select a point in an image (e.g., without limitation, a green screen image) as a color reference point to remove for matting based on the user-selected color reference point. In another example, a point in an image (e.g., without limitation, a green screen image) can be automatically determined as a color reference point to be removed without requiring a user to select the color reference point, thereby enabling automatic matting without requiring user manipulation, avoiding substantial human interaction. The color reference points may be automatically determined by methods such as, but not limited to, machine learning.
In step S102, a color distance between each pixel point of the image and the color reference point is calculated in a preset space.
In an exemplary embodiment of the present disclosure, the preset space may include at least one of a hue saturation luminance (HSV) space and a luminance blue density offset red density offset (YCbCr) space. In addition, the preset space may further include other image spaces, which is not limited by the present disclosure.
In an exemplary embodiment of the present disclosure, when the preset space is an HSV space, when a color distance between each pixel point of the image and the color reference point is calculated in the preset space, the image may be first converted into the HSV space, a hue distance between a hue of each pixel point and a hue of the color reference point is calculated, a saturation distance between a saturation of each pixel point and a saturation of the color reference point is calculated, and then an HSV color distance between each pixel point and the color reference point is calculated based on the hue distance and the saturation distance as a color distance between each pixel point and the color reference point, thereby obtaining the color distance in the HSV space.
In an exemplary embodiment of the present disclosure, when calculating the HSV color distance between each pixel point and the color reference point based on the hue distance and the saturation distance, the shadow region of the image may be first determined, the luminance distance between the luminance of each pixel point of the shadow region of the image and the luminance of the color reference point may be calculated, and then the HSV color distance between each pixel point and the color reference point may be calculated based on the hue distance, the saturation distance, and the luminance distance, thereby improving the accuracy of the color distance.
In an exemplary embodiment of the present disclosure, when calculating the HSV color distance between each pixel point and the color reference point based on the hue distance, the saturation distance, and the brightness distance, the hue distance, the saturation distance, and the brightness distance may be weighted to obtain the HSV color distance between each pixel point and the color reference point, thereby improving the accuracy of the color distance.
The present disclosure also contemplates the problem of shadow portion matting in images. In an exemplary embodiment of the present disclosure, the shadow portion in the image may be individually processed, and the degree of shadow preservation may be controlled, thereby achieving adjustment of the degree of shadow preservation. For the shadow part, the disclosure introduces the brightness information of the image, and weights the distance of the brightness value (V) to obtain dis while HSV calculates the distance of the hue (H) and the saturation (S). The distance and transparency alpha values may be mapped equally well using, for example, but not limited to, a gaussian function: α ═ exp (μ · (dis-show)), where μ is the scaling factor and show is a user-adjustable parameter, by adjusting show the degree of shadow retention can be controlled.
In an exemplary embodiment of the present disclosure, when the preset space is the YCbCr space, in calculating the color distance between each pixel point of the image and the color reference point in the preset space, the image may be first converted into the YCbCr space, the blue distance between the blue density offset of each pixel point and the blue density offset of the color reference point is calculated, and the red distance between the red density offset of each pixel point and the red density offset of the color reference point is calculated, and then the YCbCr color distance between each pixel point and the color reference point is calculated based on the blue distance and the red distance as the color distance between each pixel point and the color reference point, thereby obtaining the color distance in the YCbCr space.
In an exemplary embodiment of the present disclosure, when the YCbCr color distance between each pixel point and the color reference point is calculated based on the blue distance and the red distance, the blue distance and the red distance may be weighted to obtain the YCbCr color distance between each pixel point and the color reference point, thereby improving the accuracy of the color distance.
In an exemplary embodiment of the present disclosure, when the preset space is the HSV space and the YCbCr space, and the color distance between each pixel point of the image and the color reference point is calculated in the preset space, the HSV color distance between each pixel point of the image and the color reference point may be first calculated in the HSV space, and the YCbCr color distance between each pixel point of the image and the color reference point may be calculated in the YCbCr space, and then the color distance between each pixel point of the image and the color reference point may be calculated based on the HSV color distance and the YCbCr color distance, thereby improving the accuracy of the color distance.
Due to the influence of ambient light, the color of the background of the green screen is not uniform, which means that there are many green backgrounds in the green screen which have a large difference from the color point input by the user, and it is difficult to distinguish the green with a large saturation and a large brightness difference in the RGB space. The present disclosure abandons the approach of the correlation algorithm that processes the color information of the image only in RGB space. For distance calculation, the method can convert the green screen image to be subjected to matting into HSV and YCbCr color spaces, calculate the polar coordinate distance of hue (H) and saturation (S) in the HSV space, and perform weighting to obtain a color distance disH; calculating the distance between the blue concentration offset (Cb) and the red concentration offset (Cr) in the YCbCr space, and weighting to obtain a color distance disY; the product of the two color spaces is taken as the final color distance. To achieve smooth transitions of the background and foreground regions, the present disclosure may map the color distance and transparency α values using, for example, but not limited to, a gaussian function, i.e.: α ═ exp (γ · (dis × disty-intensity)), where γ is a scaling factor and intensity is a user-adjustable parameter, and the matting intensity can be controlled by adjusting the intensity.
In step S103, the image is matting based on the color distance. Thus, the mobile terminal can execute the cutout in real time.
In an exemplary embodiment of the present disclosure, when the image is subjected to matting based on the color distance, the transparency of each pixel point of the image may be first calculated through a preset gaussian function based on the color distance, and then the image is subjected to matting based on the transparency of each pixel point, so that the matting effect is improved.
The alpha channel of the matted image is affected by illumination. If the ambient light is soft, the edge transition is smooth and no jump occurs. However, in reality, the light ray is not smooth enough at the edge of the foreground due to the angle problem, and there are irregular jumps, so that an unnatural transition occurs in image composition. In order to solve this problem, in an exemplary embodiment of the present disclosure, an edge region of the image obtained by matting may also be determined first, and then the image before matting is used as a guide image, and the transparency of each pixel point of the edge region of the image obtained by matting is subjected to guide filtering, so that edge burrs and unevenness generated by matting are reduced.
In particular, the present disclosure may blur the alpha channel. Compared with the RGB space processing, the alpha channel processing can effectively avoid blurring the background area of the image, and effectively reduce the time complexity of the algorithm. The method adopts the guiding filtering algorithm to carry out fuzzy processing, and the guiding filtering algorithm has the characteristic of edge keeping, so that the fuzzy effect can be achieved, and the unclear edge caused by excessive fuzzy can be avoided. Specifically, for each pixel location, the present disclosure may, for example, but not limited to, calculate statistics within its local range, i.e., mean, variance, and covariance, and use these statistics as filter weight coefficients.
In an exemplary embodiment of the present disclosure, the color of the color reference point can be used to remove the color overflow of the image obtained by the matting, so as to improve the matting effect.
In the exemplary embodiment of the present disclosure, when the color of the color reference point is used to perform the color overflow removal on the image obtained by matting, the pixel points having the same color as the color reference point in the image obtained by matting may be determined first, and then the saturation of the determined pixel points is reduced in the HSV space, so that the method is suitable for removing different color overflows generated under different illumination, and the color overflow removal effect is improved.
In an exemplary embodiment of the present disclosure, when determining a pixel point having the same color as a color reference point in a matte image, the pixel point having the same color as the color reference point in the matte image may be first determined in HSV space and YCbCr space, respectively, and then the pixel point having the same color as the color reference point in HSV space and YCbCr space in the matte image is determined as a pixel point having the same color as the color reference point, so as to improve a matte effect.
Because there is background reflection in the shooting process, the foreground under the green curtain background can be totally green, that is, the color is overflowed. To address the color spill problem, the present disclosure reduces the green saturation (S) for green regions in the image, effectively eliminating the green spill. Specifically, a green channel pixel value of an image pixel may be first calculated, and if it is greater than an average value of red and blue channels while satisfying a value of hue (H) in the HSV color space within a green interval, it may be determined that the current pixel is green.
The image processing method according to the exemplary embodiment of the present disclosure has been described above in conjunction with fig. 1. Hereinafter, an image processing apparatus and units thereof according to an exemplary embodiment of the present disclosure will be described with reference to fig. 2 and 3.
Fig. 2 illustrates a block diagram of an image processing apparatus according to an exemplary embodiment of the present disclosure.
Referring to fig. 2, the image processing apparatus includes a reference point acquisition unit 21, a distance calculation unit 22, and an image cutout unit 23.
The reference point acquisition unit 21 is configured to acquire a color reference point on an image.
The distance calculation unit 22 is configured to calculate a color distance between each pixel point of the image and the color reference point in a preset space.
In an exemplary embodiment of the present disclosure, the preset space may include at least one of an HSV space and a YCbCr space.
In an exemplary embodiment of the present disclosure, the distance calculation unit 22 may be configured to: converting the image to HSV space; calculating the hue distance between the hue of each pixel point and the hue of the color reference point, and calculating the saturation distance between the saturation of each pixel point and the saturation of the color reference point; calculating an HSV color distance between each pixel point and the color reference point based on the hue distance and the saturation distance as a color distance between each pixel point and the color reference point.
In an exemplary embodiment of the present disclosure, the distance calculation unit 22 may be configured to: determining a shadow area of the image; calculating the brightness distance between the brightness of each pixel point in the shadow area of the image and the brightness of the color reference point; an HSV color distance between each pixel point of a shadow region of the image and the color reference point is calculated based on the hue distance, the saturation distance, and the brightness distance.
In an exemplary embodiment of the present disclosure, the distance calculation unit 22 may be configured to: and carrying out weighted calculation on the hue distance, the saturation distance and the brightness distance to obtain the HSV color distance between each pixel point of the shadow area of the image and the color reference point.
In an exemplary embodiment of the present disclosure, the distance calculation unit 22 may be configured to: converting the image to YCbCr space; calculating a blue distance between the blue concentration offset of each pixel point and the blue concentration offset of the color reference point, and calculating a red distance between the red concentration offset of each pixel point and the red concentration offset of the color reference point; the YCbCr color distance between each pixel point and the color reference point is calculated based on the blue distance and the red distance as the color distance between each pixel point and the color reference point.
In an exemplary embodiment of the present disclosure, the distance calculation unit 22 may be configured to: and performing weighted calculation on the blue distance and the red distance to obtain the YCbCr color distance between each pixel point and the color reference point.
In an exemplary embodiment of the present disclosure, the distance calculation unit 22 may be configured to: calculating the HSV color distance between each pixel point of the image and the color reference point in the HSV space, and calculating the YCbCr color distance between each pixel point of the image and the color reference point in the YCbCr space; and calculating the color distance between each pixel point of the image and the color reference point based on the HSV color distance and the YCbCr color distance.
The image matting unit 23 is configured to matte the image based on the color distance.
In an exemplary embodiment of the present disclosure, the matting unit 23 may be configured to: calculating the transparency of each pixel point of the image through a preset Gaussian function based on the color distance; and matting the image based on the transparency of each pixel point.
Fig. 3 illustrates a block diagram of an image processing apparatus according to another exemplary embodiment of the present disclosure.
Referring to fig. 3, the image processing apparatus includes a reference point acquisition unit 31, a distance calculation unit 32, an image matting unit 33, a guide filter unit 34, and an overflow removal unit 35.
The reference point acquisition unit 31 is configured to acquire a color reference point on an image.
The distance calculation unit 32 is configured to calculate a color distance between each pixel point of the image and the color reference point in a preset space.
In an exemplary embodiment of the present disclosure, the preset space includes at least one of an HSV space and a YCbCr space.
In an exemplary embodiment of the present disclosure, the distance calculation unit 32 may be configured to: converting the image to HSV space; calculating the hue distance between the hue of each pixel point and the hue of the color reference point, and calculating the saturation distance between the saturation of each pixel point and the saturation of the color reference point; calculating an HSV color distance between each pixel point and the color reference point based on the hue distance and the saturation distance as a color distance between each pixel point and the color reference point.
In an exemplary embodiment of the present disclosure, the distance calculation unit 32 may be configured to: determining a shadow area of the image; calculating the brightness distance between the brightness of each pixel point in the shadow area of the image and the brightness of the color reference point; an HSV color distance between each pixel point of a shadow region of the image and the color reference point is calculated based on the hue distance, the saturation distance, and the brightness distance.
In an exemplary embodiment of the present disclosure, the distance calculation unit 32 may be configured to:
and carrying out weighted calculation on the hue distance, the saturation distance and the brightness distance to obtain the HSV color distance between each pixel point of the shadow area of the image and the color reference point.
In an exemplary embodiment of the present disclosure, the distance calculation unit 32 may be configured to: converting the image to YCbCr space; calculating a blue distance between the blue concentration offset of each pixel point and the blue concentration offset of the color reference point, and calculating a red distance between the red concentration offset of each pixel point and the red concentration offset of the color reference point; the YCbCr color distance between each pixel point and the color reference point is calculated based on the blue distance and the red distance as the color distance between each pixel point and the color reference point.
In an exemplary embodiment of the present disclosure, the distance calculation unit 32 may be configured to: and performing weighted calculation on the blue distance and the red distance to obtain the YCbCr color distance between each pixel point and the color reference point.
In an exemplary embodiment of the present disclosure, the distance calculation unit 32 may be configured to: calculating the HSV color distance between each pixel point of the image and the color reference point in the HSV space, and calculating the YCbCr color distance between each pixel point of the image and the color reference point in the YCbCr space; and calculating the color distance between each pixel point of the image and the color reference point based on the HSV color distance and the YCbCr color distance.
The image matting unit 33 is configured to matte the image based on the color distance.
In an exemplary embodiment of the present disclosure, the matting unit 33 may be configured to: calculating the transparency of each pixel point of the image through a preset Gaussian function based on the color distance; and matting the image based on the transparency of each pixel point.
In an exemplary embodiment of the present disclosure, the reference point acquiring unit 31, the distance calculating unit 32, and the image matting unit 33 may be separate independent units or may be included in one unit (for example, matting unit), which is not limited by the present disclosure.
The guided filtering unit 34 is configured to determine an edge region of the matte image, and to guide-filter the transparency of each pixel point of the edge region of the matte image with the pre-matte image as a guide image.
The blur removal unit 35 is configured to perform blur removal on the cutout image based on the color of the color reference point.
In an exemplary embodiment of the present disclosure, the flash removal unit 35 may be configured to: determining pixel points with the same color as the color reference point in the image obtained by matting; and reducing the saturation of the determined pixel points in the HSV space.
In an exemplary embodiment of the present disclosure, the flash removal unit 35 may be configured to: determining pixel points with the same color as the color reference point in the image obtained by matting in the HSV space and the YCbCr space respectively; and determining pixel points which are determined to have the same color as the color reference point in the HSV space and the YCbCr space in the image obtained by the sectional drawing as the pixel points which have the same color as the color reference point.
With regard to the apparatus in the above-described embodiment, the specific manner in which each unit performs the operation has been described in detail in the embodiment related to the method, and will not be described in detail here.
The image processing apparatus according to the exemplary embodiment of the present disclosure has been described above in conjunction with fig. 2 and 3. Next, an electronic device according to an exemplary embodiment of the present disclosure is described with reference to fig. 4.
Fig. 4 is a block diagram of an electronic device 400 according to an example embodiment of the present disclosure.
Referring to fig. 4, the electronic device 400 comprises at least one memory 401 and at least one processor 402, the at least one memory 401 having stored therein a set of computer-executable instructions that, when executed by the at least one processor 402, perform a method of image processing according to an exemplary embodiment of the present disclosure.
In an exemplary embodiment of the present disclosure, the electronic device 400 may be a PC computer, a tablet device, a personal digital assistant, a smartphone, or other device capable of executing the above-described set of instructions. Here, the electronic device 400 need not be a single electronic device, but can be any collection of devices or circuits that can execute the above instructions (or sets of instructions) individually or in combination. The electronic device 400 may also be part of an integrated control system or system manager, or may be configured as a portable electronic device that interfaces with local or remote (e.g., via wireless transmission).
In the electronic device 400, the processor 402 may include a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a programmable logic device, a special purpose processor system, a microcontroller, or a microprocessor. By way of example, and not limitation, processors may also include analog processors, digital processors, microprocessors, multi-core processors, processor arrays, network processors, and the like.
The processor 402 may execute instructions or code stored in the memory 401, wherein the memory 401 may also store data. The instructions and data may also be transmitted or received over a network via a network interface device, which may employ any known transmission protocol.
The memory 401 may be integrated with the processor 402, for example, by having RAM or flash memory disposed within an integrated circuit microprocessor or the like. Further, memory 401 may comprise a stand-alone device, such as an external disk drive, storage array, or any other storage device usable by a database system. The memory 401 and the processor 402 may be operatively coupled or may communicate with each other, such as through I/O ports, network connections, etc., so that the processor 402 can read files stored in the memory.
In addition, the electronic device 400 may also include a video display (such as a liquid crystal display) and a user interaction interface (such as a keyboard, mouse, touch input device, etc.). All components of electronic device 400 may be connected to each other via a bus and/or a network.
There is also provided, in accordance with an example embodiment of the present disclosure, a computer-readable storage medium, such as a memory 401, comprising instructions executable by a processor 402 of an apparatus 400 to perform the above-described method. Alternatively, the computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
According to an exemplary embodiment of the present disclosure, a computer program product may also be provided, which comprises computer programs/instructions, which when executed by a processor, implement the method of image processing according to an exemplary embodiment of the present disclosure.
The image processing method and apparatus according to the exemplary embodiment of the present disclosure have been described above with reference to fig. 1 to 4. However, it should be understood that: the image processing apparatus and units thereof shown in fig. 2 and 3 may be respectively configured as software, hardware, firmware, or any combination thereof to perform a specific function, the electronic device shown in fig. 4 is not limited to include the above-shown components, but some components may be added or deleted as needed, and the above components may also be combined.
According to the image processing method and the device, the color reference points on the image are firstly obtained, the color distance between each pixel point of the image and the color reference points is calculated in the preset space, and then the image is subjected to matting based on the color distance, so that the mobile terminal can perform matting in real time. In addition, most parameters of the image processing method and the image processing device are fixed values, so that a large amount of manual interaction can be avoided; the color distance can be calculated by utilizing HSV and YCbCr spaces of the color image, so that the matting effect is better, and the method can be suitable for removing different color overflows generated under different illumination; the shadow in the image can be processed independently, so that the adjustment of the shadow retention degree is realized; the transparency can be filtered, so that edge burrs and unsmoothness generated by matting can be reduced; the green color can be judged by combining the RGB space and the HSV space, and the saturation degree of the green color is reduced, so that the green color overflow is effectively eliminated.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. An image processing method, comprising:
acquiring a color reference point on an image;
calculating a color distance between each pixel point of the image and a color reference point in a preset space;
matting the image based on the color distance.
2. The image processing method according to claim 1, wherein the preset space includes at least one of a hue saturation luminance HSV space and a luminance blue density offset red density offset YCbCr space.
3. The image processing method according to claim 2, wherein when the preset space is an HSV space, the step of calculating a color distance between each pixel point of the image and a color reference point in the preset space includes:
converting the image to HSV space;
calculating the hue distance between the hue of each pixel point and the hue of the color reference point, and calculating the saturation distance between the saturation of each pixel point and the saturation of the color reference point;
calculating an HSV color distance between each pixel point and the color reference point based on the hue distance and the saturation distance as a color distance between each pixel point and the color reference point.
4. The image processing method according to claim 3, wherein the step of calculating the HSV color distance between each pixel point and the color reference point based on the hue distance and the saturation distance comprises:
determining a shadow region of the image;
calculating the brightness distance between the brightness of each pixel point in the shadow area of the image and the brightness of the color reference point;
calculating an HSV color distance between each pixel point of a shadow region of the image and a color reference point based on the hue distance, the saturation distance, and the brightness distance.
5. The image processing method according to claim 4, wherein the step of calculating the HSV color distance between each pixel point of the shadow region of the image and the color reference point based on the hue distance, the saturation distance, and the brightness distance comprises:
and carrying out weighted calculation on the hue distance, the saturation distance and the brightness distance to obtain the HSV color distance between each pixel point of the shadow area of the image and the color reference point.
6. The image processing method according to claim 2, wherein when the preset space is an YCbCr space, the step of calculating a color distance between each pixel point of the image and a color reference point in the preset space comprises:
converting the image to YCbCr space;
calculating a blue distance between the blue concentration offset of each pixel point and the blue concentration offset of the color reference point, and calculating a red distance between the red concentration offset of each pixel point and the red concentration offset of the color reference point;
the YCbCr color distance between each pixel point and the color reference point is calculated based on the blue distance and the red distance as the color distance between each pixel point and the color reference point.
7. The image processing method of claim 6, wherein the step of calculating the YCbCr color distance between each pixel point and the color reference point based on the blue distance and the red distance comprises:
and performing weighted calculation on the blue distance and the red distance to obtain the YCbCr color distance between each pixel point and the color reference point.
8. An image processing apparatus characterized by comprising:
a reference point acquisition unit configured to acquire a color reference point on an image;
a distance calculation unit configured to calculate a color distance between each pixel point of the image and a color reference point in a preset space; and
an image matting unit configured to matte the image based on the color distance.
9. An electronic device/server, comprising:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the image processing method of any one of claims 1 to 7.
10. A computer-readable storage medium storing a computer program, which, when executed by a processor of an electronic device, causes the electronic device to perform the image processing method according to any one of claims 1 to 7.
CN202110216499.XA 2021-02-26 2021-02-26 Image processing method and device Pending CN113034509A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110216499.XA CN113034509A (en) 2021-02-26 2021-02-26 Image processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110216499.XA CN113034509A (en) 2021-02-26 2021-02-26 Image processing method and device

Publications (1)

Publication Number Publication Date
CN113034509A true CN113034509A (en) 2021-06-25

Family

ID=76461711

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110216499.XA Pending CN113034509A (en) 2021-02-26 2021-02-26 Image processing method and device

Country Status (1)

Country Link
CN (1) CN113034509A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113409221A (en) * 2021-06-30 2021-09-17 深圳市斯博科技有限公司 Image color matting method, system, computer equipment and storage medium
CN113793395A (en) * 2021-09-15 2021-12-14 湖南快乐阳光互动娱乐传媒有限公司 Key color extraction method and device
CN117221504A (en) * 2023-11-07 2023-12-12 北京医百科技有限公司 Video matting method and device
WO2024001360A1 (en) * 2022-06-28 2024-01-04 北京字跳网络技术有限公司 Green screen matting method and apparatus, and electronic device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016127883A1 (en) * 2015-02-12 2016-08-18 阿里巴巴集团控股有限公司 Image area detection method and device
CN111277772A (en) * 2020-03-09 2020-06-12 北京文香信息技术有限公司 Matting method, device, equipment and storage medium
CN112132852A (en) * 2020-08-28 2020-12-25 稿定(厦门)科技有限公司 Automatic image matting method and device based on multi-background color statistics
CN112233195A (en) * 2020-10-15 2021-01-15 北京达佳互联信息技术有限公司 Color matching method, device, electronic equipment and storage medium
CN112330692A (en) * 2020-11-11 2021-02-05 北京文香信息技术有限公司 Matting method, device, equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016127883A1 (en) * 2015-02-12 2016-08-18 阿里巴巴集团控股有限公司 Image area detection method and device
CN111277772A (en) * 2020-03-09 2020-06-12 北京文香信息技术有限公司 Matting method, device, equipment and storage medium
CN112132852A (en) * 2020-08-28 2020-12-25 稿定(厦门)科技有限公司 Automatic image matting method and device based on multi-background color statistics
CN112233195A (en) * 2020-10-15 2021-01-15 北京达佳互联信息技术有限公司 Color matching method, device, electronic equipment and storage medium
CN112330692A (en) * 2020-11-11 2021-02-05 北京文香信息技术有限公司 Matting method, device, equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
梁刚明: "基于结构超图的形态学新算子", 《中国优秀硕士学位论文全文数据库信息科技辑》, no. 02, pages 4 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113409221A (en) * 2021-06-30 2021-09-17 深圳市斯博科技有限公司 Image color matting method, system, computer equipment and storage medium
CN113409221B (en) * 2021-06-30 2023-12-12 深圳万兴软件有限公司 Image color matting method, system, computer equipment and storage medium
CN113793395A (en) * 2021-09-15 2021-12-14 湖南快乐阳光互动娱乐传媒有限公司 Key color extraction method and device
WO2024001360A1 (en) * 2022-06-28 2024-01-04 北京字跳网络技术有限公司 Green screen matting method and apparatus, and electronic device
CN117221504A (en) * 2023-11-07 2023-12-12 北京医百科技有限公司 Video matting method and device
CN117221504B (en) * 2023-11-07 2024-01-23 北京医百科技有限公司 Video matting method and device

Similar Documents

Publication Publication Date Title
CN113034509A (en) Image processing method and device
US8525847B2 (en) Enhancing images using known characteristics of image subjects
CN107451969A (en) Image processing method, device, mobile terminal and computer-readable recording medium
CN109862389B (en) Video processing method, device, server and storage medium
Peng et al. Image haze removal using airlight white correction, local light filter, and aerial perspective prior
CN107871303B (en) Image processing method and device
WO2014170886A1 (en) System and method for online processing of video images in real time
JP2003271971A (en) Method for real-time discrimination and compensation for illuminance change in digital color image signal
CN113297937B (en) Image processing method, device, equipment and medium
CN110996174B (en) Video image quality enhancement method and related equipment thereof
CN108961299B (en) Foreground image obtaining method and device
CN109325918B (en) Image processing method and device and computer storage medium
CN107424137B (en) Text enhancement method and device, computer device and readable storage medium
CN109949248B (en) Method, apparatus, device and medium for modifying color of vehicle in image
CN107256539B (en) Image sharpening method based on local contrast
CN112819721A (en) Method and system for reducing noise of image color noise
CN116309152A (en) Detail enhancement method, system, equipment and storage medium for low-illumination image
CN110796689B (en) Video processing method, electronic device and storage medium
CN111970501A (en) Pure color scene AE color processing method and device, electronic equipment and storage medium
CN116468636A (en) Low-illumination enhancement method, device, electronic equipment and readable storage medium
CN107277369B (en) Image processing method, device, computer readable storage medium and computer equipment
CN111738949B (en) Image brightness adjusting method and device, electronic equipment and storage medium
CN115239578A (en) Image processing method and device, computer readable storage medium and terminal equipment
CN114511580A (en) Image processing method, device, equipment and storage medium
JP2009010636A (en) Adaption histogram equalization method, and adaption histogram equalization apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination