CN111223108A - Method and system based on backdrop matting and fusion - Google Patents

Method and system based on backdrop matting and fusion Download PDF

Info

Publication number
CN111223108A
CN111223108A CN201911413016.4A CN201911413016A CN111223108A CN 111223108 A CN111223108 A CN 111223108A CN 201911413016 A CN201911413016 A CN 201911413016A CN 111223108 A CN111223108 A CN 111223108A
Authority
CN
China
Prior art keywords
color
point
background
image
foreground
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911413016.4A
Other languages
Chinese (zh)
Inventor
邓海峰
曹康文
李欢
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Yingzhuo Information Technology Co ltd
Original Assignee
Shanghai Yingzhuo Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Yingzhuo Information Technology Co ltd filed Critical Shanghai Yingzhuo Information Technology Co ltd
Priority to CN201911413016.4A priority Critical patent/CN111223108A/en
Publication of CN111223108A publication Critical patent/CN111223108A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/272Means for inserting a foreground image in a background image, i.e. inlay, outlay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20036Morphological image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20076Probabilistic image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20132Image cropping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a method and a system based on backdrop matting and fusion, which comprises the following steps: a background color obtaining step: acquiring the color of a background in an image or a video; an initial alpha value determination step: determining an initial alpha value of an initial pixel by calculating the distance and the included angle of the pixel color and the RGB space of the background color to obtain an alpha image; a binarization step: according to a set threshold value of an alpha value, performing binarization processing on the alpha image; and (3) optimizing: using the alpha image after the morphological optimization binarization processing to obtain a trisection image; sharing sample points: and executing a shared sample point algorithm on the obtained trimap image to obtain a final foreground color and an alpha value.

Description

Method and system based on backdrop matting and fusion
Technical Field
The invention relates to the technical field of image processing, in particular to a backdrop matting and fusion-based method and system.
Background
Matting refers to a technique for accurately extracting foreground objects from an image or video sequence. The matting technology is a key technology in the field of visual special effects, and is widely applied to the fields of image editing, film making and the like. However, due to the under-constraint of the matting problem, an additional constraint condition needs to be added when solving the problem, so in movie and television production, a blue curtain or a green curtain is usually adopted as a shooting background, thereby reducing the difficulty of solving the problem.
Patent document CN 107180238A discloses an image previewing apparatus and method for an intelligent terminal, which start an intelligent terminal shooting module, capture a target image containing a target person, recognize the face of the target person in the target image, obtain the position information of the target person, lock the person outline according to the conventional human body proportion or preset person figure parameters, replace the background outside the person outline with a green curtain, extract the person image inside the person outline, and send the person image to a holographic projection module included in the intelligent terminal or connected to the intelligent terminal, where the holographic projection module projects a stereoscopic image of the target person around the target person.
However, the conventional matting technology generally has inaccurate matting, so that for an area in which a background and a foreground are difficult to distinguish (a background area, a foreground area and an unknown area which cannot be identified are generated in an identification process), the foreground is easily lost or mixed into the background, and the matting efficiency is poor.
Disclosure of Invention
In view of the shortcomings in the prior art, it is an object of the present invention to provide a backdrop matting and fusion based method and system.
The method based on backdrop matting and fusion provided by the invention comprises the following steps:
a background color obtaining step: acquiring the color of a background in an image or a video;
an initial alpha value determination step: determining an initial alpha value of an initial pixel by calculating the distance and the included angle of the pixel color and the RGB space of the background color to obtain an alpha image;
a binarization step: according to a set threshold value of an alpha value, performing binarization processing on the alpha image;
and (3) optimizing: using the alpha image after the morphological optimization binarization processing to obtain a trisection image;
sharing sample points: and executing a shared sample point algorithm on the obtained trimap image to obtain a final foreground color and an alpha value.
Preferably, the background color acquiring step includes two kinds:
A. respectively counting histograms of RGB three channels in an image or a video, and selecting a background color value with the maximum RGB probability;
B. and directly acquiring the background color value selected by the user.
Preferably, the initial alpha value determining step includes:
pixel color C of ith pixeliAnd background color CbDistance d ofi
Figure BDA0002350469460000021
Pixel color C of ith pixeliAnd background color CbAngle of (cos θ)i
Figure BDA0002350469460000022
Initial alpha value α for the ith pixeli
Figure BDA0002350469460000023
Wherein,
Figure BDA0002350469460000024
preferably, the optimizing step comprises:
A. respectively executing an opening operation on the foreground area and the background area, and connecting the locally discontinuous areas;
B. executing the expansion region r with the size of the foreground region and the background region respectivelycThe etching operation of (1).
Preferably, the sharing sample point step includes:
for point p in each unknown region:
A. respectively finding out foreground points and background points which are closest to the p point from the directions of 4 equiangular-distance rays starting from the p point, and forming a set by the foreground points
Figure BDA0002350469460000025
Background points comprising sets
Figure BDA0002350469460000026
To pair
Figure BDA0002350469460000027
Computing a p to s energy function
Figure BDA0002350469460000028
And obtain p as belonging to foreground region TfProbability of (2)
Figure BDA0002350469460000029
Wherein,
Figure BDA00023504694600000210
is the difference in color between the two points,
Figure BDA00023504694600000211
is the distance between two points, Ep(b) As a function of the energy from point b to point p, Ep(f) As a function of the energy from point p to point f;
B. for any pair
Figure BDA0002350469460000031
-calculating the color distortion
Figure BDA0002350469460000032
Figure BDA0002350469460000033
To make it possible to
Figure BDA0002350469460000034
The minimum value is selected from the group consisting of,
Figure BDA0002350469460000035
is a point fi pThe color of (a) is selected,
Figure BDA0002350469460000036
is a point
Figure BDA0002350469460000037
The color of (a);
-calculating PFpAnd
Figure BDA0002350469460000038
inconsistency parameter
Figure BDA0002350469460000039
-integrating the color distortion, disparity parameter and positional relationship:
Figure BDA00023504694600000310
Figure BDA00023504694600000311
is an objective function, NpAs a function of the degree of color distortion, ApAs an inconsistency parameter, DpIs fi p
Figure BDA00023504694600000312
And the linear distance of point p;
C. select to make
Figure BDA00023504694600000313
Smallest color pair
Figure BDA00023504694600000314
As foreground color and background color, respectively;
D. all q ∈ T in the 10x10 domain for punknownColor pair of
Figure BDA00023504694600000315
TunknownCalculating the color distortion degree of a relative p point for an unknown region, and selecting the color pair with the minimum distortion degree as a new color pair of p;
E. and performing Gaussian smoothing on all color pairs and corresponding transparencies, and outputting final foreground colors and alpha values.
The invention provides a backdrop matting and fusion-based system, which comprises:
a background color acquisition module: acquiring the color of a background in an image or a video;
an initial alpha value determination module: determining an initial alpha value of an initial pixel by calculating the distance and the included angle of the pixel color and the RGB space of the background color to obtain an alpha image;
a binarization module: according to a set threshold value of an alpha value, performing binarization processing on the alpha image;
an optimization module: using the alpha image after the morphological optimization binarization processing to obtain a trisection image;
a shared sample point module: and executing a shared sample point algorithm on the obtained trimap image to obtain a final foreground color and an alpha value.
Preferably, the background color obtaining module includes two types:
A. respectively counting histograms of RGB three channels in an image or a video, and selecting a background color value with the maximum RGB probability;
B. and directly acquiring the background color value selected by the user.
Preferably, the initial alpha value determining module comprises:
pixel color C of ith pixeliAnd background color CbDistance d ofi
Figure BDA00023504694600000316
Pixel color C of ith pixeliAnd background color CbAngle of (cos θ)i
Figure BDA0002350469460000041
Initial alpha value α for the ith pixeli
Figure BDA0002350469460000042
Wherein,
Figure BDA0002350469460000043
preferably, the optimization module comprises:
A. respectively executing an opening operation on the foreground area and the background area, and connecting the locally discontinuous areas;
B. executing the expansion region r with the size of the foreground region and the background region respectivelycThe etching operation of (1).
Preferably, the shared sample point module comprises:
for point p in each unknown region:
A. respectively finding out foreground points and background points which are closest to the p point from the directions of 4 equiangular-distance rays starting from the p point, and forming a set by the foreground points
Figure BDA0002350469460000044
Background points comprising sets
Figure BDA0002350469460000045
To pair
Figure BDA0002350469460000046
Computing a p to s energy function
Figure BDA0002350469460000047
And obtain p as belonging to foreground region TfProbability of (2)
Figure BDA0002350469460000048
Wherein,
Figure BDA0002350469460000049
is the difference in color between the two points,
Figure BDA00023504694600000410
is the distance between two points, Ep(b) As a function of the energy from point b to point p, Ep(f) As a function of the energy from point p to point f;
B. for any pair
Figure BDA00023504694600000411
-calculating the color distortion
Figure BDA00023504694600000412
Figure BDA00023504694600000413
To make it possible to
Figure BDA00023504694600000414
The minimum value is selected from the group consisting of,
Figure BDA00023504694600000415
is a point fi pThe color of (a) is selected,
Figure BDA00023504694600000416
is a point
Figure BDA00023504694600000417
The color of (a);
-calculating PFpAnd
Figure BDA00023504694600000418
inconsistency parameter
Figure BDA00023504694600000419
-integrating the color distortion, disparity parameter and positional relationship:
Figure BDA00023504694600000420
Figure BDA00023504694600000421
is an objective function, NpAs a function of the degree of color distortion, ApAs an inconsistency parameter, DpIs fi p
Figure BDA00023504694600000422
And the linear distance of point p;
C. select to make
Figure BDA0002350469460000051
Smallest color pair
Figure BDA0002350469460000052
As foreground color and background color, respectively;
D. all q ∈ T in the 10x10 domain for punknownColor pair of
Figure BDA0002350469460000053
TunknownCalculating the color distortion degree of a relative p point for an unknown region, and selecting the color pair with the minimum distortion degree as a new color pair of p;
E. and performing Gaussian smoothing on all color pairs and corresponding transparencies, and outputting final foreground colors and alpha values.
Compared with the prior art, the invention has the following beneficial effects:
1. through color difference, binaryzation, morphological expansion and corrosion, an accurate trimap (trimap) is extracted, and automatic cutout is achieved
2. By adopting the shared matching algorithm to correlate the pixels of the unknown region based on the surrounding pixels, the fine structure of the edge (such as hair) is ensured to be preserved, and the background color of the edge is avoided from being mixed.
3. By realizing GLSL conversion of the algorithm and directly running the algorithm in a GPU, the efficiency of the algorithm is greatly improved and the instantaneity of the matting processing is ensured.
Drawings
Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments with reference to the following drawings:
FIG. 1 is a flow chart of the operation of the present invention.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the invention, but are not intended to limit the invention in any way. It should be noted that it would be obvious to those skilled in the art that various changes and modifications can be made without departing from the spirit of the invention. All falling within the scope of the present invention.
In the present invention, the area with green curtains: background region Tb(ii) a Area without green screen: foreground region Tf(ii) a Indistinguishable regions: unknown region Tunknown
As shown in fig. 1, the method for backdrop matting and fusion based on the present invention includes:
s1, background color obtaining step: the color of the background in the image or video is obtained. The background color acquisition step includes two kinds:
A. respectively counting histograms of RGB three channels in an image or a video, and selecting a background color value with the maximum RGB probability;
B. and directly acquiring the background color value selected by the user.
S2, an initial alpha value determining step: and determining an initial alpha value of the initial pixel by calculating the distance and the included angle of the pixel color and the RGB space of the background color to obtain an alpha image. The initial alpha value determining step includes:
pixel color C of ith pixeliAnd background color CbDistance d ofi
Figure BDA0002350469460000061
Pixel color C of ith pixeliAnd background color CbAngle of (cos θ)i
Figure BDA0002350469460000062
Initial alpha value α for the ith pixeli
Figure BDA0002350469460000063
Wherein,
Figure BDA0002350469460000064
s3, binarization step: and performing binarization processing on the alpha image according to a set threshold value of the alpha value.
Select αlow、αhighThe threshold value is set to a value that is,
αi<=αlowi=0;
αi>=αhighi=1;
αlow<αi<αhighi=0.5;
s4, optimizing: and (5) utilizing the alpha image after the morphological optimization binarization processing to obtain a trisection image. The optimization steps comprise:
A. respectively executing one-time opening operation (namely expansion operation, expansion of the foreground area and the background area) on the foreground area and the background area, and connecting the locally discontinuous areas;
B. executing the expansion region r with the size of the foreground region and the background region respectivelycThe etching operation of (1).
S5, sharing sample points: and executing a shared sample point algorithm on the obtained trimap image to obtain a final foreground color and an alpha value. The step of sharing the sample points comprises the following steps:
for point p in each unknown region:
A. respectively finding out foreground points and background points which are closest to the p point from the directions of 4 equiangular-distance rays starting from the p point, and forming a set by the foreground points
Figure BDA0002350469460000065
Background points comprising sets
Figure BDA0002350469460000066
To pair
Figure BDA0002350469460000067
Computing a p to s energy function
Figure BDA0002350469460000068
And obtain p as belonging to foreground region TfProbability of (2)
Figure BDA0002350469460000071
Wherein,
Figure BDA0002350469460000072
is the difference in color between the two points,
Figure BDA0002350469460000073
is the distance between two points, Ep(b) As a function of the energy from point b to point p, Ep(f) As a function of the energy from point p to point f;
B. for any pair
Figure BDA0002350469460000074
-calculating the color distortion
Figure BDA0002350469460000075
Figure BDA0002350469460000076
To make it possible to
Figure BDA0002350469460000077
The minimum value is selected from the group consisting of,
Figure BDA0002350469460000078
is a point fi pThe color of (a) is selected,
Figure BDA0002350469460000079
is a point
Figure BDA00023504694600000710
The color of (a);
-calculating PFpAnd
Figure BDA00023504694600000711
inconsistency parameter
Figure BDA00023504694600000712
-integrating the color distortion, disparity parameter and positional relationship:
Figure BDA00023504694600000713
Figure BDA00023504694600000714
is an objective function, NpAs a function of the degree of color distortion, ApAs an inconsistency parameter, DpIs fi p
Figure BDA00023504694600000715
And the linear distance of point p;
C. select to make
Figure BDA00023504694600000716
Smallest color pair
Figure BDA00023504694600000717
As foreground color and background color, respectively;
D. all q ∈ T in the 10x10 domain for punknownColor pair of
Figure BDA00023504694600000718
TunknownCalculating the color distortion degree of a relative p point for an unknown region, and selecting the color pair with the minimum distortion degree as a new color pair of p;
E. and performing Gaussian smoothing on all color pairs and corresponding transparencies, and outputting final foreground colors and alpha values.
On the basis of the backdrop matting and fusion-based method, the invention also provides a backdrop matting and fusion-based system, which comprises the following steps:
a background color acquisition module: acquiring the color of a background in an image or a video;
an initial alpha value determination module: determining an initial alpha value of an initial pixel by calculating the distance and the included angle of the pixel color and the RGB space of the background color to obtain an alpha image;
a binarization module: according to a set threshold value of an alpha value, performing binarization processing on the alpha image;
an optimization module: using the alpha image after the morphological optimization binarization processing to obtain a trisection image;
a shared sample point module: and executing a shared sample point algorithm on the obtained trimap image to obtain a final foreground color and an alpha value.
Those skilled in the art will appreciate that, in addition to implementing the system and its various devices, modules, units provided by the present invention as pure computer readable program code, the system and its various devices, modules, units provided by the present invention can be fully implemented by logically programming method steps in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Therefore, the system and various devices, modules and units thereof provided by the invention can be regarded as a hardware component, and the devices, modules and units included in the system for realizing various functions can also be regarded as structures in the hardware component; means, modules, units for performing the various functions may also be regarded as structures within both software modules and hardware components for performing the method.
The foregoing description of specific embodiments of the present invention has been presented. It is to be understood that the present invention is not limited to the specific embodiments described above, and that various changes or modifications may be made by one skilled in the art within the scope of the appended claims without departing from the spirit of the invention. The embodiments and features of the embodiments of the present application may be combined with each other arbitrarily without conflict.

Claims (10)

1. A backdrop matting and fusion-based method is characterized by comprising the following steps:
a background color obtaining step: acquiring the color of a background in an image or a video;
an initial alpha value determination step: determining an initial alpha value of an initial pixel by calculating the distance and the included angle of the pixel color and the RGB space of the background color to obtain an alpha image;
a binarization step: according to a set threshold value of an alpha value, performing binarization processing on the alpha image;
and (3) optimizing: using the alpha image after the morphological optimization binarization processing to obtain a trisection image;
sharing sample points: and executing a shared sample point algorithm on the obtained trimap image to obtain a final foreground color and an alpha value.
2. The backdrop matting and fusion-based method according to claim 1, wherein the background color obtaining step includes two kinds:
A. respectively counting histograms of RGB three channels in an image or a video, and selecting a background color value with the maximum RGB probability;
B. and directly acquiring the background color value selected by the user.
3. The backdrop matting and fusion-based method according to claim 1, wherein the initial alpha value determining step comprises:
pixel color C of ith pixeliAnd background color CbDistance d ofi
Figure FDA0002350469450000011
Pixel color C of ith pixeliAnd background color CbAngle of (cos θ)i
Figure FDA0002350469450000012
Initial alpha value α for the ith pixeli
Figure FDA0002350469450000013
Wherein,
Figure FDA0002350469450000014
4. the backdrop matting and fusion-based method according to claim 1, wherein the optimizing step comprises:
A. respectively executing an opening operation on the foreground area and the background area, and connecting the locally discontinuous areas;
B. executing the expansion region r with the size of the foreground region and the background region respectivelycThe etching operation of (1).
5. The backdrop matting and fusion-based method according to claim 1, wherein the sharing sample point step comprises:
for point p in each unknown region:
A. respectively finding out foreground points and background points which are closest to the p point from the directions of 4 equiangular-distance rays starting from the p point, and forming a set by the foreground points
Figure FDA0002350469450000021
Background points comprising sets
Figure FDA0002350469450000022
To pair
Figure FDA0002350469450000023
Computing a p to s energy function
Figure FDA0002350469450000024
And obtain p as belonging to foreground region TfProbability of (2)
Figure FDA0002350469450000025
Wherein,
Figure FDA0002350469450000026
is the difference in color between the two points,
Figure FDA0002350469450000027
is the distance between two points, Ep(b) As a function of the energy from point b to point p, Ep(f) As a function of the energy from point p to point f;
B. for any pair
Figure FDA0002350469450000028
-calculating the color distortion
Figure FDA0002350469450000029
Figure FDA00023504694500000210
To make it possible to
Figure FDA00023504694500000211
The minimum value is selected from the group consisting of,
Figure FDA00023504694500000212
is a point fi pThe color of (a) is selected,
Figure FDA00023504694500000213
is a point
Figure FDA00023504694500000214
The color of (a);
-calculating PFpAnd
Figure FDA00023504694500000215
inconsistency parameter
Figure FDA00023504694500000216
-integrating the color distortion, disparity parameter and positional relationship:
Figure FDA00023504694500000217
Figure FDA00023504694500000218
is an objective function, NpAs a function of the degree of color distortion, ApAs an inconsistency parameter, DpIs fi p
Figure FDA00023504694500000219
And the linear distance of point p;
C. select to make
Figure FDA00023504694500000220
Smallest color pair
Figure FDA00023504694500000221
As foreground color and background color, respectively;
D. all q ∈ T in the 10x10 domain for punknownColor pair of
Figure FDA00023504694500000222
TunknownFor unknown regions, calculate the relativeSelecting the color pair with the minimum distortion degree as a new color pair of p according to the color distortion degree of the p point;
E. and performing Gaussian smoothing on all color pairs and corresponding transparencies, and outputting final foreground colors and alpha values.
6. A backdrop matting and fusion-based system, comprising:
a background color acquisition module: acquiring the color of a background in an image or a video;
an initial alpha value determination module: determining an initial alpha value of an initial pixel by calculating the distance and the included angle of the pixel color and the RGB space of the background color to obtain an alpha image;
a binarization module: according to a set threshold value of an alpha value, performing binarization processing on the alpha image;
an optimization module: using the alpha image after the morphological optimization binarization processing to obtain a trisection image;
a shared sample point module: and executing a shared sample point algorithm on the obtained trimap image to obtain a final foreground color and an alpha value.
7. The backdrop matting and fusion-based system according to claim 6, wherein the background color acquisition module includes two types:
A. respectively counting histograms of RGB three channels in an image or a video, and selecting a background color value with the maximum RGB probability;
B. and directly acquiring the background color value selected by the user.
8. The backdrop matting and fusion based system according to claim 6 wherein the initial alpha value determination module comprises:
pixel color C of ith pixeliAnd background color CbDistance d ofi
Figure FDA0002350469450000031
Pixel color C of ith pixeliAnd background color CbAngle of (cos θ)i
Figure FDA0002350469450000032
Initial alpha value α for the ith pixeli
Figure FDA0002350469450000033
Wherein,
Figure FDA0002350469450000034
9. the backdrop matting and fusion-based system according to claim 6, wherein the optimization module comprises:
A. respectively executing an opening operation on the foreground area and the background area, and connecting the locally discontinuous areas;
B. executing the expansion region r with the size of the foreground region and the background region respectivelycThe etching operation of (1).
10. The backdrop matting and fusion-based system according to claim 6 wherein the shared sample point module comprises:
for point p in each unknown region:
A. respectively finding out foreground points and background points which are closest to the p point from the directions of 4 equiangular-distance rays starting from the p point, and forming a set by the foreground points
Figure FDA0002350469450000035
Background points comprising sets
Figure FDA0002350469450000036
To pair
Figure FDA0002350469450000037
Computing a p to s energy function
Figure FDA0002350469450000038
And obtain p as belonging to foreground region TfProbability of (2)
Figure FDA0002350469450000041
Wherein,
Figure FDA0002350469450000042
is the difference in color between the two points,
Figure FDA0002350469450000043
is the distance between two points, Ep(b) As a function of the energy from point b to point p, Ep(f) As a function of the energy from point p to point f;
B. for any pair
Figure FDA0002350469450000044
-calculating the color distortion
Figure FDA0002350469450000045
Figure FDA0002350469450000046
To make it possible to
Figure FDA0002350469450000047
The minimum value is selected from the group consisting of,
Figure FDA0002350469450000048
is a point fi pThe color of (a) is selected,
Figure FDA0002350469450000049
is a point
Figure FDA00023504694500000410
The color of (a);
-calculating PFpAnd
Figure FDA00023504694500000411
inconsistency parameter
Figure FDA00023504694500000412
-integrating the color distortion, disparity parameter and positional relationship:
Figure FDA00023504694500000413
Figure FDA00023504694500000414
is an objective function, NpAs a function of the degree of color distortion, ApAs an inconsistency parameter, DpIs fi p
Figure FDA00023504694500000415
And the linear distance of point p;
C. select to make
Figure FDA00023504694500000416
Smallest color pair
Figure FDA00023504694500000417
As foreground color and background color, respectively;
D. all q ∈ T in the 10x10 domain for punknownColor pair of
Figure FDA00023504694500000418
TunknownCalculating the color distortion degree of a relative p point for an unknown region, and selecting the color pair with the minimum distortion degree as a new color pair of p;
E. and performing Gaussian smoothing on all color pairs and corresponding transparencies, and outputting final foreground colors and alpha values.
CN201911413016.4A 2019-12-31 2019-12-31 Method and system based on backdrop matting and fusion Pending CN111223108A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911413016.4A CN111223108A (en) 2019-12-31 2019-12-31 Method and system based on backdrop matting and fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911413016.4A CN111223108A (en) 2019-12-31 2019-12-31 Method and system based on backdrop matting and fusion

Publications (1)

Publication Number Publication Date
CN111223108A true CN111223108A (en) 2020-06-02

Family

ID=70828050

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911413016.4A Pending CN111223108A (en) 2019-12-31 2019-12-31 Method and system based on backdrop matting and fusion

Country Status (1)

Country Link
CN (1) CN111223108A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112164012A (en) * 2020-10-14 2021-01-01 上海影卓信息科技有限公司 Method and system for realizing portrait color relief effect
CN112164013A (en) * 2020-10-14 2021-01-01 上海影卓信息科技有限公司 Portrait reloading method, system and medium
CN112308866A (en) * 2020-11-04 2021-02-02 Oppo广东移动通信有限公司 Image processing method, image processing device, electronic equipment and storage medium
CN112785511A (en) * 2020-06-30 2021-05-11 青岛经济技术开发区海尔热水器有限公司 Image anti-aliasing processing method and electrical equipment
CN113077408A (en) * 2021-03-29 2021-07-06 维沃移动通信有限公司 Fusion coefficient determination method and device, electronic equipment and storage medium
WO2024001360A1 (en) * 2022-06-28 2024-01-04 北京字跳网络技术有限公司 Green screen matting method and apparatus, and electronic device

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8175384B1 (en) * 2008-03-17 2012-05-08 Adobe Systems Incorporated Method and apparatus for discriminative alpha matting
CN103473780A (en) * 2013-09-22 2013-12-25 广州市幸福网络技术有限公司 Portrait background cutout method
JP2014071666A (en) * 2012-09-28 2014-04-21 Dainippon Printing Co Ltd Image processor, image processing method and program
CN103942794A (en) * 2014-04-16 2014-07-23 南京大学 Image collaborative cutout method based on confidence level
CN104200470A (en) * 2014-08-29 2014-12-10 电子科技大学 Blue screen image-matting method
CN204154528U (en) * 2014-05-16 2015-02-11 金陵科技学院 The sampling box device of acid sludge water colour Intelligent Measurement in a kind of sulfuric acid chemical industry
US20150117779A1 (en) * 2013-10-30 2015-04-30 Thomson Licensing Method and apparatus for alpha matting
CN104899877A (en) * 2015-05-20 2015-09-09 中国科学院西安光学精密机械研究所 Image foreground extraction method based on super-pixels and fast three-division graph
CN106504241A (en) * 2016-10-25 2017-03-15 西安交通大学 A kind of apparatus and method of checking colors automatically
CN106530309A (en) * 2016-10-24 2017-03-22 成都品果科技有限公司 Video matting method and system based on mobile platform
CN106952270A (en) * 2017-03-01 2017-07-14 湖南大学 A kind of quickly stingy drawing method of uniform background image
CN107133964A (en) * 2017-06-01 2017-09-05 江苏火米互动科技有限公司 A kind of stingy image space method based on Kinect
CN107730528A (en) * 2017-10-28 2018-02-23 天津大学 A kind of interactive image segmentation and fusion method based on grabcut algorithms
US9922681B2 (en) * 2013-02-20 2018-03-20 Intel Corporation Techniques for adding interactive features to videos
CN110298861A (en) * 2019-07-04 2019-10-01 大连理工大学 A kind of quick three-dimensional image partition method based on shared sampling
CN110400323A (en) * 2019-07-30 2019-11-01 上海艾麒信息科技有限公司 It is a kind of to scratch drawing system, method and device automatically

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8175384B1 (en) * 2008-03-17 2012-05-08 Adobe Systems Incorporated Method and apparatus for discriminative alpha matting
JP2014071666A (en) * 2012-09-28 2014-04-21 Dainippon Printing Co Ltd Image processor, image processing method and program
US9922681B2 (en) * 2013-02-20 2018-03-20 Intel Corporation Techniques for adding interactive features to videos
CN103473780A (en) * 2013-09-22 2013-12-25 广州市幸福网络技术有限公司 Portrait background cutout method
US20150117779A1 (en) * 2013-10-30 2015-04-30 Thomson Licensing Method and apparatus for alpha matting
CN103942794A (en) * 2014-04-16 2014-07-23 南京大学 Image collaborative cutout method based on confidence level
CN204154528U (en) * 2014-05-16 2015-02-11 金陵科技学院 The sampling box device of acid sludge water colour Intelligent Measurement in a kind of sulfuric acid chemical industry
CN104200470A (en) * 2014-08-29 2014-12-10 电子科技大学 Blue screen image-matting method
CN104899877A (en) * 2015-05-20 2015-09-09 中国科学院西安光学精密机械研究所 Image foreground extraction method based on super-pixels and fast three-division graph
CN106530309A (en) * 2016-10-24 2017-03-22 成都品果科技有限公司 Video matting method and system based on mobile platform
CN106504241A (en) * 2016-10-25 2017-03-15 西安交通大学 A kind of apparatus and method of checking colors automatically
CN106952270A (en) * 2017-03-01 2017-07-14 湖南大学 A kind of quickly stingy drawing method of uniform background image
CN107133964A (en) * 2017-06-01 2017-09-05 江苏火米互动科技有限公司 A kind of stingy image space method based on Kinect
CN107730528A (en) * 2017-10-28 2018-02-23 天津大学 A kind of interactive image segmentation and fusion method based on grabcut algorithms
CN110298861A (en) * 2019-07-04 2019-10-01 大连理工大学 A kind of quick three-dimensional image partition method based on shared sampling
CN110400323A (en) * 2019-07-30 2019-11-01 上海艾麒信息科技有限公司 It is a kind of to scratch drawing system, method and device automatically

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
EDUARDO S. L. GASTAL 等: "Shared Sampling for Real-Time Alpha Matting", 《EUROGRAPHICS 2010》 *
孟蕊 等: "基于OneCut和共享抠图算法的自适应衣物目标抠取", 《智能计算机与应用》 *
杨振亚 等: "RGB 颜色空间的矢量-角度距离色差公式", 《计算机工程与应用》 *
杨振亚 等: "一种新的RGB色差度量公式", 《计算机应用》 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112785511A (en) * 2020-06-30 2021-05-11 青岛经济技术开发区海尔热水器有限公司 Image anti-aliasing processing method and electrical equipment
CN112164012A (en) * 2020-10-14 2021-01-01 上海影卓信息科技有限公司 Method and system for realizing portrait color relief effect
CN112164013A (en) * 2020-10-14 2021-01-01 上海影卓信息科技有限公司 Portrait reloading method, system and medium
CN112164013B (en) * 2020-10-14 2023-04-18 上海影卓信息科技有限公司 Portrait reloading method, system and medium
CN112164012B (en) * 2020-10-14 2023-05-12 上海影卓信息科技有限公司 Method and system for realizing portrait color relief effect
CN112308866A (en) * 2020-11-04 2021-02-02 Oppo广东移动通信有限公司 Image processing method, image processing device, electronic equipment and storage medium
CN112308866B (en) * 2020-11-04 2024-02-09 Oppo广东移动通信有限公司 Image processing method, device, electronic equipment and storage medium
CN113077408A (en) * 2021-03-29 2021-07-06 维沃移动通信有限公司 Fusion coefficient determination method and device, electronic equipment and storage medium
CN113077408B (en) * 2021-03-29 2024-05-24 维沃移动通信有限公司 Fusion coefficient determination method and device, electronic equipment and storage medium
WO2024001360A1 (en) * 2022-06-28 2024-01-04 北京字跳网络技术有限公司 Green screen matting method and apparatus, and electronic device

Similar Documents

Publication Publication Date Title
CN111223108A (en) Method and system based on backdrop matting and fusion
CN107862698B (en) Light field foreground segmentation method and device based on K mean cluster
US10574905B2 (en) System and methods for depth regularization and semiautomatic interactive matting using RGB-D images
US10303983B2 (en) Image recognition apparatus, image recognition method, and recording medium
Jung Efficient background subtraction and shadow removal for monochromatic video sequences
Davis et al. Background-subtraction in thermal imagery using contour saliency
CN108537782B (en) Building image matching and fusing method based on contour extraction
CN102271254B (en) Depth image preprocessing method
Ghazali et al. An innovative face detection based on skin color segmentation
Xu et al. Automatic building rooftop extraction from aerial images via hierarchical RGB-D priors
CN103198319B (en) For the blurred picture Angular Point Extracting Method under the wellbore environment of mine
EP2849425A1 (en) Color video processing system and method, and corresponding computer program
CN107066963B (en) A kind of adaptive people counting method
CN104966266A (en) Method and system to automatically blur body part
CN111460964A (en) Moving target detection method under low-illumination condition of radio and television transmission machine room
CN111161219B (en) Robust monocular vision SLAM method suitable for shadow environment
CN108961258B (en) Foreground image obtaining method and device
CN105022992A (en) Automatic identification method of vehicle license plate
CN110458012B (en) Multi-angle face recognition method and device, storage medium and terminal
CN106203447B (en) Foreground target extraction method based on pixel inheritance
van de Wouw et al. Hierarchical 2.5-d scene alignment for change detection with large viewpoint differences
Sekhar et al. An object-based splicing forgery detection using multiple noise features
Lertchuwongsa et al. Mixed color/level lines and their stereo-matching with a modified hausdorff distance
CN108171236A (en) A kind of LED characters automatic positioning method
Nam et al. Flash shadow detection and removal in stereo photography

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200602