CN111223108A - Method and system based on backdrop matting and fusion - Google Patents
Method and system based on backdrop matting and fusion Download PDFInfo
- Publication number
- CN111223108A CN111223108A CN201911413016.4A CN201911413016A CN111223108A CN 111223108 A CN111223108 A CN 111223108A CN 201911413016 A CN201911413016 A CN 201911413016A CN 111223108 A CN111223108 A CN 111223108A
- Authority
- CN
- China
- Prior art keywords
- color
- point
- background
- image
- foreground
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000004927 fusion Effects 0.000 title claims abstract description 20
- 238000000034 method Methods 0.000 title claims abstract description 18
- 238000005457 optimization Methods 0.000 claims abstract description 13
- 230000000877 morphologic effect Effects 0.000 claims abstract description 8
- 239000003086 colorant Substances 0.000 claims description 5
- 238000005530 etching Methods 0.000 claims description 5
- 238000009499 grossing Methods 0.000 claims description 5
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000007797 corrosion Effects 0.000 description 1
- 238000005260 corrosion Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/187—Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/194—Segmentation; Edge detection involving foreground-background segmentation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/272—Means for inserting a foreground image in a background image, i.e. inlay, outlay
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20036—Morphological image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20076—Probabilistic image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20112—Image segmentation details
- G06T2207/20132—Image cropping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
The invention provides a method and a system based on backdrop matting and fusion, which comprises the following steps: a background color obtaining step: acquiring the color of a background in an image or a video; an initial alpha value determination step: determining an initial alpha value of an initial pixel by calculating the distance and the included angle of the pixel color and the RGB space of the background color to obtain an alpha image; a binarization step: according to a set threshold value of an alpha value, performing binarization processing on the alpha image; and (3) optimizing: using the alpha image after the morphological optimization binarization processing to obtain a trisection image; sharing sample points: and executing a shared sample point algorithm on the obtained trimap image to obtain a final foreground color and an alpha value.
Description
Technical Field
The invention relates to the technical field of image processing, in particular to a backdrop matting and fusion-based method and system.
Background
Matting refers to a technique for accurately extracting foreground objects from an image or video sequence. The matting technology is a key technology in the field of visual special effects, and is widely applied to the fields of image editing, film making and the like. However, due to the under-constraint of the matting problem, an additional constraint condition needs to be added when solving the problem, so in movie and television production, a blue curtain or a green curtain is usually adopted as a shooting background, thereby reducing the difficulty of solving the problem.
Patent document CN 107180238A discloses an image previewing apparatus and method for an intelligent terminal, which start an intelligent terminal shooting module, capture a target image containing a target person, recognize the face of the target person in the target image, obtain the position information of the target person, lock the person outline according to the conventional human body proportion or preset person figure parameters, replace the background outside the person outline with a green curtain, extract the person image inside the person outline, and send the person image to a holographic projection module included in the intelligent terminal or connected to the intelligent terminal, where the holographic projection module projects a stereoscopic image of the target person around the target person.
However, the conventional matting technology generally has inaccurate matting, so that for an area in which a background and a foreground are difficult to distinguish (a background area, a foreground area and an unknown area which cannot be identified are generated in an identification process), the foreground is easily lost or mixed into the background, and the matting efficiency is poor.
Disclosure of Invention
In view of the shortcomings in the prior art, it is an object of the present invention to provide a backdrop matting and fusion based method and system.
The method based on backdrop matting and fusion provided by the invention comprises the following steps:
a background color obtaining step: acquiring the color of a background in an image or a video;
an initial alpha value determination step: determining an initial alpha value of an initial pixel by calculating the distance and the included angle of the pixel color and the RGB space of the background color to obtain an alpha image;
a binarization step: according to a set threshold value of an alpha value, performing binarization processing on the alpha image;
and (3) optimizing: using the alpha image after the morphological optimization binarization processing to obtain a trisection image;
sharing sample points: and executing a shared sample point algorithm on the obtained trimap image to obtain a final foreground color and an alpha value.
Preferably, the background color acquiring step includes two kinds:
A. respectively counting histograms of RGB three channels in an image or a video, and selecting a background color value with the maximum RGB probability;
B. and directly acquiring the background color value selected by the user.
Preferably, the initial alpha value determining step includes:
pixel color C of ith pixeliAnd background color CbDistance d ofi:
Pixel color C of ith pixeliAnd background color CbAngle of (cos θ)i:
Initial alpha value α for the ith pixeli:
preferably, the optimizing step comprises:
A. respectively executing an opening operation on the foreground area and the background area, and connecting the locally discontinuous areas;
B. executing the expansion region r with the size of the foreground region and the background region respectivelycThe etching operation of (1).
Preferably, the sharing sample point step includes:
for point p in each unknown region:
A. respectively finding out foreground points and background points which are closest to the p point from the directions of 4 equiangular-distance rays starting from the p point, and forming a set by the foreground pointsBackground points comprising setsTo pairComputing a p to s energy functionAnd obtain p as belonging to foreground region TfProbability of (2)Wherein,is the difference in color between the two points,is the distance between two points, Ep(b) As a function of the energy from point b to point p, Ep(f) As a function of the energy from point p to point f;
-calculating the color distortion To make it possible toThe minimum value is selected from the group consisting of,is a point fi pThe color of (a) is selected,is a pointThe color of (a);
-integrating the color distortion, disparity parameter and positional relationship:
is an objective function, NpAs a function of the degree of color distortion, ApAs an inconsistency parameter, DpIs fi p、And the linear distance of point p;
D. all q ∈ T in the 10x10 domain for punknownColor pair ofTunknownCalculating the color distortion degree of a relative p point for an unknown region, and selecting the color pair with the minimum distortion degree as a new color pair of p;
E. and performing Gaussian smoothing on all color pairs and corresponding transparencies, and outputting final foreground colors and alpha values.
The invention provides a backdrop matting and fusion-based system, which comprises:
a background color acquisition module: acquiring the color of a background in an image or a video;
an initial alpha value determination module: determining an initial alpha value of an initial pixel by calculating the distance and the included angle of the pixel color and the RGB space of the background color to obtain an alpha image;
a binarization module: according to a set threshold value of an alpha value, performing binarization processing on the alpha image;
an optimization module: using the alpha image after the morphological optimization binarization processing to obtain a trisection image;
a shared sample point module: and executing a shared sample point algorithm on the obtained trimap image to obtain a final foreground color and an alpha value.
Preferably, the background color obtaining module includes two types:
A. respectively counting histograms of RGB three channels in an image or a video, and selecting a background color value with the maximum RGB probability;
B. and directly acquiring the background color value selected by the user.
Preferably, the initial alpha value determining module comprises:
pixel color C of ith pixeliAnd background color CbDistance d ofi:
Pixel color C of ith pixeliAnd background color CbAngle of (cos θ)i:
Initial alpha value α for the ith pixeli:
preferably, the optimization module comprises:
A. respectively executing an opening operation on the foreground area and the background area, and connecting the locally discontinuous areas;
B. executing the expansion region r with the size of the foreground region and the background region respectivelycThe etching operation of (1).
Preferably, the shared sample point module comprises:
for point p in each unknown region:
A. respectively finding out foreground points and background points which are closest to the p point from the directions of 4 equiangular-distance rays starting from the p point, and forming a set by the foreground pointsBackground points comprising setsTo pairComputing a p to s energy functionAnd obtain p as belonging to foreground region TfProbability of (2)Wherein,is the difference in color between the two points,is the distance between two points, Ep(b) As a function of the energy from point b to point p, Ep(f) As a function of the energy from point p to point f;
-calculating the color distortion To make it possible toThe minimum value is selected from the group consisting of,is a point fi pThe color of (a) is selected,is a pointThe color of (a);
-integrating the color distortion, disparity parameter and positional relationship:
is an objective function, NpAs a function of the degree of color distortion, ApAs an inconsistency parameter, DpIs fi p、And the linear distance of point p;
D. all q ∈ T in the 10x10 domain for punknownColor pair ofTunknownCalculating the color distortion degree of a relative p point for an unknown region, and selecting the color pair with the minimum distortion degree as a new color pair of p;
E. and performing Gaussian smoothing on all color pairs and corresponding transparencies, and outputting final foreground colors and alpha values.
Compared with the prior art, the invention has the following beneficial effects:
1. through color difference, binaryzation, morphological expansion and corrosion, an accurate trimap (trimap) is extracted, and automatic cutout is achieved
2. By adopting the shared matching algorithm to correlate the pixels of the unknown region based on the surrounding pixels, the fine structure of the edge (such as hair) is ensured to be preserved, and the background color of the edge is avoided from being mixed.
3. By realizing GLSL conversion of the algorithm and directly running the algorithm in a GPU, the efficiency of the algorithm is greatly improved and the instantaneity of the matting processing is ensured.
Drawings
Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments with reference to the following drawings:
FIG. 1 is a flow chart of the operation of the present invention.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the invention, but are not intended to limit the invention in any way. It should be noted that it would be obvious to those skilled in the art that various changes and modifications can be made without departing from the spirit of the invention. All falling within the scope of the present invention.
In the present invention, the area with green curtains: background region Tb(ii) a Area without green screen: foreground region Tf(ii) a Indistinguishable regions: unknown region Tunknown。
As shown in fig. 1, the method for backdrop matting and fusion based on the present invention includes:
s1, background color obtaining step: the color of the background in the image or video is obtained. The background color acquisition step includes two kinds:
A. respectively counting histograms of RGB three channels in an image or a video, and selecting a background color value with the maximum RGB probability;
B. and directly acquiring the background color value selected by the user.
S2, an initial alpha value determining step: and determining an initial alpha value of the initial pixel by calculating the distance and the included angle of the pixel color and the RGB space of the background color to obtain an alpha image. The initial alpha value determining step includes:
pixel color C of ith pixeliAnd background color CbDistance d ofi:
Pixel color C of ith pixeliAnd background color CbAngle of (cos θ)i:
Initial alpha value α for the ith pixeli:
s3, binarization step: and performing binarization processing on the alpha image according to a set threshold value of the alpha value.
Select αlow、αhighThe threshold value is set to a value that is,
αi<=αlow,αi=0;
αi>=αhigh,αi=1;
αlow<αi<αhigh,αi=0.5;
s4, optimizing: and (5) utilizing the alpha image after the morphological optimization binarization processing to obtain a trisection image. The optimization steps comprise:
A. respectively executing one-time opening operation (namely expansion operation, expansion of the foreground area and the background area) on the foreground area and the background area, and connecting the locally discontinuous areas;
B. executing the expansion region r with the size of the foreground region and the background region respectivelycThe etching operation of (1).
S5, sharing sample points: and executing a shared sample point algorithm on the obtained trimap image to obtain a final foreground color and an alpha value. The step of sharing the sample points comprises the following steps:
for point p in each unknown region:
A. respectively finding out foreground points and background points which are closest to the p point from the directions of 4 equiangular-distance rays starting from the p point, and forming a set by the foreground pointsBackground points comprising setsTo pairComputing a p to s energy functionAnd obtain p as belonging to foreground region TfProbability of (2)Wherein,is the difference in color between the two points,is the distance between two points, Ep(b) As a function of the energy from point b to point p, Ep(f) As a function of the energy from point p to point f;
-calculating the color distortion To make it possible toThe minimum value is selected from the group consisting of,is a point fi pThe color of (a) is selected,is a pointThe color of (a);
-integrating the color distortion, disparity parameter and positional relationship:
is an objective function, NpAs a function of the degree of color distortion, ApAs an inconsistency parameter, DpIs fi p、And the linear distance of point p;
D. all q ∈ T in the 10x10 domain for punknownColor pair ofTunknownCalculating the color distortion degree of a relative p point for an unknown region, and selecting the color pair with the minimum distortion degree as a new color pair of p;
E. and performing Gaussian smoothing on all color pairs and corresponding transparencies, and outputting final foreground colors and alpha values.
On the basis of the backdrop matting and fusion-based method, the invention also provides a backdrop matting and fusion-based system, which comprises the following steps:
a background color acquisition module: acquiring the color of a background in an image or a video;
an initial alpha value determination module: determining an initial alpha value of an initial pixel by calculating the distance and the included angle of the pixel color and the RGB space of the background color to obtain an alpha image;
a binarization module: according to a set threshold value of an alpha value, performing binarization processing on the alpha image;
an optimization module: using the alpha image after the morphological optimization binarization processing to obtain a trisection image;
a shared sample point module: and executing a shared sample point algorithm on the obtained trimap image to obtain a final foreground color and an alpha value.
Those skilled in the art will appreciate that, in addition to implementing the system and its various devices, modules, units provided by the present invention as pure computer readable program code, the system and its various devices, modules, units provided by the present invention can be fully implemented by logically programming method steps in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Therefore, the system and various devices, modules and units thereof provided by the invention can be regarded as a hardware component, and the devices, modules and units included in the system for realizing various functions can also be regarded as structures in the hardware component; means, modules, units for performing the various functions may also be regarded as structures within both software modules and hardware components for performing the method.
The foregoing description of specific embodiments of the present invention has been presented. It is to be understood that the present invention is not limited to the specific embodiments described above, and that various changes or modifications may be made by one skilled in the art within the scope of the appended claims without departing from the spirit of the invention. The embodiments and features of the embodiments of the present application may be combined with each other arbitrarily without conflict.
Claims (10)
1. A backdrop matting and fusion-based method is characterized by comprising the following steps:
a background color obtaining step: acquiring the color of a background in an image or a video;
an initial alpha value determination step: determining an initial alpha value of an initial pixel by calculating the distance and the included angle of the pixel color and the RGB space of the background color to obtain an alpha image;
a binarization step: according to a set threshold value of an alpha value, performing binarization processing on the alpha image;
and (3) optimizing: using the alpha image after the morphological optimization binarization processing to obtain a trisection image;
sharing sample points: and executing a shared sample point algorithm on the obtained trimap image to obtain a final foreground color and an alpha value.
2. The backdrop matting and fusion-based method according to claim 1, wherein the background color obtaining step includes two kinds:
A. respectively counting histograms of RGB three channels in an image or a video, and selecting a background color value with the maximum RGB probability;
B. and directly acquiring the background color value selected by the user.
3. The backdrop matting and fusion-based method according to claim 1, wherein the initial alpha value determining step comprises:
pixel color C of ith pixeliAnd background color CbDistance d ofi:
Pixel color C of ith pixeliAnd background color CbAngle of (cos θ)i:
Initial alpha value α for the ith pixeli:
4. the backdrop matting and fusion-based method according to claim 1, wherein the optimizing step comprises:
A. respectively executing an opening operation on the foreground area and the background area, and connecting the locally discontinuous areas;
B. executing the expansion region r with the size of the foreground region and the background region respectivelycThe etching operation of (1).
5. The backdrop matting and fusion-based method according to claim 1, wherein the sharing sample point step comprises:
for point p in each unknown region:
A. respectively finding out foreground points and background points which are closest to the p point from the directions of 4 equiangular-distance rays starting from the p point, and forming a set by the foreground pointsBackground points comprising setsTo pairComputing a p to s energy functionAnd obtain p as belonging to foreground region TfProbability of (2)Wherein,is the difference in color between the two points,is the distance between two points, Ep(b) As a function of the energy from point b to point p, Ep(f) As a function of the energy from point p to point f;
-calculating the color distortion To make it possible toThe minimum value is selected from the group consisting of,is a point fi pThe color of (a) is selected,is a pointThe color of (a);
-integrating the color distortion, disparity parameter and positional relationship:
is an objective function, NpAs a function of the degree of color distortion, ApAs an inconsistency parameter, DpIs fi p、And the linear distance of point p;
D. all q ∈ T in the 10x10 domain for punknownColor pair ofTunknownFor unknown regions, calculate the relativeSelecting the color pair with the minimum distortion degree as a new color pair of p according to the color distortion degree of the p point;
E. and performing Gaussian smoothing on all color pairs and corresponding transparencies, and outputting final foreground colors and alpha values.
6. A backdrop matting and fusion-based system, comprising:
a background color acquisition module: acquiring the color of a background in an image or a video;
an initial alpha value determination module: determining an initial alpha value of an initial pixel by calculating the distance and the included angle of the pixel color and the RGB space of the background color to obtain an alpha image;
a binarization module: according to a set threshold value of an alpha value, performing binarization processing on the alpha image;
an optimization module: using the alpha image after the morphological optimization binarization processing to obtain a trisection image;
a shared sample point module: and executing a shared sample point algorithm on the obtained trimap image to obtain a final foreground color and an alpha value.
7. The backdrop matting and fusion-based system according to claim 6, wherein the background color acquisition module includes two types:
A. respectively counting histograms of RGB three channels in an image or a video, and selecting a background color value with the maximum RGB probability;
B. and directly acquiring the background color value selected by the user.
8. The backdrop matting and fusion based system according to claim 6 wherein the initial alpha value determination module comprises:
pixel color C of ith pixeliAnd background color CbDistance d ofi:
Pixel color C of ith pixeliAnd background color CbAngle of (cos θ)i:
Initial alpha value α for the ith pixeli:
9. the backdrop matting and fusion-based system according to claim 6, wherein the optimization module comprises:
A. respectively executing an opening operation on the foreground area and the background area, and connecting the locally discontinuous areas;
B. executing the expansion region r with the size of the foreground region and the background region respectivelycThe etching operation of (1).
10. The backdrop matting and fusion-based system according to claim 6 wherein the shared sample point module comprises:
for point p in each unknown region:
A. respectively finding out foreground points and background points which are closest to the p point from the directions of 4 equiangular-distance rays starting from the p point, and forming a set by the foreground pointsBackground points comprising setsTo pairComputing a p to s energy functionAnd obtain p as belonging to foreground region TfProbability of (2)Wherein,is the difference in color between the two points,is the distance between two points, Ep(b) As a function of the energy from point b to point p, Ep(f) As a function of the energy from point p to point f;
-calculating the color distortion To make it possible toThe minimum value is selected from the group consisting of,is a point fi pThe color of (a) is selected,is a pointThe color of (a);
-integrating the color distortion, disparity parameter and positional relationship:
is an objective function, NpAs a function of the degree of color distortion, ApAs an inconsistency parameter, DpIs fi p、And the linear distance of point p;
D. all q ∈ T in the 10x10 domain for punknownColor pair ofTunknownCalculating the color distortion degree of a relative p point for an unknown region, and selecting the color pair with the minimum distortion degree as a new color pair of p;
E. and performing Gaussian smoothing on all color pairs and corresponding transparencies, and outputting final foreground colors and alpha values.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911413016.4A CN111223108A (en) | 2019-12-31 | 2019-12-31 | Method and system based on backdrop matting and fusion |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911413016.4A CN111223108A (en) | 2019-12-31 | 2019-12-31 | Method and system based on backdrop matting and fusion |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111223108A true CN111223108A (en) | 2020-06-02 |
Family
ID=70828050
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911413016.4A Pending CN111223108A (en) | 2019-12-31 | 2019-12-31 | Method and system based on backdrop matting and fusion |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111223108A (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112164012A (en) * | 2020-10-14 | 2021-01-01 | 上海影卓信息科技有限公司 | Method and system for realizing portrait color relief effect |
CN112164013A (en) * | 2020-10-14 | 2021-01-01 | 上海影卓信息科技有限公司 | Portrait reloading method, system and medium |
CN112308866A (en) * | 2020-11-04 | 2021-02-02 | Oppo广东移动通信有限公司 | Image processing method, image processing device, electronic equipment and storage medium |
CN112785511A (en) * | 2020-06-30 | 2021-05-11 | 青岛经济技术开发区海尔热水器有限公司 | Image anti-aliasing processing method and electrical equipment |
CN113077408A (en) * | 2021-03-29 | 2021-07-06 | 维沃移动通信有限公司 | Fusion coefficient determination method and device, electronic equipment and storage medium |
WO2024001360A1 (en) * | 2022-06-28 | 2024-01-04 | 北京字跳网络技术有限公司 | Green screen matting method and apparatus, and electronic device |
Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8175384B1 (en) * | 2008-03-17 | 2012-05-08 | Adobe Systems Incorporated | Method and apparatus for discriminative alpha matting |
CN103473780A (en) * | 2013-09-22 | 2013-12-25 | 广州市幸福网络技术有限公司 | Portrait background cutout method |
JP2014071666A (en) * | 2012-09-28 | 2014-04-21 | Dainippon Printing Co Ltd | Image processor, image processing method and program |
CN103942794A (en) * | 2014-04-16 | 2014-07-23 | 南京大学 | Image collaborative cutout method based on confidence level |
CN104200470A (en) * | 2014-08-29 | 2014-12-10 | 电子科技大学 | Blue screen image-matting method |
CN204154528U (en) * | 2014-05-16 | 2015-02-11 | 金陵科技学院 | The sampling box device of acid sludge water colour Intelligent Measurement in a kind of sulfuric acid chemical industry |
US20150117779A1 (en) * | 2013-10-30 | 2015-04-30 | Thomson Licensing | Method and apparatus for alpha matting |
CN104899877A (en) * | 2015-05-20 | 2015-09-09 | 中国科学院西安光学精密机械研究所 | Image foreground extraction method based on super-pixels and fast three-division graph |
CN106504241A (en) * | 2016-10-25 | 2017-03-15 | 西安交通大学 | A kind of apparatus and method of checking colors automatically |
CN106530309A (en) * | 2016-10-24 | 2017-03-22 | 成都品果科技有限公司 | Video matting method and system based on mobile platform |
CN106952270A (en) * | 2017-03-01 | 2017-07-14 | 湖南大学 | A kind of quickly stingy drawing method of uniform background image |
CN107133964A (en) * | 2017-06-01 | 2017-09-05 | 江苏火米互动科技有限公司 | A kind of stingy image space method based on Kinect |
CN107730528A (en) * | 2017-10-28 | 2018-02-23 | 天津大学 | A kind of interactive image segmentation and fusion method based on grabcut algorithms |
US9922681B2 (en) * | 2013-02-20 | 2018-03-20 | Intel Corporation | Techniques for adding interactive features to videos |
CN110298861A (en) * | 2019-07-04 | 2019-10-01 | 大连理工大学 | A kind of quick three-dimensional image partition method based on shared sampling |
CN110400323A (en) * | 2019-07-30 | 2019-11-01 | 上海艾麒信息科技有限公司 | It is a kind of to scratch drawing system, method and device automatically |
-
2019
- 2019-12-31 CN CN201911413016.4A patent/CN111223108A/en active Pending
Patent Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8175384B1 (en) * | 2008-03-17 | 2012-05-08 | Adobe Systems Incorporated | Method and apparatus for discriminative alpha matting |
JP2014071666A (en) * | 2012-09-28 | 2014-04-21 | Dainippon Printing Co Ltd | Image processor, image processing method and program |
US9922681B2 (en) * | 2013-02-20 | 2018-03-20 | Intel Corporation | Techniques for adding interactive features to videos |
CN103473780A (en) * | 2013-09-22 | 2013-12-25 | 广州市幸福网络技术有限公司 | Portrait background cutout method |
US20150117779A1 (en) * | 2013-10-30 | 2015-04-30 | Thomson Licensing | Method and apparatus for alpha matting |
CN103942794A (en) * | 2014-04-16 | 2014-07-23 | 南京大学 | Image collaborative cutout method based on confidence level |
CN204154528U (en) * | 2014-05-16 | 2015-02-11 | 金陵科技学院 | The sampling box device of acid sludge water colour Intelligent Measurement in a kind of sulfuric acid chemical industry |
CN104200470A (en) * | 2014-08-29 | 2014-12-10 | 电子科技大学 | Blue screen image-matting method |
CN104899877A (en) * | 2015-05-20 | 2015-09-09 | 中国科学院西安光学精密机械研究所 | Image foreground extraction method based on super-pixels and fast three-division graph |
CN106530309A (en) * | 2016-10-24 | 2017-03-22 | 成都品果科技有限公司 | Video matting method and system based on mobile platform |
CN106504241A (en) * | 2016-10-25 | 2017-03-15 | 西安交通大学 | A kind of apparatus and method of checking colors automatically |
CN106952270A (en) * | 2017-03-01 | 2017-07-14 | 湖南大学 | A kind of quickly stingy drawing method of uniform background image |
CN107133964A (en) * | 2017-06-01 | 2017-09-05 | 江苏火米互动科技有限公司 | A kind of stingy image space method based on Kinect |
CN107730528A (en) * | 2017-10-28 | 2018-02-23 | 天津大学 | A kind of interactive image segmentation and fusion method based on grabcut algorithms |
CN110298861A (en) * | 2019-07-04 | 2019-10-01 | 大连理工大学 | A kind of quick three-dimensional image partition method based on shared sampling |
CN110400323A (en) * | 2019-07-30 | 2019-11-01 | 上海艾麒信息科技有限公司 | It is a kind of to scratch drawing system, method and device automatically |
Non-Patent Citations (4)
Title |
---|
EDUARDO S. L. GASTAL 等: "Shared Sampling for Real-Time Alpha Matting", 《EUROGRAPHICS 2010》 * |
孟蕊 等: "基于OneCut和共享抠图算法的自适应衣物目标抠取", 《智能计算机与应用》 * |
杨振亚 等: "RGB 颜色空间的矢量-角度距离色差公式", 《计算机工程与应用》 * |
杨振亚 等: "一种新的RGB色差度量公式", 《计算机应用》 * |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112785511A (en) * | 2020-06-30 | 2021-05-11 | 青岛经济技术开发区海尔热水器有限公司 | Image anti-aliasing processing method and electrical equipment |
CN112164012A (en) * | 2020-10-14 | 2021-01-01 | 上海影卓信息科技有限公司 | Method and system for realizing portrait color relief effect |
CN112164013A (en) * | 2020-10-14 | 2021-01-01 | 上海影卓信息科技有限公司 | Portrait reloading method, system and medium |
CN112164013B (en) * | 2020-10-14 | 2023-04-18 | 上海影卓信息科技有限公司 | Portrait reloading method, system and medium |
CN112164012B (en) * | 2020-10-14 | 2023-05-12 | 上海影卓信息科技有限公司 | Method and system for realizing portrait color relief effect |
CN112308866A (en) * | 2020-11-04 | 2021-02-02 | Oppo广东移动通信有限公司 | Image processing method, image processing device, electronic equipment and storage medium |
CN112308866B (en) * | 2020-11-04 | 2024-02-09 | Oppo广东移动通信有限公司 | Image processing method, device, electronic equipment and storage medium |
CN113077408A (en) * | 2021-03-29 | 2021-07-06 | 维沃移动通信有限公司 | Fusion coefficient determination method and device, electronic equipment and storage medium |
CN113077408B (en) * | 2021-03-29 | 2024-05-24 | 维沃移动通信有限公司 | Fusion coefficient determination method and device, electronic equipment and storage medium |
WO2024001360A1 (en) * | 2022-06-28 | 2024-01-04 | 北京字跳网络技术有限公司 | Green screen matting method and apparatus, and electronic device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111223108A (en) | Method and system based on backdrop matting and fusion | |
CN107862698B (en) | Light field foreground segmentation method and device based on K mean cluster | |
US10574905B2 (en) | System and methods for depth regularization and semiautomatic interactive matting using RGB-D images | |
US10303983B2 (en) | Image recognition apparatus, image recognition method, and recording medium | |
Jung | Efficient background subtraction and shadow removal for monochromatic video sequences | |
Davis et al. | Background-subtraction in thermal imagery using contour saliency | |
CN108537782B (en) | Building image matching and fusing method based on contour extraction | |
CN102271254B (en) | Depth image preprocessing method | |
Ghazali et al. | An innovative face detection based on skin color segmentation | |
Xu et al. | Automatic building rooftop extraction from aerial images via hierarchical RGB-D priors | |
CN103198319B (en) | For the blurred picture Angular Point Extracting Method under the wellbore environment of mine | |
EP2849425A1 (en) | Color video processing system and method, and corresponding computer program | |
CN107066963B (en) | A kind of adaptive people counting method | |
CN104966266A (en) | Method and system to automatically blur body part | |
CN111460964A (en) | Moving target detection method under low-illumination condition of radio and television transmission machine room | |
CN111161219B (en) | Robust monocular vision SLAM method suitable for shadow environment | |
CN108961258B (en) | Foreground image obtaining method and device | |
CN105022992A (en) | Automatic identification method of vehicle license plate | |
CN110458012B (en) | Multi-angle face recognition method and device, storage medium and terminal | |
CN106203447B (en) | Foreground target extraction method based on pixel inheritance | |
van de Wouw et al. | Hierarchical 2.5-d scene alignment for change detection with large viewpoint differences | |
Sekhar et al. | An object-based splicing forgery detection using multiple noise features | |
Lertchuwongsa et al. | Mixed color/level lines and their stereo-matching with a modified hausdorff distance | |
CN108171236A (en) | A kind of LED characters automatic positioning method | |
Nam et al. | Flash shadow detection and removal in stereo photography |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200602 |