CN103578089A - Depth map filtering method and system based on local binary pattern operator guidance - Google Patents

Depth map filtering method and system based on local binary pattern operator guidance Download PDF

Info

Publication number
CN103578089A
CN103578089A CN201310192294.8A CN201310192294A CN103578089A CN 103578089 A CN103578089 A CN 103578089A CN 201310192294 A CN201310192294 A CN 201310192294A CN 103578089 A CN103578089 A CN 103578089A
Authority
CN
China
Prior art keywords
pixel point
current pixel
vicinity points
local binary
binary patterns
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201310192294.8A
Other languages
Chinese (zh)
Inventor
胡瑞敏
钟睿
刘璐
王中元
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University WHU
Original Assignee
Wuhan University WHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University WHU filed Critical Wuhan University WHU
Priority to CN201310192294.8A priority Critical patent/CN103578089A/en
Publication of CN103578089A publication Critical patent/CN103578089A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a depth map filtering method and system based on local binary pattern operator guidance. The method comprises the steps that firstly, the local binary pattern operator of the adjacent pixel point of a current pixel point in a colorful image is obtained; then, the adjacent pixel point belonging to the same object with the current pixel point is obtained through distinguishing based on the local binary pattern operator of the adjacent pixel point; finally, the adjacent pixel point belonging to the same object with the current pixel point is used as the reference pixel of depth map filtering to obtain the depth map filtering result of the current pixel point. The filtering method is added to an H.264/MVC encoding frame as loop filtering, and the depth map encoding efficiency and the quality of synthetic and virtual view points can be effectively improved.

Description

The depth map filtering method and the system that based on local binary patterns operator, instruct
Technical field
The invention belongs to image filtering field, belong to especially in three-dimensional video-frequency for the depth map filtering field of many viewpoints plus depth video format, be specifically related to a kind of depth map filtering method and system instructing based on local binary patterns operator.
Background technology
When in recent years, 3D film and TV are for spectators bring better degree of depth feeling of immersion and stereoscopic vision impression, also introduced and user between interaction.Stereoscopic vision impression can obtain by synchronization scene different points of view being caught to the multi-viewpoint three-dimensional video (Multi-view video, MVV) that obtain, but the weak point of the three-dimensional video-frequency based on MVV is to provide the free interactivity with user.Therefore, how utilizing the synthetic virtual view of virtual viewpoint rendering technology based on depth image, is current research heat topic.
P.Merkle thinks that depth map has unique data structure, thinks that it is to consist of a large amount of different homogeneous regions, and these homogeneous regions are divided [1] by object edge.Therefore in common hybrid encoding frame, such as H.264/AVC, quantize and module unit encoding operation all can cause the serious artifact effect of object edge in depth map.And depth map provides the geological information in 3D scene, thus the error of depth map will after virtual view cause geometric distortion in synthetic.The error of the pixel in depth map in homogeneous region, with the error of object edge pixel in depth map, the two is compared, and the latter more can have a strong impact on the quality [2] of synthetic virtual view.So keep the acuteness at edge of object extremely important when coding depth figure.
In order to meet the background of depth map codec, K.Oh [3]h.264 a loop filter is added in framework, the depth map of coding is carried out to denoising.This kind of loop filter be by limiting geometric distance, image similarity and occurrence frequency, and vicinity points is around operated, and recovers noise in depth map, effectively improved the efficiency of depth map encoding and the quality of synthetic virtual view.But the pixel having damaged in depth map also may reduce the accuracy of reconstructed image vegetarian refreshments, therefore, the method bit rate is lower, and filter effect is poorer.
Thought based on bilateral filtering, liu [4]propose a kind of method of three limit filtering, the method, when determining the value of weighting coefficient, increases the structural similarity of having considered cromogram and corresponding depth map.This filtering method can effectively be removed the artifact effect of depth map encoding in encoding-decoding process, and under identical bit, obtains the synthetic viewpoint of better quality.But when around vicinity points only has base point to be similitude, the filtering result of this filtering method just seems unstable.The problem existing in order to solve this filtering method, obtains stable filtering Output rusults, and wave filter need to be eliminated the impact of dissimilar vicinity points around.
Summary of the invention
The problem existing for prior art, the present invention is based on the thought of bilateral filtering, a kind of depth map filtering method and system instructing based on local binary patterns operator proposed, the method is under local binary patterns operator instructs, can effectively differentiate the object whether vicinity points belongs to current pixel point place, and only choose belong to same object with current pixel point vicinity points as carrying out filtering with reference to pixel, thereby obtain stable filtering result.
In order to solve the problems of the technologies described above, the present invention adopts following technical scheme:
One, the depth map filtering method that local binary patterns operator instructs, comprises step:
Step 1, for coloured image corresponding to depth image, obtains the local binary patterns operator of the vicinity points of current pixel point;
Step 2, the local binary patterns operator based on vicinity points is differentiated and is obtained belonging to current pixel point the vicinity points of same object, and upgrades the local binary patterns operator of vicinity points;
Step 3, the geometric distance acquisition spatial domain factor of influence according to the vicinity points of current pixel point to current pixel point;
Step 4, according to the vicinity points of current pixel point and current pixel point, the similarity of degree of depth intensity in depth image obtains image area factor of influence;
Step 5, the constraint factor that vicinity points local binary patterns operator, spatial domain factor of influence and the image area factor of influence of take after upgrading is filtration combined weighted coefficient obtains the filtering result of current pixel point.
Above-mentioned steps 1 further comprises that sub-step is:
In coloured image corresponding to depth image, obtain the brightness value of each vicinity points of current pixel point, and the brightness value of each vicinity points and current pixel point is made respectively to difference and get two-value symbol, obtain the local binary patterns operator of vicinity points.
Above-mentioned steps 2 further comprises sub-step:
2-1 is divided into two parts according to the local binary patterns operator of vicinity points by vicinity points;
2-2 judges according to the logical reach of above-mentioned two parts vicinity points and current pixel point whether vicinity points belongs to same object with current pixel point;
2-3 is updated to 1 by the local binary patterns operator that belongs to the vicinity points of same object with current pixel point, and the local binary patterns operator that does not belong to the vicinity points of same object with current pixel point is updated to 0.
The spatial domain factor of influence of step 3 gained is G q(|| p-c||), wherein, G qfor Gaussian function, || p-c|| is that the vicinity points p of current pixel point c is to the geometric distance of current pixel point c.
The image area factor of influence of step 4 gained is G s(d p-d c), wherein, G sfor Gaussian function, d p-d cfor the vicinity points p of the current pixel point c degree of depth intensity d in depth image pdegree of depth intensity d with current pixel point c in depth image csimilarity.
The depth map filtering result G of the current pixel point of step 5 gained cfor:
G c = 1 F c Σ p ∈ w [ G S ( d p - d c ) · G Q ( | | p - c | | ) · LBP P . R uni ( g p - g c ) · d p ]
Wherein:
W is the set of all vicinity points of current pixel point c;
P is the vicinity points of current pixel point c;
G s(d p-d c) be image area factor of influence, G sfor Gaussian function, d p-d cfor the vicinity points p of the current pixel point c degree of depth intensity d in depth image pdegree of depth intensity d with current pixel point c in depth image csimilarity;
G q(|| p-c||) be spatial domain factor of influence, G qfor Gaussian function, || p-c|| is that the vicinity points p of current pixel point c is to the geometric distance of current pixel point c;
Figure BDA00003228682500032
for the local binary patterns operator after vicinity points p renewal, g pand g cbe respectively vicinity points p and the current pixel point c brightness value in coloured image corresponding to depth image;
F c = Σ p ∈ w [ G S ( d p - d c ) · G Q ( | | p - c | | ) · LBP P . R uni ( g p - g c ) ] , For weighting coefficient summation.
Two, the depth map filtering system that local binary patterns operator instructs, comprising:
Local binary patterns operator acquisition module, is used for, for coloured image corresponding to depth image, obtaining the local binary patterns operator of the vicinity points of current pixel point;
Vicinity points judge module, the local binary patterns operator being used for based on vicinity points is differentiated and is obtained belonging to current pixel point the vicinity points of same object, and upgrades the local binary patterns operator of vicinity points;
Spatial domain factor of influence acquisition module, is used for to the geometric distance of current pixel point, obtaining spatial domain factor of influence according to the vicinity points of current pixel point;
Image area factor of influence acquisition module, is used for the similarity of according to the vicinity points of current pixel point and current pixel point degree of depth intensity in depth image to obtain image area factor of influence;
Filtration module, is used for the filtering result that constraint factor that vicinity points local binary patterns operator, spatial domain factor of influence and image area factor of influence after upgrading are filtration combined weighted coefficient obtains current pixel point.
Above-mentioned vicinity points judge module further comprises submodule:
Vicinity points is divided module, is used for, according to the local binary patterns operator of vicinity points, vicinity points is divided into two parts;
Judge module, is used for judging according to the logical reach of above-mentioned two parts vicinity points and current pixel point whether vicinity points belongs to same object with current pixel point;
Local binary patterns operator update module, be used for the local binary patterns operator that belongs to the vicinity points of same object with current pixel point to be updated to 1, the local binary patterns operator that does not belong to the vicinity points of same object with current pixel point is updated to 0.
The spatial domain factor of influence that above-mentioned spatial domain factor of influence acquisition module obtains is G q(|| p-c||), wherein, G qfor Gaussian function, || p-c|| is that the vicinity points p of current pixel point c is to the geometric distance of current pixel point c.
Above-mentioned image area factor of influence is G s(d p-d c), wherein, G sfor Gaussian function, d p-d cfor the vicinity points p of the current pixel point c degree of depth intensity d in depth image pdegree of depth intensity d with current pixel point c in depth image csimilarity.
The filtering result G of the current pixel point that above-mentioned filtration module obtains cfor:
G c = 1 F c Σ p ∈ w [ G S ( d p - d c ) · G Q ( | | p - c | | ) · LBP P . R uni ( g p - g c ) · d p ]
Wherein:
W is the set of all vicinity points of current pixel point c;
P is the vicinity points of current pixel point c;
G s(d p-d c) be image area factor of influence, G sfor Gaussian function, d p-d cfor the vicinity points p of the current pixel point c degree of depth intensity d in depth image pdegree of depth intensity d with current pixel point c in depth image csimilarity;
G q(|| p-c||) be spatial domain factor of influence, G qfor Gaussian function, || p-c|| is that the vicinity points p of current pixel point c is to the geometric distance of current pixel point c;
Figure BDA00003228682500051
for the local binary patterns operator after vicinity points p renewal, g pand g cbe respectively vicinity points p and the current pixel point c brightness value in coloured image corresponding to depth image;
F c = Σ p ∈ w [ G S ( d p - d c ) · G Q ( | | p - c | | ) · LBP P . R uni ( g p - g c ) ] , For weighting coefficient summation.
Compared with prior art, the present invention has following feature and beneficial effect:
The present invention is based on bilateral filtering thought, proposed a kind of depth map filtering method instructing based on local binary patterns operator.Local binary patterns operator (LBP) is at first by TImo Ojala [5]propose, as a kind of mode of description microtexture of stalwartness, local binary patterns operator has been described circular to being uniformly distributed of vicinity points around in territory, city.Local binary patterns operator (LBP) can the various structural information features of Precise Representation, such as: edge, turning, sharp point and flat site.
And the input pixel that depth map wave filter is selected, need to belong to current pixel point the vicinity points of an object, so the local binary patterns operator based on coloured image corresponding to depth image, can judge whether vicinity points belongs to same object with current pixel point.In addition,, even if dull gray scale is carried out change quantization operation, local binary patterns operator (LBP) also has good robustness [6], in depth map, slight quantization error is not enough to affect the accuracy of LBP description scheme information.Therefore, with the local binary patterns operator of coloured image, control the input reference pixel of wave filter, compare liu [4]with the brightness value of coloured image, come calculation of filtered weighting coefficient to have more robustness.
Existing depth map filtering method can not be distinguished the object whether vicinity points belongs to current pixel point place, and the vicinity points that makes not belong to same object with current pixel point is also used to be used as the input of weighted filtering, causes filtering unstable result.And the present invention is under local binary patterns operator instructs, can effectively differentiate the object whether vicinity points belongs to current pixel point place, and only choose belong to same object with current pixel point vicinity points as carrying out filtering with reference to pixel, thereby can obtain stable, accurate filtering result.
H.264/MVC depth map filtering method of the present invention is joined in coding framework as loop filtering, by removing the noise and increasing deep video code efficiency of depth map, can further improve based on virtual visual point image subjective and objective quality corresponding to the colored plus depth form three-dimensional video-frequency of many viewpoints.
embodiment
The invention discloses a kind of depth map filtering method instructing based on local binary patterns operator, ultimate principle is: the local binary patterns operator that obtains the coloured image that depth map is corresponding, under the local binary patterns operator of coloured image instructs, whether vicinity points and the current pixel point of effectively differentiating current pixel point belong to same object, and only choose the vicinity points that belongs to same object with current pixel point, as the reference pixel of depth map filtering.
Below in conjunction with specific embodiment, further illustrate the inventive method, concrete steps are as follows:
Step 1, for coloured image corresponding to depth image, obtains the local binary patterns operator LBP of the vicinity points of current pixel point p.R.
Local binary patterns operator LBP operates take P the vicinity points of current pixel point in the circular symmetric area that the center of circle, the R of take are radius, therefore, in this concrete enforcement, LBP is designated as to LBP p.R.Above-mentioned take current pixel point as the center of circle, the R of the take circle that is radius the adjacent windows that is current pixel point place, radius R is controlled the area of space size of local binary patterns operator; The number P of vicinity points, controls the quantization degree of local binary patterns operator on space, angle [5].
Brightness value by vicinity points in coloured image corresponding to depth image is designated as g p, p represents p vicinity points, p=0, and 1 ..., P-1, the number that P is vicinity points.By the brightness value g of each vicinity points prespectively with the brightness value g of current pixel point cmake the poor luminance difference (g that obtains p-g c), due to (g p-g c) and g cseparate, therefore the brightness value g of vicinity points p in coloured image pjoint distribution see formula (1):
T=T(g c,g 0,g 1,...,g P-1) (1)
Luminance difference (the g of vicinity points p and current pixel point c p-g c) joint distribution [7]see formula (2):
T≈T(g 0-g c,g 1-g c,...,g P-1-g c) (2)
Luminance difference (g p-g c) symbol can not change along with the change of average brightness value, therefore, can use two-value symbol s (g p-g c) replacement (g p-g c), see formula (3):
s ( g p - g c ) = 1 , g p - g c &GreaterEqual; 0 0 , g p - g c < 0 - - - ( 3 )
In order to describe more accurately object edge, the present invention does not carry out the rotary manipulation of local binary patterns operator [5].LBP p.Rbe equivalent to two-value joint distribution, see formula (4) and (5):
LBP P.R(g p-g c)=s(g p-g c) (4)
LBP P.R=T(s(g 0-g c),s(g 1-g c),...,s(g P-1-g c)) (5)
Wherein, p is p vicinity points, p=0, and 1 ..., P-1.
Step 2, the local binary patterns operator LBP based on vicinity points p.Rvicinity points is judged to selection.
The local binary patterns operator LBP of vicinity points in coloured image p.Rthere is the characteristic of describing object edge.According to the local binary patterns operator LBP of vicinity points p.Rvalue is divided into two parts by the vicinity points in adjacent windows, and the local binary patterns operator value of a part of vicinity points is 1, and the local binary patterns operator value of another part vicinity points is 0.
Calculate respectively above-mentioned two parts vicinity points to the logical reach sum of current pixel point zeroand sum one:
sum one=Σ p∈w[LBP P.R(g p-g c)·|d p-d c|] (6)
sum zero=Σ p∈w[|1-LBP P.R(g p-g c)|·|d p-d c|] (7)
Wherein:
Sum zerofor the logical reach of the local binary patterns operator value vicinity points that is 0 to current pixel point;
Sum onefor the logical reach of the local binary patterns operator value vicinity points that is 1 to current pixel point;
LBP p.R(g p-g c)=s (g p-g c), be the local binary patterns operator of vicinity points p, s (g p-g c) be the luminance difference (g of vicinity points p and current pixel point c p-g c) two-value symbol;
W is the set of the interior vicinity points of adjacent windows of current pixel point;
P is p vicinity points;
D cfor the degree of depth intensity of current pixel point c in depth image;
D pfor the degree of depth intensity of vicinity points p in depth image.
That a part of vicinity points and current pixel point that logical reach is less belong to same object, the local binary patterns operator of this part vicinity points is updated to 1, the local binary patterns operator of another part vicinity points is updated to 0, sees formula (8):
Figure BDA00003228682500081
With the local binary patterns operator upgrading
Figure BDA00003228682500082
as the filtration combined weighted restricted coefficients of equation factor, can avoid not belonging to current pixel point the interference of the vicinity points of same object, eliminate the instability of filtering.
Step 3, obtains spatial domain factor of influence.
Each vicinity points p comprising in adjacent windows for current pixel point c place, p=0,1 ..., P-1, P is the number of the vicinity points that comprises in adjacent windows, calculates respectively the geometric distance of each vicinity points p and current pixel point c || p-c||, with G q(|| p-c||) be the spatial domain factor of influence that vicinity points p is corresponding, G qfor Gaussian function.G q(|| p-c||) can be expressed as follows:
G Q ( | | p - c | | ) = exp ( - | x p - x c | 2 + | y p - y c | 2 &sigma; Q 2 ) - - - ( 9 )
Wherein:
(x p, y p) be the locus coordinate of vicinity points p;
(x c, y c) be the locus coordinate of current pixel point c;
σ qfor standard deviation, be experience factor, be the optimal value obtaining by test of many times, in this concrete enforcement, be set as 1.
Adopt G q(|| the p-c||) factor of influence to filtration combined weighted coefficient as spatial domain.
Step 4, obtains image area factor of influence.
The vicinity points p comprising in adjacent windows for current pixel point c place, p=0,1 ..., P-1, P is the number of the vicinity points that comprises in adjacent windows, calculates respectively the degree of depth intensity d of each vicinity points p in depth map pdegree of depth intensity d with current pixel point c in depth map csimilarity (d p-d c), with G s(d p-d c) be the image area factor of influence that vicinity points p is corresponding, G sfor Gaussian function.G s(d p-d c) can be expressed as follows:
G S ( d p - d c ) = exp ( - | d p - d c | 2 &sigma; s 2 ) - - - ( 10 )
Wherein, σ sfor standard deviation, be experience factor, be the optimal value obtaining by test of many times, be set as 0.1.
Adopt G s(d p-d c) factor of influence as image area to filtration combined weighted coefficient.
Step 5, compute depth figure filtering result
By the local binary patterns operator of step 2 gained vicinity points
Figure BDA00003228682500092
the step 3 gained spatial domain image factor and the step 4 gained image area image factor, as the vicinity points p degree of depth intensity d in depth map in constraint adjacent windows pweighting coefficient.
Degree of depth intensity d to each vicinity points pbe weighted summation, obtain the filtering result G of current pixel point c in depth map c:
G c = 1 F c &Sigma; p &Element; w [ G S ( d p - d c ) &CenterDot; G Q ( | | p - c | | ) &CenterDot; LBP P . R uni ( g p - g c ) &CenterDot; d p ] - - - ( 11 )
Wherein:
W is the set of all vicinity points of current pixel point c;
G s(d p-d c) be image area factor of influence, G sfor Gaussian function, d p-d cfor the vicinity points p of the current pixel point c degree of depth intensity d in depth map pdegree of depth intensity d with current pixel point c in depth map csimilarity;
G q(|| p-c||) be spatial domain factor of influence, G qfor Gaussian function, || p-c|| is that the vicinity points p of current pixel point c is to the geometric distance of current pixel point c;
Figure BDA00003228682500094
for vicinity points p after upgrading local binary patterns operator, g pfor the brightness value of vicinity points p in coloured image corresponding to depth image, g cfor the brightness value of current pixel point c in coloured image corresponding to depth image;
F c = &Sigma; p &Element; w [ G S ( d p - d c ) &CenterDot; G Q ( | | p - c | | ) &CenterDot; LBP P . R uni ( g p - g c ) ] , For weighting coefficient summation, for standardization.
H.264/MVC the depth map filtering method of above-mentioned steps 1~5 is joined in coding framework, as loop filtering, by removing the noise and increasing deep video code efficiency of depth map, further improved based on virtual visual point image subjective and objective quality corresponding to the colored plus depth form three-dimensional video-frequency of many viewpoints.
The depth map filtering method instructing based on local binary patterns operator (LBP) that the present invention proposes, the PSNR(Y-PSNR that makes new synthetic virtual view quality and depth map encoding efficiency on average obtain 0.45dB and 0.66dB) lifting, is also greatly improved on subjective vision with stylish synthetic virtual visual point image.
Filtering method of the present invention is inserted into the software platform of JMVC6.0 as loop filtering [9].For deep video encoding setting Q pvalue 22,27,32 and 37, color video encoding Q pbe set to 22.Deep video and color video 100 frames of all encoding, arrange reference frame between two time domain reference frames and two viewpoints." Ballet " that the resolution that cycle tests provides for Microsoft is 1024 * 768 and " Breakdancer ".Viewpoint 0 and viewpoint 2 are chosen as reference view, and viewpoint 1 is synthetic virtual view, and virtual view adopts VSRS2.3 software synthetic.Contrast is the depth filtering algorithm [4] of optimum at present, and the PSNR that the inventive method makes depth map encoding efficiency on average obtain 0.66dB promotes, and new synthetic virtual view objective quality is on average promoted to 0.45dB.
In literary composition, relate to following list of references:
[1]P.Merkle,J.B.Singla,K.Muller,and T.Wiegand,“Correlation histogram analysis of depth-enhanced3d video coding,”in Image Processing(ICIP),201017th IEEE International Conference on,sept.2010,pp.2605–2608.
[2]Y.Morvan P.Merkle and A.Smolic,“The effects of multi-view depth video compression on multi-view rendering,”Signal Processing:Image Communication,vol.24,no.1-2,pp.73C88,January2009.
[3]Kwan-Jung Oh,A.Vetro,and Yo-Sung Ho,“Depth coding using a boundary reconstruction filter for3-d video systems,”Circuits and Systems for Video Technology,IEEE Transactions on,vol.21,no.3,pp.350–359,march2011.
[4]Shujie Liu,PoLin Lai,Dong Tian,and Chang Wen Chen,“New depth coding techniques with utilization of corresponding video,”Broadcasting,IEEE Transactions on,vol.57,no.2,pp.551–561,june2011.
[5]T.Ojala,M.Pietikainen,and T.Maenpaa,“Multi-resolution gray-scale and rotation invariant texture classification with local binary patterns,”Pattern Analysis and Machine Intelligence,IEEE Transactions on,vol.24,no.7,pp.971–987,jul2002.
[6]B.Widrow,I.Kollar,and Ming-Chang Liu,“Statistical theory of quantization,”Instrumentation and Measurement,IEEE Transactions on,vol.45,no.2,pp.353–361,apr1996.
[7]C.Dorea P.Yin P.Lai,A.Ortega and C.Gomila,“Statistical theory of quantization,”in Proc.of Visual Communic.and Image Proc.,VCIP’09.San Jose,CA,USA,,Jan.2009.
[8]Timo Ojala,Kimmo Valkealahti,Erkki Oja,and MattiPietikinen,“Texture discrimination with multidimensional distributions of signed gray-level differences,”Pattern Recognition,vol.34,no.3,pp.727–739,2001.

Claims (10)

1. the depth map filtering method instructing based on local binary patterns operator, is characterized in that, comprises step:
Step 1, for coloured image corresponding to depth image, obtains the local binary patterns operator of the vicinity points of current pixel point;
Step 2, the local binary patterns operator based on vicinity points is differentiated and is obtained belonging to current pixel point the vicinity points of same object, and upgrades the local binary patterns operator of vicinity points;
Step 3, the geometric distance acquisition spatial domain factor of influence according to the vicinity points of current pixel point to current pixel point;
Step 4, according to the vicinity points of current pixel point and current pixel point, the similarity of degree of depth intensity in depth image obtains image area factor of influence;
Step 5, the constraint factor that vicinity points local binary patterns operator, spatial domain factor of influence and the image area factor of influence of take after upgrading is filtration combined weighted coefficient obtains the filtering result of current pixel point.
2. the depth map filtering method instructing based on local binary patterns operator as claimed in claim 1, is characterized in that:
Above-mentioned steps 2 further comprises sub-step:
2-1 is divided into two parts according to the local binary patterns operator of vicinity points by vicinity points;
2-2 judges according to the logical reach of above-mentioned two parts vicinity points and current pixel point whether vicinity points belongs to same object with current pixel point;
2-3 is updated to 1 by the local binary patterns operator that belongs to the vicinity points of same object with current pixel point, and the local binary patterns operator that does not belong to the vicinity points of same object with current pixel point is updated to 0.
3. the depth map filtering method instructing based on local binary patterns operator as claimed in claim 1, is characterized in that:
Spatial domain factor of influence described in step 3 is G q(|| p-c||), wherein, G qfor Gaussian function, || p-c|| is that the vicinity points p of current pixel point c is to the geometric distance of current pixel point c.
4. the depth map filtering method instructing based on local binary patterns operator as claimed in claim 1, is characterized in that:
Image area factor of influence described in step 4 is G s(d p-d c), wherein, G sfor Gaussian function, d p-d cfor the vicinity points p of the current pixel point c degree of depth intensity d in depth image pdegree of depth intensity d with current pixel point c in depth image csimilarity.
5. the depth map filtering method instructing based on local binary patterns operator as claimed in claim 1, is characterized in that:
The depth map filtering result G of the current pixel point that step 5 obtains cfor:
G c = 1 F c &Sigma; p &Element; w [ G S ( d p - d c ) &CenterDot; G Q ( | | p - c | | ) &CenterDot; LBP P . R uni ( g p - g c ) &CenterDot; d p ]
Wherein:
W is the set of all vicinity points of current pixel point c;
P is the vicinity points of current pixel point c;
G s(d p-d c) be image area factor of influence, G sfor Gaussian function, d p-d cfor the vicinity points p of the current pixel point c degree of depth intensity d in depth image pdegree of depth intensity d with current pixel point c in depth image csimilarity;
G q(|| p-c||) be spatial domain factor of influence, G qfor Gaussian function, || p-c|| is that the vicinity points p of current pixel point c is to the geometric distance of current pixel point c;
for the local binary patterns operator after vicinity points p renewal, g pand g cbe respectively vicinity points p and the current pixel point c brightness value in coloured image corresponding to depth image;
F c = &Sigma; p &Element; w [ G S ( d p - d c ) &CenterDot; G Q ( | | p - c | | ) &CenterDot; LBP P . R uni ( g p - g c ) ] , For weighting coefficient summation.
6. the depth map filtering system instructing based on local binary patterns operator, is characterized in that, comprising:
Local binary patterns operator acquisition module, is used for, for coloured image corresponding to depth image, obtaining the local binary patterns operator of the vicinity points of current pixel point;
Vicinity points judge module, the local binary patterns operator being used for based on vicinity points is differentiated and is obtained belonging to current pixel point the vicinity points of same object, and upgrades the local binary patterns operator of vicinity points;
Spatial domain factor of influence acquisition module, is used for to the geometric distance of current pixel point, obtaining spatial domain factor of influence according to the vicinity points of current pixel point;
Image area factor of influence acquisition module, is used for the similarity of according to the vicinity points of current pixel point and current pixel point degree of depth intensity in depth image to obtain image area factor of influence;
Filtration module, is used for the filtering result that constraint factor that vicinity points local binary patterns operator, spatial domain factor of influence and image area factor of influence after upgrading are filtration combined weighted coefficient obtains current pixel point.
7. the depth map filtering system instructing based on local binary patterns operator as claimed in claim 6, is characterized in that:
Described vicinity points judge module further comprises submodule:
Vicinity points is divided module, is used for, according to the local binary patterns operator of vicinity points, vicinity points is divided into two parts;
Judge module, is used for judging according to the logical reach of above-mentioned two parts vicinity points and current pixel point whether vicinity points belongs to same object with current pixel point;
Local binary patterns operator update module, be used for the local binary patterns operator that belongs to the vicinity points of same object with current pixel point to be updated to 1, the local binary patterns operator that does not belong to the vicinity points of same object with current pixel point is updated to 0.
8. the depth map filtering system instructing based on local binary patterns operator as claimed in claim 6, is characterized in that:
The spatial domain factor of influence that described spatial domain factor of influence acquisition module obtains is G q(|| p-c||), wherein, G qfor Gaussian function, || p-c|| is that the vicinity points p of current pixel point c is to the geometric distance of current pixel point c.
9. the depth map filtering system instructing based on local binary patterns operator as claimed in claim 6, is characterized in that:
Described image area factor of influence is G s(d p-d c), wherein, G sfor Gaussian function, d p-d cfor the vicinity points p of the current pixel point c degree of depth intensity d in depth image pdegree of depth intensity d with current pixel point c in depth image csimilarity.
10. the depth map filtering system instructing based on local binary patterns operator as claimed in claim 6, is characterized in that:
The filtering result G of the current pixel point that described filtration module obtains cfor:
G c = 1 F c &Sigma; p &Element; w [ G S ( d p - d c ) &CenterDot; G Q ( | | p - c | | ) &CenterDot; LBP P . R uni ( g p - g c ) &CenterDot; d p ]
Wherein:
W is the set of all vicinity points of current pixel point c;
G s(d p-d c) be image area factor of influence, G sfor Gaussian function, d p-d cfor the vicinity points p of the current pixel point c degree of depth intensity d in depth map pdegree of depth intensity d with current pixel point c in depth map csimilarity;
G q(|| p-c||) be spatial domain factor of influence, G qfor Gaussian function, || p-c|| is that the vicinity points p of current pixel point c is to the geometric distance of current pixel point c;
Figure FDA00003228682400041
for vicinity points p after upgrading local binary patterns operator, g pfor the brightness value of vicinity points p in coloured image corresponding to depth image, g cfor the brightness value of current pixel point c in coloured image corresponding to depth image;
F c = &Sigma; p &Element; w [ G S ( d p - d c ) &CenterDot; G Q ( | | p - c | | ) &CenterDot; LBP P . R uni ( g p - g c ) ] , For weighting coefficient summation.
CN201310192294.8A 2013-05-22 2013-05-22 Depth map filtering method and system based on local binary pattern operator guidance Pending CN103578089A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310192294.8A CN103578089A (en) 2013-05-22 2013-05-22 Depth map filtering method and system based on local binary pattern operator guidance

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310192294.8A CN103578089A (en) 2013-05-22 2013-05-22 Depth map filtering method and system based on local binary pattern operator guidance

Publications (1)

Publication Number Publication Date
CN103578089A true CN103578089A (en) 2014-02-12

Family

ID=50049815

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310192294.8A Pending CN103578089A (en) 2013-05-22 2013-05-22 Depth map filtering method and system based on local binary pattern operator guidance

Country Status (1)

Country Link
CN (1) CN103578089A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108596194A (en) * 2018-05-03 2018-09-28 武汉科技大学 A kind of image encoding method of the local binary patterns of Gauss weighting
CN110910326A (en) * 2019-11-22 2020-03-24 上海商汤智能科技有限公司 Image processing method and device, processor, electronic device and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120213419A1 (en) * 2011-02-22 2012-08-23 Postech Academy-Industry Foundation Pattern recognition method and apparatus using local binary pattern codes, and recording medium thereof
CN102799870A (en) * 2012-07-13 2012-11-28 复旦大学 Single-training sample face recognition method based on blocking consistency LBP (Local Binary Pattern) and sparse coding
CN102915544A (en) * 2012-09-20 2013-02-06 武汉大学 Video image motion target extracting method based on pattern detection and color segmentation

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120213419A1 (en) * 2011-02-22 2012-08-23 Postech Academy-Industry Foundation Pattern recognition method and apparatus using local binary pattern codes, and recording medium thereof
CN102799870A (en) * 2012-07-13 2012-11-28 复旦大学 Single-training sample face recognition method based on blocking consistency LBP (Local Binary Pattern) and sparse coding
CN102915544A (en) * 2012-09-20 2013-02-06 武汉大学 Video image motion target extracting method based on pattern detection and color segmentation

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
RUI ZHONG 等: "LBP-GUIDED DEPTH IMAGE FILTER", 《DATA COMPRESSION CONFERENCE》 *
RUI ZHONG 等: "LBP-GUIDED DEPTH IMAGE FILTER", 《DATA COMPRESSION CONFERENCE》, 22 March 2013 (2013-03-22) *
赖新平 等: "基于各向异性扩散的降噪新方法", 《上海大学学报(自然科学版)》 *
黄非非: "基于LBP的人脸识别研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108596194A (en) * 2018-05-03 2018-09-28 武汉科技大学 A kind of image encoding method of the local binary patterns of Gauss weighting
CN110910326A (en) * 2019-11-22 2020-03-24 上海商汤智能科技有限公司 Image processing method and device, processor, electronic device and storage medium

Similar Documents

Publication Publication Date Title
CN101937578B (en) Method for drawing virtual view color image
CN103236082B (en) Towards the accurate three-dimensional rebuilding method of two-dimensional video of catching static scene
US8780172B2 (en) Depth and video co-processing
CN101287143B (en) Method for converting flat video to tridimensional video based on real-time dialog between human and machine
CN101516040A (en) Video matching method, device and system
CN104217208A (en) Target detection method and device
CN107392950A (en) A kind of across yardstick cost polymerization solid matching method based on weak skin texture detection
CN111179189B (en) Image processing method and device based on generation of countermeasure network GAN, electronic equipment and storage medium
CN104850847B (en) Image optimization system and method with automatic thin face function
CN108596975A (en) A kind of Stereo Matching Algorithm for weak texture region
CN104994375A (en) Three-dimensional image quality objective evaluation method based on three-dimensional visual saliency
CN110189294B (en) RGB-D image significance detection method based on depth reliability analysis
CN110349132A (en) A kind of fabric defects detection method based on light-field camera extraction of depth information
CN104867135A (en) High-precision stereo matching method based on guiding image guidance
CN105898278B (en) A kind of three-dimensional video-frequency conspicuousness detection method based on binocular Multidimensional Awareness characteristic
CN107240084A (en) A kind of removing rain based on single image method and device
CN109831664B (en) Rapid compressed stereo video quality evaluation method based on deep learning
CN103384343B (en) A kind of method and device thereof filling up image cavity
CN103996174A (en) Method for performing hole repair on Kinect depth images
KR20110014067A (en) Method and system for transformation of stereo content
CN106408513A (en) Super-resolution reconstruction method of depth map
CN104639933A (en) Real-time acquisition method and real-time acquisition system for depth maps of three-dimensional views
CN103248906A (en) Method and system for acquiring depth map of binocular stereo video sequence
Kuo et al. Depth estimation from a monocular view of the outdoors
CN103679739A (en) Virtual view generating method based on shielding region detection

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20140212

RJ01 Rejection of invention patent application after publication