Summary of the invention
The purpose of this invention is to provide a kind of coloured image three-dimensional rebuilding method based on the solid coupling, can accurate fast automatic reconstructed image three-dimensional point cloud.
The technical scheme that the present invention adopts is: at first gather the colored real scene shooting image of two width of cloth; Carry out camera calibration; Carry out polar curve is proofreaied and correct and image transformation according to nominal data; Calculate coupling cost and initial parallax figure through initial matching, utilize confidence level detection of coupling cost and left and right sides consistency desired result that the initial matching result is classified according to fiduciary level, then the left image after proofreading and correct is carried out colour and cut apart; Carry out global optimization and obtain final parallax to have optionally the confidence spread algorithm again, utilize nominal data and matching result reconstruction of three-dimensional point cloud at last and show.
Method of the present invention specifically comprises following step:
Step 1: Image Acquisition
Use two colour TV cameras from two angles that are more or less the same same scene to be taken two width of cloth images simultaneously, what wherein the video camera on the left side was taken is the original left image, and what the video camera on the right was taken is the original right image;
Step 2: camera calibration
Respectively two video cameras are demarcated, set up the relation between camera review location of pixels and the scene location, obtain the intrinsic parameter matrix A of the video camera on the left side
L, the right the intrinsic parameter matrix A of video camera
ROuter parameter matrix [R with the video camera on the left side
Lt
L], the outer parameter matrix [R of the video camera on the right
Rt
R];
Step 3: image is proofreaied and correct polar curve
The camera interior and exterior parameter utilization method for correcting polar line that obtains according to step 2 carries out polar curve to captured left and right sides image to be proofreaied and correct and obtains run-in index binocular vision model, makes matched pixel to having identical ordinate, and left image and right image after the correction are designated as I respectively
lAnd I
r
Step 4: initial matching:
Step 4.1: confirm candidate's disparity range D:
D=(d
min,d
max),
D wherein
MinBe minimum parallax, d
Min=0, d
MaxBe maximum disparity, through the matched pixel point between mark benchmark image and the registering images to trying to achieve:
Ten pixels in the picked at random benchmark image pl1, and pl2, pl3 ..., pl10} seeks and { pl1 respectively in registering images; Pl2, pl3 ..., pl10} has ten estimation matched pixel points { pr1, pr2, pr3 of identical ordinate and similar color information;, pr10}, so obtain ten groups the estimation matched pixel to (pl1, pr1), (pl2, pr2); (pl3, pr3) ..., (pl10, pr10) }, to the thoroughly deserve one group parallax value { d1 of each group matched pixel to the difference of the horizontal ordinate that calculates two pixels; D2, d3 ..., d10}, maximum disparity d
Max=max{d1, d2 ..., d10}+5;
Step 4.2: adaptive weighting window algorithm
With the left image I after proofreading and correct
lBe benchmark image, with the right image I after proofreading and correct
rBe registering images, adopt the adaptive weighting windowhood method that each pixel in the benchmark image is calculated the coupling cost and obtains initial left disparity map, then, with the right image I after proofreading and correct
rBe benchmark image, with the left image I after proofreading and correct
lBe registering images, adopt the adaptive weighting windowhood method that each pixel in the benchmark image is calculated the coupling cost and obtains initial right disparity map, described adaptive weighting windowhood method is:
Step 4.2.1: weight coefficient calculates
At first benchmark image is designated as I
1, registering images is designated as I
2, utilize color and spatial information each pixel in two width of cloth images to be calculated the weight coefficient E of all pixels in the neighborhood window then
Pq:
Wherein p is the pixel in benchmark image or the registering images, and q is for pixel p center, size being the arbitrary pixel in the neighborhood window of n * n, and n=35, Δ pq are illustrated in the color distortion between the pixel p and q under the rgb space, || p-q||
2Be two pixels Euclidean distances before, α and β are constant coefficient, α=0.1, β=0.047;
Step 4.2.2: the coupling cost is calculated
Under horizontal polar curve constraint, to the corresponding coupling cost C (p of all parallax value in each the pixel calculated candidate disparity range in the benchmark image
1, d):
P wherein
1Be arbitrary pixel in the benchmark image, p
1Coordinate do
D is the arbitrary parallax value in candidate's disparity range D, pixel p
2Be p
1In registering images corresponding to the candidate matches pixel of parallax d, when benchmark image is left image, p
2Coordinate be
When benchmark image is right image, p
2Coordinate be
Represent respectively with pixel p
1, p
2For center, size are the neighborhood window of n * n, pixel q
1Be window
Interior arbitrary neighborhood territory pixel point, coordinate does
Pixel q
2Be window
In and q
1Corresponding pixel,, when benchmark image is left image, q
2Coordinate be
When benchmark image is right image, q
2Coordinate be
With
Be the weight coefficient of trying to achieve, S (q according to step 4.2.1
1, q
2) be that respective pixel is to (q
1, q
2) diversity factor;
Step 4.2.3: calculate the initial parallax value
Each pixel is calculated the minimum parallax value d of coupling cost
0(p
1):
P wherein
1Be the arbitrary pixel in the benchmark image, D is candidate's disparity range, d
MinAnd d
MaxBe minimum parallax and maximum disparity, C (p
1, d) be the coupling cost that calculates according to step 4.2.1; The minimum parallax value d of coupling cost
0(p
1) be pixel p
1Initial matching parallax result;
Step 4.2.4: set up the initial parallax image
Set up initial parallax image D
0: D
0(i, j)=d
0(p
Ij), wherein i and j are respectively the horizontal ordinate and the ordinate of anaglyph pixel, p
IjBe that coordinate is (i, pixel j), d in the benchmark image
0(p
Ij) be the p that calculates among the step 4.2.3
IjInitial matching parallax result;
If benchmark image is left image I
l, with initial parallax figure D
0Assignment is given initial left disparity map D
l 0If benchmark image is right image I
r, with initial parallax figure D
0Assignment is given initial right disparity map D
r 0
Step 5: pixel fiduciary level mark:
Step 5.1: coupling cost confidence level check
With left image I
lAll pixels are according to the classification of coupling cost confidence level, and the higher set of confidence level is designated as M
Hc, the lower set of confidence level is for being designated as M
Lc: left image I
lIn arbitrary pixel p
lCoupling cost confidence level is r (p
l):
C
Min1Be p
lThe coupling cost that initial matching parallax result is corresponding, i.e. smallest match cost value, and C
Min2Be p
lThe second little coupling cost, setting threshold dist then is as r (p
L1During)>dist, p
lThe matching result confidence level be higher, p
l∈ M
Hc, otherwise confidence level is for lower, p
l∈ M
Lc, wherein threshold value dist gets 0.04;
Step 5.2: left and right sides consistency desired result
For the arbitrary pixel p in the left image
l, coordinate does
p
lThe initial parallax result
The matched pixel p of correspondence in right image
rCoordinate do
The initial right anaglyph D that obtains according to step 4
r 0Obtain pixel p
rThe initial parallax result
If d
1=d
2, pixel p then
lThrough left and right sides consistency desired result, be designated as p
l∈ M
Ac, otherwise, pixel p
l, be not designated as p through left and right sides consistency desired result
l∈ M
Bc, M wherein
AcAnd M
BcBe respectively through the set of left and right sides consistency desired result and the set through left and right sides consistency desired result;
Step 5.3: pixel fiduciary level coefficient mark
According to step 5.1 and 5.2 result to each element marking fiduciary level coefficient Con (p in the left image
l):
P wherein
lBe the arbitrary pixel in the left image, Con (p
l) be p
lThe fiduciary level coefficient;
Step 6: image segmentation:
With the Mean-Shift algorithm left image is cut apart, to the cut zone S (p under each element marking
l), p wherein
lBe arbitrary pixel in the left image, S (p
l) be pixel p
lAffiliated region labeling;
Step 7: global optimization
Step 7.1: the level and smooth cost of pixel is calculated
Calculate the level and smooth cost J (p between four neighborhood territory pixels up and down of each pixel and this pixel in the left image with respect to all parallax value in the scope of the inspection D
l, q
l, d
p, d
q):
J(p
l,q
l,d
p,d
q)=min{|d
p-d
q|,|d
max-d
min|/8},
P wherein
lBe the arbitrary pixel in the left image, q
lBe pixel p
lArbitrary neighbours territory pixel, d
pAnd d
qBe respectively pixel p
lAnd q
lThe arbitrary parallax in disparity range D, d
MaxAnd d
MinBe maximum disparity and minimum parallax;
Step 7.2: the degree of confidence message of calculating pixel node
Iterative computation degree of confidence message, t is a number of iterations, and initial value is 0, when t=50, stops iteration, and the computation process of iteration is each time:
During t iteration; This pixel was propagated to neighbours territory pixel when each pixel node in the left image was calculated next iteration, with respect to the degree of confidence message
of each parallax value in the disparity range D
P wherein
lBe arbitrary pixel in the left image, q
lBe pixel p
lAny neighbours territory pixel, D is the disparity range of definition in the step 4.1, d is the arbitrary parallax value in the D, C (p
l, d) coupling cost, d for calculating among the step 4.2.2
xBe the arbitrary parallax value in the disparity range D, J (p
l, q
l, d, d
x) the level and smooth cost of trying to achieve for step 7.1,
For t-1 iteration try to achieve from pixel q
sTo p
lThe parallax of propagating is d
xDegree of confidence message, during t=1
Be 0, d
xBe the arbitrary parallax value in the disparity range D, q
sBe pixel p
lSelectivity neighborhood N
1(p
l) in, be different from pixel q
lArbitrary pixel, described selectivity neighborhood N
1(p
l) be:
N
1(p
l)={q
f|q
f∈N(p
l),Con(q
f)≥Con(p
l)and?S(q
f)=S(p
l)},
N in the formula
1(p
l) be pixel p
lThe territory of neighbours up and down, Con (q
f) and Con (p
l) be the fiduciary level coefficient of mark in the step 5.3, S (q
f) and S (p
l) be the pixel q that tries to achieve in the step 6
fAnd p
lAffiliated cut zone label;
Step 7.3: calculate each pixel in the left image with respect to might parallax degree of confidence b (p
l, d):
P wherein
lBe the arbitrary pixel in the left image, d is the arbitrary parallax value in the D, C (p
l, the coupling cost that d) obtains for step 4.2.2,
Be the 50th iterative computation obtain from pixel p
sTo p
lThe parallax of propagating is the degree of confidence message of d, p
sBe N
1(p
l) interior arbitrary pixel, N
1(p
l) be the p of definition in the step 7.2
lThe selectivity neighborhood;
Step 7.4: calculate anaglyph
The optimum parallax value d of confidence calculations (p according to each pixel
l):
P wherein
lBe the arbitrary pixel in the left image, b (p
l, the degree of confidence that d) calculates for step 7.3, D is a disparity range, d is the arbitrary parallax value in the inspection scope D;
Optimum parallax according to each pixel in the left image is set up final parallax as D
Out: D
Out(x, y)=d (p
Xy), wherein x and y are respectively anaglyph D
OutThe horizontal ordinate of pixel and ordinate, p
XyBe that coordinate is (x, pixel y), d (p in the benchmark image
Xy) be p
XyOptimum parallax value;
Step 8: the three-dimensional information of reconstructed object thing
The camera interior and exterior parameter matrix A that obtains according to step 2
L, A
R[R
Lt
L], [R
Rt
R], and the disparity map D that obtains of step 7
Out, calculate the three-dimensional point cloud model of whole object through the space method of crossing.Beneficial effect: compared with prior art; The present invention has following advantage: the adaptive weighting window algorithm calculates its weight with respect to pixel to be matched according to the space and the colouring information of neighborhood territory pixel, has avoided the intrinsic self-adapting window building process of difficulty of local algorithm; The tradition certainty factor algebra propagates degree of confidence message between all neighbors; Do not have directive significance owing to do not satisfy the initial matching result of parallax continuity constraint and some pixel between the part neighbor; There is irrational travel path among the tradition certainty factor algebra; Cause that matching accuracy is not high, optimal speed waits problem slowly, the present invention utilizes color images and pixel fiduciary level classification results to instruct the scope and the direction of degree of confidence message propagation, and this have optionally that the confidence spread algorithm has cut off irrational part among traditional certainty factor algebra; Make the path of global energy optimization be optimized; Computation complexity reduces and has more specific aim, and the matching result of low fiduciary level pixel constantly obtains revising in the process of iteration optimization, finally obtains the higher disparity map of matching precision.The present invention has fully combined the advantage of local optimum algorithm and global optimum's algorithm, and the two is combined, and has overcome existing three-dimensional reconstruction technology in the contradiction of rebuilding between accuracy and the reconstruction speed, and has improved the automaticity of process of reconstruction.
Embodiment
With reference to the accompanying drawings, specific embodiments of the present invention is done more detailed description.The programming implementation tool is selected Visual C++6.0 and OpenCV Flame Image Process function library for use, has taken the coloured image that two width of cloth contain the discontinuous and low texture region of more parallax in the indoor environment.
Fig. 1 is entire flow figure of the present invention
Fig. 5 is system model of the present invention and principle schematic.Use two colored CCDs to take a width of cloth coloured image, O simultaneously from two different angles respectively
L, O
RBe respectively the photocentre of two video cameras, I
L, I
RBe respectively the imaging plane of two video cameras, P is a space object point of treating on the object of reconstruct, P
L, P
RBe object point P imaging point on two video camera imaging planes respectively.This is a pair of match point by the same space object point imaging point on the different cameras imaging plane.Appoint and to get wherein that a width of cloth is a benchmark image, another width of cloth is a registering images, and the process of in alignment image, search for corresponding match point for each pixel in the benchmark image is called three-dimensional the coupling.After obtaining the matching relationship of pixel,,, carry out reverse computing, just can obtain the 3 d space coordinate of corresponding object point, thereby realize the three-dimensionalreconstruction of image in conjunction with demarcating the camera interior and exterior parameter that obtains according to system model.
Fig. 6 proofreaies and correct synoptic diagram for polar curve.For pixel p among the left figure
l, matched pixel p
rSearch only need in right figure corresponding to p
lPolar curve on carry out, and all polar curves of parallel stereovision model all are parallel to the line O of photocentre
lO
r, then stereo-picture can further reduce the search difficulty in the case to having only horizontal shift, and the search of corresponding point only gets final product along same line search.But in reality, this master pattern is difficult to satisfy, and imaging plane can be proofreaied and correct through polar curve and make the imaging plane rotation, thereby obtain two virtual parallel imaging planes not on same plane.Through rotating initial projection matrix around photocentre up to two focal plane coplanes, baseline is also contained in the focal plane, thereby obtains two new projection matrixes.Limit is located in infinite distant place like this; Therefore polar curve is parallel.In order to make that simultaneously polar curve is a level, baseline must be parallel to the new X-direction of two cameras.In addition, in order to obtain correct correction, conjugate points is to having identical ordinate, and this can obtain through letting new camera configuration have identical intrinsic parameter.
Method of the present invention specifically comprises following step:
Step 1: Image Acquisition
Use two colour TV cameras from two angles that are more or less the same same scene to be taken two width of cloth images simultaneously, what wherein the video camera on the left side was taken is the original left image, and what the video camera on the right was taken is the original right image;
Step 2: camera calibration
Respectively two video cameras are demarcated, set up the relation between camera review location of pixels and the scene location, obtain the intrinsic parameter matrix A of the video camera on the left side
L, the right the intrinsic parameter matrix A of video camera
ROuter parameter matrix [R with the video camera on the left side
Lt
L], the outer parameter matrix [R of the video camera on the right
Rt
R];
The camera calibration technology is comparative maturity now; List of references " A Flexible New Technique for Camera Calibration " (Zhang Z Y; IEEE Transactions on Pattern Analysis and Machine Intelligence; 2000,20 (11): 1330-1334) proposed a kind of calibration algorithm that is called the plane template method, adopted this method respectively two video cameras to be demarcated among the present invention;
Step 3: image is proofreaied and correct polar curve
The camera interior and exterior parameter utilization method for correcting polar line that obtains according to step 2 carries out polar curve to captured left and right sides image to be proofreaied and correct and obtains run-in index binocular vision model, makes matched pixel to having identical ordinate, and left image and right image after the correction are designated as I respectively
lAnd I
r
Adopt list of references " A compact algorithm for rectification of stereo pairs.Machine Vision and Applications " (Fusiello A; Trucco E; Verri A.2000,12 (1): the method for correcting polar line of the proposition 16-22) carries out polar curve is proofreaied and correct to captured left and right sides image, and is as shown in Figure 6; When if the pixel coordinate after the conversion in the image corresponds on the non-integer coordinates in the original image; Then carry out bilinear interpolation, obtain run-in index binocular vision model at last, the image after overcorrect is undistorted; Error between the right ordinate of matched pixel is less than a pixel, and the space complexity of coupling reduces greatly.
Step 4: initial matching:
Step 4.1: confirm candidate's disparity range D:
D=(d
min,d
max),
D wherein
MinBe minimum parallax, d
Min=0, d
MaxBe maximum disparity, through the matched pixel point between mark benchmark image and the registering images to trying to achieve:
Ten pixels in the picked at random benchmark image pl1, and pl2, pl3 ..., pl10} seeks and { pl1 respectively in registering images; Pl2, pl3 ..., pl10} has ten estimation matched pixel points { pr1, pr2, pr3 of identical ordinate and similar color information;, pr10}, so obtain ten groups the estimation matched pixel to (pl1, pr1), (pl2, pr2); (pl3, pr3) ..., (pl10, pr10) }, to the thoroughly deserve one group parallax value { d1 of each group matched pixel to the difference of the horizontal ordinate that calculates two pixels; D2, d3 ..., d10}, maximum disparity d
Max=max{d1, d2 ..., d10}+5;
Step 4.2: adaptive weighting window algorithm
With the left image I after proofreading and correct
lBe benchmark image, with the right image I after proofreading and correct
rBe registering images, adopt the adaptive weighting windowhood method that each pixel in the benchmark image is calculated the coupling cost and obtains initial left disparity map, then, with the right image I after proofreading and correct
rBe benchmark image, with the left image I after proofreading and correct
lBe registering images, adopt the adaptive weighting windowhood method that each pixel in the benchmark image is calculated the coupling cost and obtains initial right disparity map, described adaptive weighting windowhood method is:
Step 4.2.1: weight coefficient calculates
At first benchmark image is designated as I
1, registering images is designated as I
2, utilize color and spatial information each pixel in two width of cloth images to be calculated the weight coefficient E of all pixels in the neighborhood window then
Pq:
Wherein p is the pixel in benchmark image or the registering images, and q is for pixel p center, size being the arbitrary pixel in the neighborhood window of n * n, and n=35, Δ pq are illustrated in the color distortion between the pixel p and q under the rgb space,
The r of c presentation video wherein, g or b passage, I
c(p) and I
c(q) remarked pixel p and the q color component under the c passage, || p-q||
2Be two pixels Euclidean distances before, α and β are constant coefficient, α=0.1, β=0.047;
Step 4.2.2: the coupling cost is calculated
Under horizontal polar curve constraint, to the corresponding coupling cost C (p of all parallax value in each the pixel calculated candidate disparity range in the benchmark image
1, d):
P wherein
1Be arbitrary pixel in the benchmark image, p
1Coordinate do
D is the arbitrary parallax value in candidate's disparity range D, pixel p
2Be p
1In registering images corresponding to the candidate matches pixel of parallax d, when benchmark image is left image, p
2Coordinate be
When benchmark image is right image, p
2Coordinate be
Represent respectively with pixel p
1, p
2For center, size are the neighborhood window of n * n, pixel q
1Be window
Interior arbitrary neighborhood territory pixel point, coordinate does
Pixel q
2Be window
In and q
1Corresponding pixel,, when benchmark image is left image, q
2Coordinate be
When benchmark image is right image, q
2Coordinate be
With
Be the weight coefficient of trying to achieve, S (q according to step 4.2.1
1, q
2) be that respective pixel is to (q
1, q
2) diversity factor;
It is as shown in Figure 8,
Wherein, q
2lBe q
2Left neighborhood territory pixel, coordinate does
q
2rBe q
2Right neighborhood territory pixel, coordinate does
I
2(q
2), I
2(q
2l) and I
2(q
2r) be respectively pixel q
2, q
2lAnd q
2rAt registering images I
2In the mean value of RGB triple channel component, definition then,
Pixel q then
1And q
2Diversity factor be:
S=max{0,I
1(q
1)-I
max,I
min-I
1(q
1)},
I wherein
1(q
1) be pixel q
1At benchmark image I
1In the mean value of RGB triple channel component;
Step 4.2.3: calculate the initial parallax value
Each pixel is calculated the minimum parallax value d of coupling cost
0(p
1):
P wherein
1Be the arbitrary pixel in the benchmark image, D is candidate's disparity range, d
MinAnd d
MaxBe minimum parallax and maximum disparity, C (p
1, d) be the coupling cost that calculates according to step 4.2.1; The minimum parallax value d of coupling cost
0(p
1) be pixel p
1Initial matching parallax result;
Step 4.2.4: set up the initial parallax image
Set up initial parallax image D
0: D
0(i, j)=d
0(p
Ij), wherein i and j are respectively the horizontal ordinate and the ordinate of anaglyph pixel, p
IjBe that coordinate is (i, pixel j), d in the benchmark image
0(p
Ij) be the p that calculates among the step 4.2.3
IjInitial matching parallax result;
If benchmark image is left image I
l, with initial parallax figure D
0Assignment is given initial left disparity map D
l 0If benchmark image is right image I
r, with initial parallax figure D
0Assignment is given initial right disparity map D
r 0
Step 5: pixel fiduciary level mark:
Step 5.1: coupling cost confidence level check
With left image I
lAll pixels are according to the classification of coupling cost confidence level, and the higher set of confidence level is designated as M
Hc, the lower set of confidence level is for being designated as M
Lc: left image I
lIn arbitrary pixel p
lCoupling cost confidence level is r (p
l):
C
Min1Be p
lThe coupling cost that initial matching parallax result is corresponding, i.e. smallest match cost value, and C
Min2Be p
lThe second little coupling cost, setting threshold dist then is as r (p
L1During)>dist, p
lThe matching result confidence level be higher, p
l∈ M
Hc, otherwise confidence level is for lower, p
l∈ M
Lc, wherein threshold value dist gets 0.04;
Step 5.2: left and right sides consistency desired result
For the arbitrary pixel p in the left image
l, coordinate does
p
lThe initial parallax result
The matched pixel p of correspondence in right image
rCoordinate do
The initial right anaglyph D that obtains according to step 4
r 0Obtain pixel p
rThe initial parallax result
If d
1=d
2, pixel p then
lThrough left and right sides consistency desired result, be designated as p
l∈ M
Ac, otherwise, pixel p
l, be not designated as p through left and right sides consistency desired result
l∈ M
Bc, M wherein
AcAnd M
BcBe respectively through the set of left and right sides consistency desired result and the set through left and right sides consistency desired result;
Step 5.3: pixel fiduciary level coefficient mark
According to step 5.1 and 5.2 result to each element marking fiduciary level coefficient Con (p in the left image
l):
P wherein
lBe the arbitrary pixel in the left image, Con (p
l) be p
lThe fiduciary level coefficient;
Step 6: image segmentation:
With the Mean-Shift algorithm left image is cut apart, to the cut zone S (p under each element marking
l), p wherein
lBe arbitrary pixel in the left image, S (p
l) be pixel p
lAffiliated region labeling;
Parameter is set to: spatial bandwidth h
s=7, color bandwidth h
r=6.5, smallest region size M=35;
Step 7: global optimization
Traditional confidence spread path is shown in figure 10, pixel p
0Obtain degree of confidence from four neighborhood territory pixels, solid arrow is represented the direction of propagation among the figure, and dotted arrow is represented the direction of last round of confidence spread.Suppose pixel p
01Be insecure pixel, so in the confidence spread process, from p
01Information M
P01p0Fiduciary level also just relatively low, thereby pixel p
0The coupling cost added insecure information, the matching result that possibly lead to errors in calculating.To this problem; The present invention is on the basis of pixel fiduciary level classification; Improved the path of traditional confidence spread; Shown in figure 11; Fiduciary level with four kinds of different pattern remarked pixels; The fiduciary level of
remarked pixel is the highest; The fiduciary level coefficient is 4;
remarked pixel fiduciary level coefficient is 3,
remarked pixel fiduciary level coefficient be 2,
remarked pixel fiduciary level is minimum; The fiduciary level coefficient is 1; We are defined in when propagating degree of confidence between the neighbor, if two pixel fiduciary levels do not wait, the direction of propagation is low by the high sensing fiduciary level of fiduciary level; If two pixel fiduciary levels equate, take the principle of two-way propagation.Match information flows to insecure network from the reliable network that approaches true parallax makes the global optimization approach of this paper have selectivity in the direction of propagation.
The parallax continuity is the prerequisite of confidence spread, and is shown in figure 10, pixel p
02With pixel p
0The both sides and the true parallax that lay respectively at the object edge differ greatly, so from p
02The degree of confidence of propagating and coming is to pixel p
0To not have directive significance.Yet have the discontinuous zone of many degree of depth in the three-dimensional scenic, it is unaccommodated in these zones, propagating degree of confidence.The zone of parallax saltus step often is accompanied by change in color, and based on this fact, the present invention utilizes colored carve information to retrain the scope of confidence spread, avoids the regional spread degree of confidence in color generation saltus step, and is shown in figure 12, s
1And s
2Represent two different cut zone respectively, be defined in and propagate degree of confidence in the identical block that the travel path that belongs between two pixels of different blocks breaks off.This algorithm that receives carve information constraint can reduce degree of confidence effectively to be propagated between parallax differs bigger neighbor, has improved the matching performance of BP algorithm at the parallax discontinuity zone.
Step 7.1: the level and smooth cost of pixel is calculated
Calculate the level and smooth cost J (p between four neighborhood territory pixels up and down of each pixel and this pixel in the left image with respect to all parallax value in the scope of the inspection D
l, q
l, d
p, d
q):
J(p
l,q
l,d
p,d
q)=min{|d
p-d
q|,|d
max-d
min|/8},
P wherein
lBe the arbitrary pixel in the left image, q
lBe pixel p
lArbitrary neighbours territory pixel, d
pAnd d
qBe respectively pixel p
lAnd q
lThe arbitrary parallax in disparity range D, d
MaxAnd d
MinBe maximum disparity and minimum parallax;
Step 7.2: the degree of confidence message of calculating pixel node
Iterative computation degree of confidence message, t is a number of iterations, and initial value is 0, when t=50, stops iteration, and the computation process of iteration is each time:
During t iteration; This pixel was propagated to neighbours territory pixel when each pixel node in the left image was calculated next iteration, with respect to the degree of confidence message
of each parallax value in the disparity range D
P wherein
lBe arbitrary pixel in the left image, q
lBe pixel p
lAny neighbours territory pixel, D is the disparity range of definition in the step 4.1, d is the arbitrary parallax value in the D, C (p
l, d) coupling cost, d for calculating among the step 4.2.2
xBe the arbitrary parallax value in the disparity range D, J (p
l, q
l, d, d
x) the level and smooth cost of trying to achieve for step 7.1,
For t-1 iteration try to achieve from pixel q
sTo p
lThe parallax of propagating is d
xDegree of confidence message, during t=1
Be 0, d
xBe the arbitrary parallax value in the disparity range D, q
sBe pixel p
lSelectivity neighborhood N
1(p
l) in, be different from pixel q
lArbitrary pixel, described selectivity neighborhood N
1(p
l) be:
N
1(p
l)={q
f|q
f∈N(p
l),Con(q
f)≥Con(p
l)and?S(q
f)=S(p
l)},
N in the formula
1(p
l) be pixel p
lThe territory of neighbours up and down, Con (q
f) and Con (p
l) be the fiduciary level coefficient of mark in the step 5.3, S (q
f) and S (p
l) be the pixel q that tries to achieve in the step 6
fAnd p
lAffiliated cut zone label;
Step 7.3: calculate each pixel in the left image with respect to might parallax degree of confidence b (p
l, d):
P wherein
lBe the arbitrary pixel in the left image, d is the arbitrary parallax value in the D, C (p
l, the coupling cost that d) obtains for step 4.2.2,
Be the 50th iterative computation obtain from pixel p
sTo p
lThe parallax of propagating is the degree of confidence message of d, p
sBe N
1(p
l) interior arbitrary pixel, N
1(p
l) be the p of definition in the step 7.2
lThe selectivity neighborhood;
Step 7.4: calculate anaglyph
The optimum parallax value d of confidence calculations (p according to each pixel
l):
P wherein
lBe the arbitrary pixel in the left image, b (p
l, the degree of confidence that d) calculates for step 7.3, D is a disparity range, d is the arbitrary parallax value in the inspection scope D;
Optimum parallax according to each pixel in the left image is set up final parallax as D
Out: D
Out(x, y)=d (p
Xy), wherein x and y are respectively anaglyph D
OutThe horizontal ordinate of pixel and ordinate, p
XyBe that coordinate is (x, pixel y), d (p in the benchmark image
Xy) be p
XyOptimum parallax value;
Step 8: the three-dimensional information of reconstructed object thing
The camera interior and exterior parameter matrix A that obtains according to step 2
L, A
R[R
Lt
L], [R
Rt
R], and the disparity map D that obtains of step 7
Out, calculate the three-dimensional point cloud model of whole object through the space method of crossing.
Figure 13 is the cross synoptic diagram of method of space, O
L, O
RBe respectively the photocentre of two video cameras, S
L, S
RBe respectively the imaging plane of two video cameras, P
L, P
RIt is a pair of match point in two shot by camera images.Following relation of plane is arranged between the pixel coordinate on object point three-dimensional coordinate and the imaging plane in the space:
Wherein (u, the v) pixel coordinate of representation space object point imaging vegetarian refreshments on imaging plane, (X
w, Y
w, Z
w) expression object point volume coordinate.It has represented a straight-line equation through video camera photocentre, imaging point, space object point.
Arbitrary pixel p in the image of a left side
lMatched pixel point in right figure is p
r, p
rCoordinate do
Wherein
The optimum anaglyph that calculates for step 7.4,
Be p
lCoordinate, therefore as long as according to match point to { p
l, p
rCoordinate just can calculate two straight-line equations that project a pair of matched pixel point on two imaging planes through same object point; Calculate the intersection point of two straight lines; Just can obtain the 3 d space coordinate of object point; Because all have error in each processes such as demarcation, coupling, computing, reverse two straight lines that calculate can just in time not intersect probably, then get the mid point of its common vertical line this moment.