CN102184540B - Sub-pixel level stereo matching method based on scale space - Google Patents

Sub-pixel level stereo matching method based on scale space Download PDF

Info

Publication number
CN102184540B
CN102184540B CN 201110112619 CN201110112619A CN102184540B CN 102184540 B CN102184540 B CN 102184540B CN 201110112619 CN201110112619 CN 201110112619 CN 201110112619 A CN201110112619 A CN 201110112619A CN 102184540 B CN102184540 B CN 102184540B
Authority
CN
China
Prior art keywords
parallax
formula
window
sub
expression
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN 201110112619
Other languages
Chinese (zh)
Other versions
CN102184540A (en
Inventor
门朝光
边继龙
李香
余鹏达
尚方
田泽宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Engineering University
Original Assignee
Harbin Engineering University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Engineering University filed Critical Harbin Engineering University
Priority to CN 201110112619 priority Critical patent/CN102184540B/en
Publication of CN102184540A publication Critical patent/CN102184540A/en
Application granted granted Critical
Publication of CN102184540B publication Critical patent/CN102184540B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)
  • Measurement Of Optical Distance (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention provides a sub-pixel level stereo matching method based on a scale space. The method comprises the following steps of: (1) establishing the scale space for a stereopair; (2) determining the size of a matching window for each point of a current level reference image in the scale space; (3) calculating a parallax searching range for each point of the level reference image according to the parallax of the former level; (4) calculating an integer-grade parallax of the current level in the scale space; and (5) calculating a sub-pixel level parallax through a sub-pixel level matching method based on binary search. The method of the invention uses the stereopair generated under the condition of small height ratio to match; a dispersed scale space is used in the matching process to improve the matching efficiency; a bonding phenomenon generated in the matching process is reduced by using a mixed window selecting strategy; and the sub-pixel level parallax is obtained by the sub-pixel level matching method based on the binary search so as to ensure high sub-pixel level precision, and greatly improve matching efficiency of the algorithm.

Description

Sub-pixel solid matching method based on metric space
Technical field
The present invention relates to computer vision, signal processing, stereo mapping method, is a kind of sub-pixel solid matching method.
Background technology
The depth difference of three-dimensional scenic can produce geometric displacement at the view that different points of view obtains, Stereo matching obtains the parallax of corresponding point with regard to being based on this principle by the view that mates two width of cloth or several different points of view, calculates the depth information of scenery by principle of triangulation again.Stereo matching is a hot issue in the computer vision field, and it has a wide range of applications in fields such as robot navigation, stereo mapping, photogrammetric measurement, remote sensing.
The stereogram that forms under the large base-height ratio condition of tradition can exist and more block, larger radiation difference and geometric distortion and because the relative position variation that moving target in the scene produces, these factors have increased difficulty of matching to a certain extent, have caused a large amount of mistake couplings.
The adverse effect that coupling is produced for weakening above-mentioned factor, the Stereo Matching Technology under the little base-height ratio condition is arisen at the historic moment.Base-height ratio refers to the ratio of photographic base and camera height, little base-height ratio then refers in camera height one regularly, the time interval of taking pictures for twice is short, and namely twice imaging time distance in the heart is less, and the factors such as the stereogram that forms under this condition can weaken that relative position between the scenery changes because motion produces, radiation difference and geometric distortion are on the impact of the mating accuracy with the raising Stereo matching.Depth information, base-height ratio and parallax satisfy dz=d ε/(b/h), under the given condition of parallax precision, base-height ratio is larger, depth error is less, therefore, the base-height ratio that large more options are larger in stereopsis to be reducing the increase that causes not depth error owing to the parallax precision, but meanwhile, large base-height ratio has increased again difficulty of matching, has caused a large amount of mistake couplings.J.Delon in 2007 and B.Roug are at the 28th volume of " Journal of Mathematical Imaging and Vision ", delivered " Small Baseline Stereovision " paper on the 3rd phase, in this paper, proposed first based on little base-height ratio solid matching method, it is the adaptive definite window size of every bit in the reference picture that the method has proposed a kind of self-adapting window strategy, recycling " victor gets entirely " policy calculation every bit parallax, then utilize barycenter to proofread and correct to reduce " bonding " phenomenon in the Stereo matching, the last continuous expression form of calculating related coefficient in frequency domain obtains the sub-pixel parallax.It and difference of the present invention are to the present invention proposes hybrid window selection strategy and reduce " bonding " phenomenon in the Stereo matching, and have proposed to obtain the sub-pixel parallax based on the sub-pixel matching process of binary search.
Summary of the invention
The object of the present invention is to provide and a kind ofly can improve matching precision, can obtain the sub-pixel parallax of degree of precision, but also have the sub-pixel solid matching method based on metric space of lower Space-time Complexity.
The object of the present invention is achieved like this:
(1) makes up metric space for stereogram;
In this step, be left and right sides picture construction metric space in the stereogram according to disparity range, the formula that makes up metric space is as follows:
u N ( x , y ) = Σ ( m , n ) ∈ S ( x , y ) u N - 1 ( m , n ) 4 ,
S(x,y)={(2x,2y),(2x+1,2y),(2x,2y+1),(2x+1,2y+1)}
In the formula, u NN layer scalogram picture in the expression metric space, wherein, u 0(N=0) bottom layer image is original image in the expression metric space;
(2) determine the match window size for the every bit of current level reference picture in the metric space;
In this step, determine this match window size according to every spot correlation density of current level reference picture in the metric space.Relevant density calculation formula is as follows:
Figure BDA00000589351200022
In the formula, u (x) presentation video function, u ' is the derivative of presentation video function u (x) (x),
Figure BDA00000589351200023
Expression is with x 0Centered by the support window, the norm of image function u (x)
Figure BDA00000589351200024
Computing formula as follows:
U (x) is as follows with its derivative u ' inner product computing formula (x):
The formula that calculates the match window size is as follows:
Figure BDA00000589351200027
In the formula,
Figure BDA00000589351200028
The expression reference picture, g is Gaussian function, σ bPoor for noise criteria, λ is matching precision,
Figure BDA00000589351200029
Computing formula as follows:
Figure BDA000005893512000210
(3) according to last layer level parallax be the every bit calculating parallax hunting zone of this level reference picture;
The every bit that is current level reference picture in the metric space calculates the parallax hunting zone, and at first according to the initial parallax of this level of integer level disparity computation of last layer level, the computing formula of initial parallax is as follows:
d N-1(i,j)=2·d N(m,n),
(i,j)∈{(2m,2n),(2m+1,2n),(2m,2n+1),(2m+1,2n+1)}
In the formula, d N(i, j) is the integer level parallax of N level (i, j) position; Then, according to maximum disparity value d MaxThe maximum disparity value d of calculating book level N, max, its computing formula is as follows:
Figure BDA00000589351200031
In the formula,
Figure BDA00000589351200032
Expression is more than or equal to the smallest positive integral of x.
(4) the integer level parallax of current level in the calculating metric space;
In this step, according to the parallax hunting zone that obtains in the match window size of determining for every bit in the step (2) and the step (3), for each definite match window position of hybrid window selection strategy, similarity according to NCC calculating match window in corresponding parallax hunting zone obtains corresponding optimum parallax value under each the window's position, and chooses the final parallax of parallax conduct of similarity maximum in the parallax that calculates under these the window's positions.
The NCC computing formula is as follows:
C k ( x , y , d ) = Σ ( i , j ) ∈ W k u ( x + i , y + j ) · u ~ ( x + i + d , y + j ) Σ ( i , j ) ∈ W k u ( x + i , y + j ) 2 · Σ ( i , j ) ∈ W k u ~ ( x + i + d , y + j ) 2
In the formula, u (x, y) and
Figure BDA00000589351200034
Represent respectively the left and right sides image in the stereogram, W kRepresent corresponding match window.The corresponding disparity computation formula of each the window's position is as follows:
d k ( x , y ) = arg d max C k ( x , y , d )
In the formula, d kThe optimum parallax of (x, y) expression (x, y) point under k match window, optimal cost is expressed as C accordingly K, opt(x, y).
The computing formula of final parallax is as follows:
d opt ( x , y ) = arg d k ( x , y ) max C k , opt ( x , y )
In the formula, d OptThe final parallax of (x, y) expression (x, y) point.
(5) utilize the sub-pixel matching process based on binary search to calculate the sub-pixel parallax.
In this step, according to the integer level parallax that calculates in multiscale space, the recycling binary search is calculated the sub-pixel parallax that is better than designated precision, and the computing formula of its binary search is as follows:
d A + 1 ( x 0 ) = arg m ∈ S max N x 0 ( m )
S={d A(x 0),d A(x 0)+1/2 A,d A(x 0)-1/2 A}
In the formula, A 〉=1 is iterations, d A(x 0) expression x 0The sub-pixel parallax of the A time iteration of point, d 1(x 0) (A=1) expression x 0The integer parallax of point, The expression reference windows
Figure BDA00000589351200042
With match window
Figure BDA00000589351200043
The standard cross-correlation coefficient, its computing formula is as follows:
Figure BDA00000589351200044
Reduce computing time and improve matching efficiency under the prerequisite that guarantees matching precision, the present invention arrives the Stereo matching Procedure embedding in the middle of the discrete metric space of " by slightly to smart ".Large scale image in the metric space provides the Global Information of image, and the small scale image provides the detailed information of image; The parallax of trying to achieve under large scale is approximate parallax, and the parallax of trying to achieve under small scale is accurate parallax.Based on the Stereo matching of metric space be exactly in matching process, utilize metric space at the middle and upper levels the approximate parallax of yardstick instruct this layer yardstick to mate and then try to achieve the process of this layer yardstick parallax.The size that supports window in matching process can affect matching precision, for obtaining in the reliable parallax support window certain grey scale change being arranged, the non-texture region inner support window of image should be as much as possible greatly comprising enough half-tone informations, and texture region inner support window should be enough little with the assurance window in parallax constant.For this reason, the present invention has introduced the adaptive windows vocal imitation skill in matching process, and adaptive process has only changed window size and do not change its shape, and this can cause supporting window and cross over object boundary and produce " bonding " phenomenon.Cross over object boundary for avoiding supporting window in the matching process, the present invention proposes hybrid window selection strategy, hybrid window selection strategy not only can adaptive selection window size mate required grey scale change scope to satisfy, but also the diversity that has increased the match window shape can improve matching precision to a certain extent to avoid the window crossing the boundary.
For obtaining the sub-pixel parallax, then need to after obtaining integer level parallax, add the sub-pixel coupling.Traditional sub-pixel matching process mainly comprises image resampling method, fitting process and phase method.The image resampling method is to utilize interpolation technique that matching image is carried out interpolation to obtain high-definition picture, each location of pixels in the high-definition picture represents the sub-pixel location of original image, the recycling Stereo Matching Technology is determined the sub-pixel parallax, this algorithm has realizes simple, precision advantages of higher, but its computation complexity is higher.Fitting process is to utilize the adjacent cost of Optimum Matching cost and the left and right sides thereof, adopt fitting technique to obtain the continuous expression form of cost function, ask again the extreme value place of its function, this extreme point position is sub-pixel parallax position, this algorithm has to be realized simply, the advantages such as complexity is low, but its sub-pixel precision is lower.Phase method is the displacement information that utilizes the phase information acquisition picture point of frequency domain, but the frequency alias effect can seriously influence the bearing accuracy of algorithm.For obtaining high-precision sub-pixel parallax, reduce simultaneously the computation complexity of algorithm, the present invention proposes the sub-pixel matching process based on binary search, the method not only can obtain the sub-pixel parallax of degree of precision, but also has lower Space-time Complexity.
The sub-pixel solid matching method that the present invention is based on metric space has used the stereogram that generates under the little base-height ratio condition to mate, in matching process, adopted discrete metric space to improve matching efficiency, and utilize hybrid window selection strategy to reduce " bonding " phenomenon that produces in the matching process, used at last the sub-pixel matching process based on binary search to obtain the sub-pixel parallax, the method not only can guarantee higher subpixel accuracy, but also has improved to a great extent the matching efficiency of algorithm.Compare with other matching process, advantage of the present invention is:
1, the present invention uses the stereogram of producing under the little base-height ratio condition to reduce because the mistake coupling that the factors such as the relative position variation between the scenery, the radiation difference between stereogram and geometric distortion cause;
2, the present invention has been embedded into matching process in the middle of the discrete metric space, utilizes higher level's parallax to instruct subordinate to mate, and has reduced computing time, has improved matching efficiency;
3, the present invention proposes the hybrid window selection strategy that self-adapting window and multiwindow combine, the method has reduced " bonding " phenomenon that produces in the matching process to a great extent, has improved the matching accuracy rate of integer level parallax;
4, the present invention proposes sub-pixel matching process based on binary search and obtain the sub-pixel parallax to remedy little base-height ratio to error that three-dimensional reconstruction was brought, the method not only can obtain higher subpixel accuracy, but also have lower Space-time Complexity: its time complexity O (A), space complexity is O (K 2).
Description of drawings
Fig. 1 is the sub-pixel solid matching method process flow diagram based on metric space;
Fig. 2 is multiscale space pyramid synoptic diagram;
Fig. 3 is hybrid window selection strategy synoptic diagram;
Fig. 4 is the sub-pixel coupling schematic diagram based on 3 * 3 windows;
Fig. 5 (a)-Fig. 5 (d) is experiment effect figure of the present invention, first row Fig. 5 (a) is the reference picture in the stereogram, secondary series Fig. 5 (b) is true disparity map, the 3rd row Fig. 5 (c) is integer level disparity map, and the 4th row Fig. 5 (d) is the sub-pixel disparity map.
Embodiment
For example technical scheme of the present invention is further described below in conjunction with accompanying drawing:
Fig. 1 has shown the process flow diagram of the sub-pixel solid matching method that the present invention is based on metric space, describes the specific implementation of each step in detail below in conjunction with this process flow diagram:
(1) makes up metric space for stereogram;
1. at first according to the maximum disparity value d of original stereogram MaxCalculate the maximum level N of metric space Max, its computing formula is as follows:
N max = min { n | d max 2 n < 1 }
2. according to the maximum level N of metric space MaxBe respectively reference picture
Figure BDA00000589351200052
Make up metric space with matching image u
Figure BDA00000589351200053
And u N,
Its computing formula that makes up metric space is as follows:
u ~ N ( x , y ) = &Sigma; ( m , n ) &Element; S ( x , y ) u ~ N - 1 ( m , n ) 4 ,
S(x,y)={(2x,2y),(2x+1,2y),(2x,2y+1),(2x+1,2y+1)}
u N ( x , y ) = &Sigma; ( m , n ) &Element; S ( x , y ) u N - 1 ( m , n ) 4 ,
S(x,y)={(2x,2y),(2x+1,2y),(2x,2y+1),(2x+1,2y+1)}
In the formula, N layer reference picture in the expression metric space, wherein,
Figure BDA00000589351200063
Expression original reference image
Figure BDA00000589351200064
u NN layer matching image in the expression metric space, wherein, U 0(N=0) expression original match image.Height and the width of N layer yardstick reference picture and matching image are respectively 1/2 of former reference picture and matching image N, the construction process of its metric space as shown in Figure 2.
(2) determine the match window size for the every bit of current level reference picture in the metric space;
1. under the current level of metric space, at first calculate Gauss's norm according to standard deviation, the computing formula of its Gauss's norm is as follows:
| | g | | L 2 = 1 2 &sigma; &pi;
In the formula, the value of σ is 0.56.
2. the every bit for current level reference picture in the metric space calculates optimum window size, and the calculating of optimum window is in fact to calculate the min-window value that satisfies inequality in the window size span, and its optimum window calculation formula is as follows:
Figure BDA00000589351200066
In the formula,
Figure BDA00000589351200067
The expression reference picture, g represents Gaussian function, σ bExpression Gaussian noise standard deviation, λ represents matching precision,
Figure BDA00000589351200068
The relevant density of expression reference picture, its computing formula is as follows:
Figure BDA00000589351200069
(3) parallax according to the last layer level is the every bit calculating parallax hunting zone of this level reference picture;
1. the integer level parallax according to the last layer level is the every bit calculating initial parallax value of this level, and the computing formula of initial parallax is as follows:
d N-1(i,j)=2·d N(m,n),
(i,j)∈{(2m,2n),(2m+1,2n),(2m,2n+1),(2m+1,2n+1)}
In the formula, d N(i, j) is the integer level parallax of N layer (i, j) position.
2. according to maximum disparity value d MaxThe maximum disparity value d of calculating book level N, max, its computing formula is as follows:
Figure BDA00000589351200071
In the formula, Expression is more than or equal to the smallest positive integral of x.
(4) the integer level parallax of current level in the calculating metric space;
According to the parallax hunting zone of calculating in the match window size of calculating in the step (2) and the step (3), for definite each the window's position (as shown in Figure 3) of hybrid window selection strategy, in corresponding parallax hunting zone, calculate the similarity of match window according to NCC, recycling " victor gets entirely " strategy obtains the corresponding optimum parallax of each the window's position, and chooses the final parallax of parallax conduct of similarity maximum in these parallaxes.
1. for each the window's position, calculate the coupling cost according to window size and disparity range, its computing formula is as follows:
C k ( x , y , d ) = &Sigma; ( i , j ) &Element; W k u ( x + i , y + j ) &CenterDot; u ~ ( x + i + d , y + j ) &Sigma; ( i , j ) &Element; W k u ( x + i , y + j ) 2 &CenterDot; &Sigma; ( i , j ) &Element; W k u ~ ( x + i + d , y + j ) 2
In the formula, u and Represent respectively the left and right sides image in the stereogram, W kRepresent corresponding match window, C k(x, y, d) represents pixel (x, y) under k window, the coupling cost when parallax is d.
2. calculate corresponding parallax under each the window's position, its computing formula is as follows:
d k ( x , y ) = arg d max C k ( x , y , d )
In the formula, d kThe optimum parallax of (x, y) expression (x, y) point under k match window, optimal cost is expressed as C accordingly K, opt(x, y).
3. calculate final parallax, its computing formula is as follows:
d opt ( x , y ) = arg d k ( x , y ) max C k , opt ( x , y )
In the formula, d OptThe final parallax of (x, y) expression (x, y) point.
(5) utilize the sub-pixel matching process based on binary search to calculate the sub-pixel parallax.
In this step, according to the integer level parallax that in multiscale space, obtains, utilize binary search to calculate the sub-pixel parallax that is better than designated precision.
1. at first utilize the linear interpolation formula to calculate the gray-scale value of match window and adjacent window apertures floating-point position, the left and right sides thereof, its left and right sides adjacent window apertures is by all pixels in the match window are distinguished to the left and right direction translations 1/2 ATwo windows that (A is iterations) forms are called these three windows hereinafter: middle match window, move to left match window and the match window that moves to right, and the gray count formula of floating-point position is as follows in its match window:
u ~ ( x ) = ( x - [ x ] ) ( u ~ ( [ x ] + 1 ) - u ~ ( [ x ] ) ) + u ~ ( [ x ] )
In the formula, [] expression rounds.
2. utilize the similarity of NCC similarity measurement formula computing reference window and three match windows, its computing formula is as follows:
Figure BDA00000589351200081
In the formula, m ∈ { d A(x 0), d A(x 0)+1/2 A, d A(x 0)-1/2 AWherein A 〉=1 be iterations, d A(x 0) expression x 0The sub-pixel parallax of the A time iteration of point, d 1(x 0) (A=1) expression x 0The integer level parallax of point,
Figure BDA00000589351200082
Expression is with x 0Reference windows centered by the point,
Figure BDA00000589351200083
Expression is with x 0Match window centered by the+m.
3. determine optimum sub-pixel parallax according to calculating three similarity values that obtain, computing formula is as follows:
d A + 1 ( x 0 ) = arg m &Element; S max N x 0 ( m )
S={d A(x 0),d A(x 0)+1/2 A,d 4(x 0)-1/2 A}
The optimum sub-pixel parallax of obtaining is continued iterative search as new central point, until reach given solving precision, Fig. 4 has shown that use 3 * 3 windows carry out the sub-pix matching process of twice iteration.
Satisfy when the image sampling frequency in the situation of sampling thheorem, binary search can reach higher subpixel accuracy, and the sample frequency of image does not often satisfy sampling thheorem in the Stereo matching of reality, and the precision of sub-pix is limited.Through experimental verification, the method can reach 1/20 subpixel accuracy, i.e. A=5.
Compare the present invention proposes subpixel method obvious improvement has been arranged on Time ﹠ Space Complexity with the sub-pix matching process of resampling method.When sub-pixel precision is 2 AThe time, the space complexity of resampling method is S 1=K 2* 2 A, and the space complexity of binary search is S 2=3*K 2, wherein K is for supporting window size; On time complexity, binary search is reduced to linear time complexity with exponential time complexity, and the time complexity of resampling method is T=O (2 A), and the time complexity of binary search is T=O (A), this algorithm has had obvious improvement at time complexity.

Claims (2)

1. sub-pixel solid matching method based on metric space is characterized in that:
(1) makes up metric space for stereogram;
(2) determine the match window size for the every bit of current level reference picture in the metric space;
1. under the current level of metric space, at first calculate Gauss's norm according to standard deviation, the computing formula of its Gauss's norm is as follows:
| | g | | L 2 = 1 2 &sigma; &pi;
In the formula, the value of σ is 0.56;
2. the every bit for current level reference picture in the metric space calculates optimum window size, and its optimum window calculation formula is as follows:
In the formula,
Figure FDA00002255312300013
The expression reference picture, g represents Gaussian function, σ bExpression Gaussian noise standard deviation, λ represents matching precision, The relevant density of expression reference picture, its computing formula is as follows:
Figure FDA00002255312300015
In the formula, u (x) presentation video function, the u'(x) derivative of presentation video function u (x),
Figure FDA00002255312300016
Expression is with X 0Centered by the support window, the norm of image function u (x)
Figure FDA00002255312300017
Computing formula as follows:
Figure FDA00002255312300018
U (x) and its derivative u'(x) the inner product computing formula as follows:
Figure FDA00002255312300019
The formula that calculates the match window size is as follows:
Figure FDA000022553123000110
In the formula,
Figure FDA000022553123000111
The expression reference picture, g is Gaussian function, σ bPoor for noise criteria, λ is matching precision,
Figure FDA000022553123000112
Computing formula as follows:
Figure FDA00002255312300021
(3) according to last layer level parallax be the every bit calculating parallax hunting zone of this level reference picture;
1. the integer level parallax according to the last layer level is the every bit calculating initial parallax value of this level, and the computing formula of initial parallax is as follows:
d N-1(i,j)=2·d N(m,n),
(i,j)∈{(2m,2n),(2m+1,2n),(2m,2n+1),(2m+1,2n+1)}
In the formula, d N(i, j) is the integer level parallax of N layer (i, j) position;
2. according to maximum disparity value d MaxThe maximum disparity value d of calculating book level N, max, its computing formula is as follows:
Figure FDA00002255312300022
In the formula,
Figure FDA00002255312300023
Expression is more than or equal to the smallest positive integral of x;
(4) the integer level parallax of current level in the calculating metric space;
According to the parallax hunting zone that obtains in the match window size of determining for every bit in the step (2) and the step (3), for each definite match window position of hybrid window selection strategy, similarity according to NCC calculating match window in corresponding parallax hunting zone obtains corresponding optimum parallax value under each the window's position, and the parallax of choosing the similarity maximum under these the window's positions in the parallax that calculates specifically comprises as final parallax:
1. for each the window's position, calculate the coupling cost according to window size and disparity range, its computing formula is as follows:
C k ( x , y , d ) = &Sigma; ( i , j ) &Element; W k u ( x + i , y + j ) &CenterDot; u ~ ( x + i + d , y + j ) &Sigma; ( i , j ) &Element; W k u ( x + i , y + j ) 2 &CenterDot; &Sigma; ( i , j ) &Element; W k u ~ ( x + i + d , y + j ) 2
In the formula, u and
Figure FDA00002255312300025
Represent respectively the left and right sides image in the stereogram, W kRepresent corresponding match window, C kThe coupling cost of (x, y, d) expression pixel (x, y) under k window, when parallax is d;
2. calculate corresponding parallax under each the window's position, its computing formula is as follows:
d k ( x , y ) = arg d max C k ( x , y , d )
In the formula, d kThe optimum parallax of (x, y) expression (x, y) point under k match window, optimal cost is expressed as C accordingly K, opt(x, y);
3. calculate final parallax, its computing formula is as follows:
d opt ( x , y ) = arg d k ( x , y ) max C k , opt ( x , y )
In the formula, d OptThe final parallax of (x, y) expression (x, y) point;
(5) utilize the sub-pixel matching process based on binary search to calculate the sub-pixel parallax;
According to the integer level parallax that calculates in multiscale space, the recycling binary search is calculated the sub-pixel parallax that is better than designated precision;
1. at first utilize the linear interpolation formula to calculate the gray-scale value of match window and adjacent window apertures floating-point position, the left and right sides thereof, its left and right sides adjacent window apertures is by all pixels in the match window are distinguished to the left and right direction translations 1/2 ATwo windows that form, wherein A is iterations, and these three windows are called: middle match window, move to left match window and the match window that moves to right, and the gray count formula of floating-point position is as follows in its match window:
u ~ ( x ) = ( x - [ x ] ) ( u ~ ( [ x ] + 1 ) - u ~ ( [ x ] ) ) + u ~ ( [ x ] )
In the formula, [] expression rounds;
2. the similarity of computing reference window and three match windows, its computing formula is as follows:
Figure FDA00002255312300032
In the formula, m ∈ { d A(x 0), d A(x 0)+12 A, d A(x 0)-1/2 AWherein A 〉=1 be iterations, d A(x 0) expression x 0The sub-pixel parallax of the A time iteration of point, d during A=1 1(x 0) expression x 0The integer level parallax of point,
Figure FDA00002255312300033
Expression is with x 0Reference windows centered by the point, Expression is with x 0Match window centered by the+m;
3. determine optimum sub-pixel parallax according to calculating three similarity values that obtain, computing formula is as follows:
d A + 1 ( x 0 ) = arg m &Element; S max N x 0 ( m )
S={d A(x 0),d A(x 0)+1/2 A,d A(x 0)-1/2 A}。
2. the sub-pixel solid matching method based on metric space according to claim 1 is characterized in that describedly for stereogram makes up in the metric space step, and the formula that makes up metric space is as follows:
u N ( x , y ) = &Sigma; ( m , n ) &Element; S ( x , y ) u N - 1 ( m , n ) 4 ,
S(x,y)={(2x,2y),(2x+1,2y),(2x,2y+1),(2x+1,2y+1)}
In the formula, u NN layer scalogram picture in the expression metric space, wherein, u 0, N=0, bottom layer image is original image in the expression metric space.
CN 201110112619 2011-05-03 2011-05-03 Sub-pixel level stereo matching method based on scale space Expired - Fee Related CN102184540B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201110112619 CN102184540B (en) 2011-05-03 2011-05-03 Sub-pixel level stereo matching method based on scale space

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201110112619 CN102184540B (en) 2011-05-03 2011-05-03 Sub-pixel level stereo matching method based on scale space

Publications (2)

Publication Number Publication Date
CN102184540A CN102184540A (en) 2011-09-14
CN102184540B true CN102184540B (en) 2013-03-20

Family

ID=44570710

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201110112619 Expired - Fee Related CN102184540B (en) 2011-05-03 2011-05-03 Sub-pixel level stereo matching method based on scale space

Country Status (1)

Country Link
CN (1) CN102184540B (en)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103679680B (en) * 2012-08-31 2016-12-21 富士通株式会社 Solid matching method and system
CN102903111B (en) * 2012-09-27 2015-09-30 哈尔滨工程大学 Large area based on Iamge Segmentation low texture area Stereo Matching Algorithm
CN103310446B (en) * 2013-06-08 2016-05-04 哈尔滨工程大学 A kind of solid matching method that instructs filtering based on iteration
CN105740867B (en) * 2016-01-26 2018-11-09 哈尔滨理工大学 The selection method of image texture window shape and scale
CN107481271B (en) * 2017-07-25 2020-08-11 成都通甲优博科技有限责任公司 Stereo matching method, system and mobile terminal
CN108387184A (en) * 2018-03-20 2018-08-10 山西和信基业科技股份有限公司 A kind of measurement method and system of deflection of bridge span
CN109029379B (en) * 2018-06-08 2020-10-20 北京空间机电研究所 High-precision small-base-height-ratio three-dimensional mapping method
CN110111339B (en) * 2019-04-28 2023-08-15 航天智造(上海)科技有限责任公司 Stripe image target area extraction method
CN110211053B (en) * 2019-04-28 2023-09-15 航天智造(上海)科技有限责任公司 Rapid and accurate phase matching method for three-dimensional measurement
CN110223326B (en) * 2019-06-18 2021-04-30 无锡英斯特微电子有限公司 Method and system for determining sub-pixel region in photoelectric navigation system
CN110942102B (en) * 2019-12-03 2022-04-01 武汉大学 Probability relaxation epipolar matching method and system
CN111414931B (en) * 2019-12-31 2023-04-25 杭州电子科技大学 Multi-branch multi-scale small target detection method based on image depth
CN112700504B (en) * 2020-12-30 2024-02-20 南京理工大学智能计算成像研究院有限公司 Parallax measurement method of multi-view telecentric camera
CN115797439A (en) * 2022-11-11 2023-03-14 中国消防救援学院 Flame space positioning system and method based on binocular vision

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0686942A2 (en) * 1994-06-07 1995-12-13 Matsushita Electric Industrial Co., Ltd. Stereo matching method and disparity measuring method
WO2008054109A1 (en) * 2006-11-03 2008-05-08 Postech Academy-Industry Foundation Message propagation-based stereo image matching system
CN101720047A (en) * 2009-11-03 2010-06-02 上海大学 Method for acquiring range image by stereo matching of multi-aperture photographing based on color segmentation
CN101908230A (en) * 2010-07-23 2010-12-08 东南大学 Regional depth edge detection and binocular stereo matching-based three-dimensional reconstruction method
CN101924953A (en) * 2010-09-03 2010-12-22 南京农业大学 Simple matching method based on datum point
CN102026013A (en) * 2010-12-18 2011-04-20 浙江大学 Stereo video matching method based on affine transformation

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0686942A2 (en) * 1994-06-07 1995-12-13 Matsushita Electric Industrial Co., Ltd. Stereo matching method and disparity measuring method
WO2008054109A1 (en) * 2006-11-03 2008-05-08 Postech Academy-Industry Foundation Message propagation-based stereo image matching system
CN101720047A (en) * 2009-11-03 2010-06-02 上海大学 Method for acquiring range image by stereo matching of multi-aperture photographing based on color segmentation
CN101908230A (en) * 2010-07-23 2010-12-08 东南大学 Regional depth edge detection and binocular stereo matching-based three-dimensional reconstruction method
CN101924953A (en) * 2010-09-03 2010-12-22 南京农业大学 Simple matching method based on datum point
CN102026013A (en) * 2010-12-18 2011-04-20 浙江大学 Stereo video matching method based on affine transformation

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Julie Delon等.Small Baseline Stereovision.《Journal of mathematical image and vision》.2007,第28卷(第3期),209-223.
Small Baseline Stereovision;Julie Delon等;《Journal of mathematical image and vision》;20070731;第28卷(第3期);第209-223页 *

Also Published As

Publication number Publication date
CN102184540A (en) 2011-09-14

Similar Documents

Publication Publication Date Title
CN102184540B (en) Sub-pixel level stereo matching method based on scale space
Wang et al. Mvdepthnet: Real-time multiview depth estimation neural network
Alismail et al. Photometric bundle adjustment for vision-based slam
CN110009674B (en) Monocular image depth of field real-time calculation method based on unsupervised depth learning
Liu et al. Continuous depth estimation for multi-view stereo
CN104867135B (en) A kind of High Precision Stereo matching process guided based on guide image
CN108010081B (en) RGB-D visual odometer method based on Census transformation and local graph optimization
CN106780631B (en) Robot closed-loop detection method based on deep learning
CN102665086B (en) Method for obtaining parallax by using region-based local stereo matching
CN109579825B (en) Robot positioning system and method based on binocular vision and convolutional neural network
CN104156957B (en) Stable and high-efficiency high-resolution stereo matching method
Chen et al. Transforming a 3-d lidar point cloud into a 2-d dense depth map through a parameter self-adaptive framework
CN103123727A (en) Method and device for simultaneous positioning and map building
CN104869387A (en) Method for acquiring binocular image maximum parallax based on optical flow method
CN108305277B (en) Heterogeneous image matching method based on straight line segments
CN102036094B (en) Stereo matching method based on digital fractional delay technology
CN103093479A (en) Target positioning method based on binocular vision
CN102073874A (en) Geometric constraint-attached spaceflight three-line-array charged coupled device (CCD) camera multi-image stereo matching method
Tao et al. Stereo priori RCNN based car detection on point level for autonomous driving
CN102903111B (en) Large area based on Iamge Segmentation low texture area Stereo Matching Algorithm
CN102740096A (en) Space-time combination based dynamic scene stereo video matching method
CN104200453A (en) Parallax image correcting method based on image segmentation and credibility
Shivakumar et al. Real time dense depth estimation by fusing stereo with sparse depth measurements
CN108876861B (en) Stereo matching method for extraterrestrial celestial body patrolling device
Zhao et al. Fpp-slam: indoor simultaneous localization and mapping based on fringe projection profilometry

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20130320

Termination date: 20190503

CF01 Termination of patent right due to non-payment of annual fee