CN106530337A - Non local stereopair dense matching method based on image gray scale guiding - Google Patents

Non local stereopair dense matching method based on image gray scale guiding Download PDF

Info

Publication number
CN106530337A
CN106530337A CN201610933540.4A CN201610933540A CN106530337A CN 106530337 A CN106530337 A CN 106530337A CN 201610933540 A CN201610933540 A CN 201610933540A CN 106530337 A CN106530337 A CN 106530337A
Authority
CN
China
Prior art keywords
pixel
cost
image
parallax
formula
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610933540.4A
Other languages
Chinese (zh)
Inventor
黄旭
周刚
陆正武
樊海波
蔡刚山
范超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan engineering science and technology research institute
Original Assignee
Wuhan engineering science and technology research institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan engineering science and technology research institute filed Critical Wuhan engineering science and technology research institute
Priority to CN201610933540.4A priority Critical patent/CN106530337A/en
Publication of CN106530337A publication Critical patent/CN106530337A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination

Abstract

The invention relates to a non local stereopair dense matching method based on image gray scale guiding. The non local stereopair dense matching method based on image gray scale guiding includes the steps: performing cost computation: taking an improved HOG operator as a cost measure, computing the cost between homonymous pixels, taking the cost as the means of describing the similarity between the homonymous pixels, and establishing a cost matrix; performing cost accumulation based on image gray scale guiding, and obtaining a stable cost accumulation result; according to a WTA strategy, obtaining an initial disparity image, rejecting a mis-matching point and an occulsion point, and obtaining a refined disparity image; and finally according to the disparity image, generating a dense high-precision three dimensional point cloud. The non local stereopair dense matching method based on image gray scale guiding fully considers the edge gray scale characteristics, has relatively high matching precision on the disparity edge, uses an eight-direction iterative cost accumulation mode so as to increase the matching robustness of a texture lacking area, and can quickly obtain a dense high-precision three dimensional point cloud, thus having great application prospect in the field of aerospace photogrammetry, low altitude photogrammetry and close-range photogrammetry, automatic driving of unmanned vehicle.

Description

Based on the non local stereogram dense Stereo Matching method that gradation of image is guided
Technical field
The present invention relates to a kind of image matching method, more particularly, to a kind of non local solid guided based on gradation of image As to dense Stereo Matching method.
Background technology
Stereogram dense Stereo Matching is one of photogrammetric and unfailing study hotspot of computer vision field, its master Task is wanted to be to find the same place between two images pixel by pixel.Dense Stereo Matching DEM DSM production, digital city Three-dimensional Gravity Build, before image rendering, unmanned automatic driving, robot navigation, augmented reality, the aspect such as virtual reality have and be widely applied very much Scape.Up to now, most of stereogram dense Stereo Matching algorithm can be summarized as 4 steps:1. cost is calculated;2. generation Valency is gathered;3. disparity computation;4. parallax amendment.
Although dense Stereo Matching passes through the development of many decades, technology reaches its maturity, and which is in cost calculating, texture-free region The aspects such as cost accumulation still have very big research space.It is current to have occurred in that bilateral filtering local matching algorithm, half overall situation are close Collection matching algorithm (SGM) ,/global dense Stereo Matching algorithm of belief propagation etc. is cut based on figure, but no matter any intensive With algorithm, matching precision and matching robustness can not be taken into account well.Local matching algorithm realizes that simply calculating speed is fast, There is preferable matching precision at parallax edge, but whole matching poor robustness;Half global registration algorithm SGM equally can be real Existing Rapid matching, has preferable matching effect in texture-rich region, but in the poor Region Matching poor robustness of texture;Entirely Office's matching algorithm matching robustness is good, but the problem of excess smoothness easily occurs at parallax edge.Accordingly, it would be desirable to study one kind New dense Stereo Matching algorithm, can have matching precision and matching robustness concurrently, generate intensive three-dimensional point cloud.
The content of the invention
The present invention mainly solves traditional dense Stereo Matching method in parallax edge matching low precision, whole matching result not Shandong A kind of problem of rod, it is proposed that stereogram dense Stereo Matching method guided based on gradation of image, can be calculated using improved HOG Son, rationally describes the similarity between pixel of the same name, builds cost matrix;Cost biography is constrained according to the quadratic function of design Pass, non local cost accumulation is carried out to iterative manner using all directions, obtain stable cost accumulation result;Finally to initial parallax Figure is modified, and generates intensive high-precision three-dimensional point cloud.
What the above-mentioned technical problem of the present invention was mainly addressed by following technical proposals:
A kind of non local stereogram dense Stereo Matching method guided based on gradation of image, it is characterised in that including following Step:
Step 1, sets up cost matrix:Using HOG operators are improved, estimate as similarity between description same place, build Vertical cost matrix, wherein, improved HOG operators are based on defined below:
It is defined in window ranges and (typically takes the window of 5x5 or 7x7 pixel sizes), the spoke between stereogram The linear change of distortion is penetrated, that is, meets below equation:
gr(pr)=c gl(pl)+t (1)
In formula, pl、prLeft and right image same place is represented respectively;When stereogram is core line image, relational expression p is metrx =pl- d, wherein d represent parallax;gl、grThe gray scale of left and right image same place is represented respectively;C, t are the coefficients of linear equation, c tables Show scale factor, t represents displacement factor;
Gradient is calculated simultaneously to stereogram, displacement factor t can be eliminated in the range of the wicket of local;Can adopt Sobel operators are calculating image gradient:
Gxr(pr)=c Gxl(pl)Gyr(pr)=c Gyl(pl) (2)
In formula, Gxl(pl)、Gxr(pr) point p is represented respectivelyl、prShade of gray in the horizontal direction;Gyl(pl)、Gyr(pr) Point p is represented respectivelyl、prIn the shade of gray of vertical direction;
In order to further eliminate scale factor c, according to formula (2), the deflection of gradient is calculated:
In formula, θl(pl)、θr(pr) point p is represented respectivelyl、prGradient direction angle;According to the difference of quadrant, codomain be [0, 360°);From formula (3), gradient direction has the ability of good anti-linearized radiation distortion;
The window of a WxW size centered on any one pixel p in image, is opened up, is retouched as a basic HOG State unit (cell);The codomain scope of gradient direction is divided into into 12 interval, scopes of each 30 ° of interval correspondence;By each Interval initial count is set to 0;The gradient direction angle of all pixels in statistic unit;The gradient direction of pixel in judging unit Which fall interval at, then corresponding count is incremented in the interval;All of pixel in Traversal Unit successively, finally, by each interval Count the sum divided by pixel in window;According to each interval statistical conditions, the vector of one 12 dimension is constructed, as description picture The Feature Descriptor of plain p, as shown in formula (4);
VHOG(p)=(b0,b1,b2,b3,b4,b5,b6,b7,b8,b9,b10,b11)T (4)
In formula, bi(i=0~11) represent each interval value;VHOGP () represents the feature description operator of pixel p;
Step 2, the non local cost accumulation guided based on gradation of image:According to the half-tone information of image, one two is designed Secondary function carries out non local cost accumulation to by the way of iteration using all directions, sets up reliable cost product constraining cost transmission Poly- matrix;
Step 3, three-dimensional point cloud are generated:Matrix is gathered according to cost, initial parallax figure is generated, using left and right consistency detection Method excluding gross error, correct disparity map, generate dense high-precision three-dimensional point cloud.
In a kind of above-mentioned non local stereogram dense Stereo Matching method guided based on gradation of image, described step 1 The middle concrete grammar for setting up cost matrix is as follows:Using the distance between same place HOG feature description vectors as cost, based on such as Lower formula;
In formula, epl (pl, d) represent point pl, the same place in the case of parallax is d, i.e. pr=epl (pl,d);CHOG(pl, D) represent point pl、prBetween with HOG calculate cost;Represent left imaging point plHOG feature descriptors;Represent right imaging point prFeature descriptor;
In a kind of above-mentioned non local stereogram dense Stereo Matching method guided based on gradation of image, described step 2 In, the non local cost accumulation mode guided based on gradation of image:
Gray scale difference absolute value is defined on image within 15 pixels, and the pixel in center pixel eight neighborhood, its parallax Should meet consistency condition;According to assumed condition, the cost equation of transfer as shown in formula (6) is listed:
In formula, L (p, d) represents accumulated costs of the pixel p in current path correspondence parallax d;R represents the direction in path;CHOG (p, d) represents the HOG costs of current pixel p correspondence parallax d;P 1 represented on current path, the previous pixel of pixel p; P1、P2Penalty term is represented, in gray scale homogeneous region, when neighbor parallax smooth change, small punishment P is given1;When There is the change of step evolution in neighbor parallax, give larger punishment P2;Advantage of this is that:As far as possible same in accordance with gray scale Under the consistent principle of matter region adjacent pixel parallax, pixel parallax is neatly allowed to change;As gray scale homogeneous region is regarded The situation of difference smooth change is relatively common, so parameter P of setting2Significantly larger than P1;Parameter T represents that pixel p and pixel p 1 are grey The degree of closeness of degree/color, for constraining the transmission of cost between neighborhood territory pixel.
It is in a kind of above-mentioned non local stereogram dense Stereo Matching method guided based on gradation of image, described to set up cost In matrix step, represent for constraining the transmission of cost between neighborhood territory pixel using a quadratic function;Two are based on specifically The kernel function of secondary function, in the range of [0,2 σ], it is ensured that what slope was incremented by, as shown in formula (7);
In formula, TGThe T value that expression is calculated using Gaussian function;A represents secondary term coefficient;σ is smoothing factor.
It is in a kind of above-mentioned non local stereogram dense Stereo Matching method guided based on gradation of image, described to set up cost In matrix step, cost accumulation result is closely related with cost cumulative path, is specifically based on the cost accumulation strategy in 8 directions;No Horizontal can be only included, 4 cornerwise directions are further comprises;Algorithm needs iteration twice, changes in first time In generation, need to be added the cost accumulation results in 8 directions, as shown in formula (8);
In formula, L represents the cost accumulation results calculated according to formula (6);R represents scan-line direction, including 0 ° of direction, 45 ° of directions, 90 ° of directions, 135 ° of directions, 180 ° of directions, 225 ° of directions, 270 ° of directions and 315 ° of directions;S represents 8 directions The total cost accumulation results of scan line;
Through first time iteration, for each pixel, only there is path on scan-line direction, outside scan line Pixel is connected without path;Therefore, using accumulation results S of first time iteration as new cost, participate in second iteration meter Calculate, have:
In formula,Cost accumulation results in representing second iteration, on r scanning direction lines;
Finally, the cost accumulation results of 8 scanning direction lines are added, as shown in formula (10);Through iteration twice, image On each pixel, between view picture image other all of pixels, all there is path, or connection, or non-interconnected, so as to Non local cost accumulation is realized, strengthens the reliability of cost accumulation;
In formula, S2In representing second iteration, the total cost accumulation results in 8 directions;5 is according to claim 2 A kind of non local stereogram dense Stereo Matching method guided based on gradation of image, it is characterised in that in described step 3, three Dimension point cloud generating mode is as follows:
It is according to the cost cumulant matrix that step 2 is obtained, tactful using WTA, the parallax of each pixel is calculated, such as formula (11) Institute:
Wherein, d (p) represents parallax of the pixel p on disparity map;Represent in disparity range, it is minimum S2Corresponding parallax d;After parallax using all pixels on WTA policy calculation images, you can obtain initial parallax figure;
It is to refer to image with left image, it is possible to obtain the initial parallax figure of left image;Using right image as image is referred to, can To obtain the initial parallax figure of right image, using the method for left and right consistency detection, Mismatching point can be rejected, obtain amendment Disparity map, as shown in formula (12):
In formula, d (pl) represent parallax of the pixel p in left image;d(pr) represent parallax of the pixel p in right image;Invalid Represent invalid parallax;If same pixel differs by more than 1 pixel in the parallax of left image and right image, then it is assumed that be mistake With point, rejected;
According to the disparity map of amendment, the same place of each valid pixel can be obtained, according to identical point coordinates and image Internal and external orientation, can generate corresponding three-dimensional point cloud, as shown in formula (13):
In formula, X, Y, Z represent the three-dimensional coordinate of object space point;xi、yi(i=1,2) represent corresponding image points coordinate;F represents burnt Away from;Ri(i=1,2) represent spin matrix;Xsi、Ysi、Zsi(i=1,2) represent exterior orientation line element.
Therefore, the invention has the advantages that:In the case of there is radiometric distortion between image, still can obtain accurate Reliable matching result;The cost transfer mode guided using gradation of image, ensure that the matching precision at parallax edge;Using From all directions to iterative non local cost accumulation mode, the matching robustness in the poor region of texture is ensure that, is ultimately generated close The high-precision three-dimensional point cloud of collection.The present invention is surveyed in space photogrammetry, aerophotogrammetry, low altitude photogrammetry and up short There is preferable application prospect in the field such as amount, digital city three-dimensional reconstruction, unmanned vehicle automatic Pilot.
Description of the drawings
Fig. 1 a are cost calculation window;
Fig. 1 b are gradient orientation histogram;
Fig. 2 is for from all directions to iterative non local cost accumulation path.
Specific embodiment
Below by embodiment, and accompanying drawing is combined, technical scheme is described in further detail.
Embodiment:
The technical scheme that the present invention is provided is, according to core DNA mitochondrial DNA picture pair, using HOG operator calculation costs are improved, to set up Cost matrix;According to gradation of image information, design quadratic function constraining the transmission of cost, and using from all directions to iterative non- Local cost accumulation strategy, obtains reliable cost and gathers result;Using WTA policy calculation parallaxes, examined according to left and right concordance Survey and reject error hiding, generate dense high-precision three-dimensional point cloud, comprise the following steps:
Step 1. builds HOG cost matrixes.
Cost is to describe similarity between same place to estimate.One good cost, it should accurately can describe pixel and its The gray feature of neighborhood.The present invention using HOG operators as dense Stereo Matching cost.If calculated to each pixel on image Whole HOG features, then amount of calculation will be very huge.Meanwhile, HOG operators do not possess the ability of anti-linearized radiation distortion yet.Cause This, the present invention is improved to HOG operators, can not only be accelerated cost calculating speed, and be caused improved HOG features tool There is the ability of anti-linearized radiation distortion.
The radiation event of image is affected by factors such as sensor itself, phase, illumination conditions.In actual production, stand There is the radiometric distortion of complexity between body image pair.Assume that, in the window ranges of very little, the radiation between stereogram is abnormal herein Become linear to change, that is, meet below equation:
gr(pr)=c gl(pl)+t
In formula, pl、prLeft and right image same place is represented respectively.When stereogram is core line image, relational expression p is metrx =pl- d, wherein d represent parallax.gl、grThe gray scale of left and right image same place is represented respectively.C, t are the coefficients of linear equation, c tables Show scale factor, t represents displacement factor.
Gradient is calculated simultaneously to stereogram, displacement factor t can be eliminated in the range of the wicket of local.Can adopt Sobel operators are calculating image gradient:
Gxr(pr)=c Gxl(pl)Gyr(pr)=c Gyl(pl)
In formula, Gxl(pl)、Gxr(pr) point p is represented respectivelyl、prShade of gray in the horizontal direction;Gyl(pl)、Gyr(pr) Point p is represented respectivelyl、prIn the shade of gray of vertical direction.
First, determine that left image refers to image, using Sobel operators, calculate the horizontal gradient of two images in left and right respectively Figure and vertical gradient map.According to the gradient map of left and right image, using following formula, gradient direction figure is calculated, the ladder of each pixel is obtained Degree direction:
θr(pr)=θl(pl)
θr(pr)=arctan (Gyr(pr)/Gxr(pr))
θl(pl)=arctan (Gyl(pl)/Gxl(pl))
In formula, plRepresent the pixel of left image;prRepresent the pixel of right image;θ represents gradient direction;Gx represents level side To gradient;Gy represents vertical gradient.
We open up the window of a WxW (W typically takes 5-9) size centered on any one pixel p in image, make Unit (cell) is described for a basic HOG.The codomain scope of gradient direction is divided into into 12 interval, each interval correspondences 30 ° of scope.Each interval initial count is set to into 0.The gradient direction angle of all pixels in statistic unit.Judging unit The gradient direction of interior pixel fall it is interval at which, then the interval it is corresponding count is incremented, as shown in Figure 1.Fig. 1 (a) represents that description is single Unit, the direction of arrow represent the gradient direction of each pixel;The background colour of pixel is one-to-one with interval.Fig. 1 (b) represents single Interval in unit, the numeral in interval represent the counting in each interval.All of pixel in Traversal Unit successively, finally, will be every Sum of the individual interval counting divided by pixel in window.According to each interval statistical conditions, the vector of one 12 dimension is constructed, made To describe the Feature Descriptor of pixel p, it is shown below.
VHOG(p)=(b0,b1,b2,b3,b4,b5,b6,b7,b8,b9,b10,b11)T
In formula, bi(i=0~11) represent each interval value;VHOGP () represents the feature description operator of pixel p.
Compared with traditional HOG algorithms, the present invention have ignored the information of gradient-norm during HOG description are built, from And the Feature Descriptor for constructing is caused, with linearized radiation distortion invariant.Additionally, traditional HOG algorithms are needed several Basic description unit (Cell) is combined into a block (Block), constitutes bigger feature descriptor.Due to the generation of dense Stereo Matching The cost of neighborhood territory pixel can be gathered together by valency accumulation link, therefore in order to avoid double counting, the present invention is special to pixel The description levied only is rested in unit (Cell) aspect.Compared with traditional HOG algorithms, calculating speed is undoubtedly greatly accelerated.
Left image pixel plThe HOG Feature Descriptors of correspondence parallax d can be obtained;Right image pixel prCorrespondence can equally be obtained The HOG Feature Descriptors of parallax d.The distance between same place HOG feature description vectors are estimated by the present invention as cost, as follows Shown in formula.
In formula, epl (pl, d) represent point pl, the same place in the case of parallax is d, i.e. pr=epl (pl,d);CHOG(pl, D) represent point pl、prBetween with HOG calculate cost;Represent left imaging point plHOG feature descriptors;Represent right imaging point prFeature descriptor.
According to above formula, all pixels p on left image can be calculatedlThe HOG costs of correspondence parallax d.Assume a width of W of image, it is high For H, disparity range is DR, then need to build the cost matrix of a WxHxDR size, store all pixels on left image HOG costs.
Step 2. all directions is to iterative non local cost accumulation.
After cost matrix is built, cost transmission is carried out along scan-line direction.The direction of scan line include horizontal direction, Vertical direction and diagonal, as shown in Fig. 2 Green scan lines.Present invention assumes that:On image gray scale be close to, and distance compared with Near pixel, its parallax should be to try to meet consistency condition.According to assumed condition, cost equation of transfer is shown below:
In formula, L (p, d) represents accumulated costs of the pixel p in current path correspondence parallax d;R represents the direction in path;CHOG (p, d) represents the HOG costs of current pixel p correspondence parallax d;P 1 represented on current path, the previous pixel of pixel p; P1、P2Represent penalty term, parameter P of general setting2Significantly larger than P1;Parameter T represents pixel p and pixel p 1 gray scale/color Degree of closeness, for constraining the transmission of cost between neighborhood territory pixel, the present invention is represented using a quadratic function,
Traditional dense Stereo Matching algorithm is using gaussian kernel function come calculating parameter T.Gaussian kernel function is a subtraction function, its Slope is also descending gradually to successively decrease.This means when gray scale difference produces small change near 0, Gaussian function numerical value T Fall be maximum!On image, even gray scale homogeneous region, the gray scale of intra-zone pixel is also impossible to completely Equally.It is desirable that be not equal to 0 in gray scale difference, but still it is less in the case of, obtain larger T value such that it is able to strengthen same Cost transmission in matter region.Traditional algorithm takes larger smoothing factor σ (σ=15-25) to solve this problem, but non- Between homogeneous region pixel during the transmission of cost, larger σ can also calculate larger weights.Cause it is critical only that for this problem The slope of gaussian kernel function successively decreases.Therefore, the present invention devises a kind of kernel function based on quadratic function, in [0,2 σ] model In enclosing, it is ensured that what slope was incremented by, it is shown below:
Δ g=| g (p+1)-g (p) | a=(e-2-1)/4σ2
In formula, TGThe T value that expression is calculated using Gaussian function;A represents secondary term coefficient;σ is smoothing factor, typically Take 5.
Cost accumulation result is closely related with cost cumulative path.Present invention definition all pixels from origin-to-destination are equal The path for belonging to the same area is communication path;Otherwise it is non-interconnected path.One good cost cumulative path should be as far as possible The whole homogeneous region of covering, i.e., the path in gray scale homogeneous region between pixel be connection.Most of matching algorithms are adopted The method of horizontal/vertical scan line is defining path.But only with scan line horizontally and vertically, easily make Path in gray scale homogeneous region between pixel does not connect.
Only the cost accumulation result of a scan line is unstable, it is easy to produce " fringe effects ".In order to solve This problem, the present invention propose a kind of non local cost accumulation strategy based on 8 directions.Horizontal/vertical can not only be included Direction, further comprises 4 cornerwise directions.The cost accumulation result in eight directions is added up, to strengthen cost accumulation Robustness.Algorithm needs iteration twice, in first time iteration, needs to be added the cost accumulation results in 8 directions, such as Shown in following formula.
In formula, L represents single scanning line direction calculating cost accumulation results out;R represents scan-line direction, including 0 ° Direction, 45 ° of directions, 90 ° of directions, 135 ° of directions, 180 ° of directions, 225 ° of directions, 270 ° of directions and 315 ° of directions;S represents 8 The total cost accumulation results of scanning direction line.
Through first time iteration, for each pixel, only there is path on scan-line direction, outside scan line Pixel is connected without path.It is that each pixel is only affected by scan-line direction pixel, not by the exterior pixel of scan line Impact.This cost accumulation mode is substantially local, and cost gathers result not robust, especially in the poor region of texture Cost accumulation.Therefore, in order to further enhance the robustness of cost accumulation, using accumulation results S of first time iteration as new Cost, participate in second and iterate to calculate, every scan line carries out cost transmission respectively, has:
In formula,Cost accumulation results in representing second iteration, on r scanning direction lines.
Finally, the cost accumulation results of 8 scanning direction lines are added, are shown below.Through iteration twice, on image Each pixel, between view picture image other all of pixels, all there is path, it is achieved thereby that the accumulation of non local cost, Strengthen the reliability of cost accumulation, as shown in Figure 2.Fig. 2 is represented cost from pixel p1It is delivered to pixel p3Path, circular table Show image pixel.Dotted line straight line represents pixel p18 scan-line directions;Solid line straight line represents pixel p38 scan line sides To.Intersection point between this two groups of scan lines, defines the path of cost transmission.The circle of intersection point black is represented.Arrow is represented The direction of cost transmission.Figure it is seen that the non local cost accumulation strategy based on 8 directions, can be p1To p3Cost Transmission, there is provided mulitpath, the path for having the path of horizontal direction, the path of vertical direction and diagonal.
In formula, S2In representing second iteration, the total non local cost accumulation results in 8 directions.
Each pixel p correspondence parallax d, has cost accumulation results S2., can there are very many generations in whole image of traversal Valency accumulation results.Assume a width of W of image, a height of H, disparity range are DR, then need to build the cumulated net rain of a WxHxDR size Battle array, stores accumulated costs S of all pixels on left image2
Step 3. three-dimensional point cloud is generated.
In disparity range, multiple accumulated costs can be corresponded to reference to each pixel on image.Using WTA (Winner Takes ALL) strategy, using final parallax of the corresponding parallax of minimum cost as the pixel, such as following formula in multiple accumulated costs It is shown:
Wherein, d (p) represents parallax of the pixel p on disparity map;Represent in disparity range, most Little S2Corresponding parallax d.
After parallax using all pixels on WTA policy calculation images, you can obtain initial parallax figure.Due to blocking, shadow As reasons such as radiation, error hiding problem is inevitably present, needs by certain means to detect and reject Mismatching point. The present invention is detected and is rejected error hiding using the conforming method in left and right.First, it is to refer to image with left image, it is possible to obtain The initial parallax figure of left image;Then, using right image as image is referred to, according to step 1 and step 2, the right side can equally be obtained The initial parallax figure of image.If match point is correct point, the parallax between left image pixel and right image pixel of the same name should This is consistent;Otherwise, it should be inconsistent.Therefore, using the method for left and right consistency detection, error hiding can be rejected Point, obtains the disparity map of amendment, is shown below:
In formula, d (pl) represent parallax of the pixel p in left image;d(pr) represent parallax of the pixel p in right image;Invalid Represent invalid parallax.If same pixel differs by more than 1 pixel in the parallax of left image and right image, then it is assumed that be mistake With point, rejected.
According to the disparity map of amendment, the same place of each valid pixel can be obtained, is shown below:
xr=xl-d yr=yl
In formula, (xl,yl)、(xr,yr) represent image identical point coordinates.
According to identical point coordinates and the internal and external orientation of image, corresponding three-dimensional point cloud can be generated, is shown below:
Bu=XS2-XS1Bv=YS2-YS1Bw=ZS2-ZS1
X=XS1+U1=XS2+U2
Y=YS1+V1=YS2+V2
Z=ZS1+W1=ZS2+W2
In formula, X, Y, Z represent the three-dimensional coordinate of object space point;F represents focal length;Ri(i=1,2) represent spin matrix;Xsi、 Ysi、Zsi(i=1,2) represent exterior orientation line element.
Specific embodiment described herein is only explanation for example spiritual to the present invention.Technology neck belonging to of the invention The technical staff in domain can be made various modifications or supplement or replaced using similar mode to described specific embodiment Generation, but without departing from the spiritual of the present invention or surmount scope defined in appended claims.

Claims (5)

1. it is a kind of based on gradation of image guide non local stereogram dense Stereo Matching method, it is characterised in that including following step Suddenly:
Step 1, sets up cost matrix:Using HOG operators are improved, estimate as similarity between description same place, set up generation Valency matrix, wherein, improved HOG operators are based on defined below:
It is defined in window ranges, the radiometric distortion between stereogram linearly changes, that is, meets below equation:
gr(pr)=c gl(pl)+t (1)
In formula, pl、prLeft and right image same place is represented respectively;When stereogram is core line image, relational expression p is metrx=pl- D, wherein d represent parallax;gl、grThe gray scale of left and right image same place is represented respectively;C, t are the coefficients of linear equation, and c represents ratio The example factor, t represent displacement factor;
Gradient is calculated simultaneously to stereogram, displacement factor t can be eliminated in the range of the wicket of local;Can be calculated using Sobel Son is calculating image gradient:
Gxr(pr)=c Gxl(pl) Gyr(pr)=c Gyl(pl) (2)
In formula, Gxl(pl)、Gxr(pr) point p is represented respectivelyl、prShade of gray in the horizontal direction;Gyl(pl)、Gyr(pr) difference table Show point pl、prIn the shade of gray of vertical direction;
In order to further eliminate scale factor c, according to formula (2), the deflection of gradient is calculated:
θ r ( p r ) = θ l ( p l ) θ r ( p r ) = arctan ( Gy r ( p r ) / Gx r ( p r ) ) θ l ( p l ) = arctan ( Gy l ( p l ) / Gx l ( p l ) ) - - - ( 3 )
In formula, θl(pl)、θr(pr) point p is represented respectivelyl、prGradient direction angle;According to the difference of quadrant, codomain be [0, 360°);From formula (3), gradient direction has the ability of good anti-linearized radiation distortion;
Centered on any one pixel p in image, the window of a WxW size is opened up, it is single as a basic HOG description First (cell);The codomain scope of gradient direction is divided into into 12 interval, scopes of each 30 ° of interval correspondence;Each is interval Initial count be set to 0;The gradient direction angle of all pixels in statistic unit;In judging unit, the gradient direction of pixel falls Which is interval, then corresponding count is incremented in the interval;All of pixel in Traversal Unit successively, finally, by the counting in each interval Divided by the sum of pixel in window;According to each interval statistical conditions, the vector of one 12 dimension is constructed, as description pixel p Feature Descriptor, as shown in formula (4);
VHOG(p)=(b0,b1,b2,b3,b4,b5,b6,b7,b8,b9,b10,b11)T (4)
In formula, bi(i=0~11) represent each interval value;VHOGP () represents the feature description operator of pixel p;
Step 2, the non local cost accumulation guided based on gradation of image:According to the half-tone information of image, a secondary letter is designed Count to constrain cost transmission, non local cost accumulation is carried out to by the way of iteration using all directions, set up reliable cost accumulation square Battle array;
Step 3, three-dimensional point cloud are generated:Matrix is gathered according to cost, initial parallax figure is generated, using the side of left and right consistency detection Method excluding gross error, corrects disparity map, generates dense high-precision three-dimensional point cloud.
2. it is according to claim 1 it is a kind of based on gradation of image guide non local stereogram dense Stereo Matching method, its It is characterised by, the concrete grammar that cost matrix is set up in described step 1 is as follows:By between same place HOG feature descriptions vector Distance as cost, based on equation below;
C H O G ( p l , d ) = | | V H O G l ( p l ) - V H O G r ( e p l ( p l , d ) ) | | - - - ( 5 )
In formula, epl (pl, d) represent point pl, the same place in the case of parallax is d, i.e. pr=epl (pl,d);CHOG(pl, d) table Show point pl、prBetween with HOG calculate cost;Represent left imaging point plHOG feature descriptors;Represent the feature descriptor of right imaging point pr.
3. a kind of stereogram dense Stereo Matching method guided based on gradation of image according to claim 1, its feature exists In, in described step 2, the non local cost accumulation mode guided based on gradation of image:
Gray scale difference absolute value is defined on image within 15 pixels, and the pixel in center pixel eight neighborhood, its parallax should It is to meet consistency condition;According to assumed condition, the cost equation of transfer as shown in formula (6) is listed:
L r ( p , d ) = G H O G ( p , d ) + T · min L r ( p - 1 , d ) , L r ( p - 1 , d - 1 ) + P 1 , L r ( p - 1 , d + 1 ) + P 1 , min k L r ( p - 1 , k ) + P 2 - - - ( 6 )
In formula, L (p, d) represents accumulated costs of the pixel p in current path correspondence parallax d;R represents the direction in path;CHOG(p, D) the HOG costs of current pixel p correspondence parallax d are represented;P 1 represented on current path, the previous pixel of pixel p;P1、P2 Penalty term is represented, in gray scale homogeneous region, when neighbor parallax smooth change, small punishment P is given1;When adjacent There is the change of step evolution in pixel parallax, give larger punishment P2;Advantage of this is that:As far as possible in accordance with gray scale homogeneity area Under the consistent principle of domain neighborhood pixels parallax, pixel parallax is neatly allowed to change;As gray scale homogeneous region parallax is put down The situation of sliding change is relatively common, so parameter P of setting2Significantly larger than P1;Parameter T represents pixel p and pixel p 1 gray scale/face The degree of closeness of color, for constraining the transmission of cost between neighborhood territory pixel.
4. it is according to claim 2 it is a kind of based on gradation of image guide non local stereogram dense Stereo Matching method, its It is characterised by, described to set up in cost matrix step, the transmission for constraining cost between neighborhood territory pixel adopts a secondary letter Count to represent;Kernel function specifically based on quadratic function, in the range of [0,2 σ], it is ensured that what slope was incremented by, such as formula (7) institute Show;
T ( P + 1 , p ) = a · Δg 2 + 1 Δ g ≤ 2 σ T G ( p , p + 1 ) Δ g > 2 σ Δ g = | g ( p + 1 ) - g ( p ) | a = ( e - 2 - 1 ) / 4 σ 2 - - - ( 7 )
In formula, TGThe T value that expression is calculated using Gaussian function;A represents secondary term coefficient;σ is smoothing factor.
5. it is according to claim 2 it is a kind of based on gradation of image guide non local stereogram dense Stereo Matching method, its It is characterised by, described to set up in cost matrix step, cost accumulation result is closely related with cost cumulative path, is specifically based on 8 The cost accumulation strategy in direction;Horizontal can not only be included, 4 cornerwise directions are further comprises;Algorithm needs Iteration, in first time iteration, needs to be added the cost accumulation results in 8 directions, as shown in formula (8) twice;
S ( p , d ) = C H O G ( p , d ) + Σ r = 1 8 ( L r ( p , d ) - C ( p , d ) ) - - - ( 8 )
In formula, L represents the cost accumulation results calculated according to formula (6);R represents scan-line direction, including 0 ° of direction, 45 ° of sides To, 90 ° of directions, 135 ° of directions, 180 ° of directions, 225 ° of directions, 270 ° of directions and 315 ° of directions;S represents 8 scanning direction lines Total cost accumulation results;
Through first time iteration, for each pixel, only there is path, the pixel outside scan line on scan-line direction It is to be connected without path;Therefore, using accumulation results S of first time iteration as new cost, participate in second and iterate to calculate, Have:
L r 2 ( p , d ) = S ( p , d ) + T · min L r 2 ( p - 1 , d ) , L r 2 ( p - 1 , d - 1 ) + P 1 , L r 2 ( p - 1 , d + 1 ) + P 1 , min k L r 2 ( p - 1 , k ) + P 2 - - - ( 9 )
In formula,Cost accumulation results in representing second iteration, on r scanning direction lines;
Finally, the cost accumulation results of 8 scanning direction lines are added, as shown in formula (10);Through iteration twice, on image Between each pixel, with view picture image other all of pixels, all there is path, or connection, or non-interconnected, so as to realize Non local cost accumulation, strengthens the reliability of cost accumulation;
S 2 ( p , d ) = S ( p , d ) + Σ r = 1 8 ( L r 2 ( p , d ) - S ( p , d ) ) - - - ( 10 )
In formula, S2In representing second iteration, the total cost accumulation results in 8 directions;A kind of 5 bases according to claim 2 In the non local stereogram dense Stereo Matching method of gradation of image guiding, it is characterised in that in described step 3, three-dimensional point cloud Generating mode is as follows:
It is according to the cost cumulant matrix that step 2 is obtained, tactful using WTA, the parallax of each pixel is calculated, such as formula (11) institute:
d ( p ) = m i n d S 2 ( p , d ) - - - ( 11 )
Wherein, d (p) represents parallax of the pixel p on disparity map;Represent in disparity range, minimum S2Institute Corresponding parallax d;After parallax using all pixels on WTA policy calculation images, you can obtain initial parallax figure;
It is to refer to image with left image, it is possible to obtain the initial parallax figure of left image;Using right image as image is referred to, can obtain The initial parallax figure of right image is obtained, using the method for left and right consistency detection, Mismatching point can be rejected, be obtained the parallax of amendment Figure, as shown in formula (12):
d ( p ) = d ( p l ) i f | d ( p l ) - d ( p r ) | ≤ 1 I n v a l i d i f | d ( p l ) - d ( p r ) | > 1 - - - ( 12 )
In formula, d (pl) represent parallax of the pixel p in left image;d(pr) represent parallax of the pixel p in right image;Invalid is represented Invalid parallax;If same pixel differs by more than 1 pixel in the parallax of left image and right image, then it is assumed that be error hiding Point, is rejected;
According to the disparity map of amendment, the same place of each valid pixel can be obtained, inside and outside identical point coordinates and image The element of orientation, can generate corresponding three-dimensional point cloud, as shown in formula (13):
u 1 v 1 w 1 = R 1 x 1 y 1 - f u 2 v 2 w 2 = R 2 x 2 y 2 - f B u = X S 2 - X S 1 B v = Y S 2 - Y S 1 B w = Z S 2 - Z S 1 N 1 = B u w 2 - B w u 2 u 1 w 2 - u 2 w 1 N 2 = B u w 1 - B w u 1 u 1 w 2 - u 2 w 1 1. U 1 V 1 W 1 = N 1 u 1 v 1 w 1 U 2 V 2 W 2 = N 2 u 2 v 2 w 2 X = X S 1 + U 1 = X S 2 + U 2 Y = Y S 1 + V 1 = Y S 2 + V 2 Z = Z S 1 + W 1 = Z S 2 + W 2 - - - ( 13 )
In formula, X, Y, Z represent the three-dimensional coordinate of object space point;xi、yi(i=1,2) represent corresponding image points coordinate;F represents focal length;Ri (i=1,2) represent spin matrix;Xsi、Ysi、Zsi(i=1,2) represent exterior orientation line element.
CN201610933540.4A 2016-10-31 2016-10-31 Non local stereopair dense matching method based on image gray scale guiding Pending CN106530337A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610933540.4A CN106530337A (en) 2016-10-31 2016-10-31 Non local stereopair dense matching method based on image gray scale guiding

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610933540.4A CN106530337A (en) 2016-10-31 2016-10-31 Non local stereopair dense matching method based on image gray scale guiding

Publications (1)

Publication Number Publication Date
CN106530337A true CN106530337A (en) 2017-03-22

Family

ID=58291597

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610933540.4A Pending CN106530337A (en) 2016-10-31 2016-10-31 Non local stereopair dense matching method based on image gray scale guiding

Country Status (1)

Country Link
CN (1) CN106530337A (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107170000A (en) * 2017-04-18 2017-09-15 武汉市工程科学技术研究院 The stereopsis dense Stereo Matching method optimized based on global block
CN107240149A (en) * 2017-06-14 2017-10-10 广东工业大学 Object dimensional model building method based on image procossing
CN107506782A (en) * 2017-07-06 2017-12-22 武汉市工程科学技术研究院 Dense Stereo Matching method based on confidence weight bilateral filtering
CN107578429A (en) * 2017-07-14 2018-01-12 武汉市工程科学技术研究院 Stereopsis dense Stereo Matching method based on Dynamic Programming and global cost cumulative path
CN108288286A (en) * 2018-01-17 2018-07-17 视缘(上海)智能科技有限公司 A kind of half global solid matching method preferential based on surface orientation
CN108830798A (en) * 2018-04-23 2018-11-16 西安电子科技大学 Improved image denoising method based on propagation filter
CN110060283A (en) * 2019-04-17 2019-07-26 武汉大学 It is a kind of to estimate half global dense Stereo Matching algorithm more
CN111462195A (en) * 2020-04-09 2020-07-28 武汉大学 Irregular angle direction cost aggregation path determination method based on mainline constraint
CN112149674A (en) * 2020-09-02 2020-12-29 珠海格力电器股份有限公司 Image processing method and device
CN112163622A (en) * 2020-09-30 2021-01-01 山东建筑大学 Overall situation and local fusion constrained line segment feature matching method for aviation wide-baseline stereopair
CN112435267A (en) * 2020-11-17 2021-03-02 哈尔滨工程大学 Disparity map calculation method for high-resolution urban satellite stereo image
CN113554102A (en) * 2021-07-27 2021-10-26 高小翎 Aviation image DSM matching method for cost calculation dynamic programming
CN116957524A (en) * 2023-09-21 2023-10-27 青岛阿斯顿工程技术转移有限公司 Talent information intelligent management method and system in technology transfer process

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103971366A (en) * 2014-04-18 2014-08-06 天津大学 Stereoscopic matching method based on double-weight aggregation
CN104850850A (en) * 2015-04-05 2015-08-19 中国传媒大学 Binocular stereoscopic vision image feature extraction method combining shape and color
CN105160702A (en) * 2015-08-20 2015-12-16 武汉大学 Stereoscopic image dense matching method and system based on LiDAR point cloud assistance
CN105205808A (en) * 2015-08-20 2015-12-30 武汉大学 Multi-vision image dense coupling fusion method and system based on multiple characteristics and multiple constraints
CN105466399A (en) * 2016-01-11 2016-04-06 中测新图(北京)遥感技术有限责任公司 Quick semi-global dense matching method and device
US20160125258A1 (en) * 2014-01-14 2016-05-05 Microsoft Technology Licensing, Llc Stereo image processing using contours

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160125258A1 (en) * 2014-01-14 2016-05-05 Microsoft Technology Licensing, Llc Stereo image processing using contours
CN103971366A (en) * 2014-04-18 2014-08-06 天津大学 Stereoscopic matching method based on double-weight aggregation
CN104850850A (en) * 2015-04-05 2015-08-19 中国传媒大学 Binocular stereoscopic vision image feature extraction method combining shape and color
CN105160702A (en) * 2015-08-20 2015-12-16 武汉大学 Stereoscopic image dense matching method and system based on LiDAR point cloud assistance
CN105205808A (en) * 2015-08-20 2015-12-30 武汉大学 Multi-vision image dense coupling fusion method and system based on multiple characteristics and multiple constraints
CN105466399A (en) * 2016-01-11 2016-04-06 中测新图(北京)遥感技术有限责任公司 Quick semi-global dense matching method and device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
XU HUANGA, YONGJUN ZHANGA, ZHAOXI YUEA: "Image-Guided Non-Local Dense Matching with Three-Steps Optimization", 《 REMOTE SENSING & SPATIAL INFORMA》 *
XU HUANGA, YONGJUN ZHANGA, ZHAOXI YUEA: "Image-Guided Non-Local Dense Matching with Three-Steps Optimization", 《REMOTE SENSING & SPATIAL INFORMA》 *

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107170000A (en) * 2017-04-18 2017-09-15 武汉市工程科学技术研究院 The stereopsis dense Stereo Matching method optimized based on global block
CN107170000B (en) * 2017-04-18 2019-09-10 武汉市工程科学技术研究院 Stereopsis dense Stereo Matching method based on the optimization of global block
CN107240149A (en) * 2017-06-14 2017-10-10 广东工业大学 Object dimensional model building method based on image procossing
CN107506782B (en) * 2017-07-06 2020-04-17 武汉市工程科学技术研究院 Dense matching method based on confidence weight bilateral filtering
CN107506782A (en) * 2017-07-06 2017-12-22 武汉市工程科学技术研究院 Dense Stereo Matching method based on confidence weight bilateral filtering
CN107578429A (en) * 2017-07-14 2018-01-12 武汉市工程科学技术研究院 Stereopsis dense Stereo Matching method based on Dynamic Programming and global cost cumulative path
CN108288286A (en) * 2018-01-17 2018-07-17 视缘(上海)智能科技有限公司 A kind of half global solid matching method preferential based on surface orientation
CN108830798A (en) * 2018-04-23 2018-11-16 西安电子科技大学 Improved image denoising method based on propagation filter
CN110060283B (en) * 2019-04-17 2020-10-30 武汉大学 Multi-measure semi-global dense matching method
CN110060283A (en) * 2019-04-17 2019-07-26 武汉大学 It is a kind of to estimate half global dense Stereo Matching algorithm more
CN111462195A (en) * 2020-04-09 2020-07-28 武汉大学 Irregular angle direction cost aggregation path determination method based on mainline constraint
CN111462195B (en) * 2020-04-09 2022-06-07 武汉大学 Irregular angle direction cost aggregation path determination method based on dominant line constraint
CN112149674A (en) * 2020-09-02 2020-12-29 珠海格力电器股份有限公司 Image processing method and device
CN112163622A (en) * 2020-09-30 2021-01-01 山东建筑大学 Overall situation and local fusion constrained line segment feature matching method for aviation wide-baseline stereopair
CN112163622B (en) * 2020-09-30 2022-07-05 山东建筑大学 Global and local fusion constrained aviation wide-baseline stereopair line segment matching method
CN112435267A (en) * 2020-11-17 2021-03-02 哈尔滨工程大学 Disparity map calculation method for high-resolution urban satellite stereo image
CN112435267B (en) * 2020-11-17 2023-03-17 哈尔滨工程大学 Disparity map calculation method for high-resolution urban satellite stereo image
CN113554102A (en) * 2021-07-27 2021-10-26 高小翎 Aviation image DSM matching method for cost calculation dynamic programming
CN116957524A (en) * 2023-09-21 2023-10-27 青岛阿斯顿工程技术转移有限公司 Talent information intelligent management method and system in technology transfer process
CN116957524B (en) * 2023-09-21 2024-01-05 青岛阿斯顿工程技术转移有限公司 Talent information intelligent management method and system in technology transfer process

Similar Documents

Publication Publication Date Title
CN106530337A (en) Non local stereopair dense matching method based on image gray scale guiding
CN104484668B (en) A kind of contour of building line drawing method of the how overlapping remote sensing image of unmanned plane
CN110473217B (en) Binocular stereo matching method based on Census transformation
CN110264567B (en) Real-time three-dimensional modeling method based on mark points
CN103106688B (en) Based on the indoor method for reconstructing three-dimensional scene of double-deck method for registering
CN102693542B (en) Image characteristic matching method
CN107679537B (en) A kind of texture-free spatial target posture algorithm for estimating based on profile point ORB characteristic matching
CN106340036A (en) Binocular stereoscopic vision-based stereo matching method
CN107063228A (en) Targeted attitude calculation method based on binocular vision
CN106485690A (en) Cloud data based on a feature and the autoregistration fusion method of optical image
CN111047695B (en) Method for extracting height spatial information and contour line of urban group
CN104574432B (en) Three-dimensional face reconstruction method and three-dimensional face reconstruction system for automatic multi-view-angle face auto-shooting image
CN107274483A (en) A kind of object dimensional model building method
CN104820991B (en) A kind of multiple soft-constraint solid matching method based on cost matrix
CN102938142A (en) Method for filling indoor light detection and ranging (LiDAR) missing data based on Kinect
CN107067431A (en) A kind of object volume computational methods based on Kinect
CN106709948A (en) Quick binocular stereo matching method based on superpixel segmentation
CN110728671B (en) Dense reconstruction method of texture-free scene based on vision
CN110738618B (en) Irregular stacking volume measuring method based on binocular camera
CN103886611A (en) Image matching method suitable for automatically detecting flight quality of aerial photography
CN107170037A (en) A kind of real-time three-dimensional point cloud method for reconstructing and system based on multiple-camera
CN111462302B (en) Multi-view human body dynamic three-dimensional reconstruction method and system based on depth coding network
CN109523595A (en) A kind of architectural engineering straight line corner angle spacing vision measuring method
CN105631844A (en) Image camera calibration method
CN112991420A (en) Stereo matching feature extraction and post-processing method for disparity map

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20170322