CN106408596A - Edge-based local stereo matching method - Google Patents

Edge-based local stereo matching method Download PDF

Info

Publication number
CN106408596A
CN106408596A CN201610803492.7A CN201610803492A CN106408596A CN 106408596 A CN106408596 A CN 106408596A CN 201610803492 A CN201610803492 A CN 201610803492A CN 106408596 A CN106408596 A CN 106408596A
Authority
CN
China
Prior art keywords
point
parallax
lattice point
lattice
edge
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610803492.7A
Other languages
Chinese (zh)
Other versions
CN106408596B (en
Inventor
李宏亮
孙文龙
习自
王久圣
廖伟军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN201610803492.7A priority Critical patent/CN106408596B/en
Publication of CN106408596A publication Critical patent/CN106408596A/en
Application granted granted Critical
Publication of CN106408596B publication Critical patent/CN106408596B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4007Interpolation-based scaling, e.g. bilinear interpolation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • G06T2207/10021Stereoscopic video; Stereoscopic image sequence

Abstract

The present invention provides an edge-based local stereo matching method. After an initial disparity map is obtained; grid point sampling is performed on the disparity map; unsteady grid points are deleted, and stable grid points are preserved; Canny edge points are extracted from the image; stable edge points and the stable grid points are merged into a stable point set; and interpolation is performed on the other points according to disparity corresponding to the stable points, so that an accurate dense disparity map is obtained. According to the method provided by the invention, gradient is adopted as matching cost; a fixed window cost aggregation mode is adopted to calculate the matching cost; and the initial disparity map is obtained through left and right consistency detection. The method of the invention has the advantages of low complexity, high speed and high precision, and is suitable for occasions having high requirements for real-time performance.

Description

Sectional perspective matching process based on edge
Technical field
The present invention relates to image processing techniquess, particularly to Stereo Matching Technology.
Background technology
As a critically important branch of computer vision, binocular stereo vision is to pass through to move or rotate to regard by one Two photographic head of video camera or simulation eyes to shoot same width scene for the angle shoot same object, then using vertical Body matching algorithm obtains the parallax between the corresponding picture point of two width images, obtains field in conjunction with anaglyph and camera calibration parameters obtained The three-dimensional coordinate of testee each point in scape, and then reconstruct the three dimensional structure information of scene and obtain the depth value of respective point, This depth value is the actual range between video camera and testee.One complete stereo visual system typically can be by scheming As acquisition, camera calibration, feature extraction, Stereo matching, depth determines and 6 parts of interpolated reconstruction form.Stereo matching is Most important in binocular stereo vision, most crucial problem.What it solved is with reference to figure for any on the same space scenery As how upper subpoint finds the problem of its corresponding point on coupling image.Stereo matching is Visual Navigation of Mobile Robots neck The key technology in domain, main application fields have:The fields such as robot vision, autonomous driving vehicle, measurement.
Most of Stereo Matching Algorithm all includes four steps at present:(1) cost calculates, and (2) cost is polymerized, (3) parallax Calculate and optimize, (4) parallax refinement.On the whole, Stereo Matching Algorithm can be divided into two classes:Sectional perspective matching algorithm and complete Office's Stereo Matching Algorithm.Sectional perspective matching algorithm determines regarding of each point using the color in a window or half-tone information Difference.And overall Stereo Matching Algorithm is based on smoothness assumption, and using energy minimization techniques determine simultaneously a little Parallax value.
Overall Stereo Matching Algorithm utilizes the smooth item data item two item constraint item of image to solve the minimum of integral energy Value.Can be good at solving the error hiding phenomenon of low texture region.The representative of overall Stereo Matching Algorithm has figure to cut method, dynamic rule Method to one's profit, belief propagation algorithm etc..But algorithm operation quantity is big, is not suitable for real-time system.
Sectional perspective matching algorithm only relies only on local message due to each point, by calculating the total coupling in match window Cost, is found the Matching power flow of minimum, to determine parallax using WTA strategy.Can retain according to fixing small size window Texture and marginal information, but disparity map noise ratio is larger;And the coupling of local can be put down using fixing large scale window Sliding, produce prospect bulking effect in depth discontinuity zone, disparity map obscures, the poor effect at edge.Sectional perspective coupling is calculated The representative of method has the absolute value (SAD) of pixel difference, self-adapting window algorithm, adaptive weighting algorithm etc..Although sectional perspective Join arithmetic accuracy and be not so good as overall Stereo Matching Algorithm, but algorithm operation quantity is little, suitable real-time system.
Traditional sectional perspective matching process:Adaptive weighting algorithm utilizes local auto-adaptive to support this center of weight calculation Pixel and the probability belonging to the adjacent pixel of the same area.Because its complexity depends on the size of match window, with window Size becomes quadratic relationship, and self-adapting window speed is slowly it is impossible to meet the requirement of real-time;Though the Stereo Matching Algorithm of stationary window So fast operation, but bad in low texture region treatment effect, prospect bulking effect easily occurs.
At present, the real-time of sectional perspective matching algorithm will height with respect to overall Stereo Matching Algorithm.In sectional perspective Join in algorithm, although the Stereo Matching Algorithm algorithm complex of stationary window is low, there is prospect and expand, depth discontinuity zone The problem of parallax effect difference, although and the Stereo Matching Algorithm of self-adapting window and adaptive weighting greatly reduces depth and does not connect The error hiding in continuous region, but complexity is high, is not suitable for the high occasion of requirement of real-time.
Content of the invention
The technical problem to be solved be to provide a kind of algorithm complex based on edge extracting low it is adaptable to The quick stereo matching process of real-time system.
The present invention be employed technical scheme comprise that by solving above-mentioned technical problem, the sectional perspective match party based on edge Method, comprises the following steps:
1) sectional perspective matching algorithm is utilized to calculate initial parallax figure input left images;
2) disparity map is done with lattice point sampling, lattice point is done with deletion unstable point operation and obtains stablizing lattice point;Unstable point is It is all higher than the lattice point of threshold value with the difference of the parallax value exceeding the lattice point setting quantity about;
3) marginal point is obtained to the Canny rim detection that carries out of input left image, reject isolated marginal point, obtain stablizing side Edge point;Isolated marginal point is that the marginal point number included in its eight neighborhood is less than the marginal point setting a numerical value;
4) containing stablizing the row of lattice point by column scan, between two point of safes is a little row interpolation point to all, when appointing Meaning column direction two neighboring stablizing does not have stabilised edge point between lattice point, then the regarding of this two row interpolation points stablized between lattice point Difference is two linear interpolations stablizing lattice point parallax value;Stablize and between lattice point, have stabilised edge when any column direction is two neighboring Point, then the parallax value of this two row interpolation points stablized between lattice point is to stablize lattice point parallax value and stabilised edge point parallax value Piecewise linear interpolation;
5) lattice point, row interpolation point and stabilised edge point will be stablized as stablizing point set, image will be pressed with row scanning, point of safes Outside set, the parallax value of all unstable points is linearly inserted with two nearest point of safes parallax value of left and right in a line by it It is worth to.
The present invention, after obtaining initial parallax figure, carries out lattice point sampling to disparity map, and deletes unstable lattice point, retains Stable lattice point, to image zooming-out Canny marginal point, by stabilised edge point with stablize lattice point and merge into stable point set, according to The corresponding parallax of these point of safes clicks through row interpolation to remaining, obtains accurate dense disparity map.
Further, the present invention provides a kind of is Matching power flow using gradient, using the cost polymerization methodses of stationary window Calculate Matching power flow, the method obtaining initial parallax figure through left and right consistency detection.
The invention has the beneficial effects as follows, complexity is low, speed is fast, the advantage of high precision, the high field of suitable requirement of real-time Close.
Brief description
Fig. 1:The Stereo matching schematic flow sheet of the present invention.
Specific embodiment
The present invention can be divided into initial parallax to calculate, and stablizes two stages of point interpolation, as shown in Figure 1.
Initial parallax is calculated and specifically can be divided into following four step:
Based on the consistency constraint of binocular vision, left and right consistency desired result is done to horizontal parallax figure, that is, in left disparity map and Same coordinate position parallax in right disparity map is identical, effectively revises to blocking a parallax and doing, obtains initial parallax figure.
Step one:Input corrected left and right two width image, calculate the gradient of left and right two width images.
Step 2:The Matching power flow that gradient is calculated as initial parallax, takes stationary window to each pixel, in window Matching power flow polymerization obtains the polymerization cost of each pixel.The present embodiment pass through calculate left images gradient absolute error and SAD, as Matching power flow, using the window of 7*7 as the cost zone of convergency, is polymerized cost CGradient(x,y,d)
N(x,y)It is the window of the 7*7 at pixel (x, y) place,It is gradient to the right,It is downward gradient, I1(i,j) It is the brightness value at (i, j) z coordinate in left figure, I1(i+d j) is (i+d, the j) brightness value at coordinate in right figure.
Step 3:First on the basis of left figure, according to the plan of winner-take-all WTA (Winner-Take-All) in right figure Slightly find the point of coupling, obtain left disparity map, then in the same way on the basis of right figure, find in left figure and mate most Point, obtain right disparity map:
Calculate the matching confidence of pixel, if credibility is less than threshold value, thinks that this disparity computation is reliable, otherwise recognize For unreliable, believability threshold takes 0.9 here.On the basis of left figure, search for match point in right figure, obtain left disparity map;Equally , on the basis of right figure, search for match point in left figure, obtain right disparity map.Calculate credibility:
Confidence=SAD_min/SAD_min2 (2)
Wherein, Confidence represents credibility, and SAD_min represents minimum Matching power flow, and SAD_min2 represents time Little Matching power flow.
Step 4:Left and right consistency check LRC is done to horizontal parallax figure individual element, obtains initial parallax figure.
It is based on and stablize point interpolation, including following step:
Step one:Lattice point is calculated to image, ranks step-length is 5 pixels.Such as input image size is 320*240, Then lattice point a size of 64*48.The parallax value of lattice point, is done to these lattice points in the parallax value of disparity map respective coordinates first for lattice point Screening, finds stable lattice point.If certain lattice point is larger with most of lattice point diversity ratios about, be considered as noise or The error hiding that the reasons such as the low texture of person cause, this point is considered as unstable point, and remaining lattice point is then to stablize lattice point.Here, for Comparison with most of lattice point differences about is compared with threshold value by pre-setting setting quantitative value.In the present embodiment If the lattice point in the block of the 5*5 centered on certain lattice point and center lattice point parallax value all diversity ratios larger then it is assumed that center Point is unstable lattice point, otherwise it is assumed that being to stablize lattice point.
Step 2:Canny marginal point is calculated to left figure, by marginal point number in the eight neighborhood of statistics marginal point, if little In 2 then it is assumed that being isolated point, reject relatively more isolated marginal point, remaining as stabilised edge point.The parallax of stabilised edge point Parallax value equal to the respective coordinates point of initial parallax figure.Eight neighborhood is centered on current point, upper and lower, left and right, upper left, a left side Under, upper right, the consecutive points in 8 directions of bottom right;
Step 3:To all containing stablizing the row of lattice point by column scan, if two of arbitrary neighborhood stable on column direction There is no stabilised edge point between point, then be referred to as a little row interpolation point between this two point of safes, its parallax value is for two surely The linear interpolation of a parallax value that fixes;If stablizing and run into stabilised edge point between lattice point for adjacent two on column direction, this It is referred to as a little row interpolation point, its parallax value is to stablize lattice point parallax value and stabilised edge point parallax value between two point of safes Piecewise linear interpolation.
Linear interpolation formula is:
dy=d0y*(d1-d0) (3)
Wherein interpolation coefficientdyIt is parallax value during y for impact point vertical coordinate, the vertical seat of two point of safes Mark and parallax are respectively (y0,d0),(y1,d1).
Step 4:Lattice point will be stablized, row interpolation point and stabilised edge point combine composition and stablize point set, and carry out Parallax value interpolation is to obtain dense disparity map.Image is pressed with row scanning, institute between two adjacent point of safes of any line direction The parallax having pixel is obtained by linear interpolation.
Linear interpolation formula is:
dx=d0x*(d1-d0) (4)
Wherein interpolation coefficientdxIt is parallax value during x for impact point abscissa, the abscissa of two point of safes It is respectively (x with parallax0,d0),(x1,d1).

Claims (2)

1. the sectional perspective matching process based on edge is it is characterised in that comprise the following steps:
1) sectional perspective matching algorithm is utilized to calculate initial parallax figure input left images;
2) disparity map is done with lattice point sampling, lattice point is done with deletion unstable point operation and obtains stablizing lattice point;Unstable point is and it The difference exceeding the parallax value of lattice point setting quantity of surrounding is all higher than the lattice point of threshold value;
3) marginal point is obtained to the Canny rim detection that carries out of input left image, reject isolated marginal point, obtain stabilised edge Point;Isolated marginal point is that the marginal point number included in its eight neighborhood is less than the marginal point setting a numerical value;
4) containing stablizing the row of lattice point by column scan, between two point of safes is a little row interpolation point to all, arranges when any Two neighboring stablizing does not have stabilised edge point between lattice point in direction, then the parallax value of this two row interpolation points stablized between lattice point For two linear interpolations stablizing lattice point parallax value;Stablize and between lattice point, have stabilised edge point when any column direction is two neighboring, Then the parallax value of this two row interpolation points stablized between lattice point be stablize lattice point parallax value and stabilised edge point parallax value minute Section linear interpolation;
5) lattice point, row interpolation point and stabilised edge point will be stablized as stablizing point set, and image be pressed with row scanning, stablizes point set Outside the parallax value of all unstable points linear interpolation carried out with two nearest point of safes parallax value of left and right in a line by it obtain Arrive.
2. the sectional perspective matching process based on edge as claimed in claim 1 is it is characterised in that calculate the tool of initial parallax figure Body method is:
Input left images be according to binocular camera parameter correction after image, to input left images calculate gradient respectively, will The Matching power flow that gradient calculates as initial parallax, takes stationary window to each pixel, and the Matching power flow polymerization in window obtains The polymerization cost of each pixel;
In right figure, on the basis of left figure, find the point of coupling according to winner-take-all WTA strategy, obtain left disparity map;With the right side In left figure, on the basis of figure, find the point of coupling according to winner-take-all WTA strategy, obtain right disparity map;
Left and right consistency desired result is done to horizontal parallax figure, effectively revises to blocking a parallax and doing, obtain initial parallax figure.
CN201610803492.7A 2016-09-06 2016-09-06 Sectional perspective matching process based on edge Active CN106408596B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610803492.7A CN106408596B (en) 2016-09-06 2016-09-06 Sectional perspective matching process based on edge

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610803492.7A CN106408596B (en) 2016-09-06 2016-09-06 Sectional perspective matching process based on edge

Publications (2)

Publication Number Publication Date
CN106408596A true CN106408596A (en) 2017-02-15
CN106408596B CN106408596B (en) 2019-06-21

Family

ID=57998529

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610803492.7A Active CN106408596B (en) 2016-09-06 2016-09-06 Sectional perspective matching process based on edge

Country Status (1)

Country Link
CN (1) CN106408596B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108010075A (en) * 2017-11-03 2018-05-08 华南理工大学 A kind of sectional perspective matching process based on multiple features combining
CN109089100A (en) * 2018-08-13 2018-12-25 西安理工大学 A kind of synthetic method of binocular tri-dimensional video
CN110223257A (en) * 2019-06-11 2019-09-10 北京迈格威科技有限公司 Obtain method, apparatus, computer equipment and the storage medium of disparity map
CN110378915A (en) * 2019-07-24 2019-10-25 西南石油大学 A kind of climbing robot obstacle detection method based on binocular vision
CN113112529A (en) * 2021-03-08 2021-07-13 武汉市土地利用和城市空间规划研究中心 Dense matching mismatching point processing method based on region adjacent point search
CN115063619A (en) * 2022-08-18 2022-09-16 北京中科慧眼科技有限公司 Cost aggregation method and system based on binocular stereo matching algorithm
CN116943995A (en) * 2023-09-20 2023-10-27 深圳正实自动化设备有限公司 High-precision dispensing machine evaluation method and system based on data analysis

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102999913A (en) * 2012-11-29 2013-03-27 清华大学深圳研究生院 Local three-dimensional matching method based on credible point spreading
CN103489183A (en) * 2012-10-17 2014-01-01 北京大学深圳研究生院 Local stereo matching method based on edge segmentation and seed point
US20140003704A1 (en) * 2012-06-27 2014-01-02 Imec Taiwan Co. Imaging system and method
CN105513064A (en) * 2015-12-03 2016-04-20 浙江万里学院 Image segmentation and adaptive weighting-based stereo matching method
CN105551035A (en) * 2015-12-09 2016-05-04 深圳市华和瑞智科技有限公司 Stereoscopic vision matching method based on weak edge and texture classification

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140003704A1 (en) * 2012-06-27 2014-01-02 Imec Taiwan Co. Imaging system and method
CN103489183A (en) * 2012-10-17 2014-01-01 北京大学深圳研究生院 Local stereo matching method based on edge segmentation and seed point
CN102999913A (en) * 2012-11-29 2013-03-27 清华大学深圳研究生院 Local three-dimensional matching method based on credible point spreading
CN105513064A (en) * 2015-12-03 2016-04-20 浙江万里学院 Image segmentation and adaptive weighting-based stereo matching method
CN105551035A (en) * 2015-12-09 2016-05-04 深圳市华和瑞智科技有限公司 Stereoscopic vision matching method based on weak edge and texture classification

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108010075A (en) * 2017-11-03 2018-05-08 华南理工大学 A kind of sectional perspective matching process based on multiple features combining
CN108010075B (en) * 2017-11-03 2021-10-26 华南理工大学 Local stereo matching method based on multi-feature combination
CN109089100A (en) * 2018-08-13 2018-12-25 西安理工大学 A kind of synthetic method of binocular tri-dimensional video
CN109089100B (en) * 2018-08-13 2020-10-23 西安理工大学 Method for synthesizing binocular stereo video
CN110223257A (en) * 2019-06-11 2019-09-10 北京迈格威科技有限公司 Obtain method, apparatus, computer equipment and the storage medium of disparity map
CN110378915A (en) * 2019-07-24 2019-10-25 西南石油大学 A kind of climbing robot obstacle detection method based on binocular vision
CN113112529A (en) * 2021-03-08 2021-07-13 武汉市土地利用和城市空间规划研究中心 Dense matching mismatching point processing method based on region adjacent point search
CN115063619A (en) * 2022-08-18 2022-09-16 北京中科慧眼科技有限公司 Cost aggregation method and system based on binocular stereo matching algorithm
CN116943995A (en) * 2023-09-20 2023-10-27 深圳正实自动化设备有限公司 High-precision dispensing machine evaluation method and system based on data analysis
CN116943995B (en) * 2023-09-20 2023-12-08 深圳正实自动化设备有限公司 High-precision dispensing machine evaluation method and system based on data analysis

Also Published As

Publication number Publication date
CN106408596B (en) 2019-06-21

Similar Documents

Publication Publication Date Title
CN106408596B (en) Sectional perspective matching process based on edge
CN108648161B (en) Binocular vision obstacle detection system and method of asymmetric kernel convolution neural network
US6671399B1 (en) Fast epipolar line adjustment of stereo pairs
CN107578430B (en) Stereo matching method based on self-adaptive weight and local entropy
CN108010081B (en) RGB-D visual odometer method based on Census transformation and local graph optimization
CN104820991B (en) A kind of multiple soft-constraint solid matching method based on cost matrix
CN108596975B (en) Stereo matching algorithm for weak texture region
CN105374039B (en) Monocular image depth information method of estimation based on contour acuity
CN106210449B (en) Multi-information fusion frame rate up-conversion motion estimation method and system
CN106651897B (en) Parallax correction method based on super-pixel segmentation
CN102892021B (en) New method for synthesizing virtual viewpoint image
Hua et al. Extended guided filtering for depth map upsampling
CN102831601A (en) Three-dimensional matching method based on union similarity measure and self-adaptive support weighting
CN111105460B (en) RGB-D camera pose estimation method for three-dimensional reconstruction of indoor scene
CN104200453B (en) Parallax image correcting method based on image segmentation and credibility
CN108776989A (en) Low texture plane scene reconstruction method based on sparse SLAM frames
CN104517317A (en) Three-dimensional reconstruction method of vehicle-borne infrared images
CN101765019B (en) Stereo matching algorithm for motion blur and illumination change image
CN113744315B (en) Semi-direct vision odometer based on binocular vision
CN105787932B (en) Solid matching method based on segmentation Cross-Tree
CN110516639B (en) Real-time figure three-dimensional position calculation method based on video stream natural scene
CN108470356A (en) A kind of target object fast ranging method based on binocular vision
CN103826032A (en) Depth map post-processing method
CN104079800A (en) Shaking preventing method for video image in video surveillance
US20130083993A1 (en) Image processing device, image processing method, and program

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant