CN104867133A - Quick stepped stereo matching method - Google Patents

Quick stepped stereo matching method Download PDF

Info

Publication number
CN104867133A
CN104867133A CN201510216619.0A CN201510216619A CN104867133A CN 104867133 A CN104867133 A CN 104867133A CN 201510216619 A CN201510216619 A CN 201510216619A CN 104867133 A CN104867133 A CN 104867133A
Authority
CN
China
Prior art keywords
cut zone
matching
pixel
image
disparity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510216619.0A
Other languages
Chinese (zh)
Other versions
CN104867133B (en
Inventor
陈华
张志娟
刘刚
胡春海
刘斌
柳海潮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yanshan University
Original Assignee
Yanshan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yanshan University filed Critical Yanshan University
Priority to CN201510216619.0A priority Critical patent/CN104867133B/en
Publication of CN104867133A publication Critical patent/CN104867133A/en
Application granted granted Critical
Publication of CN104867133B publication Critical patent/CN104867133B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

A quick stepped stereo matching method comprises three steps of gray scale transformation based on a local texture characteristic, parallax restriction based on colorful image segmentation, and matching cost calculation based on a fixed window. Firstly, gray scale transformation is performed a image according to the local texture characteristic of the image for highlighting the structure characteristic and the gray scale characteristic of the image. Then the image is segmented according to a mean-shift algorithm, and the matching cost of a segmented region in a manner that the segmented region with a random size and random shape is used as a supporting window, and forming a parallax restriction based on image segmentation. Finally, stereo matching cost calculation based on a fixed window is performed on the image, thereby obtaining an initial parallax graph. The quick stepped stereo matching method has advantages of improving matching precision and matching efficiency, reducing calculation amount, reducing a parallax searching range, etc.

Description

One is substep solid matching method fast
Technical field
The present invention relates to a kind of computer vision and digital image processing field, especially one substep solid matching method fast.
Background technology
At present, Stereo matching (Stereo Matching) is the basis of theory on computer vision and application, is again a more scabrous problem simultaneously.The target of Stereo matching is from different visual point image, find the corresponding point of coupling, and it is widely used in the numerous areas such as remote sensing images, medical image, three-dimensionalreconstruction, robot vision and image measurement.There is a lot of Chinese scholars to conduct in-depth research this field all the time.Wherein, Szeliski and Scharsten is studied some representational Stereo Matching Algorithm and evaluates, and is summed up by all algorithms and is divided into four large steps, and existing Stereo Matching Algorithm is divided into local algorithm and Global Algorithm two class.Generally speaking, Global Algorithm precision is higher, but calculates loaded down with trivial details, and setting parameter is complicated; Local algorithm efficiency is high and be easy to realize, but there is the problem of the size and shape of how self-adaptation determination support window always, causes it to be difficult to reach higher precision.Chinese scholars proposes much representative local matching algorithm for this problem, and main summary is divided into following three classes:
The first kind concentrates on selects optimum window as support window in multiple windows given in advance.This algorithm improves the precision of images match to a certain extent, prospect is inhibit to amplify phenomenon (foregroundfattening effect), but because the selection of support window size and shape is defined, lack dirigibility, so be difficult to adapt to changeable picture structure, error hiding rate is still higher, and parallax edge is also clear not.
Equations of The Second Kind concentrates on and is weighted pixel in the support window determining size and shape, or self-adaptation chooses the size and shape of support window.Although this algorithm makes the ambiguousness of mating greatly reduce, the parallax result obtained can match in excellence or beauty with global optimization result, and due to calculation of complex, computing overhead is large, cannot embody the high efficiency of local algorithm.
3rd class concentrates on the similarity measure improving coupling.Zabih proposes the algorithm that a kind of non-parametric transformations carries out Stereo matching.Afterwards, a lot of scholar improved the Stereo matching based on non-parametric transformations, and achieved good effect.Although non-parametric transformations has good capacity switching signal, certain robustness is had to noise and amplitude distortion, and the impact of illumination on pixel grey scale can be balanced, but just simply non-parametric transformations is carried out to gray-scale value originally, do not utilize the cross-correlation information between pixel.
In sum, local matching algorithm has the shortcoming being difficult to take into account matching precision and matching efficiency, so while guarantee Stereo matching precision, the speed improving coupling is most important.
Summary of the invention
The object of the invention is to provide a kind of fast and fractional solid matching method based on local grain characteristic and color images can taking into account matching precision and matching efficiency.
For achieving the above object, have employed following technical scheme, matching process of the present invention comprises the following steps:
(1) input the coloured image of the same scene that two width are taken under different angles, will wherein piece image as with reference to image A 1, another piece image is as target image B 1, suppose that two width images are corrected through polar curve all;
(2) respectively to reference picture A 1with target image B 1carry out the greyscale transformation based on local grain characteristic, obtain gray level image to A 2and B 2;
(3) utilize mean shift algorithm to reference picture A 1carry out color images, adopt zero-mean normalized crosscorrelation metric function, using the cut zone of arbitrary size and shape as support window to gray level image to A 2and B 2carry out Stereo matching, the Matching power flow value in computed segmentation region, Matching power flow value according to cut zone is classified to each color segmentation region, when cut zone Matching power flow is more than or equal to 0.5, cut zone is categorized as reliable matching region, be labeled as 1, when cut zone Matching power flow is less than 0.5, cut zone is categorized as unreliable matching area, is labeled as 0, estimate the disparity range of pixel in regional according to classification results, form the disparity constraint based on color images;
(4) associated class is adopted to estimate normalized crosscorrelation metric function (Normalized cross-correlation respectively, NCC) and distance-like estimate the absolute value of gray scale difference and metric function (Sum of AbsoluteDifference, SAD) to gray level image to A 2and B 2carry out the Stereo matching based on stationary window, and utilize the disparity constraint based on color images to limit disparity search scope, obtain reference picture A 1disparity map.
Compared with prior art, tool of the present invention has the following advantages:
1, based on local grain characteristic greyscale transformation, make full use of the local grain characteristic of image, improve matching precision, the matching precision of especially low texture region.
2, based on the disparity constraint of color images, in the process of carrying out Stereo matching, greatly reduce disparity search scope, reduce calculated amount, improve matching efficiency, also improve the matching precision of occlusion area simultaneously.
3, when not carrying out parallax optimization and self-adapting window selects, no matter be select associated class to estimate NCC, or chosen distance class estimates SAD, all achieves good disparity map, has metric function accommodation widely.
Accompanying drawing explanation
Fig. 1 is the process flow diagram of the inventive method.
Fig. 2 is a) reference picture A in embodiment one 1.
Fig. 2 b) be target image B in embodiment one 1.
Fig. 3 a) be in embodiment one to reference picture A 1carry out the gray level image A based on obtaining after the greyscale transformation of local grain characteristic 2.
Fig. 3 b) be in embodiment one to reference picture A 1the gray level image A obtained after carrying out ordinary gamma conversion 3.
Fig. 3 c) be in embodiment one to target image B 1carry out the gray level image B based on obtaining after the greyscale transformation of local grain characteristic 2.
Fig. 3 d) be in embodiment one to target image B 1the gray level image B obtained after carrying out ordinary gamma conversion 3.
Fig. 4 be in embodiment one to reference picture A 1the image obtained after carrying out color images.
Fig. 5 is a) the initial parallax figure that employing distance-like in embodiment one is estimated SAD and obtained.
Fig. 5 b) be the initial parallax figure that employing associated class in embodiment one is estimated NCC and obtained.
Embodiment
Below in conjunction with accompanying drawing, the present invention will be further described:
Process flow diagram of the present invention as shown in Figure 1, matching process of the present invention comprises the following steps:
(1) as shown in Figure 2, obtain the coloured image of the same scene that two width are taken under different angles and Input matching system, Fig. 2 is a) as with reference to image A 1, Fig. 2 b) and as target image B 1, two width images are corrected through polar curve all;
(2) respectively to reference picture A 1with target image B 1carry out the greyscale transformation based on local grain characteristic, obtain gray level image to A 2and B 2, so that while carrying out greyscale transformation, the local grain characteristic maximally utilising image carrys out architectural feature and the gray feature of outstanding image; As shown in Figure 3, Fig. 3 is a) reference picture A 1through based on the gray level image A after the greyscale transformation of local grain characteristic 2, Fig. 3 b) and be target image B 1through based on the gray level image B after the greyscale transformation of local grain characteristic 2.First coloured image is changed into gray level image A 3and B 3, then carry out the greyscale transformation based on local grain characteristic on this basis, the concrete computation process based on the greyscale transformation of local grain characteristic is as follows:
First, centered by object pixel p, get the mapping window W that a size is m × n, calculate the local grain contrast value Δ m of all pixels in this window, gray-scale value I (p) of center pixel is changed into I (p)+Δ m; Wherein, m is the length of mapping window, and n is the wide of mapping window;
Then, the size of the center pixel gray-scale value after comparing each neighborhood territory pixel and changing, sues for peace all pixel grey scales being greater than center pixel, by summed result assignment to center pixel.
Local grain contrast value Δ m represents the texture features between pixel, and its computing method are:
Add up be designated as sum being greater than I (the p)+gray-scale value of Δ m in window W 1, number is n 1; Be less than I (the p)+gray-scale value of Δ m to add up and be designated as sum 2, number is n 2, then have:
(3) utilize mean shift algorithm to reference picture A 1carry out color images, segmentation result as shown in Figure 4; Then, adopt zero-mean normalized crosscorrelation metric function, using the cut zone of arbitrary size and shape as support window to gray level image to A 2and B 2carry out Stereo matching, the Matching power flow value in computed segmentation region, the disparity search scope of pixel in each color segmentation region is judged according to the scope of the Matching power flow value of cut zone, when cut zone Matching power flow is more than or equal to 0.5, cut zone is categorized as reliable matching region, be labeled as 1, when cut zone Matching power flow is less than 0.5, cut zone is categorized as unreliable matching area, be labeled as 0, estimate the disparity range of pixel in regional according to classification results, form the disparity constraint based on color images, concrete computation process is as follows:
Known A 2and B 2be respectively reference picture and target image through based on the image after the greyscale transformation of local grain characteristic.Define any pair matching area (s p, s q) when suppose parallax be d cut zone Matching power flow be E (s p, s q, d):
E(s p,s q,d)=C zncc(p,q,d)
C zncc ( p , q , d ) = Σ p ∈ s p , q ∈ s q ( p - p ‾ ) · ( q - q ‾ ) Σ p ∈ s p ( p - p ‾ ) 2 Σ q ∈ s q ( q - q ‾ ) 2
Wherein, s pit is reference picture any one cut zone after color segmentation; s qregion corresponding with it in target image, i.e. s p=s q-d; D is the parallax value of hypothesis; P is cut zone s pin any one pixel; Q is region s qin a corresponding with it pixel, i.e. p ∈ s p, q ∈ s qand p=q-d; C zncc(p, q, d) is cut zone s pand corresponding region s qsimilarity, adopt zero-mean normalized crosscorrelation metric function (Zero mean normalized cross-correlation, ZNCC) to estimate this similarity, and it can be used as the Matching power flow E (s of cut zone p, s q, d); cut zone s pin the average of all grey scale pixel values; cut zone s qin the average of all grey scale pixel values;
Utilize local optimization methods (Winner-Take-All) to maximize cut zone Matching power flow and obtain parallax value d s, as the estimating disparity of pixels all in cut zone, that is:
d s=max E(s p,s q,d);d∈[d min,d max]
In formula, d minminimum parallax value, d maxit is maximum disparity value;
Cut zone Matching power flow calculates and only need carry out color segmentation to reference picture.Color images is supposed based on smooth surface, and in the region that namely pixel color values is similar, the depth value of pixel is general also identical, and what these regions were often described is plane scene, and parallax saltus step place is the edge of cut zone.According to this hypothesis, all pixels in same cut zone have identical Matching power flow E (s p, s q, d), namely have identical best parallax ds.
Cut zone Matching power flow E (s p, s q, scope d) is between-1 to 1, represents the similarity degree of two cut zone.Cut zone is more similar, and the Matching power flow of cut zone is higher, and namely matching degree is higher.So, with cut zone Matching power flow for foundation, matching result is classified, and give certain constraint criterion formation disparity constraint to each classification results, limit the disparity search scope of next step coupling.
Classify to the result of cut zone Matching power flow, principle of classification is as cut zone Matching power flow E (s p, s q, when d)>=0.5, cut zone s pbe classified as reliable matching region, be designated as 1; As cut zone Matching power flow E (s p, s q, d) during < 0.5, cut zone s pbe classified as unreliable matching area, be designated as 0; Cut zone is should s herein p, s qit is region corresponding with it;
s p = 1 , E ( s p , s q , d ) &GreaterEqual; 0.5 0 , E ( s p , s q , d ) < 0.5
Estimate that the disparity range of pixel in regional forms the disparity constraint of substep coupling according to classification results, constraint criterion is defined as:
d(s p)∈[down,up]
In formula, down is the minimum value of carrying out disparity range after disparity constraint, and up is the maximal value of carrying out disparity range after disparity constraint;
Work as s pwhen=1,
down &Element; max ( 0 , d s - 2 ) up &Element; min ( d max , d s + 2 )
Work as s pwhen=0,
down &Element; max ( 0 , d s - 0.5 &times; d max ) up &Element; min ( d max , d s + 0.5 &times; d max )
Wherein d (s p) be cut zone s pin all pixel parallax spans.
(4) associated class is adopted to estimate normalized crosscorrelation metric function (Normalized cross-correlation respectively, NCC) and distance-like estimate the absolute value of gray scale difference and metric function (Sum of AbsoluteDifference, SAD) to gray level image to A 2and B 2carry out the Stereo matching based on stationary window, and utilize the disparity constraint based on color images to limit disparity search scope, obtain reference picture A 1disparity map.As shown in Figure 5, Fig. 5 is a) the initial parallax figure adopting distance-like to estimate SAD acquisition, Fig. 5 b) be the initial parallax figure adopting associated class to estimate NCC acquisition.
Above-described embodiment is only be described the preferred embodiment of the present invention; not scope of the present invention is limited; under not departing from the present invention and designing the prerequisite of spirit; the various distortion that those of ordinary skill in the art make technical scheme of the present invention and improvement, all should fall in protection domain that claims of the present invention determines.

Claims (4)

1. fast substep a solid matching method, it is characterized in that, described matching process comprises the following steps:
(1) input the coloured image of the same scene that two width are taken under different angles, will wherein piece image as with reference to image A 1, another piece image is as target image B 1, suppose that two width images are corrected through polar curve all;
(2) respectively to reference picture A 1with target image B 1carry out the greyscale transformation based on local grain characteristic, obtain gray level image to A 2and B 2;
(3) utilize mean shift algorithm to reference picture A 1carry out color images, adopt zero-mean normalized crosscorrelation metric function, using the cut zone of arbitrary size and shape as support window to gray level image to A 2and B 2carry out Stereo matching, the Matching power flow value in computed segmentation region, Matching power flow value according to cut zone is classified to each color segmentation region, when cut zone Matching power flow is more than or equal to 0.5, cut zone is categorized as reliable matching region, be labeled as 1, when cut zone Matching power flow is less than 0.5, cut zone is categorized as unreliable matching area, is labeled as 0, estimate the disparity range of pixel in regional according to classification results, form the disparity constraint based on color images;
(4) associated class is adopted to estimate normalized crosscorrelation metric function (Normalized cross-correlation respectively, NCC) and distance-like estimate the absolute value of gray scale difference and metric function (Sum of AbsoluteDifference, SAD) to gray level image to A 2and B 2carry out the Stereo matching based on stationary window, and utilize the disparity constraint based on color images to limit disparity search scope, obtain reference picture A 1disparity map.
2. one according to claim 1 substep solid matching method fast, it is characterized in that, in step (2), described greyscale transformation, refer to and utilize the local grain contrast value of the texture features represented between pixel to carry out greyscale transformation to image, concrete computation process is as follows:
First, centered by object pixel p, get the mapping window W that a size is m × n, calculate the local grain contrast value Δ m of all pixels in this window, gray-scale value I (p) of center pixel is changed into I (p)+Δ m; Wherein, m is the length of mapping window, and n is the wide of mapping window;
Then, the size of the center pixel gray-scale value after comparing each neighborhood territory pixel and changing, sues for peace all pixel grey scales being greater than center pixel, by summed result assignment to center pixel.
3. one according to claim 1 substep solid matching method fast, it is characterized in that, the concrete computation process of described step (3) is as follows:
Define any pair matching area (s p, s q) when suppose parallax be d cut zone Matching power flow be E (s p, s q, d):
E(s p,s q,d)=C zncc(p,q,d)
C zncc ( p , q , d ) = &Sigma; p &Element; s q , q &Element; s q ( p - p &OverBar; ) &CenterDot; ( q - q &OverBar; ) &Sigma; p &Element; s p ( p - p &OverBar; ) 2 &Sigma; q &Element; s q ( q - q &OverBar; ) 2
Wherein, s pit is reference picture any one cut zone after color segmentation; s qregion corresponding with it in target image, i.e. s p=s q-d; D is the parallax value of hypothesis; P is cut zone s pin any one pixel; Q is region s qin a corresponding with it pixel, i.e. p ∈ s p, q ∈ s qand p=q-d; C zncc(p, q, d) is cut zone s pand corresponding region s qsimilarity, adopt zero-mean normalized crosscorrelation metric function (Zero mean normalized cross-correlation, ZNCC) to estimate this similarity, and it can be used as the Matching power flow E (s of cut zone p, s q, d); cut zone s pin the average of all grey scale pixel values; cut zone s qin the average of all grey scale pixel values;
Utilize local optimization methods to maximize cut zone Matching power flow and obtain parallax value d s, as the estimating disparity of pixels all in cut zone, that is:
d s=maxE(s p,s q,d);d∈[d min,d max]
In formula, d minminimum parallax value, d maxit is maximum disparity value;
Classify to the result of cut zone Matching power flow, principle of classification is as cut zone Matching power flow E (s p, s q, when d)>=0.5, cut zone s pbe classified as reliable matching region, be designated as 1; As cut zone Matching power flow E (s p, s q, d) during < 0.5, cut zone s pbe classified as unreliable matching area, be designated as 0;
s p = 1 , E ( s p , s q , d ) &GreaterEqual; 0.5 0 , E ( s p , s q , d ) < 0.5
Estimate that the disparity range of pixel in regional forms the disparity constraint of substep coupling according to classification results, constraint criterion is defined as:
d(s p)∈[down,up]
In formula, down is the minimum value of carrying out disparity range after disparity constraint, and up is the maximal value of carrying out disparity range after disparity constraint;
Work as s p=1time,
down &Element; max ( 0 , d s - 2 ) up &Element; min ( d max , d s + 2 )
Work as s p=0time,
down &Element; max ( 0 , d s - 0.5 &times; d max ) up &Element; min ( d max , d s + 0.5 &times; d max )
Wherein d (s p) be cut zone s pin all pixel parallax spans.
4. one according to claim 2 substep solid matching method fast, it is characterized in that, local grain contrast value Δ m represents the texture features between pixel, and its computing method are:
Add up be designated as sum being greater than I (the p)+gray-scale value of Δ m in window W 1, number is n 1; Be less than I (the p)+gray-scale value of Δ m to add up and be designated as sum 2, number is n 2, then have:
other.
CN201510216619.0A 2015-04-30 2015-04-30 A kind of quick substep solid matching method Expired - Fee Related CN104867133B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510216619.0A CN104867133B (en) 2015-04-30 2015-04-30 A kind of quick substep solid matching method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510216619.0A CN104867133B (en) 2015-04-30 2015-04-30 A kind of quick substep solid matching method

Publications (2)

Publication Number Publication Date
CN104867133A true CN104867133A (en) 2015-08-26
CN104867133B CN104867133B (en) 2017-10-20

Family

ID=53912948

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510216619.0A Expired - Fee Related CN104867133B (en) 2015-04-30 2015-04-30 A kind of quick substep solid matching method

Country Status (1)

Country Link
CN (1) CN104867133B (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105574875A (en) * 2015-12-18 2016-05-11 燕山大学 Fish-eye image dense stereo algorithm based on polar curve geometry
CN106780711A (en) * 2015-11-18 2017-05-31 深圳先进技术研究院 A kind of solid matching method and device for integrated chip
CN107220997A (en) * 2017-05-22 2017-09-29 成都通甲优博科技有限责任公司 A kind of solid matching method and system
CN107884164A (en) * 2017-09-25 2018-04-06 广西电网有限责任公司电力科学研究院 A kind of breaker spring method for testing performance of NCC SAR algorithms
CN108322724A (en) * 2018-02-06 2018-07-24 上海兴芯微电子科技有限公司 Image solid matching method and binocular vision equipment
CN108513120A (en) * 2017-05-18 2018-09-07 苏州纯青智能科技有限公司 A kind of three-dimensional image matching method based on left and right sight
CN109461181A (en) * 2018-10-17 2019-03-12 北京华捷艾米科技有限公司 Depth image acquisition method and system based on pattern light
CN109840894A (en) * 2019-01-30 2019-06-04 湖北亿咖通科技有限公司 Disparity map refine method, apparatus and storage medium
CN110942102A (en) * 2019-12-03 2020-03-31 武汉大学 Probability relaxation epipolar matching method and system
CN112163622A (en) * 2020-09-30 2021-01-01 山东建筑大学 Overall situation and local fusion constrained line segment feature matching method for aviation wide-baseline stereopair
CN112765390A (en) * 2019-10-21 2021-05-07 南京深视光点科技有限公司 Stereo matching method with double search intervals
DE102022125357A1 (en) 2022-09-30 2024-04-04 Carl Zeiss Industrielle Messtechnik Gmbh Method and surface inspection system as well as computer program and computer-readable storage medium for measuring and/or inspecting surfaces of an object

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140241582A1 (en) * 2013-02-26 2014-08-28 Spinella Ip Holdings, Inc. Digital processing method and system for determination of object occlusion in an image sequence
CN104091339A (en) * 2014-07-17 2014-10-08 清华大学深圳研究生院 Rapid image three-dimensional matching method and device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140241582A1 (en) * 2013-02-26 2014-08-28 Spinella Ip Holdings, Inc. Digital processing method and system for determination of object occlusion in an image sequence
CN104091339A (en) * 2014-07-17 2014-10-08 清华大学深圳研究生院 Rapid image three-dimensional matching method and device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
平兆娜: ""改进非参数变换测度下的立体匹配算法的研究"", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *
殷虎等: ""一种基于彩色图像分割的立体匹配算法"", 《红外技术》 *

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106780711A (en) * 2015-11-18 2017-05-31 深圳先进技术研究院 A kind of solid matching method and device for integrated chip
CN106780711B (en) * 2015-11-18 2020-05-26 深圳先进技术研究院 Stereo matching method and device
CN105574875B (en) * 2015-12-18 2019-02-01 燕山大学 A kind of fish eye images dense stereo matching process based on polar geometry
CN105574875A (en) * 2015-12-18 2016-05-11 燕山大学 Fish-eye image dense stereo algorithm based on polar curve geometry
CN108513120A (en) * 2017-05-18 2018-09-07 苏州纯青智能科技有限公司 A kind of three-dimensional image matching method based on left and right sight
CN107220997A (en) * 2017-05-22 2017-09-29 成都通甲优博科技有限责任公司 A kind of solid matching method and system
CN107220997B (en) * 2017-05-22 2020-12-25 成都通甲优博科技有限责任公司 Stereo matching method and system
CN107884164B (en) * 2017-09-25 2019-07-26 广西电网有限责任公司电力科学研究院 A kind of breaker spring method for testing performance of NCC-SAR algorithm
CN107884164A (en) * 2017-09-25 2018-04-06 广西电网有限责任公司电力科学研究院 A kind of breaker spring method for testing performance of NCC SAR algorithms
CN108322724B (en) * 2018-02-06 2019-08-16 上海兴芯微电子科技有限公司 Image solid matching method and binocular vision equipment
CN108322724A (en) * 2018-02-06 2018-07-24 上海兴芯微电子科技有限公司 Image solid matching method and binocular vision equipment
CN109461181A (en) * 2018-10-17 2019-03-12 北京华捷艾米科技有限公司 Depth image acquisition method and system based on pattern light
CN109461181B (en) * 2018-10-17 2020-10-27 北京华捷艾米科技有限公司 Depth image acquisition method and system based on speckle structured light
CN109840894A (en) * 2019-01-30 2019-06-04 湖北亿咖通科技有限公司 Disparity map refine method, apparatus and storage medium
CN112765390A (en) * 2019-10-21 2021-05-07 南京深视光点科技有限公司 Stereo matching method with double search intervals
CN110942102A (en) * 2019-12-03 2020-03-31 武汉大学 Probability relaxation epipolar matching method and system
CN110942102B (en) * 2019-12-03 2022-04-01 武汉大学 Probability relaxation epipolar matching method and system
CN112163622A (en) * 2020-09-30 2021-01-01 山东建筑大学 Overall situation and local fusion constrained line segment feature matching method for aviation wide-baseline stereopair
CN112163622B (en) * 2020-09-30 2022-07-05 山东建筑大学 Global and local fusion constrained aviation wide-baseline stereopair line segment matching method
DE102022125357A1 (en) 2022-09-30 2024-04-04 Carl Zeiss Industrielle Messtechnik Gmbh Method and surface inspection system as well as computer program and computer-readable storage medium for measuring and/or inspecting surfaces of an object

Also Published As

Publication number Publication date
CN104867133B (en) 2017-10-20

Similar Documents

Publication Publication Date Title
CN104867133A (en) Quick stepped stereo matching method
CN102074014B (en) Stereo matching method by utilizing graph theory-based image segmentation algorithm
US9361699B2 (en) Imaging system and method
US9237326B2 (en) Imaging system and method
CN104517317A (en) Three-dimensional reconstruction method of vehicle-borne infrared images
CN104574366A (en) Extraction method of visual saliency area based on monocular depth map
CN103458261B (en) Video scene variation detection method based on stereoscopic vision
CN103077531B (en) Based on the gray scale Automatic Target Tracking method of marginal information
CN104408460A (en) A lane line detecting and tracking and detecting method
CN104517095B (en) A kind of number of people dividing method based on depth image
CN104463870A (en) Image salient region detection method
EP2757529B1 (en) Systems and methods for 3D data based navigation using descriptor vectors
CN102945551B (en) Graph theory based three-dimensional point cloud data plane extracting method
Hua et al. Extended guided filtering for depth map upsampling
Hulik et al. Fast and accurate plane segmentation in depth maps for indoor scenes
CN104036479A (en) Multi-focus image fusion method based on non-negative matrix factorization
CN102521846B (en) Time-space domain motion segmentation and motion estimation method based on three-dimensional video
CN103714549A (en) Stereo image object segmentation method based on rapid local matching
CN102903111B (en) Large area based on Iamge Segmentation low texture area Stereo Matching Algorithm
CN104156957A (en) Stable and high-efficiency high-resolution stereo matching method
CN104599286A (en) Optical flow based feature tracking method and device
CN102542541B (en) Deep image post-processing method
CN104680553A (en) Bilateral filtration-based variable kernel function target tracing method
CN103985128A (en) Three-dimensional matching method based on color intercorrelation and self-adaptive supporting weight
CN113822352A (en) Infrared dim target detection method based on multi-feature fusion

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
EXSB Decision made by sipo to initiate substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20171020