CN109816711B - Stereo matching method adopting adaptive structure - Google Patents

Stereo matching method adopting adaptive structure Download PDF

Info

Publication number
CN109816711B
CN109816711B CN201910010400.3A CN201910010400A CN109816711B CN 109816711 B CN109816711 B CN 109816711B CN 201910010400 A CN201910010400 A CN 201910010400A CN 109816711 B CN109816711 B CN 109816711B
Authority
CN
China
Prior art keywords
pixel
parallax
adaptive structure
pixels
boundary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910010400.3A
Other languages
Chinese (zh)
Other versions
CN109816711A (en
Inventor
傅予力
赖凯敏
陈维翔
周玉龙
向友君
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN201910010400.3A priority Critical patent/CN109816711B/en
Publication of CN109816711A publication Critical patent/CN109816711A/en
Application granted granted Critical
Publication of CN109816711B publication Critical patent/CN109816711B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses a stereo matching method adopting a self-adaptive structure, which comprises the following steps: respectively calculating an extension range for each pixel in the binocular vision image, and further calculating a self-adaptive structure; randomly initializing a parallax value of each pixel, then calculating a cost value in the self-adaptive structure, and carrying out propagation and random search on the parallax; after the iterative process of k times of propagation and random search, carrying out boundary judgment on the condition of propagation failure, if the parallax difference value between pixels is smaller than a threshold value, judging that no boundary exists between the pixels, if the parallax difference value is larger than the threshold value, judging that a boundary exists between the pixels, and not considering the corresponding boundary pixel of each pixel in the subsequent cost calculation and parallax propagation processes; and after the iteration is finished, a final parallax image is obtained through left and right consistency check. The invention adopts the self-adaptive structure to calculate the cost and carry out the parallax transmission and search process, and adds a boundary judgment mechanism in the subsequent iteration, thereby improving the accuracy and the robustness of the stereo matching.

Description

Stereo matching method adopting adaptive structure
Technical Field
The invention relates to the technical field of computer vision and digital image processing, in particular to a boundary judgment stereo matching method based on a self-adaptive structure and propagation search.
Background
The stereo matching is one of the research hotspots and difficulties of computer vision, and has important application in the aspects of three-dimensional reconstruction, three-dimensional environment perception, stereo navigation, automatic driving, remote sensing image analysis and the like. In recent years, more and more researchers pay attention to and aim at solving the technical problem of stereo matching. The task of stereo matching is to find the matching correspondence between pixel points between images, i.e. to calculate the disparity value. When calculating the parallax value of the pixel point, the boundary factor between the objects in the image greatly affects the accuracy of the stereo matching result, and how to improve the accuracy of the parallax value near the object boundary becomes one of the difficult problems of stereo matching.
In the stereo matching calculation process, the cost value of each pixel of the binocular image under the corresponding parallax is usually calculated, and the cost value is expected to reflect the cost information in the real scene, and how to use the boundary information becomes a key factor influencing the matching accuracy. Therefore, an adaptive structure is applied to cost calculation, and meanwhile, boundary information of pixel points is embedded into a calculation process of stereo matching.
Disclosure of Invention
The invention provides a binocular vision stereo matching method adopting a self-adaptive structure, and aims to achieve higher parallax accuracy and better robustness on the premise of ensuring the calculation speed when calculating the parallax of a binocular vision image, so that the method is more suitable for actual scenes.
The purpose of the invention can be achieved by adopting the following technical scheme:
a method for determining stereo matching based on adaptive structure and boundary of propagation search includes following steps:
calculating the extension range of each pixel in the binocular vision image according to the threshold value related to the variance, and calculating the self-adaptive structure of each pixel in the binocular vision image according to the extension range;
randomly initializing pixel disparity values and calculating cost values in the adaptive structure;
according to the calculated cost value, spreading and randomly searching a disparity value in the self-adaptive structure, and iterating for k times, wherein k is a preset iteration number;
determining a boundary, namely determining that no boundary exists between pixels if the parallax difference between the pixels is smaller than a threshold when the transmission fails, determining that a boundary exists if the parallax difference is larger than the threshold, and removing the pixels determined to have the boundary from a cost calculation range and a parallax transmission range by each pixel in the subsequent transmission process;
and (5) obtaining a final disparity map through left and right consistency check.
Further, the process of calculating the extension range of each pixel in the binocular vision image according to the threshold value related to the variance is as follows:
calculating the extension range of horizontal direction and vertical direction separately for each pixel in the input binocular vision image, and using a quadruple
Figure BDA0001937029030000021
It is shown that,
Figure BDA0001937029030000022
the four elements are sequentially represented as horizontal left, horizontal right, vertical up, and vertical down extensions, respectively, the extensions being determined by comparing the difference in pixel values of three channels between pixels with a threshold, wherein,
Figure BDA0001937029030000023
i.e. the calculation of the horizontal right extension is as follows:
Figure BDA0001937029030000024
Figure BDA0001937029030000025
in the above formula, p and piFor an image pixel, p ═ xp,yp),pi=(xp+i,yp),xp、ypRespectively represent the horizontal and vertical coordinates of the pixel p, L is a preset maximum extension range, and r belongs to [1, L]Integer in the range, (p)1,p2) The indicator function is represented as follows:
Figure BDA0001937029030000031
in the above formula Ic(p1) And Ic(p2) Respectively representing a pixel p1And p2The intensity value, τ, at channel c is determined by the variance values of the three channels R, G, B of the image, as follows:
Figure BDA0001937029030000032
in the above formula
Figure BDA0001937029030000033
Indicating rounding up, t is an integer parameter for adjusting the size of τ, and var (R), var (G) and var (B) indicate variance values of image pixels on three channels of R, G and B, respectively.
Figure BDA0001937029030000034
In a manner of calculation of
Figure BDA0001937029030000035
Similarly, they are only in piThere is a difference in the calculation of (c),
Figure BDA0001937029030000036
corresponding piIs pi=(xp-i,yp),
Figure BDA0001937029030000037
Corresponding piIs pi=(xp,yp-i),
Figure BDA0001937029030000038
Corresponding piIs pi=(xp,yp+i)。
Further, the process of calculating the adaptive structure of each pixel in the binocular vision image from the extended range is as follows:
determining the horizontal range H (p) and the vertical range V (p) of the pixel according to the extended range quadruple, as shown in the following formula:
Figure BDA0001937029030000039
Figure BDA00019370290300000310
in the above formula, x and y represent the horizontal and vertical coordinates of the pixel in the horizontal range and the vertical range, respectively, and the adaptive structure u (p) of the pixel point p is calculated according to h (p) and v (p), as shown in the following formula:
Figure BDA00019370290300000311
in the above formula, q represents a pixel within a vertical range v (p) of the pixel p, and h (q) represents a horizontal range of the pixel q.
Further, the process of randomly initializing the pixel disparity value and calculating the cost value in the adaptive structure is as follows:
firstly, randomly initializing a parallax value of each pixel in a binocular vision image;
calculating the cost value of each pixel according to the parallax value in the self-adaptive structure, wherein the calculation formula is as follows:
Figure BDA0001937029030000041
wherein d ispDenotes a parallax value of a pixel p, u (p) denotes an adaptive structure based on the pixel p, | u (p) | denotes the number of pixels in the adaptive structure u (p), p' denotes a pixel in u (p), f (x)p′,yp′,dp′) Representing the horizontal and vertical coordinates as x respectivelyp′And yp′Has a parallax of dpThe cost value of each.
Further, the function f (x)p′,yp′,dp′) The calculation method of (1) adopts a cost calculation method including a cost calculation method based on a sum of absolute differences (sum of absolute differences), a cost calculation method based on a sum of squared differences (sum of square differences), a cost calculation method based on normalized correlation (normalized correlation), a cost calculation method based on mutual information (mutual information), and the like.
Further, the process of propagating and randomly searching disparity values in the adaptive structure according to the calculated cost value is as follows:
a) traversing each pixel on the binocular vision image, the disparity value of the pixel will propagate to the domain pixels within the adaptive structure as shown in the following equation:
Figure BDA0001937029030000042
in the above formula, p' represents the domain pixel of the pixel p in the adaptive structure, dp′And dpRespectively representing the corresponding parallax values of the pixels p' and p;
b) generating a candidate disparity set by randomly searching each pixel in the binocular vision image, wherein the candidate disparity is generated as shown in the following formula:
Figure BDA0001937029030000043
in the above formula, i is in the range of {1,2maxI, a sequence number indicating a generated candidate parallax, imaxIndicating the number of candidate parallaxes, R, included in a preset set of candidate parallaxesiIs represented by [ -1,1 [ ]]The random numbers obeying uniform distribution in the parallax set, w represents the maximum search range, α is a fixed proportion parameter, and according to the candidate parallax set, the updated parallax formula is as follows:
Figure BDA0001937029030000051
c) and iterating the processes of a) and b) k times, wherein k is the preset iteration number.
Further, the boundary determination process is as follows:
after iterating the parallax propagation process and the parallax random search process for k times, for each pixel in the binocular vision image, if propagation fails, boundary judgment is carried out, and if | d |p-dp′|<μ, then there is no boundary between pixels p and p', if | dp-dp′If/≧ μ, then a boundary exists between pixels p and p', where μ is an empirically derived integer and dp′And dpRespectively representing the disparity values corresponding to pixels p' and p.
Further, the process of obtaining the final disparity map through the left-right consistency check is as follows:
after the boundary determination process, the left image and the right image of the binocular vision image respectively output corresponding disparity maps, and the two disparity maps are subjected to matching correction to obtain a final disparity result.
Compared with the prior art, the invention has the following advantages and effects:
1. the method determines the extension range of the pixel according to the threshold value related to the variance, and further determines the self-adaptive structure of the pixel, so that the matching result is more accurate and the applicability is stronger.
2. The invention combines the self-adaptive structure of the pixel with the parallax transmission and search mechanism, so that the matching speed is higher.
3. The invention provides a boundary judgment process, optimizes the matching process through boundary judgment, can obtain better parallax image boundary effect, and improves the accuracy and robustness of matching.
Drawings
FIG. 1 is a flow chart of a boundary decision stereo matching method based on adaptive structure and propagation search disclosed in the present invention;
fig. 2 is a flowchart of a boundary determination method in the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Examples
The embodiment discloses a stereo matching method adopting a self-adaptive structure, which specifically comprises the following steps:
the method comprises the following steps: a binocular vision image is input. The binocular vision image is divided into a left image and a right image, and after correction, parallax values of corresponding pixels of the two images are different only in the horizontal direction.
Step two: calculating the extension range of each pixel in the binocular vision image, firstly calculating the extension range of the horizontal direction and the vertical direction of each pixel in the input binocular vision image respectively, and using a quadruple
Figure BDA0001937029030000061
It is shown that,
Figure BDA0001937029030000062
the four elements are sequentially represented as horizontal left, horizontal right, vertical up, vertical down extensions, respectively. The extension range is determined by the result of comparing the difference of the three channel pixel values between the pixels with a threshold value. To be provided with
Figure BDA0001937029030000063
I.e. a horizontal right extension for example,
Figure BDA0001937029030000064
Figure BDA0001937029030000065
in the above formula, p and piFor an image pixel, p ═ xp,yp),pi=(xp+i,yp) Wherein x isp,ypRespectively represent the horizontal and vertical coordinates of the pixel p, L is a preset maximum extension range, and r belongs to [1, L]Integer in the range, (p)1,p2) The indicator function is represented as follows:
Figure BDA0001937029030000071
in the above formula Ic(p1) And Ic(p2) Respectively representing a pixel p1And p2The intensity value, τ, at channel c is determined by the variance values of the three channels R, G, B of the image, as follows:
Figure BDA0001937029030000072
in the above formula
Figure BDA0001937029030000073
Indicating rounding up, t is an integer parameter for adjusting the value of τ, and var (r), var (g) and var (b) respectively indicate variance values of three channels of image pixels. The threshold tau related to the variance is determined in a self-adaptive mode according to the pixel value condition of the image, and a more accurate pixel self-adaptive range can be obtained.
Figure BDA0001937029030000074
In a manner of calculation of
Figure BDA0001937029030000075
Similarly, they are only in piThere is a difference in the calculation of (c),
Figure BDA0001937029030000076
corresponding piIs pi=(xp-i,yp),
Figure BDA0001937029030000077
Corresponding piIs pi=(xp,yp-i),
Figure BDA0001937029030000078
Corresponding piIs pi=(xp,yp+i)。
Step three: the adaptive structure of each pixel in the binocular image is calculated from the extended range. From the extended quadruple, the horizontal extent h (p) and the vertical extent v (p) of the pixel p can be determined as follows:
Figure BDA0001937029030000079
Figure BDA00019370290300000710
in the above formula, x and y represent the horizontal and vertical coordinates of the pixel in the horizontal range and the vertical range, respectively, and the adaptive structure u (p) of the pixel p can be calculated according to h (p) and v (p), as shown in the following formula:
Figure BDA00019370290300000711
in the above formula, q represents a pixel within a vertical range v (p) of the pixel p, and h (q) represents a horizontal range of the pixel q.
Step four: the disparity values for each pixel in the binocular vision image are randomly initialized. The cost value of each pixel in the subsequent process will be calculated from the disparity value in the adaptive structure as shown in the following formula:
Figure BDA0001937029030000081
wherein d ispDenotes a parallax value of a pixel p, u (p) denotes an adaptive structure based on the pixel p, | u (p) | denotes the number of pixels in the adaptive structure u (p), p' denotes a pixel in u (p), f (x)p′,yp′,dp′) Representing the horizontal and vertical coordinates as x respectivelyp′And yp′Has a parallax of dpThe cost value of each. Function f (x)p′,yp′,dp′) The calculation method of (2) can adopt various mainstream cost calculation methods, such as sum of absolute differences (sum of absolute differences), sum of squared differences (sum of square differences), normalized correlation (normalization-correlation), mutual information (mutual information), and the like. The pixels in the adaptive structure have higher probability and also belong to the same structure in a real scene, and the pixels have the same parallax, so that the cost of each pixel calculated in the adaptive structure is more accurate.
Step five: and (5) iterating the parallax transmission process and the random parallax search process for the binocular vision image for k times, wherein k is the preset iteration number.
Wherein the disparity propagation process traverses each pixel on the image, each pixel updating the disparity value through a propagation mechanism. The disparity value of the pixel point is transmitted to the field pixel in the adaptive structure, and when the cost of the neighborhood pixel obtained in the adaptive structure by the transmitted disparity value is smaller, the disparity value of the field pixel is updated, as shown in the following formula:
Figure BDA0001937029030000082
in the above formula, p' represents the domain pixel of the pixel p in the adaptive structure, dp′And dpRespectively representing the disparity values corresponding to pixels p' and p. The parallax value is transmitted in the self-adaptive structure, so that the parallax value is more accurate, and the iterative convergence speed of the parallax is higher.
Wherein the random search disparity process updates the disparity by generating a set of candidate disparities. Since it is easy to fall into local optimum by only depending on the propagation mechanism to update the disparity value, a random search method is used to generate more candidate disparities, so as to construct a candidate disparity set, and the candidate disparity generation method is shown as the following formula:
Figure BDA0001937029030000091
in the above formula, i is in the range of {1,2maxI, a sequence number indicating a generated candidate parallax, imaxIndicating the number of candidate parallaxes, R, included in a preset set of candidate parallaxesiIs represented by [ -1,1 [ ]]The random numbers within, which are subject to uniform distribution, w represents the maximum search range, and α is a fixed scaling parameter. According to the above formula, a series of parallax candidates can be generated
Figure BDA0001937029030000092
To obtain
Figure BDA0001937029030000093
Then the pixel is calculated at the parallax value of
Figure BDA0001937029030000094
And then determining whether to update the disparity value of the current pixel according to the cost value. The update disparity formula is as follows:
Figure BDA0001937029030000095
as shown in the above equation, when the cost value of a pixel obtained in the adaptive structure with the candidate parallax is smaller, the parallax value of the pixel is updated to be the candidate parallax value. The parallax is updated through the process of randomly searching the parallax, so that the condition that the parallax value is in the local optimum can be avoided, and the parallax value is more accurate.
Step six: and (5) a boundary judgment process. And after iterating the parallax propagation process and the parallax random search process for k times, continuing to perform the iteration process. With reference to fig. 2, in the iterative process, if the propagation disparity fails, the boundary is determined, and if | dp-dp′|<Mu, then the pixels p and p' are determined to belong to the same structure, if | dp-dp′If | ≧ μ, it is determined that a boundary exists between pixels p and p'.
Where μ is an empirically derived integer, for example, μ is 3 or 4. And after recording the boundary information, continuing the processes of iterative propagation of parallax and random search of parallax. When the cost values of p and p' are calculated in the self-adaptive structure, the calculation range of the corresponding boundary pixel is removed by each pixel; during the propagation, each pixel will remove the corresponding boundary pixel out of the propagation range. And m is a preset iteration number, and when the iteration number reaches m, the boundary judgment process and the iteration process are terminated. Boundary pixels are found through a boundary process and are applied to cost calculation and parallax transmission, a better parallax image boundary effect is obtained, and matching accuracy and robustness are improved.
Step seven: and outputting a parallax result after left and right consistency check. After the boundary determination process, the left image and the right image of the binocular vision image respectively output corresponding disparity maps, matching correction is carried out on the two disparity maps, and pixels with inconsistent left and right disparity maps are corrected to obtain a final disparity result.
Through the description of the technical scheme, the invention can be seen that the self-adaptive structure of the pixels is determined through the threshold value related to the variance to obtain the structural information of each pixel, the parallax value is spread and randomly searched in the structure to continuously update the parallax of the pixels, and finally, a boundary judgment mechanism is introduced to improve the matching accuracy, improve the boundary effect of a stereo matching result and enable the stereo matching result to be more suitable for an actual scene.
The above embodiments are preferred embodiments of the present invention, but the present invention is not limited to the above embodiments, and any other changes, modifications, substitutions, combinations, and simplifications which do not depart from the spirit and principle of the present invention should be construed as equivalents thereof, and all such changes, modifications, substitutions, combinations, and simplifications are intended to be included in the scope of the present invention.

Claims (7)

1. A method for determining stereo matching based on adaptive structure and boundary of propagation search is characterized in that the stereo matching method comprises the following steps:
calculating the extension range of each pixel in the binocular vision image according to the threshold value related to the variance, and calculating the self-adaptive structure of each pixel in the binocular vision image according to the extension range, wherein the process is as follows:
calculating the extension range of horizontal direction and vertical direction separately for each pixel in the input binocular vision image, and using a quadruple
Figure FDA0002633134220000011
It is shown that,
Figure FDA0002633134220000012
the four elements are sequentially represented as horizontal left, horizontal right, vertical up, and vertical down extensions, respectively, the extensions being determined by comparing the difference in pixel values of three channels between pixels with a threshold, wherein,
Figure FDA0002633134220000013
i.e. the calculation of the horizontal right extension is as follows:
Figure FDA0002633134220000014
Figure FDA0002633134220000015
in the above formula, p and piFor an image pixel, p ═ xp,yp),pi=(xp+i,yp),xp、ypRespectively represent the horizontal and vertical coordinates of the pixel p, L is a preset maximum extension range, and r belongs to [1, L]Integer in the range, (p)1,p2) The indicator function is represented as follows:
Figure FDA0002633134220000016
in the above formula Ic(p1) And Ic(p2) Respectively representing a pixel p1And p2The intensity value, τ, at channel c is determined by the variance values of the three channels R, G, B of the image, as follows:
Figure FDA0002633134220000017
in the above formula
Figure FDA0002633134220000018
Indicating rounding up, t is an integer parameter for adjusting the size of tau, and Var (R), Var (G) and Var (B) respectively indicate variance values of image pixels on three channels of R, G and B;
randomly initializing pixel disparity values and calculating cost values in the adaptive structure;
according to the calculated cost value, spreading and randomly searching a disparity value in the self-adaptive structure, and iterating for k times, wherein k is a preset iteration number;
determining a boundary, namely determining that no boundary exists between pixels if the parallax difference between the pixels is smaller than a threshold when the transmission fails, determining that a boundary exists if the parallax difference is larger than the threshold, and removing the pixels determined to have the boundary from a cost calculation range and a parallax transmission range by each pixel in the subsequent transmission process;
and (5) obtaining a final disparity map through left and right consistency check.
2. The method of claim 1, wherein the adaptive structure of each pixel in the binocular vision image is calculated by the extended range as follows:
determining the horizontal range H (p) and the vertical range V (p) of the pixel according to the extended range quadruple, as shown in the following formula:
Figure FDA0002633134220000021
Figure FDA0002633134220000022
in the above formula, x and y represent the horizontal and vertical coordinates of the pixel in the horizontal range and the vertical range, respectively, and the adaptive structure u (p) of the pixel point p is calculated according to h (p) and v (p), as shown in the following formula:
Figure FDA0002633134220000023
in the above formula, q represents a pixel within a vertical range v (p) of the pixel p, and h (q) represents a horizontal range of the pixel q.
3. The method of claim 1, wherein the process of initializing pixel disparity values randomly and calculating cost values in the adaptive structure is as follows:
firstly, randomly initializing a parallax value of each pixel in a binocular vision image;
calculating the cost value of each pixel according to the parallax value in the self-adaptive structure, wherein the calculation formula is as follows:
Figure FDA0002633134220000031
wherein d ispDenotes a parallax value of a pixel p, u (p) denotes an adaptive structure based on the pixel p, | u (p) | denotes the number of pixels in the adaptive structure u (p), p' denotes a pixel in u (p), f (x)p′,yp′,dp′) Representing the horizontal and vertical coordinates as x respectivelyp′And yp′Has a parallax of dpThe cost value of each.
4. The method of claim 3, wherein the function f (x) is a function of a boundary decision stereo matching based on adaptive structure and propagation searchp′,yp′,dp′) The calculation adopts the following cost calculation methods, including a cost calculation method based on the sum of absolute values of gray differences, a cost calculation method based on the sum of squared gray differences, a cost calculation method based on normalized correlation, or a cost calculation method based on mutual information.
5. The method of claim 3, wherein the propagation and random search of disparity values in the adaptive structure according to the calculated cost values is performed by the following steps:
a) traversing each pixel on the binocular vision image, the disparity value of the pixel will propagate to the domain pixels within the adaptive structure as shown in the following equation:
Figure FDA0002633134220000032
in the above formula p' represents the neighborhood pixel of the pixel p within the adaptive structure, dp′And dpRespectively representing the corresponding parallax values of the pixels p' and p;
b) generating a candidate disparity set by randomly searching each pixel in the binocular vision image, wherein the candidate disparity is generated as shown in the following formula:
Figure FDA0002633134220000041
in the above formula, i is e {1,2, …, imaxI, a sequence number indicating a generated candidate parallax, imaxIndicating the number of candidate parallaxes, R, included in a preset set of candidate parallaxesiIs represented by [ -1,1 [ ]]The random numbers obeying uniform distribution in the parallax set, w represents the maximum search range, α is a fixed proportion parameter, and according to the candidate parallax set, the updated parallax formula is as follows:
Figure FDA0002633134220000042
c) and iterating the processes of a) and b) k times, wherein k is the preset iteration number.
6. The method of claim 1, wherein the boundary decision stereo matching is performed by the following steps:
after iterating the parallax propagation process and the parallax random search process for k times, for each pixel in the binocular vision image, if propagation fails, boundary judgment is carried out, and if | d |p-dp′|<μ, then there is no boundary between pixels p and p', if | dp-dp′If/≧ μ, then a boundary exists between pixels p and p', where μ is an empirically derived integer and dp′And dpRespectively representing the disparity values corresponding to pixels p' and p.
7. The method of claim 1, wherein the left-right consistency check is performed to obtain a final disparity map as follows:
after the boundary determination process, the left image and the right image of the binocular vision image respectively output corresponding disparity maps, and the two disparity maps are subjected to matching correction to obtain a final disparity result.
CN201910010400.3A 2019-01-07 2019-01-07 Stereo matching method adopting adaptive structure Active CN109816711B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910010400.3A CN109816711B (en) 2019-01-07 2019-01-07 Stereo matching method adopting adaptive structure

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910010400.3A CN109816711B (en) 2019-01-07 2019-01-07 Stereo matching method adopting adaptive structure

Publications (2)

Publication Number Publication Date
CN109816711A CN109816711A (en) 2019-05-28
CN109816711B true CN109816711B (en) 2020-10-27

Family

ID=66603914

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910010400.3A Active CN109816711B (en) 2019-01-07 2019-01-07 Stereo matching method adopting adaptive structure

Country Status (1)

Country Link
CN (1) CN109816711B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112765390A (en) * 2019-10-21 2021-05-07 南京深视光点科技有限公司 Stereo matching method with double search intervals

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103226821A (en) * 2013-04-27 2013-07-31 山西大学 Stereo matching method based on disparity map pixel classification correction optimization
CN103440681A (en) * 2013-09-12 2013-12-11 浙江工业大学 Non-contact nondestructive omnibearing three-dimensional modeling method
CN104318576A (en) * 2014-11-05 2015-01-28 浙江工业大学 Super-pixel-level image global matching method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170047780A (en) * 2015-10-23 2017-05-08 한국전자통신연구원 Low-cost calculation apparatus using the adaptive window mask and method therefor

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103226821A (en) * 2013-04-27 2013-07-31 山西大学 Stereo matching method based on disparity map pixel classification correction optimization
CN103440681A (en) * 2013-09-12 2013-12-11 浙江工业大学 Non-contact nondestructive omnibearing three-dimensional modeling method
CN104318576A (en) * 2014-11-05 2015-01-28 浙江工业大学 Super-pixel-level image global matching method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Adaptive Rank Transform for Stereo Matching;Ge Zhao 等;《International Conference on Intelligent Robotics and Applications》;20111231;95-104 *
利用自适应窗口实现不连续保护立体匹配;卢阿丽 等;《光学 精密工程》;20090930;第17卷(第9期);2328-2335 *
基于改进梯度和自适应窗口的立体匹配算法;祝世平 等;《光学学报》;20150131;第35卷(第1期);0110003-1-0110003-9 *

Also Published As

Publication number Publication date
CN109816711A (en) 2019-05-28

Similar Documents

Publication Publication Date Title
US8953874B2 (en) Conversion of monoscopic visual content using image-depth database
CN109308719B (en) Binocular parallax estimation method based on three-dimensional convolution
CN102665086B (en) Method for obtaining parallax by using region-based local stereo matching
CN110189294B (en) RGB-D image significance detection method based on depth reliability analysis
US9769460B1 (en) Conversion of monoscopic visual content to stereoscopic 3D
CN102263957B (en) Search-window adaptive parallax estimation method
CN111223059B (en) Robust depth map structure reconstruction and denoising method based on guide filter
CN104331890B (en) A kind of global disparity method of estimation and system
CN109410266A (en) Stereo Matching Algorithm based on four mould Census transformation and discrete disparity search
Xu et al. Depth-aided exemplar-based hole filling for DIBR view synthesis
CN103268604B (en) Binocular video depth map acquiring method
CN115222889A (en) 3D reconstruction method and device based on multi-view image and related equipment
CN109816711B (en) Stereo matching method adopting adaptive structure
CN115511759A (en) Point cloud image depth completion method based on cascade feature interaction
CN108805841B (en) Depth map recovery and viewpoint synthesis optimization method based on color map guide
CN112991421B (en) Robot vision stereo matching method
CN106097336B (en) Front and back scape solid matching method based on belief propagation and self similarity divergence measurement
CN104408710B (en) Global parallax estimation method and system
CN107204013B (en) Method and device for calculating pixel point parallax value applied to binocular stereo vision
CN114299132A (en) Semi-global stereo matching method adopting cost fusion and hierarchical matching strategy
CN109285186A (en) A kind of dynamic vehicle distance measurement method and device based on improvement bat algorithm
CN108230382B (en) Stereo matching algorithm based on Spearman correlation coefficient and dynamic programming fusion
TWM529333U (en) Embedded three-dimensional image system
Wang et al. A novel depth propagation algorithm with color guided motion estimation
CN109801324B (en) Inclined surface neighbor propagation stereo matching method insensitive to light intensity

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant