CN107392950A - A kind of across yardstick cost polymerization solid matching method based on weak skin texture detection - Google Patents
A kind of across yardstick cost polymerization solid matching method based on weak skin texture detection Download PDFInfo
- Publication number
- CN107392950A CN107392950A CN201710631310.7A CN201710631310A CN107392950A CN 107392950 A CN107392950 A CN 107392950A CN 201710631310 A CN201710631310 A CN 201710631310A CN 107392950 A CN107392950 A CN 107392950A
- Authority
- CN
- China
- Prior art keywords
- mrow
- msup
- mtd
- msub
- munder
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
- G06T7/337—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Magnetic Resonance Imaging Apparatus (AREA)
Abstract
A kind of across yardstick cost polymerization solid matching method based on weak skin texture detection of the present invention belongs to computer vision field, more particularly to the solid matching method to weak texture image, comprises the following steps:Two width coloured images are inputted, two described width coloured images are respectively left image and right image, and weak skin texture detection and segmentation are carried out to picture using the gradient information of left image;Matching power flow is calculated according to the colouring information and gradient information of left image and right image;On the basis of weak skin texture detection and segmentation result in above-mentioned, the interior yardstick based on gaussian filtering and the polymerization of across yardstick cost are carried out;Take policy calculation parallax entirely using the person of winning;Parallax is refined using left and right consistency detection and the method based on adaptive weighting, exports anaglyph.The present invention is realized on the premise of texture region matching accuracy is ensured, improves weak texture region matching accuracy, obtains the technical purpose of more preferable disparity map.
Description
Technical field
A kind of across yardstick cost polymerization solid matching method based on weak skin texture detection of the present invention belongs to computer vision neck
Domain, more particularly to the solid matching method to weak texture image.
Background technology
Binocular stereo vision (BinocularStereoVision) is a kind of important form of computer vision, and it is base
In principle of parallax and two images of the imaging device from different position acquisition testees are utilized, by calculating image corresponding points
Between position deviation, to obtain the method for object dimensional geological information.And the quality that three-dimensional information obtains depends primarily on solid
The height of matching gained disparity map accuracy.The problem of Stereo matching is present at present mainly has uneven illumination, over-exposed etc. outer
Boundary's factor, and picture can exist and block, weak texture, repeat picture itself feature that the computer such as texture is difficult to differentiate between in itself.
Although a large amount of scholars to three-dimensional matching correct for many years, the matching for weak texture region is still the one of image processing field
Individual difficult point.On the premise of how ensureing texture region matching accuracy, weak texture region matching accuracy is improved, is obtained more preferable
Disparity map is a significant problem.
The content of the invention
The invention provides a kind of across yardstick cost based on weak skin texture detection to polymerize solid matching method, can ensure
On the premise of texture region matching accuracy, weak texture region matching accuracy is improved, obtains more preferable disparity map.
The object of the present invention is achieved like this:
A kind of across yardstick cost polymerization solid matching method based on weak skin texture detection, comprises the following steps:
Step a, two width coloured images are inputted, two described width coloured images are respectively left image and right image, utilize a left side
The gradient information of image carries out weak skin texture detection and segmentation to picture;
Step b, Matching power flow is calculated according to the colouring information and gradient information of left image and right image;
Step c, on the basis of weak skin texture detection and segmentation result in step a, the interior yardstick based on gaussian filtering is carried out
It polymerize with across yardstick cost;
Step d, policy calculation parallax is taken entirely using the person of winning;
Step e, the method using left and right consistency detection and based on adaptive weighting is refined to parallax, exports disparity map
Picture.
Described a kind of across yardstick cost polymerization solid matching method based on weak skin texture detection, to figure in the step a
Piece carries out weak skin texture detection and segmentation is specially:
It is that the Grad of (x, y) place pixel is g (x, y) to calculate left image coordinate, and with Grads threshold gTCompare, judge
Whether it is weak texture region, its calculation formula is:
G (x, y) < gT
In formula:N (x, y) represents the window centered on pixel (x, y), and M represents the number of pixel in window, I (x, y) table
Show the gray value of pixel.
A kind of described across yardstick cost based on weak skin texture detection polymerize solid matching method, is calculated in the step b
Matching power flow is specially:
Three-dimensional color image is calculated to left image ILWith right image IRMatching power flow C (p, d), its calculation formula is:
C (p, d)=(1- α) CAD(p,d)+α·(Cgrad_x(p,d)+Cgrad_y(p,d))
In formula:P is a bit in left image, and i=R in formula, G, B represent three passages of coloured image, T respectivelyADAnd Tgrad
The interceptive value of color and gradient is represented respectively;Gradient operator of the picture in x, y direction is represented respectively;α is colour-difference
Balance factor between gradient difference.
A kind of described across yardstick cost polymerization solid matching method based on weak skin texture detection, cost in the step c
Polymerization is specially:
Wherein,The Matching power flow after polymerization is represented, z is desired optimization target values, and W is gaussian filtering core, and N is pixel
P neighborhood window, q are p neighborhood territory pixel points;S ∈ { 0,1 ..., S } are scale parameter, during s=0, C0The original chi of representative image
Spend Matching power flow;The polymerization cost of S+1 yardstick of representative image;
In formula, λ is regularization factors,WithRepresent
The optimization object function of formula (11), orderHave:
Wherein, ThighAnd TlowThe texture region hereinbefore detected and weak texture region are represented respectively;C1And C1/2Respectively
The Matching power flow of original image yardstick and half yardstick is represented, gaussian filtering is carried out with different size of window, after fusion
To final Matching power flow.
A kind of described across yardstick cost polymerization solid matching method based on weak skin texture detection, parallax in the step e
Refine specially:
|D'L(P)-D'R(P-D'R(P)) | < δ
DLRC(P)=min (D'(PL), D'(PR))
Wherein, the left figure parallax value D' of the point p in disparity mapLAnd right figure parallax value D' (p)R(p-D'L(p)), δ LRC
Threshold value;D ' (PL) is the parallax value of first unshielding point in left side, and D (PR) is the parallax of first unshielding point on right side
Value;WBpq(IL) be left image function, Δ cpqWith Δ spqRespectively left image midpoint p and q heterochromia and space are European
Distance,WithThe respectively adjustment parameter of heterochromia and distance difference;Dw(p) filtered image.
Beneficial effect:
A kind of across yardstick cost polymerization solid matching method based on weak skin texture detection of the present invention, the present invention proposes a kind of new
Stereo Matching Algorithm, whether belong to weak texture for picture region, choose appropriate matching process, so as to improve Stereo matching
Accuracy, obtain more preferable disparity map.
Using the Stereo matching image pair of present embodiment algorithm process, in the texture region of picture and weak texture region all
Preferable effect can be obtained, erroneous matching rate decreases (than without low 5%) of weak texture region partitioning algorithm.Illustrate this
Embodiment algorithm can improve weak texture region matching accuracy, obtain on the premise of texture region matching accuracy is ensured
Obtain more preferable disparity map.
Brief description of the drawings
Fig. 1 is a kind of across yardstick cost polymerization solid matching method flow chart based on weak skin texture detection.
Fig. 2 is Bowling1 disparity maps.
Fig. 3 is Lampshade1 disparity maps.
Fig. 4 is Monopoly disparity maps.
Fig. 5 is Plastic disparity maps.
Embodiment
The specific embodiment of the invention is described in further detail below in conjunction with the accompanying drawings.
Specific embodiment one
A kind of across yardstick cost polymerization solid matching method based on weak skin texture detection, as shown in figure 1, including following step
Suddenly:
Step a, two width coloured images are inputted, two described width coloured images are respectively left image and right image, utilize a left side
The gradient information of image carries out weak skin texture detection and segmentation to picture;
Step b, Matching power flow is calculated according to the colouring information and gradient information of left image and right image;
Step c, on the basis of weak skin texture detection and segmentation result in step a, the interior yardstick based on gaussian filtering is carried out
It polymerize with across yardstick cost;
Step d, policy calculation parallax is taken entirely using the person of winning;
Step e, the method using left and right consistency detection and based on adaptive weighting is refined to parallax, exports disparity map
Picture.
According to above step, four pictures are selected to be contrasted, as shown in Fig. 2, Fig. 3, Fig. 4 and Fig. 5,
In Fig. 2, Fig. 2 (a) is Bowling1 original image left figures;Fig. 2 (b) is the true disparity maps of Bowling1;Fig. 2 (c) is
The weak skin texture detection results of Bowling1;Fig. 2 (d) is Bowling1 final parallax;Fig. 2 (e) is Bowling1 without weak line
Manage the disparity map of detection.
In Fig. 3, Fig. 3 (a) is Lampshade1 original image left figures;Fig. 3 (b) is the true disparity maps of Lampshade1;Fig. 3
(c) it is the weak skin texture detection results of Lampshade1;Fig. 3 (d) is Lampshade1 final parallax;Fig. 3 (e) is Lampshade1
Without the disparity map of weak skin texture detection.
In Fig. 4, Fig. 4 (a) is Monopoly original image left figures;Fig. 4 (b) is the true disparity maps of Monopoly;Fig. 4 (c) is
The weak skin texture detection results of Monopoly;Fig. 4 (d) is Monopoly final parallax;Fig. 4 (e) is Monopoly without weak line
Manage the disparity map of detection.
In Fig. 5, Fig. 5 (a) is Plastic original image left figures;Fig. 5 (b) is the true disparity maps of Plastic;Fig. 5 (c) is
The weak skin texture detection results of Plastic;Fig. 5 (d) is Plastic final parallax;Fig. 5 (e) is that Plastic examines without weak texture
The disparity map of survey.
To Fig. 2 (a)~Fig. 2 (e), Fig. 3 (a)~Fig. 3 (e), Fig. 4 (a)~Fig. 4 (e), Fig. 5 (a)~figure from visual effect
Disparity map in 5 (e) carries out subjective assessment.Black portions represent the weak texture region detected, white in Fig. 2~Fig. 5 (c)
Part represents texture region.Compare disparity map, it can be seen that in weak texture region, the parallax obtained using present embodiment algorithm
Figure result is more far better than without the algorithm disparity map effect of weak skin texture detection.
The inventive method is evaluated from objective evaluation index.
Table 1 is given using the weak obvious image pair of texture region of two kinds of algorithm process middlebury image sets 4
Erroneous matching rate.
Table 1
As can be seen from Table 1, in the test result of two kinds of algorithm process Stereo matching images pair, using present embodiment
The image comparison of algorithm process reduces 5% without weak skin texture detection and the algorithmic error matching rate of segmentation.Illustrate this implementation
Mode algorithm can improve weak texture region matching accuracy, obtain more on the premise of texture region matching accuracy is ensured
Good disparity map.
Specific embodiment two
Described a kind of across yardstick cost polymerization solid matching method based on weak skin texture detection, to figure in the step a
Piece carries out weak skin texture detection and segmentation is specially:
It is that the Grad of (x, y) place pixel is g (x, y) to calculate left image coordinate, and with Grads threshold gTCompare, judge
Whether it is weak texture region, its calculation formula is:
G (x, y) < gT
In formula:N (x, y) represents the window centered on pixel (x, y), and M represents the number of pixel in window, I (x, y) table
Show the gray value of pixel.
A kind of described across yardstick cost based on weak skin texture detection polymerize solid matching method, is calculated in the step b
Matching power flow is specially:
Three-dimensional color image is calculated to left image ILWith right image IRMatching power flow C (p, d), its calculation formula is:
C (p, d)=(1- α) CAD(p,d)+α·(Cgrad_x(p,d)+Cgrad_y(p,d))
In formula:P is a bit in left image, and i=R in formula, G, B represent three passages of coloured image, T respectivelyADAnd Tgrad
The interceptive value of color and gradient is represented respectively;Gradient operator of the picture in x, y direction is represented respectively;α is colour-difference
Balance factor between gradient difference.
A kind of described across yardstick cost polymerization solid matching method based on weak skin texture detection, cost in the step c
Polymerization is specially:
Wherein,The Matching power flow after polymerization is represented, z is desired optimization target values, and W is gaussian filtering core, and N is pixel
P neighborhood window, q are p neighborhood territory pixel points;S ∈ { 0,1 ..., S } are scale parameter, during s=0, C0The original chi of representative image
Spend Matching power flow;The polymerization cost of S+1 yardstick of representative image;
In formula, λ is regularization factors,WithRepresent public
The optimization object function of formula (11), orderHave:
Wherein, ThighAnd TlowThe texture region hereinbefore detected and weak texture region are represented respectively;C1And C1/2Respectively
The Matching power flow of original image yardstick and half yardstick is represented, gaussian filtering is carried out with different size of window, after fusion
To final Matching power flow.
A kind of described across yardstick cost polymerization solid matching method based on weak skin texture detection, parallax in the step e
Refine specially:
|D'L(P)-D'R(P-D'R(P)) | < δ
DLRC(P)=min (D'(PL), D'(PR))
Wherein, the left figure parallax value D' of the point p in disparity mapLAnd right figure parallax value D' (p)R(p-D'L(p)), δ LRC
Threshold value;D ' (PL) is the parallax value of first unshielding point in left side, and D ' (PR) is regarding for first unshielding point on right side
Difference;WBpq(IL) be left image function, Δ cpqWith Δ spqRespectively left image midpoint p and q heterochromia and space Europe
Formula distance,WithThe respectively adjustment parameter of heterochromia and distance difference;Dw(p) filtered image.
Claims (5)
1. a kind of across yardstick cost polymerization solid matching method based on weak skin texture detection, its sign is, comprises the following steps:
Step a, two width coloured images are inputted, two described width coloured images are respectively left image and right image, utilize left image
Gradient information weak skin texture detection and segmentation are carried out to picture;
Step b, Matching power flow is calculated according to the colouring information and gradient information of left image and right image;
Step c, on the basis of weak skin texture detection and segmentation result in step a, carry out interior yardstick based on gaussian filtering and across
Yardstick cost polymerize;
Step d, policy calculation parallax is taken entirely using the person of winning;
Step e, the method using left and right consistency detection and based on adaptive weighting is refined to parallax, exports anaglyph.
2. a kind of across yardstick cost polymerization solid matching method based on weak skin texture detection according to claim 1, it is special
Sign is, carries out weak skin texture detection to picture in the step a and segmentation is specially:
It is that the Grad of (x, y) place pixel is g (x, y) to calculate left image coordinate, and with Grads threshold gTCompare, judge whether
For weak texture region, its calculation formula is:
G (x, y) < gT
<mrow>
<mi>g</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mfrac>
<mn>1</mn>
<mi>M</mi>
</mfrac>
<mo>&CenterDot;</mo>
<munder>
<mo>&Sigma;</mo>
<mrow>
<mo>(</mo>
<mi>u</mi>
<mo>,</mo>
<mi>v</mi>
<mo>)</mo>
<mo>&Element;</mo>
<mi>N</mi>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
</munder>
<mrow>
<mo>(</mo>
<mo>|</mo>
<mi>I</mi>
<mo>(</mo>
<mrow>
<mi>u</mi>
<mo>,</mo>
<mi>v</mi>
</mrow>
<mo>)</mo>
<mo>-</mo>
<mi>I</mi>
<mo>(</mo>
<mrow>
<mi>u</mi>
<mo>+</mo>
<mn>1</mn>
<mo>,</mo>
<mi>v</mi>
</mrow>
<mo>)</mo>
<mo>+</mo>
<mo>|</mo>
<mi>I</mi>
<mo>(</mo>
<mrow>
<mi>u</mi>
<mo>,</mo>
<mi>v</mi>
</mrow>
<mo>)</mo>
<mo>-</mo>
<mi>I</mi>
<mo>(</mo>
<mrow>
<mi>u</mi>
<mo>,</mo>
<mi>v</mi>
<mo>+</mo>
<mn>1</mn>
</mrow>
<mo>)</mo>
<mo>|</mo>
<mo>)</mo>
</mrow>
</mrow>
In formula:N (x, y) represents the window centered on pixel (x, y), and M represents the number of pixel in window, and I (x, y) represents picture
The gray value of element.
3. a kind of across yardstick cost polymerization solid matching method based on weak skin texture detection according to claim 1, it is special
Sign is that Matching power flow is calculated in the step b is specially:
Three-dimensional color image is calculated to left image ILWith right image IRMatching power flow C (p, d), its calculation formula is:
C (p, d)=(1- α) CAD(p,d)+α·(Cgrad_x(p,d)+Cgrad_y(p,d))
<mrow>
<msub>
<mi>C</mi>
<mrow>
<mi>A</mi>
<mi>D</mi>
</mrow>
</msub>
<mrow>
<mo>(</mo>
<mi>p</mi>
<mo>,</mo>
<mi>d</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mi>m</mi>
<mi>i</mi>
<mi>n</mi>
<mrow>
<mo>(</mo>
<mfrac>
<mn>1</mn>
<mn>3</mn>
</mfrac>
<munder>
<mo>&Sigma;</mo>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mi>R</mi>
<mo>,</mo>
<mi>G</mi>
<mo>,</mo>
<mi>B</mi>
</mrow>
</munder>
<mo>|</mo>
<msubsup>
<mi>I</mi>
<mi>L</mi>
<mi>i</mi>
</msubsup>
<mo>(</mo>
<mi>p</mi>
<mo>)</mo>
<mo>-</mo>
<msubsup>
<mi>I</mi>
<mi>R</mi>
<mi>i</mi>
</msubsup>
<mo>(</mo>
<mrow>
<mi>p</mi>
<mo>,</mo>
<mi>d</mi>
</mrow>
<mo>)</mo>
<mo>|</mo>
<mo>,</mo>
<msub>
<mi>T</mi>
<mrow>
<mi>A</mi>
<mi>D</mi>
</mrow>
</msub>
<mo>)</mo>
</mrow>
</mrow>
Cgrad_x(p, d)=min (| ▽xIL(p)-▽xIR(p,d)|,Tgrad)
Cgrad_y(p, d)=min (| ▽yIL(p)-▽yIR(p,d)|,Tgrad)
In formula:P is a bit in left image, and i=R in formula, G, B represent three passages of coloured image, T respectivelyADAnd TgradRespectively
Represent the interceptive value of color and gradient;▽x、▽yGradient operator of the picture in x, y direction is represented respectively;α is colour-difference and ladder
Balance factor between degree difference.
4. a kind of across yardstick cost polymerization solid matching method based on weak skin texture detection according to claim 1, it is special
Sign is that cost, which polymerize, in the step c is specially:
<mrow>
<mover>
<mi>C</mi>
<mo>~</mo>
</mover>
<mrow>
<mo>(</mo>
<mi>p</mi>
<mo>,</mo>
<mi>d</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<munder>
<mi>argmin</mi>
<mi>z</mi>
</munder>
<munder>
<mo>&Sigma;</mo>
<mrow>
<mi>q</mi>
<mo>&Element;</mo>
<mi>N</mi>
</mrow>
</munder>
<mi>W</mi>
<mrow>
<mo>(</mo>
<mi>p</mi>
<mo>,</mo>
<mi>q</mi>
<mo>)</mo>
</mrow>
<mo>|</mo>
<mo>|</mo>
<mi>z</mi>
<mo>-</mo>
<mi>C</mi>
<mrow>
<mo>(</mo>
<mi>q</mi>
<mo>,</mo>
<mi>d</mi>
<mo>)</mo>
</mrow>
<mo>|</mo>
<msup>
<mo>|</mo>
<mn>2</mn>
</msup>
</mrow>
<mrow>
<mover>
<mi>C</mi>
<mo>~</mo>
</mover>
<mrow>
<mo>(</mo>
<mi>p</mi>
<mo>,</mo>
<mi>d</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<munder>
<mo>&Sigma;</mo>
<mrow>
<mi>q</mi>
<mo>&Element;</mo>
<mi>N</mi>
</mrow>
</munder>
<mi>W</mi>
<mrow>
<mo>(</mo>
<mi>p</mi>
<mo>,</mo>
<mi>q</mi>
<mo>)</mo>
</mrow>
<mo>&CenterDot;</mo>
<mi>C</mi>
<mrow>
<mo>(</mo>
<mi>q</mi>
<mo>,</mo>
<mi>d</mi>
<mo>)</mo>
</mrow>
</mrow>
<mrow>
<mover>
<mi>v</mi>
<mo>~</mo>
</mover>
<mo>=</mo>
<munder>
<mrow>
<mi>a</mi>
<mi>r</mi>
<mi>g</mi>
<mi>m</mi>
<mi>i</mi>
<mi>n</mi>
</mrow>
<msubsup>
<mrow>
<mo>{</mo>
<msup>
<mi>z</mi>
<mi>s</mi>
</msup>
<mo>}</mo>
</mrow>
<mrow>
<mi>s</mi>
<mo>=</mo>
<mn>0</mn>
</mrow>
<mi>S</mi>
</msubsup>
</munder>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>s</mi>
<mo>=</mo>
<mn>0</mn>
</mrow>
<mi>S</mi>
</munderover>
<munder>
<mo>&Sigma;</mo>
<mrow>
<msup>
<mi>q</mi>
<mi>s</mi>
</msup>
<mo>&Element;</mo>
<msub>
<mi>N</mi>
<mi>s</mi>
</msub>
</mrow>
</munder>
<mi>W</mi>
<mrow>
<mo>(</mo>
<msup>
<mi>p</mi>
<mi>s</mi>
</msup>
<mo>,</mo>
<msup>
<mi>q</mi>
<mi>s</mi>
</msup>
<mo>)</mo>
</mrow>
<mo>|</mo>
<mo>|</mo>
<msup>
<mi>z</mi>
<mi>s</mi>
</msup>
<mo>-</mo>
<msup>
<mi>C</mi>
<mi>s</mi>
</msup>
<mrow>
<mo>(</mo>
<msup>
<mi>q</mi>
<mi>s</mi>
</msup>
<mo>,</mo>
<msup>
<mi>d</mi>
<mi>s</mi>
</msup>
<mo>)</mo>
</mrow>
<mo>|</mo>
<msup>
<mo>|</mo>
<mn>2</mn>
</msup>
</mrow>
Wherein,The Matching power flow after polymerization is represented, z is desired optimization target values, and W is gaussian filtering core, and N is the neighbour of pixel p
Domain window, q are p neighborhood territory pixel points;S ∈ { 0,1 ..., S } are scale parameter, during s=0, C0Representative image original scale
With cost;The polymerization cost of S+1 yardstick of representative image;
<mrow>
<msup>
<mover>
<mi>C</mi>
<mo>~</mo>
</mover>
<mi>s</mi>
</msup>
<mrow>
<mo>(</mo>
<msup>
<mi>p</mi>
<mi>s</mi>
</msup>
<mo>,</mo>
<msup>
<mi>d</mi>
<mi>s</mi>
</msup>
<mo>)</mo>
</mrow>
<mo>=</mo>
<munder>
<mo>&Sigma;</mo>
<mrow>
<msup>
<mi>q</mi>
<mi>s</mi>
</msup>
<mo>&Element;</mo>
<msub>
<mi>N</mi>
<mi>s</mi>
</msub>
</mrow>
</munder>
<mi>W</mi>
<mrow>
<mo>(</mo>
<msup>
<mi>p</mi>
<mi>s</mi>
</msup>
<mo>,</mo>
<msup>
<mi>q</mi>
<mi>s</mi>
</msup>
<mo>)</mo>
</mrow>
<mo>&CenterDot;</mo>
<msup>
<mi>C</mi>
<mi>s</mi>
</msup>
<mrow>
<mo>(</mo>
<msup>
<mi>q</mi>
<mi>s</mi>
</msup>
<mo>,</mo>
<msup>
<mi>d</mi>
<mi>s</mi>
</msup>
<mo>)</mo>
</mrow>
</mrow>
<mrow>
<mover>
<mi>v</mi>
<mo>~</mo>
</mover>
<mo>=</mo>
<munder>
<mrow>
<mi>a</mi>
<mi>r</mi>
<mi>g</mi>
<mi>m</mi>
<mi>i</mi>
<mi>n</mi>
</mrow>
<msubsup>
<mrow>
<mo>{</mo>
<msup>
<mi>z</mi>
<mi>s</mi>
</msup>
<mo>}</mo>
</mrow>
<mrow>
<mi>s</mi>
<mo>=</mo>
<mn>0</mn>
</mrow>
<mi>S</mi>
</msubsup>
</munder>
<mrow>
<mo>(</mo>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>s</mi>
<mo>=</mo>
<mn>0</mn>
</mrow>
<mi>S</mi>
</munderover>
<munder>
<mo>&Sigma;</mo>
<mrow>
<msup>
<mi>q</mi>
<mi>s</mi>
</msup>
<mo>&Element;</mo>
<msub>
<mi>N</mi>
<mi>s</mi>
</msub>
</mrow>
</munder>
<mi>W</mi>
<mo>(</mo>
<mrow>
<msup>
<mi>p</mi>
<mi>s</mi>
</msup>
<mo>,</mo>
<msup>
<mi>q</mi>
<mi>s</mi>
</msup>
</mrow>
<mo>)</mo>
<mo>|</mo>
<mo>|</mo>
<msup>
<mi>z</mi>
<mi>s</mi>
</msup>
<mo>-</mo>
<msup>
<mi>C</mi>
<mi>s</mi>
</msup>
<mo>(</mo>
<mrow>
<msup>
<mi>q</mi>
<mi>s</mi>
</msup>
<mo>,</mo>
<msup>
<mi>d</mi>
<mi>s</mi>
</msup>
</mrow>
<mo>)</mo>
<mo>|</mo>
<msup>
<mo>|</mo>
<mn>2</mn>
</msup>
<mo>+</mo>
<mi>&lambda;</mi>
<mo>&CenterDot;</mo>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>s</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>S</mi>
</munderover>
<mo>|</mo>
<mo>|</mo>
<msup>
<mi>z</mi>
<mi>s</mi>
</msup>
<mo>-</mo>
<msup>
<mi>z</mi>
<mrow>
<mi>s</mi>
<mo>-</mo>
<mn>1</mn>
</mrow>
</msup>
<mo>|</mo>
<msup>
<mo>|</mo>
<mn>2</mn>
</msup>
<mo>)</mo>
</mrow>
</mrow>
In formula, λ is regularization factors,WithRepresentation formula
(11) optimization object function, orderHave:
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mrow>
<mo>(</mo>
<mn>1</mn>
<mo>+</mo>
<mi>&lambda;</mi>
<mo>)</mo>
<msup>
<mi>z</mi>
<mi>s</mi>
</msup>
<mo>-</mo>
<msup>
<mi>&lambda;z</mi>
<mrow>
<mi>s</mi>
<mo>+</mo>
<mn>1</mn>
</mrow>
</msup>
<mo>=</mo>
<msup>
<mover>
<mi>C</mi>
<mo>~</mo>
</mover>
<mi>s</mi>
</msup>
<mo>(</mo>
<msup>
<mi>p</mi>
<mi>s</mi>
</msup>
<mo>,</mo>
<msup>
<mi>d</mi>
<mi>s</mi>
</msup>
<mo>)</mo>
<mo>,</mo>
<mi>s</mi>
<mo>=</mo>
<mn>0</mn>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mo>-</mo>
<msup>
<mi>&lambda;z</mi>
<mrow>
<mi>s</mi>
<mo>-</mo>
<mn>1</mn>
</mrow>
</msup>
<mo>+</mo>
<mrow>
<mo>(</mo>
<mn>1</mn>
<mo>+</mo>
<mn>2</mn>
<mi>&lambda;</mi>
<mo>)</mo>
</mrow>
<msup>
<mi>z</mi>
<mi>s</mi>
</msup>
<mo>-</mo>
<msup>
<mi>&lambda;z</mi>
<mrow>
<mi>s</mi>
<mo>+</mo>
<mn>1</mn>
</mrow>
</msup>
<mo>=</mo>
<msup>
<mover>
<mi>C</mi>
<mo>~</mo>
</mover>
<mi>s</mi>
</msup>
<mrow>
<mo>(</mo>
<msup>
<mi>p</mi>
<mi>s</mi>
</msup>
<mo>,</mo>
<msup>
<mi>d</mi>
<mi>s</mi>
</msup>
<mo>)</mo>
</mrow>
<mo>,</mo>
<mi>s</mi>
<mo>=</mo>
<mn>1</mn>
<mo>,</mo>
<mn>2</mn>
<mo>,</mo>
<mo>...</mo>
<mo>,</mo>
<mi>S</mi>
<mo>-</mo>
<mn>1</mn>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mo>-</mo>
<msup>
<mi>&lambda;z</mi>
<mrow>
<mi>s</mi>
<mo>-</mo>
<mn>1</mn>
</mrow>
</msup>
<mo>+</mo>
<mrow>
<mo>(</mo>
<mn>1</mn>
<mo>+</mo>
<mn>2</mn>
<mi>&lambda;</mi>
<mo>)</mo>
</mrow>
<msup>
<mi>z</mi>
<mi>s</mi>
</msup>
<mo>=</mo>
<msup>
<mover>
<mi>C</mi>
<mo>~</mo>
</mover>
<mi>s</mi>
</msup>
<mrow>
<mo>(</mo>
<msup>
<mi>p</mi>
<mi>s</mi>
</msup>
<mo>,</mo>
<msup>
<mi>d</mi>
<mi>s</mi>
</msup>
<mo>)</mo>
</mrow>
<mo>,</mo>
<mi>s</mi>
<mo>=</mo>
<mi>S</mi>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
<mrow>
<mi>A</mi>
<mover>
<mi>v</mi>
<mo>^</mo>
</mover>
<mo>=</mo>
<mover>
<mi>v</mi>
<mo>~</mo>
</mover>
</mrow>
<mrow>
<mi>A</mi>
<mo>=</mo>
<mfenced open = "[" close = "]">
<mtable>
<mtr>
<mtd>
<mrow>
<mn>1</mn>
<mo>+</mo>
<mi>&lambda;</mi>
</mrow>
</mtd>
<mtd>
<mrow>
<mo>-</mo>
<mi>&lambda;</mi>
</mrow>
</mtd>
<mtd>
<mrow></mrow>
</mtd>
<mtd>
<mrow></mrow>
</mtd>
<mtd>
<mrow></mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mo>-</mo>
<mi>&lambda;</mi>
</mrow>
</mtd>
<mtd>
<mrow>
<mn>1</mn>
<mo>+</mo>
<mn>2</mn>
<mi>&lambda;</mi>
</mrow>
</mtd>
<mtd>
<mrow>
<mo>-</mo>
<mi>&lambda;</mi>
</mrow>
</mtd>
<mtd>
<mrow></mrow>
</mtd>
<mtd>
<mrow></mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow></mrow>
</mtd>
<mtd>
<mo>...</mo>
</mtd>
<mtd>
<mo>...</mo>
</mtd>
<mtd>
<mrow></mrow>
</mtd>
<mtd>
<mrow></mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow></mrow>
</mtd>
<mtd>
<mrow></mrow>
</mtd>
<mtd>
<mo>...</mo>
</mtd>
<mtd>
<mo>...</mo>
</mtd>
<mtd>
<mrow></mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow></mrow>
</mtd>
<mtd>
<mrow></mrow>
</mtd>
<mtd>
<mrow>
<mo>-</mo>
<mi>&lambda;</mi>
</mrow>
</mtd>
<mtd>
<mrow>
<mn>1</mn>
<mo>+</mo>
<mn>2</mn>
<mi>&lambda;</mi>
</mrow>
</mtd>
<mtd>
<mrow>
<mo>-</mo>
<mi>&lambda;</mi>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow></mrow>
</mtd>
<mtd>
<mrow></mrow>
</mtd>
<mtd>
<mrow></mrow>
</mtd>
<mtd>
<mrow>
<mo>-</mo>
<mi>&lambda;</mi>
</mrow>
</mtd>
<mtd>
<mrow>
<mn>1</mn>
<mo>+</mo>
<mi>&lambda;</mi>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
</mrow>
<mrow>
<mover>
<mi>v</mi>
<mo>^</mo>
</mover>
<mo>=</mo>
<msup>
<mi>A</mi>
<mrow>
<mo>-</mo>
<mn>1</mn>
</mrow>
</msup>
<mover>
<mi>v</mi>
<mo>~</mo>
</mover>
</mrow>
<mrow>
<msub>
<mover>
<mi>C</mi>
<mo>~</mo>
</mover>
<mrow>
<mi>f</mi>
<mi>i</mi>
<mi>n</mi>
<mi>a</mi>
<mi>l</mi>
</mrow>
</msub>
<mrow>
<mo>(</mo>
<mi>p</mi>
<mo>,</mo>
<mi>d</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<msub>
<mi>T</mi>
<mrow>
<mi>h</mi>
<mi>i</mi>
<mi>g</mi>
<mi>h</mi>
</mrow>
</msub>
<mo>&CenterDot;</mo>
<munder>
<mi>&Sigma;</mi>
<mrow>
<mi>q</mi>
<mo>&Element;</mo>
<mi>N</mi>
</mrow>
</munder>
<msub>
<mi>W</mi>
<mn>1</mn>
</msub>
<mrow>
<mo>(</mo>
<mi>p</mi>
<mo>,</mo>
<mi>q</mi>
<mo>)</mo>
</mrow>
<msub>
<mi>C</mi>
<mn>1</mn>
</msub>
<mrow>
<mo>(</mo>
<mi>q</mi>
<mo>,</mo>
<mi>d</mi>
<mo>)</mo>
</mrow>
<mo>+</mo>
<msub>
<mi>T</mi>
<mrow>
<mi>l</mi>
<mi>o</mi>
<mi>w</mi>
</mrow>
</msub>
<mo>&CenterDot;</mo>
<munder>
<mi>&Sigma;</mi>
<mrow>
<mi>q</mi>
<mo>&Element;</mo>
<mi>N</mi>
</mrow>
</munder>
<msub>
<mi>W</mi>
<mn>2</mn>
</msub>
<mrow>
<mo>(</mo>
<mi>p</mi>
<mo>,</mo>
<mi>q</mi>
<mo>)</mo>
</mrow>
<msub>
<mi>C</mi>
<mrow>
<mn>1</mn>
<mo>/</mo>
<mn>2</mn>
</mrow>
</msub>
<mrow>
<mo>(</mo>
<mi>q</mi>
<mo>,</mo>
<mi>d</mi>
<mo>)</mo>
</mrow>
</mrow>
Wherein, ThighAnd TlowThe texture region hereinbefore detected and weak texture region are represented respectively;C1And C1/2Represent respectively
The Matching power flow of original image yardstick and half yardstick, gaussian filtering is carried out with different size of window, is obtained most after fusion
Whole Matching power flow.
5. a kind of across yardstick cost polymerization solid matching method based on weak skin texture detection according to claim 1, it is special
Sign is that parallax is refined specially in the step e:
|D'L(P)-D'R(P-D'R(P)) | < δ
DLRC(P)=min (D'(PL), D'(PR))
<mrow>
<msub>
<mi>D</mi>
<mi>w</mi>
</msub>
<mrow>
<mo>(</mo>
<mi>p</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<munder>
<mo>&Sigma;</mo>
<mi>q</mi>
</munder>
<msub>
<mi>WB</mi>
<mrow>
<mi>p</mi>
<mi>q</mi>
</mrow>
</msub>
<mrow>
<mo>(</mo>
<msub>
<mi>I</mi>
<mi>L</mi>
</msub>
<mo>)</mo>
</mrow>
<msub>
<mi>D</mi>
<mrow>
<mi>L</mi>
<mi>R</mi>
<mi>C</mi>
</mrow>
</msub>
<mrow>
<mo>(</mo>
<mi>q</mi>
<mo>)</mo>
</mrow>
</mrow>
<mrow>
<msub>
<mi>WB</mi>
<mrow>
<mi>p</mi>
<mi>q</mi>
</mrow>
</msub>
<mo>=</mo>
<mi>exp</mi>
<mrow>
<mo>(</mo>
<mo>-</mo>
<mfrac>
<mrow>
<msub>
<mi>&Delta;c</mi>
<mrow>
<mi>p</mi>
<mi>q</mi>
</mrow>
</msub>
</mrow>
<msubsup>
<mi>&sigma;</mi>
<mi>c</mi>
<mn>2</mn>
</msubsup>
</mfrac>
<mo>+</mo>
<mfrac>
<mrow>
<msub>
<mi>&Delta;s</mi>
<mrow>
<mi>p</mi>
<mi>q</mi>
</mrow>
</msub>
</mrow>
<msubsup>
<mi>&sigma;</mi>
<mi>s</mi>
<mn>2</mn>
</msubsup>
</mfrac>
<mo>)</mo>
</mrow>
</mrow>
Wherein, the left figure parallax value D' of the point p in disparity mapLAnd right figure parallax value D' (p)R(p-D'L(p)), δ is LRC threshold
Value;D ' (PL) is the parallax value of first unshielding point in left side, and D ' (PR) is the parallax value of first unshielding point on right side;
WBpq(IL) be left image function, Δ cpqWith Δ spqRespectively left image midpoint p and q heterochromia and space it is European away from
From,WithThe respectively adjustment parameter of heterochromia and distance difference;Dw(p) filtered image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710631310.7A CN107392950A (en) | 2017-07-28 | 2017-07-28 | A kind of across yardstick cost polymerization solid matching method based on weak skin texture detection |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710631310.7A CN107392950A (en) | 2017-07-28 | 2017-07-28 | A kind of across yardstick cost polymerization solid matching method based on weak skin texture detection |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107392950A true CN107392950A (en) | 2017-11-24 |
Family
ID=60342086
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710631310.7A Pending CN107392950A (en) | 2017-07-28 | 2017-07-28 | A kind of across yardstick cost polymerization solid matching method based on weak skin texture detection |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107392950A (en) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107945222A (en) * | 2017-12-15 | 2018-04-20 | 东南大学 | A kind of new Stereo matching cost calculates and parallax post-processing approach |
CN108181319A (en) * | 2017-12-12 | 2018-06-19 | 陕西三星洁净工程有限公司 | A kind of laying dust detecting device and method based on stereoscopic vision |
CN108510529A (en) * | 2018-03-14 | 2018-09-07 | 昆明理工大学 | A kind of figure based on adaptive weight cuts solid matching method |
CN108596975A (en) * | 2018-04-25 | 2018-09-28 | 华南理工大学 | A kind of Stereo Matching Algorithm for weak texture region |
CN108682026A (en) * | 2018-03-22 | 2018-10-19 | 辽宁工业大学 | A kind of binocular vision solid matching method based on the fusion of more Matching units |
CN108765486A (en) * | 2018-05-17 | 2018-11-06 | 长春理工大学 | Based on sparse piece of aggregation strategy method of relevant Stereo matching in color |
CN109816782A (en) * | 2019-02-03 | 2019-05-28 | 哈尔滨理工大学 | A kind of indoor scene three-dimensional rebuilding method based on binocular vision |
CN109887021A (en) * | 2019-01-19 | 2019-06-14 | 天津大学 | Based on the random walk solid matching method across scale |
CN109961417A (en) * | 2017-12-26 | 2019-07-02 | 广州极飞科技有限公司 | Image processing method, device and mobile device control method |
CN111191694A (en) * | 2019-12-19 | 2020-05-22 | 浙江科技学院 | Image stereo matching method |
CN111508013A (en) * | 2020-04-21 | 2020-08-07 | 中国科学技术大学 | Stereo matching method |
CN112070694A (en) * | 2020-09-03 | 2020-12-11 | 深兰人工智能芯片研究院(江苏)有限公司 | Binocular stereo vision disparity map post-processing method and device |
WO2021018093A1 (en) * | 2019-07-31 | 2021-02-04 | 深圳市道通智能航空技术有限公司 | Stereo matching method, image processing chip, and moving carrier |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105551035A (en) * | 2015-12-09 | 2016-05-04 | 深圳市华和瑞智科技有限公司 | Stereoscopic vision matching method based on weak edge and texture classification |
CN106340036A (en) * | 2016-08-08 | 2017-01-18 | 东南大学 | Binocular stereoscopic vision-based stereo matching method |
CN106530336A (en) * | 2016-11-07 | 2017-03-22 | 湖南源信光电科技有限公司 | Stereo matching algorithm based on color information and graph-cut theory |
-
2017
- 2017-07-28 CN CN201710631310.7A patent/CN107392950A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105551035A (en) * | 2015-12-09 | 2016-05-04 | 深圳市华和瑞智科技有限公司 | Stereoscopic vision matching method based on weak edge and texture classification |
CN106340036A (en) * | 2016-08-08 | 2017-01-18 | 东南大学 | Binocular stereoscopic vision-based stereo matching method |
CN106530336A (en) * | 2016-11-07 | 2017-03-22 | 湖南源信光电科技有限公司 | Stereo matching algorithm based on color information and graph-cut theory |
Non-Patent Citations (3)
Title |
---|
张华 等: ""基于跨尺度变窗口代价聚合的快速立体匹配"", 《计算机工程与应用》 * |
曹晓倩 等: "基于弱纹理检测及视差图融合的立体匹配", 《仪器仪表学报》 * |
林雪: "双目立体视觉中立体匹配技术研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108181319A (en) * | 2017-12-12 | 2018-06-19 | 陕西三星洁净工程有限公司 | A kind of laying dust detecting device and method based on stereoscopic vision |
CN108181319B (en) * | 2017-12-12 | 2020-09-11 | 陕西三星洁净工程有限公司 | Accumulated dust detection device and method based on stereoscopic vision |
CN107945222A (en) * | 2017-12-15 | 2018-04-20 | 东南大学 | A kind of new Stereo matching cost calculates and parallax post-processing approach |
CN109961417A (en) * | 2017-12-26 | 2019-07-02 | 广州极飞科技有限公司 | Image processing method, device and mobile device control method |
CN108510529A (en) * | 2018-03-14 | 2018-09-07 | 昆明理工大学 | A kind of figure based on adaptive weight cuts solid matching method |
CN108682026A (en) * | 2018-03-22 | 2018-10-19 | 辽宁工业大学 | A kind of binocular vision solid matching method based on the fusion of more Matching units |
CN108682026B (en) * | 2018-03-22 | 2021-08-06 | 江大白 | Binocular vision stereo matching method based on multi-matching element fusion |
CN108596975A (en) * | 2018-04-25 | 2018-09-28 | 华南理工大学 | A kind of Stereo Matching Algorithm for weak texture region |
CN108596975B (en) * | 2018-04-25 | 2022-03-29 | 华南理工大学 | Stereo matching algorithm for weak texture region |
CN108765486A (en) * | 2018-05-17 | 2018-11-06 | 长春理工大学 | Based on sparse piece of aggregation strategy method of relevant Stereo matching in color |
CN109887021A (en) * | 2019-01-19 | 2019-06-14 | 天津大学 | Based on the random walk solid matching method across scale |
CN109887021B (en) * | 2019-01-19 | 2023-06-06 | 天津大学 | Cross-scale-based random walk stereo matching method |
CN109816782A (en) * | 2019-02-03 | 2019-05-28 | 哈尔滨理工大学 | A kind of indoor scene three-dimensional rebuilding method based on binocular vision |
WO2021018093A1 (en) * | 2019-07-31 | 2021-02-04 | 深圳市道通智能航空技术有限公司 | Stereo matching method, image processing chip, and moving carrier |
CN111191694A (en) * | 2019-12-19 | 2020-05-22 | 浙江科技学院 | Image stereo matching method |
CN111508013A (en) * | 2020-04-21 | 2020-08-07 | 中国科学技术大学 | Stereo matching method |
CN111508013B (en) * | 2020-04-21 | 2022-09-06 | 中国科学技术大学 | Stereo matching method |
CN112070694A (en) * | 2020-09-03 | 2020-12-11 | 深兰人工智能芯片研究院(江苏)有限公司 | Binocular stereo vision disparity map post-processing method and device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107392950A (en) | A kind of across yardstick cost polymerization solid matching method based on weak skin texture detection | |
CN108596975A (en) | A kind of Stereo Matching Algorithm for weak texture region | |
CN104680496B (en) | A kind of Kinect depth map restorative procedures based on color images | |
CN110473217A (en) | A kind of binocular solid matching process based on Census transformation | |
CN105744256B (en) | Based on the significant objective evaluation method for quality of stereo images of collection of illustrative plates vision | |
CN107578418A (en) | A kind of indoor scene profile testing method of confluent colours and depth information | |
CN110148181A (en) | A kind of general binocular solid matching process | |
CN107578404A (en) | The complete of view-based access control model notable feature extraction refers to objective evaluation method for quality of stereo images | |
CN107301642B (en) | A kind of full-automatic prospect background segregation method based on binocular vision | |
CN106504276A (en) | The combinations matches cost algorithms of non local Stereo Matching Algorithm and parallax joint filling algorithm | |
CN107301664A (en) | Improvement sectional perspective matching process based on similarity measure function | |
CN104240204B (en) | Solar silicon wafer and battery piece counting method based on image processing | |
CN107578430A (en) | A kind of solid matching method based on adaptive weight and local entropy | |
CN104376535A (en) | Rapid image repairing method based on sample | |
CN101866497A (en) | Binocular stereo vision based intelligent three-dimensional human face rebuilding method and system | |
CN106408513A (en) | Super-resolution reconstruction method of depth map | |
CN103384343B (en) | A kind of method and device thereof filling up image cavity | |
CN102800111B (en) | Color harmony based color fusion image color quality evaluation method | |
CN103325120A (en) | Rapid self-adaption binocular vision stereo matching method capable of supporting weight | |
CN106600632A (en) | Improved matching cost aggregation stereo matching algorithm | |
CN106530334B (en) | A kind of airborne Interference synthetic aperture radar complex image corregistration method and complex image corregistration system | |
CN104102928B (en) | A kind of Classifying Method in Remote Sensing Image based on texture primitive | |
CN103177451A (en) | Three-dimensional matching algorithm between adaptive window and weight based on picture edge | |
CN103955945A (en) | Self-adaption color image segmentation method based on binocular parallax and movable outline | |
CN107578399A (en) | Full reference image quality appraisement method based on boundary characteristic segmentation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20171124 |