CN109448036A - A kind of method and device determining disparity map based on binocular image - Google Patents
A kind of method and device determining disparity map based on binocular image Download PDFInfo
- Publication number
- CN109448036A CN109448036A CN201910062717.1A CN201910062717A CN109448036A CN 109448036 A CN109448036 A CN 109448036A CN 201910062717 A CN201910062717 A CN 201910062717A CN 109448036 A CN109448036 A CN 109448036A
- Authority
- CN
- China
- Prior art keywords
- pixel
- mesh
- left mesh
- image
- value
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 44
- 230000009466 transformation Effects 0.000 claims abstract description 77
- 238000012545 processing Methods 0.000 claims description 61
- 238000001914 filtration Methods 0.000 claims description 23
- 230000009467 reduction Effects 0.000 claims description 23
- 238000006243 chemical reaction Methods 0.000 claims description 15
- 238000012937 correction Methods 0.000 claims description 14
- 230000000379 polymerizing effect Effects 0.000 claims description 14
- 238000009826 distribution Methods 0.000 claims description 9
- 238000003860 storage Methods 0.000 claims description 8
- 230000035807 sensation Effects 0.000 claims description 7
- 238000004891 communication Methods 0.000 claims description 3
- 230000005055 memory storage Effects 0.000 claims description 2
- 238000013461 design Methods 0.000 description 16
- 238000010586 diagram Methods 0.000 description 12
- 230000008569 process Effects 0.000 description 8
- 238000004590 computer program Methods 0.000 description 7
- 238000004422 calculation algorithm Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 6
- 238000003384 imaging method Methods 0.000 description 6
- 238000012986 modification Methods 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 3
- 238000013507 mapping Methods 0.000 description 2
- 238000006116 polymerization reaction Methods 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000012512 characterization method Methods 0.000 description 1
- 235000013399 edible fruits Nutrition 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- G06T5/70—
Abstract
The application provides a kind of method and device that disparity map is determined based on binocular image, for improving the precision of determining disparity map.This method comprises: being filtered to left mesh figure, the gradient information of left mesh figure is obtained;According to gradient information, the corresponding N number of match window of N number of pixel in left mesh figure is obtained;Wherein, N number of match window refers to that the size of N number of match window is not exactly the same using pixel each in N number of pixel as the image-region of center pixel, and within a preset range, N is positive integer to the image feature value of each match window in N number of match window;Statistics Census transformation is carried out to the pixel in each match window in N number of match window, obtains N number of transformation sequence of N number of pixel in left mesh figure;According to N number of transformation sequence N number of transformation sequence corresponding with N number of pixel of right mesh figure in left mesh figure, the disparity map with left mesh figure and right mesh figure is obtained.
Description
Technical field
This application involves technical field of computer vision more particularly to a kind of methods for determining disparity map based on binocular image
And device.
Background technique
Binocular image includes left mesh figure and right mesh figure.Binocular image refers to using image capture device from different position acquisitions
Testee two images.Binocular vision technology is the disparity map of binocular image to be obtained, according to parallax according to binocular image
Figure is to obtain the three-dimensional image information of object.The key for determining disparity map is exactly to determine that left mesh figure and right mesh figure match
Point.The existing point generally determining left mesh figure and right mesh figure by Stereo matching and matching.Stereo Matching Algorithm includes part
Stereo Matching Algorithm.
The substantially process of sectional perspective matching algorithm are as follows: determine match window, obtain and traverse a left side as unit of match window
The characteristics of image of the characteristics of image of mesh figure and right mesh figure matches the corresponding pixel points in left mesh figure and right mesh figure, to obtain
The parallax value between corresponding pixel points is obtained, the disparity map of left mesh figure Yu right mesh figure is determined according to parallax value.In the prior art one
As be select fixed size match window, be difficult accurately to match corresponding pixel points in left mesh figure and right mesh figure, to lead
Cause the precision for the disparity map determined lower.
Summary of the invention
The application provides a kind of method and device that disparity map is determined based on binocular image, for improving determining disparity map
Precision.
In a first aspect, providing a kind of method for determining disparity map based on binocular image, comprising:
Left mesh figure is filtered, the gradient information of the left mesh figure is obtained;Wherein, the gradient information is for indicating left
The probability distribution of characteristics of image in mesh figure;
According to the gradient information, the corresponding N number of match window of N number of pixel in the left mesh figure is obtained;Wherein, described N number of
Match window refers to using pixel each in N number of pixel as the image-region of center pixel, N number of match window it is big
Small not exactly the same, within a preset range, N is positive integer to the image feature value of each match window in N number of match window;
Statistics Census transformation is carried out to the pixel in each match window in N number of match window, obtains the left mesh
N number of transformation sequence of N number of pixel in figure;
According to N number of transformation sequence N number of transformation sequence corresponding with N number of pixel of right mesh figure in the left mesh figure, acquisition and institute
State the disparity map of left mesh figure Yu the right mesh figure;Wherein, the disparity map is for indicating the left mesh figure and the right mesh figure phase
The visible sensation distance of matched two pixels, the left mesh figure and the right mesh figure are corresponding two figures in binocular image.
In the above scheme, according to the gradient information of image, to determine the match window of each pixel in left mesh figure
Size, so that the image feature value of each match window is within a preset range, to ensure that N number of match window in left mesh figure
Image feature value within a preset range, it is excessive or too small and cause determining view to avoid the corresponding characteristics of image of match window
The low problem of the precision of poor figure, to improve the precision for the disparity map determined.
In a kind of possible design, left mesh figure is filtered, obtains the gradient information of the left mesh figure, packet
It includes:
Gray proces are carried out to left mesh figure, obtain gray processing treated left mesh figure;
According to preset operator, noise reduction filtering is carried out to the first direction and second direction of the gray processing treated left mesh figure
Processing, obtains noise reduction filtering treated left mesh figure, and the first direction and the second direction are with the left mesh figure reference
Two different directions that standard determines;
According to the noise reduction filtering treated left mesh figure, the gradient information of the left mesh figure is obtained.
In the above scheme, noise reduction filtering processing is carried out to the both direction of left mesh figure, it is hereby achieved that more left
The information of mesh figure, so that the gradient information more complete and accurate of the left mesh figure obtained, it is determining left to further ensure the later period
The precision of the disparity map of mesh figure and right mesh figure.
In a kind of possible design, according to the gradient information of the left mesh figure, N number of pixel in the left mesh figure is obtained
Corresponding N number of match window, comprising:
Using the first pixel of N number of pixel in the left mesh figure as center pixel, increase around first pixel
Pixel number, the image-region after being increased;
According to the gradient information of the left mesh figure, the characteristics of image of the image-region after obtaining the increase;
If the image feature value of the image-region after the increase is within a preset range, it is determined that the image-region after the increase
For the first match window of first pixel, to obtain the corresponding N number of match window of N number of pixel.
In the above scheme, by successively increasing the pixel around the pixel using the first pixel as center pixel
Point number, the image-region after being increased.It, should if the image feature value of the image-region after increasing is within a preset range
Image-region after increase is then the corresponding match window of the pixel.The mode of the acquisition match window is simply direct.
In a kind of possible design, the pixel in each match window in N number of match window is counted
Census transformation, obtains N number of transformation sequence of N number of pixel in the left mesh figure, comprising:
Determine the characteristic value of P pixel in neighborhood window in each match window the characteristic value of central pixel point it is big
Small, the neighborhood window refers to the region in each match window other than central pixel point, and P is just whole less than N
Number;
If the characteristic value of each pixel is greater than or equal to the characteristic value of the central pixel point, institute in the P pixel
The conversion code for stating each pixel in P pixel is taken as the first value, if in the P pixel each pixel characteristic value
Less than the characteristic value of the central pixel point, then the conversion code of each pixel is taken as second value in the P pixel, thus
The transformation sequence of each match window is obtained, so that N number of transformation sequence of N number of match window is obtained, described N number of
N number of transformation sequence with window is to state N number of transformation sequence of N number of pixel in left mesh figure.
In the above scheme, a kind of mode of transformation sequence for obtaining left mesh figure is provided, which can quickly directly
Determine the corresponding transformation sequence of each match window in ground.
In a kind of possible design, the P pixel is successively to be spaced in all pixels point of the neighborhood window
The set of the pixel of preset step-length.
In the above scheme, the partial pixel point in neighborhood window is selected to calculate the transformation sequence of match window, relatively
The transformation sequence of match window is calculated in all pixels point with match window, opposite can reduce computing cost.
It is corresponding with N number of pixel of right mesh figure according to transformation sequence N number of in the left mesh figure in a kind of possible design
N number of transformation sequence, obtain and the disparity map of the left mesh figure and the right mesh figure, comprising:
According to the corresponding N number of transformation sequence of N number of pixel of N number of transformation sequence and the right mesh figure in the left mesh figure, obtain
In the left mesh figure each pixel relative to a first Hamming distances of K between K pixel of the right mesh figure, Yi Jisuo
Each pixel in right mesh figure is stated relative to K the second Hamming distances between K pixel of the left mesh figure, K be less than or
Positive integer equal to N;
According to the K the first Hamming distances and the K the second Hamming distances, the left mesh figure and the right mesh figure are obtained
Disparity map.
In the above scheme, to the transformation sequence in left mesh figure, multiple conversion codes corresponding with N number of pixel of right mesh figure
String carries out XOR operation, it is hereby achieved that the corresponding multiple Hamming distances of each pixel.The mode for obtaining Hamming distance is simple, energy
Enough opposite reduction computing costs.
In a kind of possible design, according to the K the first Hamming distances and the K the second Hamming distances, institute is obtained
State the disparity map of left mesh figure Yu the right mesh figure, comprising:
According to the coordinate of each pixel in the left mesh figure and the K the first Hamming distances, it is corresponding to establish the left mesh figure
Three-dimensional parallax figure, and according to each pixel coordinate in the right mesh figure and the K the second Hamming distances, described in foundation
The corresponding three-dimensional parallax figure of right mesh figure;
According to preset window, the corresponding three-dimensional parallax figure of the left mesh figure is polymerize, at least one first polymerizing value is obtained,
Determine that minimum value is the parallax value of each pixel in the left mesh figure at least one described first polymerizing value, obtains the left side
The parallax value set of mesh figure;
According to preset window, the corresponding three-dimensional parallax figure of the right mesh figure is polymerize, at least one second polymerizing value is obtained,
Determine that minimum value is the parallax value of each pixel in the right mesh figure at least one described second polymerizing value, obtains the right side
The parallax value set of mesh figure;
According to the parallax value set of the left mesh figure and the parallax value set of the right mesh figure, to the left mesh figure with it is described
The parallax value of right mesh figure is corrected, the parallax value after being corrected;
According to the parallax value after the correction, the disparity map with the left mesh figure and the right mesh figure is generated.
In the above scheme, multiple Hamming distances are polymerize, the interference that noise determines disparity map to the later period can be reduced.
And the consistency of left mesh figure and right mesh figure is corrected, it can be further improved the precision for the disparity map determined.And to multiple
Hamming distance is polymerize, it is possible to reduce data volume advantageously reduces the expense of later period calculating.
In a kind of possible design, according to the parallax value of the parallax value set of the left mesh figure and the right mesh figure
Set, corrects the parallax value of the left mesh figure and the right mesh figure, obtains the left mesh figure and the right mesh figure is corrected
Parallax value afterwards, comprising:
The parallax value set of the left mesh figure is fitted, the corresponding first sub- picture of each pixel in the left mesh figure is obtained
Vegetarian refreshments parallax value, and the parallax value set of the right mesh figure is fitted, obtain each pixel pair in the right mesh figure
The the second sub-pix point parallax value answered;
It is preset if it is determined that the difference of the first sub-pix point parallax value and the second sub-pix point parallax value is less than or equal to
Threshold value, then the average value of the first sub-pix point parallax value and the second sub-pix point parallax value is the left mesh figure and institute
Parallax value after stating the correction of right mesh figure.
In the above scheme, it is fitted by the parallax value set to left mesh figure, obtains sub-pix point parallax value, pass through
The sub-pix point parallax value of left mesh figure and right mesh figure carries out consistency correction, to obtain the higher parallax value of accuracy, in turn
So that the precision for the disparity map determined is higher.
Second aspect provides a kind of device that disparity map is determined based on binocular image, including receiving module and processing module,
Wherein:
The receiving module, for obtaining left mesh figure;
The processing module obtains the gradient information of the left mesh figure for being filtered to the left mesh figure;Wherein,
The gradient information is used to indicate the probability distribution of characteristics of image in left mesh figure;
The processing module is also used to that it is N number of corresponding to obtain N number of pixel in the left mesh figure according to the gradient information
With window;Wherein, N number of match window refers to using pixel each in N number of pixel as the image-region of center pixel,
The size of N number of match window is not exactly the same, and the image feature value of each match window is pre- in N number of match window
If in range, N is positive integer;
The processing module is also used to count the pixel in N number of match window in each match window
Census transformation, obtains N number of transformation sequence of N number of pixel in the left mesh figure;
The processing module is also used to according to N number of transformation sequence N corresponding with N number of pixel of right mesh figure in the left mesh figure
A transformation sequence obtains the disparity map with the left mesh figure and the right mesh figure;Wherein, the disparity map is for indicating the left side
The visible sensation distance for two pixels that mesh figure and the right mesh figure match, the left mesh figure and the right mesh figure are binocular image
In it is corresponding two figure.
In a kind of possible design, the processing module is specifically used for:
Gray proces are carried out to left mesh figure, obtain gray processing treated left mesh figure;
According to preset operator, noise reduction filtering is carried out to the first direction and second direction of the gray processing treated left mesh figure
Processing, obtains noise reduction filtering treated left mesh figure, and the first direction and the second direction are with the left mesh figure reference
Two different directions that standard determines;
According to the noise reduction filtering treated left mesh figure, the gradient information of the left mesh figure is obtained.
In a kind of possible design, the processing module is specifically used for:
Using the first pixel of N number of pixel in the left mesh figure as center pixel, increase around first pixel
Pixel number, the image-region after being increased;
According to the gradient information of the left mesh figure, the characteristics of image of the image-region after obtaining the increase;
If the image feature value of the image-region after the increase is within a preset range, it is determined that the image-region after the increase
For the first match window of first pixel, to obtain the corresponding N number of match window of N number of pixel.
The third aspect provides a kind of device that disparity map is determined based on binocular image, comprising:
At least one processor, and
The memory being connect at least one described processor communication;
Wherein, the memory is stored with the instruction that can be executed by least one described processor, at least one described processor
The method as described in any one of first aspect is realized in instruction by executing the memory storage.
Fourth aspect, provides a kind of computer readable storage medium, and the computer-readable recording medium storage has calculating
Machine instruction, when the computer instruction is run on computers, so that computer is executed as described in any one of first aspect
Method.
Detailed description of the invention
Fig. 1 is a kind of flow chart for the method that disparity map is determined based on binocular image provided by the embodiments of the present application;
Fig. 2 is a kind of schematic diagram of default neighborhood provided by the embodiments of the present application;
Fig. 3 is provided by the embodiments of the present application a kind of to carry out gray processing treated schematic diagram to left mesh figure;
Fig. 4 is the schematic diagram provided by the embodiments of the present application that the left mesh figure after noise reduction filtering is carried out to Fig. 3;
Fig. 5 is a kind of process schematic of determining match window provided by the embodiments of the present application;
Fig. 6 is the schematic diagram that a kind of pair of match window provided by the embodiments of the present application carries out census transformation;
Fig. 7 is a kind of schematic diagram of the three-dimensional parallax figure of left mesh figure provided by the embodiments of the present application;
Fig. 8 is a kind of a kind of schematic diagram that the parallax value set to left mesh figure is fitted provided by the embodiments of the present application;
Fig. 9 provides a kind of structure chart of device that disparity map is determined based on binocular image for the embodiment of the present application;
Figure 10 provides a kind of structure chart of device that disparity map is determined based on binocular image for the embodiment of the present application.
Specific embodiment
In order to better understand technical solution provided by the embodiments of the present application, below in conjunction with Figure of description and specifically
Embodiment be described in detail.
For the ease of more fully understanding the technical solution of the embodiment of the present application, technical term is illustrated below.
1) binocular image is referred to as a left side using imaging device from the two images of different position acquisition testees
Mesh figure and right mesh figure, or referred to as left figure and right figure.
2) binocular vision technology obtains left mesh figure by finding out the corresponding relationship of left mesh figure and right mesh figure in binocular image
With the parallax of right mesh figure, according to parallax obtain disparity map after, according to principle of triangulation obtain original image depth information and
Three-dimensional information.
3) basic thought of statistics transformation (Census Transform, CT), Census transformation is: in the picture with each
Pixel is center pixel, defines a match window.Selection Center pixel, will be each in match window as reference pixel
The gray value of pixel is compared with the reference gray level value of reference pixel, and gray value is less than or equal to the pixel mark of reference gray level value
It is denoted as 0, the pixel greater than reference gray level value is labeled as 1, and finally step-by-step connects again, obtains transformed as a result, transformed knot
The transformation sequence that fruit is made of 0 and 1.
The mode of disparity map, which is illustrated, to be determined to the prior art below.
Determine that the mode of disparity map is generally adopted by sectional perspective matching algorithm at present.
Sectional perspective matching algorithm is processing unit with match window, is traversed in left mesh figure and right mesh figure with match window
Each pixel, so that the weighted average of each pixel corresponding parallax value in match window is obtained, so that it is determined that going out a left side
The best parallax of mesh figure and right mesh figure, and then disparity map is determined according to parallax.
But sectional perspective matching algorithm in the prior art be usually with the match window of fixed size to left mesh figure and
Right mesh figure is handled.If the size of match window is too small, the figure of the left mesh figure or right mesh figure that include in each match window
Picture feature is very few, is difficult that the matching relationship of the pixel of left mesh figure and right mesh figure is precisely determined.If match window is big
Small excessive, the characteristics of image of the left mesh figure or right mesh figure that include in each match window is more, then is also difficult to be accurately determined
The matching relationship of left mesh figure and right mesh image vegetarian refreshments out, so as to cause the disparity map determined with this method accuracy compared with
It is low.
In consideration of it, the embodiment of the present application provides a kind of method for determining disparity map based on binocular image, this method can be by
The device of disparity map is determined to execute.Determine that the device of disparity map can be realized by server, can also by computer come
Realize, can also by one or more field programmable gate array (Field Programmable Gate Array,
FPGA) Lai Shixian.Server such as small server or server cluster.Fig. 1 is please referred to, the process of this method is as follows:
Step 101, left mesh figure is filtered, obtains the gradient information of left mesh figure;Wherein, gradient information is for indicating left
The probability distribution of characteristics of image in mesh figure.
Step 102, according to gradient information, the corresponding N number of match window of N number of pixel in left mesh figure is obtained;Wherein, N number of
Match window refers to that the size of N number of match window is not using pixel each in N number of pixel as the image-region of center pixel
Identical, within a preset range, N is positive integer to the image feature value of each match window in N number of match window;
Step 103, statistics Census transformation is carried out to the pixel in each match window in N number of match window, obtains left mesh
N number of transformation sequence of N number of pixel in figure;
Step 104, it according to N number of transformation sequence N number of transformation sequence corresponding with N number of pixel of right mesh figure in left mesh figure, obtains
With the disparity map of left mesh figure and right mesh figure;Wherein, disparity map is used to indicate two pixels that left mesh figure and right mesh figure match
Visible sensation distance, left mesh figure and right mesh figure are corresponding two figures in binocular image.
The process of the above method is described in detail below.
Before determining that the device of disparity map determines disparity map, need first to obtain binocular image.Below to acquisition binocular image
Mode illustrated.
For example, user can be sent out binocular image by ancillary equipment it needs to be determined that when the disparity map of binocular image
The device for giving determining disparity map determines that the device of disparity map receives binocular image.
Such as determining that the device of disparity map and imaging device communicate to connect, imaging device collects binocular image, really
The device for determining disparity map just obtains the binocular image from imaging device.
Wherein, imaging device is for example, binocular camera, binocular camera etc..Imaging device can be the dress of determining disparity map
The a part set is also possible to relative to the self-existent equipment of device for determining disparity map.Binocular image is referred to above
The content of discussion, details are not described herein again.
In the device for determining disparity map after obtaining binocular image, step 101 is executed, i.e., place is filtered to left mesh figure
Reason, obtains the gradient information of left mesh figure.
Characteristics of image includes one or more of image grayscale, image border, image outline.Gradient information is for indicating
The probability distribution of characteristics of image in left mesh figure, such as gradient information are used to indicate the distribution feelings of the image texture in left mesh figure
Condition.
Left mesh figure is filtered, the left mesh figure being more clear can be obtained, the later period is facilitated to obtain the letter of left mesh figure
Breath.There are many kinds of the modes of filtering processing, is illustrated below.
For example, being filtered to left mesh figure, the mode of the gradient information of left mesh figure is obtained are as follows:
Gray processing processing is carried out to left mesh figure, obtains gray processing treated left mesh figure;
According to preset operator, the first direction and second direction of gray processing treated left mesh figure are carried out at noise reduction filtering
Reason obtains noise reduction filtering treated left mesh figure;
According to noise reduction filtering treated left mesh figure, the gradient information of left mesh figure is obtained.
In the embodiment of the present application, gray processing processing first is carried out to left mesh figure, convenient for obtaining the characteristics of image of left mesh figure.Left mesh
Figure can be expressed as image function, and image function has corresponding image function component in each direction.It is default according to difference
Operator the characteristics of image of the pixel in the first direction of left mesh figure and the default neighborhood of second direction is filtered,
Then the modulus value of left mesh figure in the first direction and a second direction is sought, to obtain the corresponding gradient information of left mesh figure.It is default
Operator such as Sobel Sobel operator etc..
It is the determining different directions of reference standard that first direction and second direction, which are using left mesh figure,.First direction can be left mesh figure
For the horizontal direction of reference standard, second direction can be the vertical direction that left mesh figure is reference standard.Default neighborhood refer to
One pixel is the region in the determining pixel region of center pixel other than central pixel point.For example, please referring to figure
2, with pixel 0 for center pixel, pixel 1 to pixel 8 then indicates the default field of pixel 0.
In the embodiment of the present application, after being filtered to first direction and second direction, left mesh figure is in edge
Modulus value can be larger, the modulus value of edge is not smaller, so that entire left mesh figure is more clear.
For example, referring to figure 3., Fig. 3 is the image carried out after gray processing processing to left mesh figure, Fig. 4 is indicated to gray processing
Treated left mesh figure carries out noise reduction filtering treated left mesh figure.As can be seen that after carrying out noise reduction filtering processing to left mesh figure
The profile of left mesh figure be more clear.
The device of disparity map is determined after executing step 101, step 102 is executed that is, according to gradient information and obtains left mesh
The corresponding N number of match window of N number of pixel in figure.
In the embodiment of the present application, match window can be understood as handling the unit of left mesh figure, and N number of match window refers to a left side
Each pixel is the image-region of center pixel in N number of pixel in mesh figure.The incomplete phase of the size of N number of match window
It is same to refer to that there are different two match windows of size in N number of match window.Image feature value is for indicating in match window
All pixels characteristics of image size.Characteristics of image is referred to the content discussed above, and details are not described herein again.First picture
Vegetarian refreshments refers to that any one pixel in N number of pixel, the first match window are that the first pixel is corresponding in N number of match window
Match window.The mode for executing step 102 is illustrated below.
Execute a kind of mode of step 102 are as follows:
Using the first pixel of N number of pixel in left mesh figure as center pixel, increase the pixel around the first pixel
Number, the image-region after being increased;
According to the gradient information of left mesh figure, the image feature value of the image-region after being increased;
If the image feature value of the image-region after increasing is within a preset range, it is determined that the image-region after increase is the first picture
First match window of vegetarian refreshments, to obtain the corresponding N number of match window of N number of pixel.
Specifically, left mesh figure includes N number of pixel, is respectively central pixel point to each pixel in N number of pixel,
Increase the pixel number around the pixel, by the set of pixel number and the pixel around the increased pixel
Image-region after as increasing.According to the gradient information of left mesh figure, to obtain the image of the image-region after image increase
Characteristic value.Judge increase after image-region image feature value whether within a preset range, most if it is less than preset range
Small value then continues growing the pixel number around the pixel, and the image feature value of the image-region after increasing is pre-
If in range, then the image-region after the increase is just the corresponding match window of the pixel, and so on, to obtain N number of
The corresponding N number of match window of pixel.
In the embodiment of the present application, the match window of each pixel can be calculated according to following formula:
Wherein, T indicates preset range, and (i, j) indicates central pixel point, QUOTE Indicate to obtain is somebody's turn to do
The match window of pixel.N indicates the number of increased pixel. QUOTE
Indicate the image feature value of the image-region after increasing.W indicates the range of the pixel of setting.
In the embodiment of the present application, the section of W be can be set to [7*7,23*23], and T can be set to [17,23].
In order to facilitate determining the device of disparity map to the pixel number increased around the pixel, in the embodiment of the present application
The pixel number around the pixel can be increased according to preset rules.
Preset rules include increasing the quantity of the pixel around the pixel every time, and/or increase around the pixel
Pixel direction.Increase the direction of the pixel around the pixel for example, increasing counterclockwise centered on the pixel
Pixel.
For example, referring to figure 5., preset rules are that the number of each increased pixel is two, increase pixel every time
Direction be clockwise.With the pixel 0 in a figure in Fig. 5 for pixel dot center, first increase pixel 1 and pixel
2, the image-region after being increased is made of as shown in b figure in Fig. 5 pixel 1 and pixel 2.Calculate pixel 1 and picture
The image feature value of vegetarian refreshments 2.
The image feature value of figure b is determined not in preset range, then according to continuing growing pixel 3 and pixel clockwise
Point 4, the image-region after being increased is as shown in c figure in Fig. 5, i.e., by pixel 1, pixel 2, pixel 3 and 4 groups of pixel
At.Calculate the image feature value of pixel 1, pixel 2, pixel 3 and pixel 4.
The image feature value of figure c is determined not in preset range, then according to continuing growing pixel 5 and picture clockwise
Vegetarian refreshments 6, the image-region after being increased as shown in d figure in Fig. 5, i.e., by pixel 1, pixel 2, pixel 3, pixel 4,
Pixel 5 and pixel 6 form.Calculate the figure of pixel 1, pixel 2, pixel 3, pixel 4, pixel 5 and pixel 6
As characteristic value, if the image feature value is in preset range, it is determined that the match window of pixel 0 be pixel 1, pixel 2,
The image-region that pixel 3, pixel 4, pixel 5 and pixel 6 form.
Determine that the device of disparity map after executing step 102, can execute step 103, i.e., to every in N number of match window
Pixel in a match window carries out statistics Census transformation, obtains N number of transformation sequence of N number of pixel in left mesh figure.
Specifically, the device of disparity map is determined as unit of the corresponding match window of each pixel, to the matching window
Pixel in mouthful carries out Census transformation, so that the transformation sequence of the pixel is obtained, and so on, to obtain the left mesh
The corresponding N number of transformation sequence of N number of pixel in figure.The mode that step 103 is performed below illustrates.
A method of executing step 103 are as follows:
Determine the size of the characteristic value of P pixel and the characteristic value of central pixel point in each match window in neighborhood window,
Neighborhood window refers to the region in each match window other than central pixel point, and P is the positive integer less than N;
If the characteristic value of each pixel is greater than or equal to the characteristic value of central pixel point in P pixel, in P pixel
The conversion code of each pixel is taken as the first value, if the characteristic value of each pixel is less than central pixel point in P pixel
Characteristic value, then the conversion code of each pixel is taken as second value in P pixel, to obtain the conversion code of each match window
String, to obtain N number of transformation sequence of N number of match window, N number of transformation sequence of N number of match window is as stated N number of in left mesh figure
N number of transformation sequence of pixel.
Wherein, P pixel can be all pixels point in neighborhood window.Or in order to reduce the dress of determining disparity map
The computing cost set, P pixel can take the partial pixel point in all pixels point of neighborhood window, such as take neighborhood window
All pixels point in successively be spaced preset step-length pixel set.Preset step-length is understood that the pixel at interval
Distance.
In the embodiment of the present application, determine the conversion code of P pixel specific formula is as follows:
Wherein, QUOTE It is any in P pixel respectively in the characteristic value of central pixel point and neighborhood window
The characteristic value of one pixel.The characteristic value of pixel refers to that image pixel intensities, image pixel intensities include the gray value of pixel.Pixel
The characteristic value of point can be obtained by the gradient information of left mesh figure, due to including the distribution feelings of characteristics of image in gradient information
Therefore condition can determine the characteristic value of each pixel by characteristics of image.It is to take 0 with the first value in above formula, second value takes
For 1.
When preset step-length is 2, by following mapping relations, it is hereby achieved that N number of Census of left mesh figure is converted
Sequence.The mapping relations are specific as follows:
Lrect (u, v) indicates that a certain pixel in left mesh figure, W indicate the corresponding match window of the pixel.
For example, please referring to Fig. 6, the corresponding match window of the central pixel point includes 9 pixels, and preset step-length is with 2
Example, then the number of corresponding neighborhood territory pixel point is 8, the number of P pixel is 4.The characteristic value of pixel is with gray value
For, the gray value of central pixel point is 128.Judge the gray value and central pixel point of 4 pixels in neighborhood window
Gray value size just takes 1 if the gray value of the pixel in the neighborhood window is greater than the gray value of central pixel point, if
The gray value of pixel in the neighborhood window is less than or equal to the gray value of central pixel point, 0 is just taken, to obtain the matching
Window for conversion code be 1100.
Determine that the device of disparity map after executing step 103, executes step 104, i.e., according to N number of conversion code in left mesh figure
String N number of transformation sequence corresponding with N number of pixel of right mesh figure, obtains the disparity map with left mesh figure and right mesh figure.
The process for obtaining the corresponding N number of transformation sequence of N number of pixel of right mesh figure is referred to the acquisition discussed an above left side
The content of the corresponding N number of transformation sequence of N number of pixel of mesh figure, details are not described herein again.Disparity map is for indicating left mesh figure and the right side
The visible sensation distance for two pixels that mesh figure matches.Left mesh figure and right mesh figure are corresponding two figures in binocular image.
The mode for executing step 104 is illustrated below.
Execute the mode one of step 104 are as follows:
By N number of transformation sequence of left mesh figure and N number of transformation sequence of right mesh figure, it is corresponding to obtain N number of pixel in left mesh figure
The first Hamming distance and right mesh figure in corresponding second Hamming distance of N number of pixel;
According to K the first Hamming distances and K the second Hamming distances, obtain the disparity map of left mesh figure Yu right mesh figure, K to be less than or
Positive integer equal to N.
Specifically, the transformation sequence in left mesh figure and multiple transformation sequences in right mesh figure carry out XOR operation respectively,
1 number is then Hamming distance in each XOR operation result.XOR operation K times, corresponding the K being obtained in left mesh figure the
One Hamming distance, and so on, corresponding K the first Hamming distance of each pixel in left mesh figure can be obtained.
Similarly, the transformation sequence in right mesh figure and multiple conversion codes in left mesh figure carry out XOR operation respectively, different every time
Or 1 number is then Hamming distance in operation result.XOR operation K times, the corresponding K being obtained in right mesh figure the second Hamming
Away from, and so on, corresponding K the second Hamming distance of each pixel in left mesh figure can be obtained.
Due to having N number of pixel in left mesh figure and right mesh figure, in order to reduce computing cost, in the embodiment of the present application in advance
One maximum disparity value d_max is set.
Specifically, calculate left mesh figure a pixel (u, when the first Hamming distance v), with pixel (u,
V) start in the corresponding pixel of right mesh figure, until search (u+d_max, v) until, to obtain the pixel in left mesh figure
The corresponding K Hamming distance of point.
The matching cost of left mesh figure Yu right mesh figure is characterized with Hamming distance, then specifically characterization formula is as follows:
Wherein, QUOTE Indicate to take right mesh figure as reference, the matching generation of obtained left mesh figure
Valence, corresponding is multiple first Hamming distances. QUOTE It indicates with left mesh figure to be reference, obtain
Right mesh figure matching cost, corresponding is multiple second Hamming distances.
In obtaining left mesh figure after corresponding K the first Hamming distance of each pixel, the smallest first Chinese can choose
The bright parallax value away from corresponding Hamming distance as left mesh figure and right mesh figure, to obtain left mesh figure and right mesh figure according to parallax value
Disparity map.
Or in order to improve obtain parallax value accuracy, please refer to Fig. 7, will be every in left mesh figure in the embodiment of the present application
In coordinate and K the first Hamming distance setting coordinate system of a pixel, to obtain the corresponding three-dimensional parallax figure of left mesh figure.It will
Coordinate and K the second Hamming distance of each pixel in right mesh figure are arranged in the same coordinate system, to obtain right mesh figure
Corresponding three-dimensional parallax figure.
In order to reduce influence of noise, the matching precision of left mesh figure and right mesh figure is improved, in the embodiment of the present application, according to default
The three-dimensional parallax figure of window, three-dimensional parallax figure and right mesh figure to left mesh figure polymerize respectively, and left mesh is obtained after polymerization
The parallax value set of the parallax value set of figure and right mesh figure.According to the parallax value set of left mesh figure and the parallax of right mesh figure
Value set;The parallax value of left mesh figure and right mesh figure is corrected, the parallax value after being corrected;According to the parallax after correction
Value generates the disparity map with left mesh figure and right mesh figure.
The mode being polymerize below to the three-dimensional parallax figure of left mesh figure is illustrated.
The information of single pixel point is usually replaced using the information of the pixel in preset window, and then improves matching cost
Reliability.It carries out calculating matching cost polymerization using preset window, formula is as follows:
QUOTE in formula For pixel QUOTE Preset window, preset window be usually preset
Fixed size window.
After obtaining at least one polymerizing value, the smallest polymerizing value is selected by WTA (winner-take-all) rule
For the corresponding parallax value of the left each pixel of mesh figure, to obtain pixel QUOTE Corresponding minimum parallax, tool
Body formula is as follows:
And so on, the corresponding minimum parallax value of each pixel in left mesh figure can be obtained respectively, obtain the parallax of left mesh figure
Value set.Similarly, the parallax value set of the minimum parallax value of N number of pixel in the right mesh figure that can be obtained.
In the ideal case, left mesh figure is similar with the parallax value of corresponding pixel in right mesh figure, but may
There are processing abnormal conditions etc., therefore, the parallax value of left mesh figure and right mesh figure is corrected in the embodiment of the present application, is rectified
Parallax value after just.
Specifically, the parallax value set of the parallax value set of left mesh figure and right mesh figure is fitted processing respectively, obtains
The left corresponding first sub-pix point parallax value of mesh figure and the corresponding second sub-pix parallax value of right mesh figure are obtained, it is sub- based on first
Pixel parallax value and the second sub-pix parallax value, the parallax value after being corrected.Sub-pix point parallax value can be understood as pair
The higher parallax value of the accuracy that parallax value is refined.
Wherein, the formula of process of fitting treatment is as follows:
Wherein, QUOTE The corresponding coordinate value of minimum value in parallax value set for indicating left mesh figure,
d(min-1)Lesser parallax value pair in two parallax values for indicating adjacent with the minimum value in the parallax value set of left mesh figure
The coordinate value answered, d(min+1)It is larger in two parallax values for indicating adjacent with the minimum value in the parallax value set of left mesh figure
The corresponding coordinate value of parallax value.
Each parallax value in the parallax value set in left mesh figure is fitted using above formula, obtains the left each picture of mesh figure
The corresponding first sub-pix point parallax value of vegetarian refreshments, similarly, it is also possible to obtain the second sub-pix point of each pixel of right mesh figure
Parallax value.
Fig. 8 is please referred to, all the points refer to the corresponding parallax value set of all pixels point in left mesh figure in Fig. 8, according to upper
Formula is fitted parallax value set, to obtain the corresponding parallax of a point in Fig. 8 to be right mesh figure, left mesh figure with right mesh figure
Corresponding parallax value.
Determine that the difference of the first sub-pix point parallax value and the second sub-pix point parallax value is less than or equal to preset threshold, then
The average value of first sub-pix point parallax value and the second sub-pix point parallax value is the parallax after the correction of left mesh figure and right mesh figure
Value.If the difference of the first sub-pix point parallax value and the second sub-pix point parallax value is greater than preset threshold, illustrate the parallax
It is larger to be worth error, then gives up the data.
The parallax value of left mesh figure and right mesh figure is corrected, the parallax value after being corrected specific formula is as follows:
DMsub(u, v) indicates the parallax value after the correction of left mesh figure and right mesh figure, and A is indicated based on calculated pixel in left mesh figure
Point (u, v) corresponding first sub-pix point parallax value, B indicate based on the right calculated pixel of mesh figure (u, v) corresponding second
Sub-pix point parallax value.
And so on, thus the parallax value after obtaining the correction of each pixel, according to the parallax value after correction, to obtain
Obtain the disparity map of left mesh figure and right mesh figure.
After obtaining disparity map, object can be calculated using parallel binocular vision principle of triangulation according to disparity map
Body depth value so obtain object depth image, thus obtain 3-D image perhaps depth of field etc. to realize ranging or build
The purpose of threedimensional model.
On the basis of a kind of method for determining disparity map based on binocular image discussed above, the embodiment of the present application is also mentioned
For a kind of device for determining disparity map based on binocular image.Fig. 9 is please referred to, which includes receiving module 901 and processing module
902.Wherein:
Receiving module 901, for receiving left mesh figure;
Processing module 902 obtains the gradient information of left mesh figure for being filtered to left mesh figure;Wherein, gradient information is used
In the probability distribution for indicating characteristics of image in left mesh figure;
Processing module 902, is also used to according to gradient information, obtains the corresponding N number of match window of N number of pixel in left mesh figure;Its
In, N number of match window refers to using pixel each in N number of pixel as the image-region of center pixel, N number of match window
Size is not exactly the same, and within a preset range, N is positive integer to the image feature value of each match window in N number of match window;
Processing module 902 is also used to carry out the pixel in match window each in N number of match window statistics Census and becomes
It changes, obtains N number of transformation sequence of N number of pixel in left mesh figure;
Processing module 902 is also used to according to N number of transformation sequence N number of transformation corresponding with N number of pixel of right mesh figure in left mesh figure
Sequence obtains the disparity map with left mesh figure and right mesh figure;Wherein, disparity map is used to indicate left mesh figure and right mesh figure match two
The visible sensation distance of a pixel, left mesh figure and right mesh figure are corresponding two figures in binocular image.
In a kind of possible design, processing module 902 is specifically used for:
Gray proces are carried out to left mesh figure, obtain gray processing treated left mesh figure;
According to preset operator, the first direction and second direction of gray processing treated left mesh figure are carried out at noise reduction filtering
Reason, obtains noise reduction filtering treated left mesh figure, and first direction and second direction are two determined with left mesh figure reference standard
Different directions;
According to noise reduction filtering treated left mesh figure, the gradient information of left mesh figure is obtained.
In a kind of possible design, processing module 902 is specifically used for:
Using the first pixel of N number of pixel in left mesh figure as center pixel, increase the pixel around the first pixel
Number, the image-region after being increased;
According to the gradient information of left mesh figure, the characteristics of image of the image-region after being increased;
If the image feature value of the image-region after increasing is within a preset range, it is determined that the image-region after increase is the first picture
First match window of vegetarian refreshments, to obtain the corresponding N number of match window of N number of pixel.
In a kind of possible design, processing module 902 is specifically used for:
Determine the size of the characteristic value of P pixel and the characteristic value of central pixel point in each match window in neighborhood window,
Neighborhood window refers to the region in each match window other than central pixel point, and P is the positive integer less than N;
If the characteristic value of each pixel is greater than or equal to the characteristic value of central pixel point in P pixel, in P pixel
The conversion code of each pixel is taken as the first value, if the characteristic value of each pixel is less than central pixel point in P pixel
Characteristic value, then the conversion code of each pixel is taken as second value in P pixel, to obtain the conversion code of each match window
String, to obtain N number of transformation sequence of N number of match window, N number of transformation sequence of N number of match window is as stated N number of in left mesh figure
N number of transformation sequence of pixel.
In a kind of possible design, P pixel is successively to be spaced preset step-length in all pixels point of neighborhood window
Pixel set.
In a kind of possible design, processing module 902 is used for:
According to the corresponding N number of transformation sequence of N number of pixel of transformation sequence N number of in left mesh figure and right mesh figure, left mesh figure is obtained
In each pixel relative to each pixel in K the first Hamming distances and right mesh figure between K pixel of right mesh figure
Relative to K the second Hamming distances between K pixel of left mesh figure, K is the positive integer less than or equal to N;
According to K the first Hamming distances and K the second Hamming distances, the disparity map of left mesh figure Yu right mesh figure is obtained.
In a kind of possible design, processing module 902 is specifically used for:
According to the coordinate of each pixel in left mesh figure and K the first Hamming distances, the corresponding three-dimensional parallax figure of left mesh figure is established,
And according to pixel coordinate each in right mesh figure and K the second Hamming distances, establish the corresponding three-dimensional parallax figure of right mesh figure;
According to preset window, the corresponding three-dimensional parallax figure of left mesh figure is polymerize, obtains at least one first polymerizing value, is determined
Minimum value is the parallax value of each pixel in left mesh figure at least one first polymerizing value, obtains the parallax value collection of left mesh figure
It closes;
According to preset window, the corresponding three-dimensional parallax figure of right mesh figure is polymerize, obtains at least one second polymerizing value, is determined
Minimum value is the parallax value of each pixel in right mesh figure at least one second polymerizing value, obtains the parallax value collection of right mesh figure
It closes;
According to the parallax value set of left mesh figure and the parallax value set of right mesh figure, to the parallax value of left mesh figure and right mesh figure into
Row correction, the parallax value after being corrected;
According to the parallax value after correction, the disparity map with left mesh figure and right mesh figure is generated.
In a kind of possible design, processing module 902 is specifically used for:
The parallax value set of left mesh figure is fitted, the corresponding first sub-pix point parallax of each pixel in left mesh figure is obtained
Value, and is fitted the parallax value set of right mesh figure, obtains the corresponding second sub-pix point of each pixel in right mesh figure
Parallax value;
If it is determined that the difference of the first sub-pix point parallax value and the second sub-pix point parallax value is less than or equal to preset threshold, then the
The average value of one sub-pix point parallax value and the second sub-pix point parallax value is the parallax value after the correction of left mesh figure and right mesh figure.
On the basis of a kind of method for determining disparity map based on binocular image discussed above, the embodiment of the present application is also mentioned
For a kind of device for determining disparity map based on binocular image.Figure 10 is please referred to, which includes:
At least one processor 1001, and
With the memory 1002 of at least one described processor 1001 communication connection;
Wherein, the memory 1002 is stored with the instruction that can be executed by least one described processor 1001, and described at least one
A processor 1001 realizes described in any item methods as shown in figure 1 by executing the instruction that the memory 1002 stores.
It is but not limit the quantity of processor 1001 actually by taking a processor 1001 as an example in Figure 10.
As one embodiment, the processing module 902 in Fig. 9 can be realized by the processor 1001 in Figure 10.
On the basis of a kind of method for determining disparity map based on binocular image discussed above, the embodiment of the present application is also mentioned
For a kind of computer readable storage medium, the computer-readable recording medium storage has computer instruction, when the computer
When instruction is run on computers, so that computer executes described in any item methods as shown in figure 1.
It should be understood by those skilled in the art that, embodiments herein can provide as method, system or computer program
Product.Therefore, complete hardware embodiment, complete software embodiment or reality combining software and hardware aspects can be used in the application
Apply the form of example.Moreover, it wherein includes the computer of computer usable program code that the application, which can be used in one or more,
The computer program implemented in usable storage medium (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.) produces
The form of product.
The application is referring to method, the process of equipment (system) and computer program product according to the embodiment of the present application
Figure and/or block diagram describe.It should be understood that every one stream in flowchart and/or the block diagram can be realized by computer program instructions
The combination of process and/or box in journey and/or box and flowchart and/or the block diagram.It can provide these computer programs
Instruct the processor of general purpose computer, special purpose computer, Embedded Processor or other programmable data processing devices to produce
A raw machine, so that being generated by the instruction that computer or the processor of other programmable data processing devices execute for real
The device for the function of being specified in present one or more flows of the flowchart and/or one or more blocks of the block diagram.
These computer program instructions, which may also be stored in, is able to guide computer or other programmable data processing devices with spy
Determine in the computer-readable memory that mode works, so that it includes referring to that instruction stored in the computer readable memory, which generates,
Enable the manufacture of device, the command device realize in one box of one or more flows of the flowchart and/or block diagram or
The function of being specified in multiple boxes.
These computer program instructions also can be loaded onto a computer or other programmable data processing device, so that counting
Series of operation steps are executed on calculation machine or other programmable devices to generate computer implemented processing, thus in computer or
The instruction executed on other programmable devices is provided for realizing in one or more flows of the flowchart and/or block diagram one
The step of function of being specified in a box or multiple boxes.
Although the preferred embodiment of the application has been described, it is created once a person skilled in the art knows basic
Property concept, then additional changes and modifications may be made to these embodiments.So it includes excellent that the following claims are intended to be interpreted as
It selects embodiment and falls into all change and modification of the application range.
Obviously, those skilled in the art can carry out various modification and variations without departing from the essence of the application to the application
Mind and range.In this way, if these modifications and variations of the application belong to the range of the claim of this application and its equivalent technologies
Within, then the application is also intended to include these modifications and variations.
Claims (13)
1. a kind of method for determining disparity map based on binocular image characterized by comprising
Left mesh figure is filtered, the gradient information of the left mesh figure is obtained;Wherein, the gradient information is for indicating left
The probability distribution of characteristics of image in mesh figure;
According to the gradient information, the corresponding N number of match window of N number of pixel in the left mesh figure is obtained;Wherein, described N number of
Match window refers to using pixel each in N number of pixel as the image-region of center pixel, N number of match window it is big
Small not exactly the same, within a preset range, N is positive integer to the image feature value of each match window in N number of match window;
Statistics Census transformation is carried out to the pixel in each match window in N number of match window, obtains the left mesh
N number of transformation sequence of N number of pixel in figure;
According to N number of transformation sequence N number of transformation sequence corresponding with N number of pixel of right mesh figure in the left mesh figure, acquisition and institute
State the disparity map of left mesh figure Yu the right mesh figure;Wherein, the disparity map is for indicating the left mesh figure and the right mesh figure phase
The visible sensation distance of matched two pixels, the left mesh figure and the right mesh figure are corresponding two figures in binocular image.
2. the method as described in claim 1, which is characterized in that be filtered to left mesh figure, obtain the left mesh figure
Gradient information, comprising:
Gray proces are carried out to left mesh figure, obtain gray processing treated left mesh figure;
According to preset operator, noise reduction filtering is carried out to the first direction and second direction of the gray processing treated left mesh figure
Processing, obtains noise reduction filtering treated left mesh figure, and the first direction and the second direction are with the left mesh figure reference
Two different directions that standard determines;
According to the noise reduction filtering treated left mesh figure, the gradient information of the left mesh figure is obtained.
3. the method as described in claim 1, which is characterized in that according to the gradient information of the left mesh figure, obtain the left mesh
The corresponding N number of match window of N number of pixel in figure, comprising:
Using the first pixel of N number of pixel in the left mesh figure as center pixel, increase around first pixel
Pixel number, the image-region after being increased;
According to the gradient information of the left mesh figure, the characteristics of image of the image-region after obtaining the increase;
If the image feature value of the image-region after the increase is within a preset range, it is determined that the image-region after the increase
For the first match window of first pixel, to obtain the corresponding N number of match window of N number of pixel.
4. method a method according to any one of claims 1-3, which is characterized in that each match window in N number of match window
Interior pixel carries out statistics Census transformation, obtains N number of transformation sequence of N number of pixel in the left mesh figure, comprising:
Determine the characteristic value of P pixel in neighborhood window in each match window the characteristic value of central pixel point it is big
Small, the neighborhood window refers to the region in each match window other than central pixel point, and P is just whole less than N
Number;
If the characteristic value of each pixel is greater than or equal to the characteristic value of the central pixel point, institute in the P pixel
The conversion code for stating each pixel in P pixel is taken as the first value, if in the P pixel each pixel characteristic value
Less than the characteristic value of the central pixel point, then the conversion code of each pixel is taken as second value in the P pixel, thus
The transformation sequence of each match window is obtained, so that N number of transformation sequence of N number of match window is obtained, described N number of
N number of transformation sequence with window is to state N number of transformation sequence of N number of pixel in left mesh figure.
5. method as claimed in claim 4, which is characterized in that the P pixel is all pixels of the neighborhood window
In point, it is successively spaced the set of the pixel of preset step-length.
6. method a method according to any one of claims 1-3, which is characterized in that according to N number of transformation sequence in the left mesh figure and the right side
The corresponding N number of transformation sequence of N number of pixel of mesh figure obtains the disparity map with the left mesh figure and the right mesh figure, comprising:
According to the corresponding N number of transformation sequence of N number of pixel of N number of transformation sequence and the right mesh figure in the left mesh figure, obtain
In the left mesh figure each pixel relative to a first Hamming distances of K between K pixel of the right mesh figure, Yi Jisuo
Each pixel in right mesh figure is stated relative to K the second Hamming distances between K pixel of the left mesh figure, K be less than or
Positive integer equal to N;
According to the K the first Hamming distances and the K the second Hamming distances, the left mesh figure and the right mesh figure are obtained
Disparity map.
7. method as claimed in claim 6, which is characterized in that according to the K the first Hamming distances and the K second
Hamming distance obtains the disparity map of the left mesh figure and the right mesh figure, comprising:
According to the coordinate of each pixel in the left mesh figure and the K the first Hamming distances, it is corresponding to establish the left mesh figure
Three-dimensional parallax figure, and according to each pixel coordinate in the right mesh figure and the K the second Hamming distances, described in foundation
The corresponding three-dimensional parallax figure of right mesh figure;
According to preset window, the corresponding three-dimensional parallax figure of the left mesh figure is polymerize, at least one first polymerizing value is obtained,
Determine that minimum value is the parallax value of each pixel in the left mesh figure at least one described first polymerizing value, obtains the left side
The parallax value set of mesh figure;
According to preset window, the corresponding three-dimensional parallax figure of the right mesh figure is polymerize, at least one second polymerizing value is obtained,
Determine that minimum value is the parallax value of each pixel in the right mesh figure at least one described second polymerizing value, obtains the right side
The parallax value set of mesh figure;
According to the parallax value set of the left mesh figure and the parallax value set of the right mesh figure, to the left mesh figure with it is described
The parallax value of right mesh figure is corrected, the parallax value after being corrected;
According to the parallax value after the correction, the disparity map with the left mesh figure and the right mesh figure is generated.
8. the method for claim 7, which is characterized in that according to the parallax value set of the left mesh figure and the right side
The parallax value set of mesh figure corrects the parallax value of the left mesh figure and the right mesh figure, obtains the left mesh figure and institute
Parallax value after stating right mesh figure correction, comprising:
The parallax value set of the left mesh figure is fitted, the corresponding first sub- picture of each pixel in the left mesh figure is obtained
Vegetarian refreshments parallax value, and the parallax value set of the right mesh figure is fitted, obtain each pixel pair in the right mesh figure
The the second sub-pix point parallax value answered;
It is preset if it is determined that the difference of the first sub-pix point parallax value and the second sub-pix point parallax value is less than or equal to
Threshold value, then the average value of the first sub-pix point parallax value and the second sub-pix point parallax value is the left mesh figure and institute
Parallax value after stating the correction of right mesh figure.
9. a kind of device for determining disparity map based on binocular image, which is characterized in that including receiving module and processing module,
In:
The receiving module, for obtaining left mesh figure;
The processing module obtains the gradient information of the left mesh figure for being filtered to the left mesh figure;Wherein,
The gradient information is used to indicate the probability distribution of characteristics of image in left mesh figure;
The processing module is also used to that it is N number of corresponding to obtain N number of pixel in the left mesh figure according to the gradient information
With window;Wherein, N number of match window refers to using pixel each in N number of pixel as the image-region of center pixel,
The size of N number of match window is not exactly the same, and the image feature value of each match window is pre- in N number of match window
If in range, N is positive integer;
The processing module is also used to count the pixel in N number of match window in each match window
Census transformation, obtains N number of transformation sequence of N number of pixel in the left mesh figure;
The processing module is also used to according to N number of transformation sequence N corresponding with N number of pixel of right mesh figure in the left mesh figure
A transformation sequence obtains the disparity map with the left mesh figure and the right mesh figure;Wherein, the disparity map is for indicating the left side
The visible sensation distance for two pixels that mesh figure and the right mesh figure match, the left mesh figure and the right mesh figure are binocular image
In it is corresponding two figure.
10. device as claimed in claim 9, which is characterized in that the processing module is specifically used for:
Gray proces are carried out to left mesh figure, obtain gray processing treated left mesh figure;
According to preset operator, noise reduction filtering is carried out to the first direction and second direction of the gray processing treated left mesh figure
Processing, obtains noise reduction filtering treated left mesh figure, and the first direction and the second direction are with the left mesh figure reference
Two different directions that standard determines;
According to the noise reduction filtering treated left mesh figure, the gradient information of the left mesh figure is obtained.
11. device as claimed in claim 9, which is characterized in that the processing module is specifically used for:
Using the first pixel of N number of pixel in the left mesh figure as center pixel, increase around first pixel
Pixel number, the image-region after being increased;
According to the gradient information of the left mesh figure, the characteristics of image of the image-region after obtaining the increase;
If the image feature value of the image-region after the increase is within a preset range, it is determined that the image-region after the increase
For the first match window of first pixel, to obtain the corresponding N number of match window of N number of pixel.
12. a kind of device for determining disparity map based on binocular image characterized by comprising
At least one processor, and
The memory being connect at least one described processor communication;
Wherein, the memory is stored with the instruction that can be executed by least one described processor, at least one described processor
Such as method of any of claims 1-8 is realized in instruction by executing the memory storage.
13. a kind of computer readable storage medium, which is characterized in that the computer-readable recording medium storage has computer to refer to
It enables, when the computer instruction is run on computers, so that computer is executed as of any of claims 1-8
Method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910062717.1A CN109448036A (en) | 2019-01-23 | 2019-01-23 | A kind of method and device determining disparity map based on binocular image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910062717.1A CN109448036A (en) | 2019-01-23 | 2019-01-23 | A kind of method and device determining disparity map based on binocular image |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109448036A true CN109448036A (en) | 2019-03-08 |
Family
ID=65544304
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910062717.1A Pending CN109448036A (en) | 2019-01-23 | 2019-01-23 | A kind of method and device determining disparity map based on binocular image |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109448036A (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110374045A (en) * | 2019-07-29 | 2019-10-25 | 哈尔滨工业大学 | A kind of intelligence de-icing method |
CN110853086A (en) * | 2019-10-21 | 2020-02-28 | 北京清微智能科技有限公司 | Depth image generation method and system based on speckle projection |
CN111508009A (en) * | 2020-07-02 | 2020-08-07 | 上海海栎创微电子有限公司 | Binocular stereo matching preprocessing method and device |
CN111739083A (en) * | 2020-07-02 | 2020-10-02 | 深兰人工智能芯片研究院(江苏)有限公司 | Matching method and device |
CN112784874A (en) * | 2020-12-28 | 2021-05-11 | 深兰人工智能芯片研究院(江苏)有限公司 | Binocular vision stereo matching method and device, electronic equipment and storage medium |
CN114332345A (en) * | 2021-09-23 | 2022-04-12 | 北京科技大学 | Metallurgy reservoir area local three-dimensional reconstruction method and system based on binocular vision |
CN115908170A (en) * | 2022-11-04 | 2023-04-04 | 浙江华诺康科技有限公司 | Binocular image noise reduction method and device, electronic device and storage medium |
-
2019
- 2019-01-23 CN CN201910062717.1A patent/CN109448036A/en active Pending
Non-Patent Citations (3)
Title |
---|
吴振: "《基于双目视觉的立体匹配算法研究与FPGA实现》", 《万方学位论文数据库》 * |
王志 等: "《改进导向滤波器立体匹配算法》", 《浙江大学学报(工学版)》 * |
赵劲松: "《基于Census变换的立体匹配算法研究与实现》", 《万方学位论文数据库》 * |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110374045A (en) * | 2019-07-29 | 2019-10-25 | 哈尔滨工业大学 | A kind of intelligence de-icing method |
CN110374045B (en) * | 2019-07-29 | 2021-09-28 | 哈尔滨工业大学 | Intelligent deicing method |
CN110853086A (en) * | 2019-10-21 | 2020-02-28 | 北京清微智能科技有限公司 | Depth image generation method and system based on speckle projection |
CN111508009A (en) * | 2020-07-02 | 2020-08-07 | 上海海栎创微电子有限公司 | Binocular stereo matching preprocessing method and device |
CN111739083A (en) * | 2020-07-02 | 2020-10-02 | 深兰人工智能芯片研究院(江苏)有限公司 | Matching method and device |
CN112784874A (en) * | 2020-12-28 | 2021-05-11 | 深兰人工智能芯片研究院(江苏)有限公司 | Binocular vision stereo matching method and device, electronic equipment and storage medium |
CN112784874B (en) * | 2020-12-28 | 2022-07-22 | 深兰人工智能芯片研究院(江苏)有限公司 | Binocular vision stereo matching method and device, electronic equipment and storage medium |
CN114332345A (en) * | 2021-09-23 | 2022-04-12 | 北京科技大学 | Metallurgy reservoir area local three-dimensional reconstruction method and system based on binocular vision |
CN115908170A (en) * | 2022-11-04 | 2023-04-04 | 浙江华诺康科技有限公司 | Binocular image noise reduction method and device, electronic device and storage medium |
CN115908170B (en) * | 2022-11-04 | 2023-11-21 | 浙江华诺康科技有限公司 | Noise reduction method and device for binocular image, electronic device and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109448036A (en) | A kind of method and device determining disparity map based on binocular image | |
CN104867135B (en) | A kind of High Precision Stereo matching process guided based on guide image | |
Yang | Dealing with textureless regions and specular highlights-a progressive space carving scheme using a novel photo-consistency measure | |
CN104899563B (en) | Two-dimensional face key feature point positioning method and system | |
CN110378838B (en) | Variable-view-angle image generation method and device, storage medium and electronic equipment | |
CN109640066B (en) | Method and device for generating high-precision dense depth image | |
CN110148181A (en) | A kind of general binocular solid matching process | |
CN108416791A (en) | A kind of monitoring of parallel institution moving platform pose and tracking based on binocular vision | |
CN102982334B (en) | The sparse disparities acquisition methods of based target edge feature and grey similarity | |
CN108322724B (en) | Image solid matching method and binocular vision equipment | |
CN108734776A (en) | A kind of three-dimensional facial reconstruction method and equipment based on speckle | |
CN108510540B (en) | Stereoscopic vision camera and height acquisition method thereof | |
CN110909693A (en) | 3D face living body detection method and device, computer equipment and storage medium | |
Chen et al. | Transforming a 3-d lidar point cloud into a 2-d dense depth map through a parameter self-adaptive framework | |
CN103530599A (en) | Method and system for distinguishing real face and picture face | |
CN106981081A (en) | A kind of degree of plainness for wall surface detection method based on extraction of depth information | |
CN102831601A (en) | Three-dimensional matching method based on union similarity measure and self-adaptive support weighting | |
CN107316326A (en) | Applied to disparity map computational methods of the binocular stereo vision based on side and device | |
CN113763269B (en) | Stereo matching method for binocular images | |
KR20110014067A (en) | Method and system for transformation of stereo content | |
CN106384363B (en) | A kind of quick self-adapted weight solid matching method | |
CN103516983A (en) | Image processing device, imaging device and image processing method | |
CN104200453B (en) | Parallax image correcting method based on image segmentation and credibility | |
CN107578430A (en) | A kind of solid matching method based on adaptive weight and local entropy | |
CN111105452B (en) | Binocular vision-based high-low resolution fusion stereo matching method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190308 |