CN105335952B - Matching power flow computational methods and device and parallax value calculating method and equipment - Google Patents
Matching power flow computational methods and device and parallax value calculating method and equipment Download PDFInfo
- Publication number
- CN105335952B CN105335952B CN201410277105.1A CN201410277105A CN105335952B CN 105335952 B CN105335952 B CN 105335952B CN 201410277105 A CN201410277105 A CN 201410277105A CN 105335952 B CN105335952 B CN 105335952B
- Authority
- CN
- China
- Prior art keywords
- pixel
- value
- power flow
- representative
- picture
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Landscapes
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
This application discloses a kind of Matching power flow computational methods and device and parallax value calculating method and equipment.The Matching power flow computational methods include:The reference representative pixels set for reference pixel is determined in a reference image, and at least one pixel of threshold value is more than with reference in the first predetermined neighborhood of the representative pixels set including reference pixel and in reference pixel and its significance degree;Determine the target representation pixel set for object pixel in the target image, target representation pixel set includes at least one pixel that in object pixel and the first predetermined neighborhood in object pixel and its significance degree is more than threshold value;And the pixel value according to each pixel in the pixel value with reference to each pixel in representative pixels set and target representation pixel set, to calculate the Matching power flow between reference pixel and object pixel.Therefore, differentiable Matching power flow can be obtained, and then obtains correct parallax information.
Description
Technical field
Present application relates generally to digital image processing field, more specifically, this application involves a kind of calculating of Matching power flow
Method and apparatus and parallax value calculating method and equipment.
Background technology
Solid matching method has a wide range of applications in the various fields such as robot, monitoring and intelligent vehicle.
The parallax information (or being referred to as, depth information) obtained by solid matching method can be used for estimating image-capturing apparatus and thing
Relative distance between body.For example, by taking intelligent vehicle as an example, pass through the parallax information obtained by solid matching method, Ke Yirong
Change places and detect road surface, white line and fence, and then detect the target such as pedestrian and vehicle and classify to target, thus, it is possible to
The overall travel conditions of vehicle are controlled comprehensively.
The basic principle of solid matching method is to by two captured by image-capturing apparatus (for example, stereoscopic camera)
The image of same object is compared (when more mesh cameras) under a different visual angles (when binocular camera) or more under different visual angles
Compared with, by the position deviation for finding respective pixel to calculate between the pixel of image, so that parallax information is obtained, and according to this
Parallax information draws anaglyph.
Common solid matching method includes sectional perspective matching process (for example, block matching method), global Stereo matching
Method (for example, dynamic programming method) and half global solid matching method (for example, half global registration (SGM) method), its is usual
All include four following steps or a few steps therein:Matching power flow calculates, supports summation, disparity computation/optimization and parallax to change
Into it is step the most key in these steps that wherein Matching power flow, which calculates,.
Usually, current Matching power flow computational methods, for the pixel of Matching power flow to be calculated, selection has with it
Close attribute or the adjacent pixel in its fixed position, and according to pixel and its adjacent pixel institute of Matching power flow to be calculated
Including Pixel Information, to calculate the Matching power flow between the respective pixel of multi-view image.
However, due to not taking into full account selected pixel in associated picture in the above process of selection adjacent pixel
In whether representative information, it seem likely that the Matching power flow of undistinguishable can be obtained, and then obtained in subsequent step
The parallax information of mistake.
The content of the invention
In order to solve the above-mentioned technical problem, according to the one side of the application, there is provided a kind of Matching power flow computational methods,
The method is used to calculate the Matching power flow between the object pixel in the reference pixel and target image in reference picture, described
Reference picture and the target image belong to an original image pair, and the described method includes:In the reference picture really
Surely it is used for the reference representative pixels set of the reference pixel, it is described to include the reference image with reference to representative pixels set
In element and the first predetermined neighborhood in the reference pixel and its significance degree is more than at least one pixel of threshold value;
The target representation pixel set for the object pixel, the target representation set of pixels are determined in the target image
Closing, which includes in the object pixel and the described first predetermined neighborhood in the object pixel and its significance degree, is more than
At least one pixel of the threshold value;And according to the pixel value of each pixel in the set with reference to representative pixels and
The pixel value of each pixel in the target representation pixel set, come calculate the reference pixel and the object pixel it
Between Matching power flow.
In addition, the another aspect according to the application, there is provided a kind of parallax value calculating method, the method, which is used to calculate, joins
Examine the parallax value between the matched pixel in the reference pixel and target image in image, the reference picture and the target figure
As belonging to an original image pair, and the described method includes:Determine multiple object pixels;For each object pixel, divide
It Ji Suan not supported with reference to each first pixel in support pixel set and corresponding target corresponding second in pixel set
Matching power flow between pixel, it is described to be included in reference to support pixel set in the reference picture in the reference pixel
At least one pixel in second predetermined neighborhood, and the target supports pixel set is included in the target image to be in
At least one pixel in described second predetermined neighborhood of the object pixel;For each object pixel, by being counted
All Matching power flows calculated are summed to obtain the overall matching cost between the reference pixel and the object pixel;
And come according at least to multiple overall matching costs between the reference pixel and the multiple object pixel the multiple
The matched pixel is determined in object pixel, so that it is determined that the parallax value between the reference pixel and the matched pixel, its
In, calculate support to correspond in pixel set with reference to each first pixel and corresponding target in support pixel set respectively
Matching power flow between second pixel includes:The first representative picture for first pixel is determined in the reference picture
Element set, the described first representative pixel set include first pixel and the first predetermined neighbour in first pixel
In domain and its significance degree is more than at least one pixel of threshold value;Determine to be used for second picture in the target image
The representative pixel set of the second of element, the described second representative pixel set include second pixel and in described second
In described first predetermined neighborhood of pixel and its significance degree is more than at least one pixel of the threshold value;And according to institute
State in the first representative pixel set each pixel in the pixel value of each pixel and the second representative pixel set
Pixel value, to calculate the Matching power flow between first pixel and second pixel.
According to the another aspect of the application, there is provided a kind of Matching power flow computing device, described device, which is used to calculate, to be referred to
The Matching power flow between the object pixel in reference pixel and target image in image, the reference picture and the target figure
As belonging to an original image pair, and described device includes:Reference set determination unit, for true in the reference picture
Surely it is used for the reference representative pixels set of the reference pixel, it is described to include the reference image with reference to representative pixels set
In element and the first predetermined neighborhood in the reference pixel and its significance degree is more than at least one pixel of threshold value;
Goal set determination unit, for determining the target representation set of pixels for the object pixel in the target image
Close, the target representation pixel set includes the object pixel and the described first predetermined neighbour in the object pixel
In domain and its significance degree is more than at least one pixel of the threshold value;And Matching power flow computing unit, for basis
Each picture in the pixel value of each pixel and the target representation pixel set in the set with reference to representative pixels
The pixel value of element, to calculate the Matching power flow between the reference pixel and the object pixel.
According to the another aspect of the application, there is provided a kind of parallax value computing device, the equipment are used to calculate reference chart
The parallax value between the matched pixel in reference pixel and target image as in, the reference picture and the target image category
In an original image pair, and the equipment includes:Object pixel determining device, for determining multiple object pixels;Matching
Cost computing device, for for each object pixel, calculating respectively with reference to each first pixel in support pixel set
It is described with reference to support pixel set and corresponding target supports the Matching power flow in pixel set between corresponding second pixel
It is included at least one pixel in the second predetermined neighborhood in the reference pixel in the reference picture, and the mesh
Mark supports pixel set to be included in the target image in the described second predetermined neighborhood in the object pixel at least
One pixel;Overall cost obtains device, for for each object pixel, passing through all Matching power flows to being calculated
Summed to obtain the overall matching cost between the reference pixel and the object pixel;And parallax value calculates dress
Put, for coming according at least to multiple overall matching costs between the reference pixel and the multiple object pixel described more
The matched pixel is determined in a object pixel, so that it is determined that the parallax value between the reference pixel and the matched pixel,
Wherein, the Matching power flow computing device is calculated with reference to each first picture in support pixel set respectively by following operation
Plain and corresponding target supports the Matching power flow between corresponding second pixel in pixel set:In the reference picture really
Surely it is used for the first representative pixel set of first pixel, the described first representative pixel set includes first picture
In element and the first predetermined neighborhood in first pixel and its significance degree is more than at least one pixel of threshold value;
The second representative pixel set for second pixel, the described second representative set of pixels are determined in the target image
Closing, which includes in second pixel and the described first predetermined neighborhood in second pixel and its significance degree, is more than
At least one pixel of the threshold value;And according to the pixel value of each pixel in the described first representative pixel set and
The pixel value of each pixel in the second representative pixel set, come calculate first pixel and second pixel it
Between Matching power flow.
Compared with prior art, embodiments herein provides a kind of Matching power flow computational methods and device, wherein will
The concept of supporting structure information is incorporated into Matching power flow calculating process, and supporting structure information refers to that some are representative here
Adjacent pixel.That is, a kind of Matching power flow calculating side based on supporting structure is provided in embodiments herein
Method, for solid matching method., can be for be calculated in the Matching power flow computational methods according to the embodiment of the present application
During the pixel selection adjacent pixel of Matching power flow, take into full account whether selected pixel has generation in associated picture
Table information, is adaptive selected representative adjacent pixel, rather than simply choose with it with close attribute or
Adjacent pixel in its fixed position, and by selecting the adjacent pixel of these representative information, can obtain can
The Matching power flow of differentiation, and then correct parallax information is obtained in subsequent step.
In addition, embodiments herein additionally provides a kind of parallax value calculating method and equipment, wherein supporting structure is believed
The concept of breath is incorporated into Matching power flow calculating process, can be in the pixel selection adjacent pixel for Matching power flow to be calculated
During, take into full account selected pixel in associated picture whether representative information, be adaptive selected with generation
The adjacent pixel of table, rather than simply choose with its adjacent pixel with close attribute or in its fixed position, and
And by selecting the adjacent pixel of these representative information, differentiable Matching power flow can be obtained, and then can distinguish
Matching power flow be applied to various Stereo Matching Algorithms, so as to obtain correct parallax information.
Other features and advantage will illustrate in the following description, also, partly become from specification
Obtain it is clear that or being understood by implementing the application.The purpose of the application and other advantages can be by specification, rights
Specifically noted structure is realized and obtained in claim and attached drawing.
Brief description of the drawings
Attached drawing is used for providing further understanding of the present application, and a part for constitution instruction, the reality with the application
Apply example to be used to explain the application together, do not form the limitation to the application.In the accompanying drawings:
Figure 1A be a diagram that the schematic diagram of the matching algorithm according to prior art based on the sum of absolute grayscale difference.
Figure 1B be a diagram that the schematic diagram of the matching algorithm according to prior art based on the sum of square gray scale difference.
Fig. 1 C be a diagram that the matching algorithm according to prior art based on basic statistics generaI investigation (Census) conversion
Fig. 1 D be a diagram that the matching algorithm according to prior art based on the first enhancing Census conversion.
Fig. 1 E be a diagram that the matching algorithm according to prior art based on the second enhancing Census conversion.
Fig. 2A be a diagram that the schematic diagram according to the left image photographed by binocular camera in left side perspective.
Fig. 2 B be a diagram that the matching algorithm according to prior art based on the second enhancing statistics census transform is obtained
Disparity map.
Fig. 3 be a diagram that the overview flow chart of the Matching power flow computational methods according to the embodiment of the present application.
Fig. 4 be a diagram that the flow chart of the Matching power flow computational methods according to the embodiment of the present application first specific example.
Fig. 5 A be a diagram that the schematic diagram for the significance degree for calculating pixel in the first case.
Fig. 5 B be a diagram that the schematic diagram for the significance degree for calculating pixel in a second situation.
Fig. 6 A be a diagram that the schematic diagram for determining respective pixel in the first case.
Fig. 6 B be a diagram that the schematic diagram for determining respective pixel in a second situation.
Fig. 7 be a diagram that the flow chart of the Matching power flow computational methods according to the embodiment of the present application second specific example.
Fig. 8 A be a diagram that the schematic diagram of the Census transformation results of left image in the first case.
Fig. 8 B be a diagram that the schematic diagram of the Census transformation results of left image in a second situation.
Fig. 9 be a diagram that the overview flow chart of the parallax value calculating method according to the embodiment of the present application.
Figure 10 A be a diagram that the matching algorithm according to prior art based on the second enhancing statistics census transform is obtained
Disparity map.
Figure 10 B be a diagram that the obtained disparity map of parallax value calculating method according to the embodiment of the present application.
Figure 11 illustrates the functional configuration block diagram of the Matching power flow computing device according to the embodiment of the present application.
Figure 12 illustrates the functional configuration block diagram of the parallax value computing device according to the embodiment of the present application.
Figure 13 illustrates the functional structure chart of the parallax value computing system according to the embodiment of the present application.
Figure 14 illustrates the general hardware block diagram for being used for the hardware system that parallax value calculates according to the embodiment of the present application.
Embodiment
It will be described in detail with reference to the accompanying drawings each embodiment according to the application.Here it is to be noted that it in the accompanying drawings,
Identical reference numeral, which is assigned, substantially has the part of same or like 26S Proteasome Structure and Function, and will omit on it
Repeated description.
In order to make those skilled in the art more fully understand the application, will come to make the application in the following order further detailed
Describe in detail bright.
1st, the brief introduction of the prior art
2nd, the thought general introduction of the application
3rd, Matching power flow computational methods
3.1st, first specific example
3.2nd, second specific example
4th, parallax value calculating method
5th, Matching power flow computing device
6th, parallax value computing device
7th, parallax value computing system
8th, the hardware system calculated for parallax value
1st, the brief introduction of the prior art
Before embodiments herein is described, in order to make it easy to understand, will briefly introduce according to existing first
The technical principle and its technical problem of the Matching power flow computational methods of technology.
As the committed step in solid matching method, currently, two class Matching power flow computational methods are primarily present, based on
Calculate the Matching power flow between the respective pixel of multi-view image.In the following, for the ease of description, will be to pass through image-capturing apparatus pin
To same object taken to multi-view image set be only original including two images of reference picture and target image
Illustrated exemplified by image pair.
First kind Matching power flow computational methods can utilize the information of some reference pixel in reference picture and have therewith
There are the information of the adjacent pixel of close attribute and the information of some object pixel in target image and have therewith close
The information of the adjacent pixel of attribute, to calculate the Matching power flow between the reference pixel and the object pixel.
It should be noted that in the following description, when not distinguished to reference pixel and object pixel, Ke Yijian
Both are referred to as center pixel by single ground.
Specifically, the example as first kind Matching power flow computational methods, is based on gray scale in the presence of one kind in the prior art
The matching algorithm of similitude, its selection with center pixel have close attribute adjacent pixel when, according to grey similarity come
The pixel sequence (that is, an adaptive curve) of a connection is built to current center pixel, and in this, as matching member
Three-level matching process is usually performed, so as to calculate the Matching power flow between the center pixel in reference picture and target image.
In other words, which builds coupling element using grey similarity, with selection by similar with gray scale to center pixel
A built-up connected pixel sequence of the pixel of property is as coupling element.
Relatively, the second class Matching power flow computational methods can utilize reference picture in some reference pixel information,
With the information of some object pixel in the information and target image of the adjacent pixel in its fixed position and in it
The information of the adjacent pixel of fixed position, to calculate the Matching power flow between the reference pixel and the object pixel.
Specifically, at present, the second class Matching power flow computational methods mainly include based on the sum of absolute grayscale difference (SAD)
Matching algorithm with algorithm, based on the sum of square gray scale difference (SSD) and the matching based on statistics generaI investigation (Census) conversion are calculated
Method.
In the following, first, it will be calculated with reference to figure 1A and Figure 1B to describe SAD matching algorithms according to prior art and SSD matchings
Method.
Figure 1A be a diagram that the schematic diagram of the matching algorithm according to prior art based on the sum of absolute grayscale difference, and scheme
1B be a diagram that the schematic diagram of the matching algorithm according to prior art based on the sum of square gray scale difference.
It should be noted that for convenience, it is still assumed that the multi-view image collection arrived for same object taken
Conjunction is a pair of of original image, it is made of the reference picture and target image photographed by binocular camera in the visual angle of left and right,
And assume that the reference picture and target image are gray-scale maps, that is to say, that each pixel in image has respective ash
Angle value.In addition, it is further assumed that the reference picture is the left image photographed by binocular camera in left side perspective, and the target
Image is the right image photographed by binocular camera in right side perspective, that is to say, that in the accompanying drawings, 9 pixels on the left side are come
From left image, 9 pixels on the right come from right image, and there are a known position deviation between them.
Based on above-mentioned it is assumed that as illustrated in fig. 1A, in order to calculate some reference pixel and the target in reference picture
Between some object pixel in image Matching power flow (it is assumed here that:The reference pixel is the left image shown in Figure 1A
In there is the center pixel that gray value is 166, and the object pixel is that to have gray value in right image shown in Figure 1A be 165
Center pixel), SAD matching algorithms include:The adjacent pixel around reference pixel in 3 × 3 neighborhoods is selected in a reference image;
Adjacent pixel in same vicinity around selection target pixel (that is, 3 × 3 neighborhoods) in the target image;Seriatim calculate left
Difference between the gray value of respective pixel at the gray value and the right same position of specific pixel at a certain position in side
Absolute value (that is, absolute grayscale is poor) is (for example, in figure ia, the top left corner pixel that in left image there is gray value to be 135 corresponds to
With the top left corner pixel that gray value is 133 in right image, and and so on);And by the absolute of whole 9 positions
Gray scale difference is summed to obtain the Matching power flow between the reference pixel and the object pixel, i.e. C=9.
As illustrated in fig. ib, SSD matching algorithms and the difference of SAD matching algorithms are:In selected center's pixel
Around adjacent pixel after, at the gray value and the right same position that seriatim calculate specific pixel at a certain position in the left side
Respective pixel gray value between difference square value (that is, square gray scale difference);And by being put down to all 9 positions
Square gray scale difference is summed to obtain the Matching power flow between the reference pixel and the object pixel, i.e. C=18.
Usually, SAD matching algorithms and SSD matching algorithms be all based on it is assumed hereinafter that, that is, think reference picture and target
Same pixel in image should have identical gray value under preferable illumination condition.However, in actual use, due to
Image-capturing apparatus is easily influenced be subject to external factor (for example, illumination variation, visual angle change, block with deformation etc.), and is made
The image that must be collected differs larger (for example, left image and right image may be in different illumination conditions from ideal situation
Under), so as to cause bad by the obtained final result performance of above-mentioned algorithm.
In order to overcome the above problem and obtain more robust matching performance, Census Transformation Matchings are further provided
Algorithm.Census Transformation Matchings algorithm is also very common in Matching power flow computational methods, its elder generation chooses in left image and right image
Neighborhood union carries out Census conversion to neighborhood, then carries out Matching power flow calculating.
Specifically, match with SAD and SSD matching algorithms similarly, Census Transformation Matching algorithms are still through center
Pixel and the Pixel Information of adjacent pixel in its fixed position calculate the Matching power flow between the center pixel.However,
Matched with SAD and SSD matching algorithms differently, Census Transformation Matching algorithms be utilize neighborhood pixels gray value between size
Relation rather than grey scale pixel value are used as similarity measure in itself, so that the ability that anti-external factor influences is strengthened.
At present, main Census Transformation Matchings algorithm mainly includes three types:Basic Census Transformation Matchings algorithm,
First enhancing Census Transformation Matchings algorithm and the second enhancing Census Transformation Matching algorithms.
Next, three kinds of Census matching algorithms according to prior art will be described with reference to figure 1C to Fig. 1 E.
Fig. 1 C be a diagram that the matching algorithm according to prior art based on basic statistics generaI investigation (Census) conversion, Fig. 1 D
It is a diagram that the matching algorithm according to prior art based on the first enhancing Census conversion, and Fig. 1 E be a diagram that according to existing
There is the matching algorithm based on the second enhancing Census conversion of technology.
As illustrated in Figure 1 C, in order to calculate some in some reference pixel in reference picture and target image
Between object pixel Matching power flow (it is assumed here that:The reference pixel is
166 center pixel, and the object pixel is the center pixel that there is gray value to be 165 in the right image shown in Fig. 1 C), base
This Census Transformation Matching algorithms include:The adjacent pixel around reference pixel in 3 × 3 neighborhoods is selected in a reference image;
The adjacent pixel in same vicinity (that is, 3 × 3 neighborhoods) in target image around selection target pixel;Respectively to single image
In window area carry out Census conversion, i.e., by the formula shown in Fig. 1 C, respectively in left image and right image in
The gray value I of imago element0As threshold value, to the gray value I of 8 adjacent pixels in 3 × 3 neighborhoodsiCarry out at single-bit binaryzation
Reason, so as to obtain for the binary string 10111101 of 8 adjacent pixels in left image and for 8 adjacent pixels in right image
Binary string 10111100;And obtain the reference pixel and the mesh by calculating the Hamming distance between two binary strings
Mark the Matching power flow between pixel, i.e. C=1.
As Fig. 1 D are illustrated, with basic Census Transformation Matchings algorithm differently, the first enhancing Census Transformation Matchings
Algorithm using the formula shown in Fig. 1 D, that is, is distinguished when carrying out Census conversion to the window area in single image respectively
With the gray value I of center pixel in left image and right image0Threshold is used as with a tolerance δ (can set based on experience value)
Value, to the gray value I of 8 adjacent pixels in 3 × 3 neighborhoodsiDibit binary conversion treatment is carried out, so as to obtain being used for left image
In 8 adjacent pixels binary string 0100010100010000 and binary string for 8 adjacent pixels in right image
0100000100010000;And obtain the reference pixel and the mesh by calculating the Hamming distance between two binary strings
Mark the Matching power flow between pixel, i.e. C=1.
Further, in order to obtain more accurate Matching power flow, as Fig. 1 E are illustrated, become in the second enhancing Census
Change in matching algorithm, it may be considered that with first enhancing Census Transformation Matching algorithms compared with scope bigger neighborhood and pattern more
Complicated adjacent pixel, as shown in the dark pixels point around center pixel.
In conclusion in the prior art, either first kind Matching power flow computational methods, or the second class Matching power flow
Computational methods are either right based on fixed position or based on close attribute in the calculation during Matching power flow between imago element
Reference pixel and object pixel selection neighbour structure, and cause each pixel in neighborhood to be respectively provided with identical weight.
Thus, it will be seen that since the prior art does not take into full account adjacent pixel in neighbour structure in associated picture
In whether be the pixel of representative information, it is possible that the Matching power flow of undistinguishable can be produced, and then in subsequent step
In obtain mistake parallax information.Especially when in a certain region of original image (for example, in left image or right image)
During variable quantity smaller (for example, lacking texture) between each pixel value, it will be unable to obtain on this using the above-mentioned prior art
The disparity map of good performance in region.
In the following, will with reference to figure 2A and Fig. 2 B to describe Stereo Matching Algorithm according to prior art the defects of.
Fig. 2A be a diagram that the schematic diagram according to the left image photographed by binocular camera in left side perspective, and Fig. 2 B
It is a diagram that the obtained disparity map of matching algorithm according to prior art based on the second enhancing statistics census transform.
Specifically, as Fig. 2A is illustrated, in the left image that binocular camera photographs in left side perspective, can see
Go out, almost without any texture in the region of road surface.In the case, used in the prior art even in for the region
The relatively outstanding SGM methods based on the second enhancing Census Transformation Matching algorithms of performance, being remained on road surface to calculate
Draw too many correct parallax value.Finally, by parallax improvement step (for example, filter is made an uproar) and so that the parallax value of mistake is filtered
Except afterwards, in such as illustrated final parallax of Fig. 2 B, almost without any parallax value on road surface.Obviously, such knot
Fruit is unsatisfactory.
2nd, the thought general introduction of the application
In the following, the main thought that the application will be briefly described first.
In order to solve technical problem of the prior art, a kind of Matching power flow computational methods and dress are proposed in this application
Put and parallax value calculating method and equipment, it can be in the mistake of the pixel selection adjacent pixel for Matching power flow to be calculated
Cheng Zhong, take into full account selected pixel in associated picture whether representative information, be adaptive selected with represent
The adjacent pixel of property, rather than simply choose with its adjacent pixel with close attribute or in its fixed position so that
Differentiable Matching power flow can be obtained, and then correct parallax information is obtained in subsequent step.
3rd, Matching power flow computational methods
Hereinafter, the overall procedure of Matching power flow computational methods according to the embodiment of the present application will be described with reference to Figure 3
Example.
According to the Matching power flow computational methods of the embodiment of the present application can be used for calculate multi-view image respective pixel it
Between Matching power flow.
Usually, which can refer to more captured by by image-capturing apparatus (for example, stereoscopic camera)
The image of same object under a different visual angles.Obviously, the plurality of different visual angles can include more than two visual angles, and be clapped
The image taken the photograph can be made of a series of pixels in diverse location coordinate, each pixel in the image can have
There is identical or different pixel value.For example, the image can be gray-scale map, each of which pixel passes through gray value (Gray
One-component) represent.Alternatively, which can also be cromogram, each of which pixel by value of color (for example,
RGB tri- components) represent.
In the following, for the ease of description, by with by binocular camera for same object in the visual angle of left and right it is taken
Double-visual angle gray level image is illustrated as the example of the multi-view image.For example, the stereoscopic image set can refer to wrap
Include the original image pair of two gray level images of reference picture and target image.Specifically, it can be assumed that the reference picture is by double
The left image that mesh camera photographs in left side perspective, and the target image is photographed by binocular camera in right side perspective
Right image.
However, it is necessary to explanation, the application both can be applied to calculate between the respective pixel of stereoscopic image
With cost, can also be applied to calculate the Matching power flow between the respective pixel of more multi-view image.In addition, the application can be same
The Matching power flow operation being applied in gray-scale map or cromogram sample.Also, right image can also be used as to reference picture, and
Left image is used as target image.
Fig. 3 be a diagram that the overview flow chart of the Matching power flow computational methods according to the embodiment of the present application.
As shown in figure 3, the Matching power flow computational methods can include:
In step s 110, the reference representative pixels set for reference pixel is determined in a reference image.
Between some object pixel in some reference pixel and target image in calculating reference picture is needed
Matching power flow when, the adjacent pixel of representative information can be selected for reference pixel in a reference image first.
For example, the whether representative information of adjacent pixel can be judged by its significance degree.Here, one it is adjacent
Pixel (or being referred to as, pixel) can make choice in general image or in subregion.Specifically, in order to reduce
The calculation amount of computing device and computational efficiency is lifted, its significance degree can be found in the first predetermined neighborhood of reference pixel
More than at least one pixel of threshold value, as above-mentioned adjacent pixel.
In addition, the significance degree can be weighed by various indexs, for example, it can be the pixel value of the pixel
The saltus step degree of (for example, gray value), the pixel degree of stability in a reference image, the architectural feature of the pixel or other
Characteristic etc..
Herein, can be by the reference pixel in reference picture and in the reference pixel for the ease of description
The first predetermined neighborhood in and its significance degree be more than threshold value at least one pixel be referred to as referring to representative pixels collection
Conjunction or the supporting structure of reference pixel.
In the step s 120, the target representation pixel set for object pixel is determined in the target image.
Next, representative letter can be selected for object pixel in the target image with step S110 similarly
The adjacent pixel of breath.For example, in order to calculate the Matching power flow between reference pixel and object pixel exactly, can be in reference chart
As with target image relative to center pixel (respectively, the target picture in the reference pixel and target image in reference picture
Element) define the supporting structure based on same concept.That is, in the target image, which equally can be to be in
In described first predetermined neighborhood of the object pixel and its significance degree is more than at least one pixel of the threshold value.
Similarly, can be by the object pixel in target image and in the object pixel for the ease of description
The first predetermined neighborhood in and its significance degree be more than threshold value at least one pixel be referred to as target representation set of pixels
Conjunction or the supporting structure of object pixel.
Furthermore, it is necessary to explanation, although herein step S120 is described as performing after step silo,
In practice, step S120 can also be performed before step S110 or both performs at the same time.
In step s 130, the Matching power flow between reference pixel and object pixel is calculated.
After determining with reference to representative pixels set and target representation pixel set, generation can be referred to according to described
In table pixel set in the pixel value of each pixel and the target representation pixel set each pixel pixel
Value, to calculate the Matching power flow between the reference pixel and the object pixel.
Specifically, it is possible, firstly, to determine respectively corresponding with each pixel in the set with reference to representative pixels
Respective pixel in the target representation pixel set.
For example, this correspondence between pixel can be proper accurate corresponding, two of which pixel
Coordinate must correspond to completely.Alternatively, this correspondence between pixel can also be have certain tolerance elastic right
Should, as long as the coordinate of two of which pixel is corresponding in certain tolerance.
Then, each pixel and the corresponding target generation in the set with reference to representative pixels are calculated respectively
The distance between respective pixel metric in table pixel set.
The distance metric value can depend on used Matching power flow algorithm and determine.For example, the distance metric value can
To be the margin of image element (for example, gray scale difference) between respective pixel, or it can also be the coding distance (example between respective pixel
Such as, Hamming distance) etc..
Further, it is also possible to according to the relative position relation between the adjacent coordinates and centre coordinate being presently processing come pair
Above-mentioned distance metric value weights assigned processing, to obtain the Weighted distance metric with more accurate metric performance.
Finally, can be by described with reference to each pixel in representative pixels set and corresponding to what is calculated
The target representation pixel set in all distance metric values between respective pixel sum, to obtain the reference
Matching power flow between pixel and the object pixel.
It can be seen from the above that embodiments herein provides a kind of Matching power flow computational methods, wherein by supporting structure information
Concept be incorporated into Matching power flow calculating process, supporting structure information refers to some representative adjacent pixels here.
That is, a kind of Matching power flow computational methods based on supporting structure are provided in embodiments herein, for vertical
Body matching process., can be for Matching power flow to be calculated in the Matching power flow computational methods according to the embodiment of the present application
During pixel selection adjacent pixel, take into full account selected pixel in associated picture whether representative information,
Representative adjacent pixel is adaptive selected, rather than simply chooses with it with close attribute or is fixed in it
The adjacent pixel of position, and by selecting the adjacent pixel of these representative information, differentiable matching can be obtained
Cost, and then correct parallax information is obtained in subsequent step.
3.1st, first specific example
Hereinafter, the Matching power flow according to the embodiment of the present application first specific example will be described with reference to figure 4 to Fig. 6 B
The overall procedure of computational methods.
It is used to calculate the ginseng in reference picture according to the Matching power flow computational methods of the embodiment of the present application first specific example
The Matching power flow between the object pixel in pixel and target image is examined, the wherein Matching power flow passes through the picture between respective pixel
Plain value difference is estimated.
In the first specific example, it is still assumed that the reference picture is photographed by binocular camera in left side perspective
Left image, and the target image is the right image photographed by binocular camera in right side perspective.The reference pixel is properly termed as
Center pixel in reference picture, and the object pixel is properly termed as the center pixel in target image, has between the two
One known position deviation, i.e., both there is deviation on abscissa and ordinate.
Fig. 4 be a diagram that the flow chart of the Matching power flow computational methods according to the embodiment of the present application first specific example.
As shown in figure 4, the Matching power flow computational methods can include:
In step S210, the first predetermined neighborhood is selected in a reference image.
For example, can in the reference picture (for example, left image) selection around the reference pixel and
Region with predetermined shape, as the described first predetermined neighborhood.
Specifically, in a reference image, reference pixel is as center pixel.For example, can centered on the reference pixel,
And the rectangular area that a ratio of width to height is w × h is determined around it.However, the application not limited to this, reference pixel can also
The center of the first predetermined neighborhood is not at, but at the off-centring certain position in the first predetermined neighborhood of distance,
And first neighborhood can also be other shapes, circle, ellipse, square etc..
In step S220, the notable journey of at least one pixel in the first predetermined neighborhood in addition to reference pixel is calculated
Degree.
For example, the significance degree can be weighed by various indexs.
In the first case, which can be the saltus step degree of the pixel value (for example, gray value) of the pixel.
That is the adjacent pixel of representative information can be the gray-scale edges pixel in gray value jump position.
At this moment, step S220 can be realized by following operation:Using the reference pixel as the multiple and different of starting point
The pixel value saltus step degree between specific pixel and its adjacent pixel is determined in direction, and the pixel value saltus step degree is made
For the significance degree of the specific pixel.
Fig. 5 A be a diagram that the schematic diagram for the significance degree for calculating pixel in the first case.
, can in a reference image, centered on reference pixel (being marked as A), along multiple and different sides with reference to figure 5A
To being scanned, untill the zone boundary of the defined first predetermined neighborhood.In fig. 5, completed using 8 directions
Above-mentioned scanning, i.e., determine whether in center pixel outside 0 °, 45 °, 90 °, 135 °, 180 °, 225 °, 270 °, 315 ° of directions
There are Gray Level Jump pixel, and scanning process is carried out from center pixel toward zone boundary.It is then possible to for each side
To the position of mark gray value saltus step.
Specifically, for two adjacent pixel P on a direction rr,iAnd Pr,i+1If the gray value g between them
(Pr,i) and g (Pr,i+1) difference be more than a threshold value η, that is to say, that if meeting the following formula (1), then it is assumed that gray value jump
Change situation occurs, and by pixel Pr,iLabeled as jump position:
|g(Pr,i)-g(Pr,i+1)|>η formula (1)
Wherein, η can take empirical value, and in one example, it can take not according to the different range of gray value
Same value.
The supporting structure example from some pixel of left image with reference to given by figure 5A, in the figure, labeled as alphabetical A
Circle represent center pixel, white, grey, dead color, black represent the gray value changed from small to large, and are labeled as letter b
The gray value jump position being finally marked to the circle representative of P.
However, it is necessary to explanation, the application not limited to this.Gray value can also be completed in more or fewer directions
The scanning of saltus step, and can also be when gray value difference is more than threshold value, by pixel Pr,i+1Rather than pixel Pr,iLabeled as jump
Become position.
In a second situation, which can be the degree of stability of the pixel in a reference image.That is, tool
The adjacent pixel of representative information can be the pixel for having stability.
At this moment, step S220 can be realized by following operation:Retouched by performing scale invariant feature to specific pixel
State to calculate the dimension stable degree of the specific pixel, and using the dimension stable degree as the aobvious of the specific pixel
Work degree.
Fig. 5 B be a diagram that the schematic diagram for the significance degree for calculating pixel in a second situation.
With reference to figure 5B, after an original image (gray scale or colour) is given, this method can to the original image (or
Only the first predetermined field of wherein selected w × h) apply iteration gaussian filtering to obtain cluster image, the cluster image
In every piece image correspond to some specific gaussian filtering iterations.In figure 5b, which is represented with T, wherein T
It is integer.For example, T=0 represents that associated picture is the original image for not applying gaussian filtering, T=1 represents that associated picture is to apply
The original image of 1 gaussian filtering, T=2 represents that associated picture is the original image for being applied with 2 gaussian filterings, such as such
Push away.This cluster image is referred to as the metric space of original image in the image processing arts.
It is then possible to find extreme point in metric space, the extreme point is the higher pixel of dimension stable degree
Point.
For example, in the present embodiment, which can use Scale invariant special with the step of searching extreme point
Sign converts (SIFT) algorithm to realize.However, it is necessary to explanation, the application not limited to this, the step can also use following
Any one of algorithm is realized:Accelerate robust features (SURF) algorithm and affinity Scale invariant features transform (ASIFT)
Algorithm etc..
In the third case, which can be the structure attribute of the pixel in a reference image.That is, tool
The adjacent pixel of representative information can be the specific pixel of such as corner pixels etc.
For example, the matrix of the strength structure of the local neighborhood for showing a pixel can be calculated, and pass through this
The characteristic value of matrix judges whether the pixel is corner pixels.
Obviously, the application is not limited in three kinds of above-mentioned situations.Some pixel in first predetermined neighborhood it is notable
Degree can also be determined by other one or more characteristics.
In step S230, the set for all pixels that reference pixel and its significance degree are more than threshold value is determined as joining
Examine representative pixels set.
When determined in the first predetermined neighborhood around reference pixel its significance degree be more than threshold value all pixels it
Afterwards, can by it is in the reference pixel in reference picture and the first predetermined neighborhood in the reference pixel and its
Significance degree is referred to as the support knot with reference to representative pixels set or reference pixel more than at least one pixel of threshold value
Structure.
In step S240, the first predetermined neighborhood is selected in the target image.
In step s 250, the notable journey of at least one pixel in the first predetermined neighborhood in addition to object pixel is calculated
Degree.
In step S260, the set for all pixels that object pixel and its significance degree are more than threshold value is determined as mesh
Mark representative pixels set.
Since step S240 to S260 corresponds respectively to above-mentioned step S210 to S230, it is differed only in, step
S240 to S260 is that the target picture being directed in target image (for example, right image) usually determines representative adjacent pixel
Set, so its repeated description will be omitted herein.
When determined in the first predetermined neighborhood around object pixel its significance degree be more than threshold value all pixels it
Afterwards, can by it is in the object pixel in target image and the first predetermined neighborhood in the object pixel and its
Significance degree is referred to as the support knot of target representation pixel set or object pixel more than at least one pixel of threshold value
Structure.
In step S270, object representations corresponding with each pixel in reference representative pixels set are determined respectively
Respective pixel in property pixel set.
As noted previously, as the position deviation on abscissa and ordinate between known reference pixel and object pixel,
And it can also be seen that the coordinate bit of coordinate position and object pixel in the target image of reference pixel in a reference image
Put, similarly, it is known that with reference to the coordinate position in a reference image of each pixel in representative pixels set and target
The coordinate position of each pixel in the target image in representative pixels set, it is possible to according to the coordinate of each pixel
To judge for reference to some pixel (for example, referred to as the first pixel) in representative pixels set, in target generation
It whether there is a corresponding object pixel in table pixel set (for example, referred to as the second pixel).
Herein, it is necessary to illustrate, due to before Matching power flow calculating is carried out, generally requiring in practice to reference
Image and target image carry out altitude calibration, so simply, can be only in horizontal seat between reference pixel and object pixel
Existence position deviation delta x is put on, and has been calibrated on the vertical scale mutually the same.In the following, for convenience, will as example after
Continue bright.
For example, the correspondence in left image and right image between pixel can be that stringent corresponding or elasticity is corresponding.
In the first case, the coordinate of two pixels can be proper accurate corresponding, that is to say, that in left figure
The coordinate of two pixels must correspond to completely between picture and right image.
Fig. 6 A be a diagram that the schematic diagram for determining respective pixel in the first case.
As illustrated in fig. 6 a, for reference to some first pixel in representative pixels set, can determine first
With reference to first coordinate (x of first pixel in representative pixels set in left image0,y0), then to first coordinate into
Offset on row abscissa, so as to obtain the second coordinate (x0+Δx,y0), and judge be in target representation pixel set
No there are its coordinate is (x0+Δx,y0) the second pixel.
In a second situation, the coordinate of two pixels can be that the elasticity with certain tolerance corresponds to, that is to say, that
As long as the coordinate of two pixels is corresponding in certain tolerance between left image and right image.
Fig. 6 B be a diagram that the schematic diagram for determining respective pixel in a second situation.
As Fig. 6 B are illustrated, for reference to some first pixel in representative pixels set, can determine first
With reference to first coordinate (x of first pixel in representative pixels set in left image0,y0), then to first coordinate into
Offset on row abscissa, so as to obtain the second coordinate (x0+Δx,y0).Next, except judging in target representation set of pixels
Whether there is its coordinate in conjunction is (x0+Δx,y0) the second pixel outside, also determine whether in target representation pixel set
In be in the second pixel in one around the second coordinate elasticity correspondence neighborhood with the presence or absence of its coordinate.
For this reason, area around the second coordinate and with predetermined shape can be selected in the target image
Domain, neighborhood is corresponded to as the elasticity.For example, it can be a rectangle region centered on the second coordinate that the elasticity, which corresponds to neighborhood,
The region in domain, border circular areas or other shapes.
For example, it is a rectangle shape shape that the elasticity can be corresponded to neighborhood definition by the following formula (2):
|x-(x0+Δx)|<=1, at the same time | y-y0|<=1 formula (2)
Wherein, x is the abscissa of some pixel in the corresponding neighborhood of elasticity, and y is a certain in the corresponding neighborhood of elasticity
The ordinate of a pixel.
Then, can by the target representation pixel set, its coordinate be in the corresponding neighborhood of elasticity the
Two pixels are determined as respective pixel corresponding with first pixel.
It should be noted that when corresponding adjacent there are during multiple pixels, can suitably reduce elasticity in the corresponding neighborhood of elasticity
The scope in domain, or the pixel of its coordinate and the second coordinate distance minimum is directly selected as second pixel.
In addition, when determining that elasticity corresponds to neighborhood, in addition to it can use coordinate as standard, can also use
Other metrics are as the standard.It is, for example, possible to use (that is, the representative information from center pixel to mark position
Adjacent pixel) radius r and from horizontal to the right to the angle theta every scan line, as corresponding adjacent for forming elasticity
The standard in domain.
In step S280, calculate refer to each pixel and corresponding target generation in representative pixels set respectively
Pixel value difference in table pixel set between respective pixel.
If determined in step S270 for some pixel in reference representative pixels set (for example, being referred to as
For the first pixel) for, there are a corresponding object pixel (for example, referred to as second in target representation pixel set
Pixel), then it can calculate pixel value difference between the two.For example, the pixel value difference can be both pixel values (for example, gray scale
Value) between difference absolute value (that is, absolute grayscale is poor).Alternatively, which can be both pixel values (for example, ash
Angle value) between difference square value (that is, square gray scale difference).
It is then possible to for repeating aforesaid operations with reference to each pixel in representative pixels set, to obtain respectively
Corresponded into the set with reference to representative pixels in each pixel and the corresponding target representation pixel set
Pixel value difference between pixel.
In addition, in one example, it is described with reference to each pixel in representative pixels set and corresponding calculating
The target representation pixel set in during pixel value difference between respective pixel, can not be to representative pixels
Each pixel in set distributes identical weighted value, but each pixel and center in representative pixels set
Position relationship between pixel determines its weight coefficient, to obtain the pixel value difference of weighting.
Specifically, thinking in this application, the representative pixels obtained in preceding step can have different weights,
And weighted value can be related with the distance between representative pixels to center pixel:Distance center pixel is nearer, its weight
It is bigger;Otherwise distance center pixel is more remote, its weight is smaller.Alternatively, as different station location marker benchmark, the weighted value
Can be related with the radius (r) and angle (θ) of the representative pixels:The pixel for being in different angle has different weights;Place
There is different weights in the pixel of different radii, and radius is smaller, weight is bigger, otherwise radius is bigger, and weight is smaller.
At this moment, this is calculated with reference to each pixel and corresponding target representation pixel in representative pixels set respectively
Pixel value difference in set between respective pixel can include:The first pixel in the set with reference to representative pixels with
Position relationship between the reference pixel determines the first weight coefficient;In the target representation pixel set
Position relationship between two pixels and the object pixel determines the second weight coefficient, and second pixel is and the reference
The respective pixel in the corresponding target representation pixel set of the first pixel in representative pixels set;Calculate described
Pixel value difference between one pixel and second pixel;And use first weight coefficient and the second weight system
Number, to be weighted to the pixel value difference between first pixel and second pixel, with obtain first pixel and
Weighted pixel difference between second pixel.
If on the contrary, determined in step S270 for reference to the first pixel in representative pixels set,
In target representation pixel set and any corresponding second pixel is not present, then this method possibly can not continue to complete above-mentioned
The calculating operation of pixel value difference.
At this moment, in order to ensure that this method can continue to perform, in a simple examples, can directly ignore with reference to generation
The first pixel in table pixel set, and continue with reference to next pixel in representative pixels set.
However, in order to obtain more accurate Matching power flow, in another example, predetermined value can be arranged to described
Pixel value difference between first pixel and second pixel, wherein, the predetermined value is more than described with reference to representative pixels collection
In pixel value difference in conjunction in each pixel and the corresponding target representation pixel set between respective pixel
Maximum.
In step S290, reference pixel and target are obtained by being summed to all pixels difference calculated
Matching power flow between pixel.
It can be seen from the above that in the first specific example of the embodiment of the present application, adjacent picture can selected for center pixel
In the process of element, some sparse pixels are adaptive selected as coupling element, they are the supporting structures by center pixel
(representative adjacent pixel, such as, gray scale/colour edging pixel, key point pixel and corner pixels etc.) obtain,
And directly obtain distinguishing according to the pixel value difference of center pixel between reference picture and target image and/or adjacent pixel
Matching power flow, and then by the Matching power flow be applied to follow-up Stereo Matching Algorithm in, to obtain correct parallax information.
3.2nd, second specific example
Hereinafter, the Matching power flow according to the embodiment of the present application second specific example will be described with reference to figure 7 to Fig. 8 B
The overall procedure of computational methods.
It is used to calculate the ginseng in reference picture according to the Matching power flow computational methods of the embodiment of the present application second specific example
The Matching power flow between the object pixel in pixel and target image is examined, the wherein Matching power flow passes through the number between respective pixel
Value encodes distance to be estimated.
Fig. 7 be a diagram that the flow chart of the Matching power flow computational methods according to the embodiment of the present application second specific example.
As shown in fig. 7, the Matching power flow computational methods can include:
In step S310, the first predetermined neighborhood is selected in a reference image.
In step s 320, the notable journey of at least one pixel in the first predetermined neighborhood in addition to reference pixel is calculated
Degree.
In step S330, the set for all pixels that reference pixel and its significance degree are more than threshold value is determined as joining
Examine representative pixels set.
In step S340, the first predetermined neighborhood is selected in the target image.
In step S350, the notable journey of at least one pixel in the first predetermined neighborhood in addition to object pixel is calculated
Degree.
In step S360, the set for all pixels that object pixel and its significance degree are more than threshold value is determined as mesh
Mark representative pixels set.
In step S370, object representations corresponding with each pixel in reference representative pixels set are determined respectively
Respective pixel in property pixel set.
Since step S310 to the S370 in second specific example and follow-up step S390 correspond respectively to the first tool
Above-mentioned steps S210 to S270 and step S290 in body example, so its repeated description will be omitted herein.
In step S375, numeralization coding is carried out to each pixel in the set with reference to representative pixels.
For example, can be using the pixel value of the reference pixel as benchmark, according to described with reference in representative pixels set
The pixel value of each pixel, to carry out numeralization coding to each pixel in the set with reference to representative pixels.Example
Such as, this numeralization coding can be single-bit binaryzation coding, dibit binary conversion treatment etc..
In step S380, numeralization coding is carried out to each pixel in the target representation pixel set.
With step S375 similarly, can be using the pixel value of the object pixel as benchmark, according to the object representations
Property pixel set in each pixel pixel value, in the target representation pixel set each pixel carry out
Numeralization coding.
In step S385, calculate refer to each pixel and corresponding target generation in representative pixels set respectively
Numeralization coding distance in table pixel set between respective pixel.
Obtaining numeralization encoded radio and target representation pixel with reference to each pixel in representative pixels set
In set after the numeralization encoded radio of each pixel, it can seriatim be compared in units of pixel and obtain corresponding picture
Coding distance of the element between, to be obtained in step S390 to the summation of coding distance between reference pixel and object pixel
Matching power flow.
In step S390, summed by all numeralizations coding distance to being calculated to obtain reference pixel
Matching power flow between object pixel.
It should be noted that although in above step S375 to step S390, to be compiled over the ground by respective pixel
Code distance and summed the coding distance between all respective pixels to obtain between reference pixel and object pixel
Matching power flow, however, the application not limited to this.
In another example, can also be in step S375 by each in obtained reference representative pixels set
The numeralization encoded radio of pixel, according to pixel position in a reference image, is cascaded into string number in a certain order,
And the numeralization encoded radio for each respective pixel in target representation pixel set in step S380 performs identical
Operation, so as in subsequent step, the coding distance between directly more obtained two strings numerical value, so that more easily
Obtain the Matching power flow between reference pixel and object pixel.Obviously, in such processing, due in reference picture and target
The coding of center pixel is identical in image, that is to say, that coding distance between the two is necessarily 0, therefore, can be in institute
In the string number of formation directly omit center pixel numeralization encoded radio, with calculation code apart from when save processing money
Source.
Understand to help, in the following, the gray scale side that will include scanning along multiple directions with reference representative pixels set
Illustrated exemplified by edge pixel.However, need not repeat, step S375 to S385 is equally applicable such as colour edging picture
Other representative pixel points of element, key point pixel and corner pixels etc.
For example, in step S375, for each center pixel, in all directions, mark can be utilized for left image
Remember that the gray value of position and the gray value of center pixel directly calculate the left Census values of left image.
The step of calculating left Census values can use any type Census Transformation Matching algorithms, for example, substantially
Census Transformation Matchings algorithm, the first enhancing Census Transformation Matchings algorithm and second enhancing Census Transformation Matching algorithms etc..
For example, in left image, can be using the following formula (3) come to reference to each picture in representative pixels set
The gray value I of elementiDibit binary conversion treatment is carried out, so as to obtain the binary string for each adjacent pixel.
Formula (3)
Wherein, I0It is the gray value of center pixel, and δ is tolerance, and it can be set as constant based on experience value.
In one example, δ can also be according to the following formula (4), the gray value I according to center pixel0To determine:
Formula (4)
Next, for example, gray value I of each pixel in reference representative pixels setiCarry out dibit
After binary conversion treatment, it can be encoded into according to the station location marker benchmark of the pixel in a form, to search.
As described above, in order to identify each pixel referred in representative pixels set, can be made using various parameters
For station location marker benchmark.
For example, in the first case, pixel can be used in based on the two-dimensional coordinate system constructed by reference picture
Coordinate (x, y) is used as its station location marker benchmark.
Fig. 8 A be a diagram that the schematic diagram of the Census transformation results of left image in the first case.
In the form shown in Fig. 8 A, x and y represent to refer to respectively each pixel in representative pixels set based on
The abscissa and ordinate in two-dimensional coordinate system constructed by reference picture, wherein, which is using central pixel point as original
Point, level are the rectangular coordinate system that positive direction of the y-axis is established by positive direction of the x-axis, straight up to the right, and L (x, y) represents to be somebody's turn to do
With reference to the Census encoded radios of each pixel in representative pixels set.
It is then possible in fig. 8 a in accordance with the order from top to bottom to representative pixel points all in left image
Census values are cascaded to generate the left Census values of left image, i.e., and 001000001000010000010001001001.
Alternatively, in a second situation, pixel can be used in based on the angular coordinate system constructed by centre coordinate
Coordinate (r, θ) is used as its station location marker benchmark.
Fig. 8 B be a diagram that the schematic diagram of the Census transformation results of left image in a second situation.
In the form shown in Fig. 8 B, r represents the radius of the pixel from center pixel to mark position, and θ is represented from level
The angle between every scan line is arrived to the right, and L (r, θ) represents that this refers to each pixel in representative pixels set
Census encoded radios.In the form, Census is transformed to the conversion of three values, and * represents do not have numerical value at the position.
It is then possible in the fig. 8b according to order from left to right, from top to bottom to representative picture all in left image
The Census values of vegetarian refreshments are cascaded to generate the left Census values of left image, i.e., and 0010000010000100000100010010
01。
Obviously, the obtained final result of generating mode of both left Census values is consistent.
Next, in step S380, for each center pixel, in all directions, can be utilized for right image
The gray value of mark position from right image and the gray value of center pixel calculate the right Census values of right image.
Obviously, the calculating process of this step is identical with previous step, therefore, is computed, and can be obtained in right image
To the Census encoded radio R (x, y) or R (r, θ) of each pixel in target representation pixel set.Then, by according to
With the correspondence between each representative pixel points in left image, to representative pixel points all in right image
Census values are cascaded to generate the right Census values of right image.
Finally, relatively and the Hamming distance between left Census values and right Census values this two strings numerical value directly can be obtained
From so as to more easily obtain the Matching power flow between reference pixel and object pixel.
It should be noted that as being had been described above in first specific example, in the left image mentioned in above-mentioned steps
Respective pixel between right image can be absolute correspondence or elastic correspondence.
In addition, work as reference to the first pixel in representative pixels set, in target representation pixel set
And when any corresponding second pixel is not present, can be by the maximum in the coding being more than between a respective pixel distance
Predetermined value is arranged to the coding distance between first pixel and second pixel.For example, in the examples described above, by upper
For the formula (3) in face it is recognised that in the case of dibit binaryzation, the maximum coding distance between two pixels is 2.This
When, the predetermined value (for example, 3) that can will be greater than 2 is arranged to Hamming distance between first pixel and second pixel.
In addition, in the Census encoded radios of each representative pixels in calculating left images, can also further count
Calculate the weight of each representative pixels in left image and right image.
For example, it is being used as it using coordinate (x, y) of the pixel in based on the two-dimensional coordinate system constructed by reference picture
In the case of the first of station location marker benchmark, weight can be set according to the distance between representative pixels to center pixel
Value.Due to it is assumed here that the coordinate of center pixel is origin (0,0), so directly according to the coordinate of the representative pixels
To set weighted value.Specifically, the weighted value w (x, y) of pixel (x, y) can be set using formula (5):
Formula (5)
Wherein, σXAnd σYFor width parameter, the sphere of action for control weight function.By calculating, left figure is respectively obtained
The weighting function of picture and right image:wl(x, y) and wr(x,y)。
At this moment, the weighting Hamming of left Census values and right Census values can be calculated according to following formula (6)
Distance, is Matching power flow:
Formula (6)
Wherein, wl(x, y) and wr(x, y) is respectively the weighting function of left image and right image, and function n (x) is based on
Calculate in Bit String x 1 number.
Alternatively, it is being used as it using coordinate (r, θ) of the pixel in based on the angular coordinate system constructed by centre coordinate
In the case of the second of station location marker benchmark, it can be set according to the distance between representative pixels to center pixel and direction
Weighted value.For example, can be according to the radius r from center pixel to mark position and from level to the right to every scan line
Angle theta sets weighted value.
It can be seen from the above that in the second specific example of the embodiment of the present application, adjacent picture can selected for center pixel
In the process of element, representative adjacent pixel (such as, gray scale/colour edging pixel, key point picture is adaptive selected
Element and corner pixels etc.).Since the pixel in neighborhood has different weights, so this Matching power flow computational methods more Shandong
Rod.However, this method can carry out numeralization volume to center pixel between reference picture and target image and/or adjacent pixel
Code, and differentiable Matching power flow is obtained according to the numeralization coding distance between respective pixel, and then by the matching generation
Valency is applied in follow-up Stereo Matching Algorithm, to obtain correct parallax information.
4th, parallax value calculating method
Utilizing the reference pixel and mesh in obtaining reference picture according to the Matching power flow computational methods of the embodiment of the present application
After the Matching power flow between object pixel in logo image, solid matching method can be continued to execute according to the Matching power flow
In subsequent step (for example, supporting summation, disparity computation/optimization and parallax to improve), so as to which reference picture is finally calculated
In reference pixel and target image in matched pixel between parallax value.
Fig. 9 be a diagram that the overview flow chart of the parallax value calculating method according to the embodiment of the present application.
As shown in figure 9, the parallax value calculating method can include:
In step S410, multiple object pixels are determined.
As it was noted above, stereoscopic camera can shoot to obtain figure of the same object under two or more different visual angles
Picture, i.e. reference picture and target image.
In order to calculate the parallax value between the matched pixel in reference pixel and target image in reference picture, need first
To search out in the target image and the matched matched pixel of reference pixel in reference picture.For this, it may be determined that multiple mesh
Mark pixel, the matched pixel as candidate.
As noted previously, as generally required in practice before Matching power flow calculating is carried out to reference picture and target
Image carries out altitude calibration, so simply, can only on the horizontal scale between reference pixel and the matched pixel of candidate
Existence position deviation delta x, and position deviation Δ x depends on stereoscopic camera parameter, and be thus to be in limited a pixel model
In enclosing.
Therefore, assuming that the coordinate of reference pixel in a reference image is (x0,y0) when, it is only necessary in the target image will
Positioned at coordinate (x0,y0) arrive (x0+Δxmax,y0) in limited a pixel be determined as object pixel.Wherein, above-mentioned pixel
Finite number Δ xmaxCan based on experience value or experiment value determines.
In the step s 420, for each object pixel, calculate respectively with reference to support in pixel set each first
Pixel and corresponding target support the Matching power flow between corresponding second pixel in pixel set, described with reference to support pixel
Set is included at least one pixel in the second predetermined neighborhood in the reference pixel in the reference picture, and institute
Stating target supports pixel set to be included in the target image in the described second predetermined neighborhood in the object pixel
At least one pixel.
It is possible, firstly, to determine that pixel set is supported in the reference for the reference pixel in the reference picture, it is described
Include the second predetermined neighbour in the reference pixel with reference to pixel set (or being referred to as, the support pixel of reference pixel) is supported
At least one pixel in domain.
The support pixel of one pixel can be all pixels for having therewith approximate parallax value.It is for example, a kind of common
Mode be select gray-scale map in object pixel adjacent pixel (including the object pixel is in itself), the branch as the object pixel
Hold pixel.Alternatively, can be in selection gray-scale map with object pixel on the same area block in a manner of another kind is common
Support pixel of the adjacent pixel (including the object pixel is) as the object pixel in itself.
Then, similarly, it can determine that the target for the object pixel supports pixel set in the target image,
The target supports pixel set to include at least one pixel in the described second predetermined neighborhood in the object pixel.
Next, can be according to the coordinate of pixel, to determine respectively with described with reference to each supported in pixel set
The corresponding target of pixel supports the respective pixel in pixel set.
Finally, the Matching power flow computational methods described above according to the embodiment of the present application can be used, for each
Object pixel, calculates support pixel set with reference to each first pixel and corresponding target in support pixel set respectively
Matching power flow between middle the second pixel of correspondence.
Of the embodiment of the present application is describe in detail due to hereinbefore having been combined embodiment and two specific examples
With cost computational methods, so its repeated description will be omitted herein.
In step S430, for each object pixel, by summing to all Matching power flows calculated
To obtain the overall matching cost between the reference pixel and the object pixel.
For example, can sum to the Matching power flow, and using the sum of described Matching power flow as the reference image
The plain overall matching cost between the object pixel.
In step S440, according at least to multiple overall matchings between the reference pixel and the multiple object pixel
Cost to determine the matched pixel in the multiple object pixel, so that it is determined that the reference pixel and the matched pixel
Between parallax value.
Obviously, it can be applied to arbitrary solid matching method according to the Matching power flow computational methods of the embodiment of the present application
(for example, Block- matching, SGM etc.).That is, the crucial institute that it is the application that the subsequent treatment of Matching power flow computational methods, which is not,
Those skilled in the art can determine parallax using any existing solid matching method based on Matching power flow result of calculation
Value.
For example, for block matching method, it is only necessary to by finding minimum totality (accumulative) Matching power flow, you can
Matched pixel is determined among multiple object pixels.For SGM methods, can further by multi-direction Dynamic Programming simultaneously
Minimum accumulative Matching power flow is found to determine matched pixel.
Finally, can by calculate between the reference pixel and the matched pixel coordinate distance determine both it
Between parallax value.
In the following, the effect of the Stereo Matching Algorithm according to the embodiment of the present application will be described with reference to figure 10A and Figure 10 B.
Figure 10 A be a diagram that the matching algorithm according to prior art based on the second enhancing statistics census transform is obtained
Disparity map, and Figure 10 B be a diagram that the obtained disparity map of parallax value calculating method according to the embodiment of the present application.
Comparison diagram 10A and Figure 10 B can be seen that compared with the result that Stereo Matching Algorithm of the prior art is obtained,
The disparity map obtained according to the parallax value calculating method of the embodiment of the present application has more correct parallax values, especially in no line
In the road surface region of reason, this effect is more obvious.After such disparity map that there are a large amount of correct parallax values is obtained,
Even if wherein occasionally there are a little invalid parallax value, it can also be filled using subsequent operation (for example, plane fitting), to obtain
Denser disparity map.Thus, it will be seen that the method that the results show is proposed according to the embodiment of the present application is certain
Effectively.
It can be seen from the above that embodiments herein provides a kind of parallax value calculating method, wherein by supporting structure information
Concept is incorporated into Matching power flow calculating process, can be in the process of the pixel selection adjacent pixel for Matching power flow to be calculated
In, take into full account selected pixel in associated picture whether representative information, be adaptive selected representative
Adjacent pixel.That is, in the parallax value calculating method, more representational information (e.g., gray scale can be make use of
Marginal information) so that this method more robust, the Matching power flow being calculated is more accurate, and thus obtained disparity map also becomes more
To be accurate.
5th, Matching power flow computing device
Embodiments herein can also be implemented by a kind of Matching power flow computing device.Hereinafter, by reference chart
11 describe the functional configuration block diagram of the Matching power flow computing device according to the embodiment of the present application.
Figure 11 illustrates the functional configuration block diagram of the Matching power flow computing device according to the embodiment of the present application.
As shown in figure 11, which can be used for calculating the reference pixel and mesh in reference picture
The Matching power flow between object pixel in logo image, the reference picture and the target image belong to an original image
It is right, and described device can include:
Reference set determination unit 110, for determining the reference generation for the reference pixel in the reference picture
Table pixel set, it is described with reference to first of representative pixels set including the reference pixel and in the reference pixel
In predetermined neighborhood and its significance degree is more than at least one pixel of threshold value;
Goal set determination unit 120, for determining the target generation for the object pixel in the target image
Table pixel set, the target representation pixel set include the object pixel and in described in the object pixels
In first predetermined neighborhood and its significance degree is more than at least one pixel of the threshold value;And
Matching power flow computing unit 130, for the pixel according to each pixel in the set with reference to representative pixels
The pixel value of each pixel in value and the target representation pixel set, to calculate the reference pixel and the target
Matching power flow between pixel.
In one example, reference set determination unit 110 can be by following operation come true in the reference picture
Surely it is used for the reference representative pixels set of the reference pixel:Selection is in reference pixel week in the reference picture
Region enclosing and with predetermined shape, as the described first predetermined neighborhood;Calculate in the described first predetermined neighborhood except institute
State the significance degree of at least one pixel outside reference pixel;And the reference pixel and its significance degree are more than institute
The set for stating all pixels of threshold value is determined as the reference representative pixels set.
Specifically, reference set determination unit 110 can be by least one described to calculate in following various modes
The significance degree of at least one pixel in first predetermined neighborhood in addition to the reference pixel:Using the reference pixel as
Determine pixel value saltus step degree between specific pixel and its adjacent pixel in multiple and different directions of starting point, and by the picture
Significance degree of the element value saltus step degree as the specific pixel;Counted by performing scale invariant feature description to specific pixel
Calculate the dimension stable degree of the specific pixel, and the notable journey using the dimension stable degree as the specific pixel
Degree;And determine the structure attribute of the specific pixel by calculating the strength structure of specific pixel, and by the structure
Significance degree of the attribute as the specific pixel.
In one example, Matching power flow computing unit 130 can by it is following operation come calculate the reference pixel with
Matching power flow between the object pixel:Determine respectively corresponding with each pixel in the set with reference to representative pixels
The target representation pixel set in respective pixel;Calculate respectively described with reference to each picture in representative pixels set
The distance between respective pixel metric in the plain and corresponding target representation pixel set;And by being counted
All distance metric values calculated are summed to obtain the Matching power flow between the reference pixel and the object pixel.
For example, Matching power flow computing unit 130 can be determined with described with reference to representative picture respectively by following operation
The respective pixel in the corresponding target representation pixel set of each pixel in element set:Determine described with reference to representative
First coordinate of first pixel in the reference picture in property pixel set;Selection is in second in the target image
Region around coordinate and with predetermined shape, as the corresponding neighborhood of elasticity, second coordinate and first coordinate
Between there is predetermined migration;And by the target representation pixel set, its coordinate is in the corresponding neighborhood of elasticity
The second pixel be determined as respective pixel corresponding with first pixel.
For example, Matching power flow computing unit 130 can calculate institute respectively by any one in following various modes
State with reference to respective pixel in each pixel in representative pixels set and the corresponding target representation pixel set
The distance between metric:Each pixel and the corresponding mesh in the set with reference to representative pixels are calculated respectively
Mark the margin of image element between respective pixel in representative pixels set;And using the pixel value of the reference pixel as benchmark,
According to the pixel value of each pixel in the set with reference to representative pixels, in the set with reference to representative pixels
Each pixel carries out numeralization coding, using the pixel value of the object pixel as benchmark, according to the target representation picture
The pixel value of each pixel in element set, numerical value is carried out to each pixel in the target representation pixel set
Change coding, and calculate each pixel and the corresponding object representations in the set with reference to representative pixels respectively
Property pixel set in numeralization coding distance between respective pixel.
For example, Matching power flow computing unit 130 described can refer to representative pixels by following operation to calculate respectively
The distance between respective pixel metric in each pixel and the corresponding target representation pixel set in set:
The position relationship between the first pixel and the reference pixel in the set with reference to representative pixels determines first
Weight coefficient;The position relationship between the second pixel and the object pixel in the target representation pixel set come
Determine the second weight coefficient, second pixel be it is corresponding with the first pixel in the set with reference to representative pixels described in
Respective pixel in target representation pixel set;Calculate the distance between first pixel and second pixel measurement
Value;And first weight coefficient and second weight coefficient are used, to first pixel and second pixel
The distance between metric be weighted, with obtain the Weighted distance between first pixel and second pixel measurement
Value.
For example, Matching power flow computing unit 130 described can refer to representative pixels by following operation to calculate respectively
The distance between respective pixel metric in each pixel and the corresponding target representation pixel set in set:
When determine in the target representation pixel set not with the first pixel pair in the set with reference to representative pixels
During the second pixel answered, predetermined value is arranged to the distance between first pixel and second pixel metric, it is described
Predetermined value is more than each pixel and the corresponding target representation set of pixels in the set with reference to representative pixels
Maximum in conjunction in the distance between respective pixel metric.
The tool of above-mentioned reference set determination unit 110, goal set determination unit 120 and Matching power flow computing unit 130
Body function and operation have been described above being discussed in detail in the Matching power flow computational methods referring to figs. 1 to Figure 10 B descriptions, and therefore,
Its repeated description will be omitted.
It should be noted that the component of above-mentioned Matching power flow computing device 100 can be realized with software program, for example, it is logical
CPU combinations RAM and ROM for crossing in all-purpose computer etc. and the software code that wherein runs are realized.Software program can deposit
Storage is operationally loaded into such as on random access storage device RAM on the storage mediums such as flash memory, floppy disk, hard disk, CD
To be performed by CPU.In addition, except on all-purpose computer, can also be by the cooperation between application-specific integrated circuit and software come real
It is existing.The integrated circuit is included for example, by MPU (microprocessing unit), DSP (digital signal processor), (scene can compile FPGA
Journey gate array), at least one in ASIC (application-specific integrated circuit) etc. realize.Such all-purpose computer is special integrated
Circuit etc. can be for example loaded on specific location (for example, vehicle), and with install on location be used for road
The imaging device such as camera for the object imaging being associated with road communicates, so as to the two dimension obtained to camera shooting
Image and/or stereo-picture extraction supporting structure, and then weighted registration cost is calculated, after later by Stereo Matching Algorithm
Continuous step calculates parallax value.In addition, all parts of the Matching power flow computing device 100 can be with special hardware come real
It is existing, such as specific field programmable gate array, application-specific integrated circuit etc..In addition, each portion of Matching power flow computing device 100
Part can also be realized using the combination of software and hardware.
6th, parallax value computing device
Embodiments herein can also be implemented by a kind of parallax value computing device.Hereinafter, will be with reference to figure 12
To describe the functional configuration block diagram of the parallax value computing device according to the embodiment of the present application.
Figure 12 illustrates the functional configuration block diagram of the parallax value computing device according to the embodiment of the present application.
As shown in figure 12, which can be used for calculating the reference pixel and target in reference picture
The parallax value between matched pixel in image, the reference picture and the target image belong to an original image pair, and
And the equipment can include:
Object pixel determining device 210, for determining multiple object pixels;
Matching power flow computing device 220, for for each object pixel, calculating respectively with reference in support pixel set
Each first pixel and corresponding target support the Matching power flow between corresponding second pixel, the ginseng in pixel set
Examine and support pixel set is included in the reference picture to be at least one in the second predetermined neighborhood of the reference pixel
Pixel, and to support that pixel set is included in described second pre- in the object pixel in the target image for the target
Determine at least one pixel in neighborhood;
Overall cost obtains device 230, for for each object pixel, passing through all matching generations to being calculated
Valency is summed to obtain the overall matching cost between the reference pixel and the object pixel;And
Parallax value calculation apparatus 240, for according at least to more between the reference pixel and the multiple object pixel
A overall matching cost to determine the matched pixel in the multiple object pixel, so that it is determined that the reference pixel and institute
The parallax value between matched pixel is stated,
Wherein, the Matching power flow computing device 220 is calculated with reference in support pixel set respectively by following operation
Each first pixel and corresponding target support the Matching power flow between corresponding second pixel in pixel set:Described
The first representative pixel set for first pixel is determined in reference picture, the described first representative pixel set includes
In first pixel and the first predetermined neighborhood in first pixel and its significance degree is more than threshold value at least
One pixel;The second representative pixel set for second pixel, the second generation are determined in the target image
Table pixel set includes that in second pixel and the described first predetermined neighborhood in second pixel and it is aobvious
Work degree is more than at least one pixel of the threshold value;And according to each pixel in the described first representative pixel set
The pixel value of each pixel in pixel value and the second representative pixel set, come calculate first pixel with it is described
Matching power flow between second pixel.
In one example, which for example can utilize the Matching power flow shown in Figure 11 to calculate
The configuration of device 100 is realized.
7th, parallax value computing system
In addition, the application can also be implemented by a kind of parallax value computing system.Hereinafter, will be retouched with reference to figure 13
State the functional structure of the parallax value computing system according to the embodiment of the present application.
Figure 13 illustrates the functional structure chart of the parallax value computing system according to the embodiment of the present application.
As shown in figure 13, which can include:For the imaging device 310 being imaged to object, such as
Monocular camera, binocular camera, more mesh cameras etc.;Parallax value computing device 320, for from captured by imaging device 210
Two dimensional image and/or stereo-picture the extraction supporting structure of acquisition, and then weighted registration cost is calculated, will pass through Stereo matching
Algorithm subsequent step calculates parallax value, which for example can utilize the parallax value shown in Figure 12 to calculate
The configuration of equipment 200 is realized.
Specifically, the input of the parallax value computing system 300 is for gray-scale map or cromogram etc., such as can be by installed in spy
The binocular camera for positioning the place of putting shoots to obtain.The input is passed through after parallax value computing device, and output parallax information calculates knot
Fruit, output form can be various, such as export visible disparity map over the display.
8th, the hardware system calculated for parallax value
The application can also be implemented by a kind of hardware system calculated for parallax value.Hereinafter, by reference chart
14 come describe according to the embodiment of the present application be used for parallax value calculate hardware system.
Figure 14 illustrates the general hardware block diagram for being used for the hardware system that parallax value calculates according to the embodiment of the present application.
As shown in figure 14, which can include:Input equipment 410, for having from external input
Information, such as gray-scale map, cromogram, camera configuration information etc. are closed, such as keyboard, Genius mouse and communication network can be included
And its remote input equipment connected etc., and can include be used for object imaging imaging device and for pair
The image formed carries out decoding device of image decoding etc.;Processing equipment 420, for implementing above-mentioned according to the application to implement
The parallax value calculating method of example, or above-mentioned parallax value computing device is embodied as, such as the centre of computer can be included
Manage device or other chips with disposal ability etc., it may be connected to the network (not shown) of such as internet, according to place
The needs of reason process handled to teletransmission after result etc.;Output equipment 430, for implementing above-mentioned parallax to outside output
It is obtained by calculating process as a result, the long-range output that can for example include display, printer and communication network and its be connected
Equipment etc.;And storage device 440, for being stored in a manner of volatile and nonvolatile involved by above-mentioned parallax value calculating process
Data, the data such as disparity map, for example, can include random access memory (RAM), read-only storage (ROM), hard disk,
Or the various volatile and nonvolatile property memories of semiconductor memory etc..
Each embodiment of the application has been described in detail above.However, it should be appreciated by those skilled in the art that do not taking off
In the case of principle and spirit from the application, these embodiments can be carried out with various modifications, combination or sub-portfolio, and so
Modification should fall within the scope of the present application.
Claims (10)
- A kind of 1. Matching power flow computational methods, it is characterised in that the method be used to calculating reference pixel in reference picture with The Matching power flow between object pixel in target image, the reference picture and the target image belong to an original image It is right, and the described method includes:The reference representative pixels set for the reference pixel is determined in the reference picture, it is described with reference to representative picture Element set includes in the reference pixel and the first predetermined neighborhood in the reference pixel and its significance degree and is more than At least one pixel of threshold value, wherein, the significance degree of pixel is by following at least one expression:The jump of the pixel value of the pixel Degree of stability, the architectural feature of the pixel of change degree, the pixel in the picture;The target representation pixel set for the object pixel, the target representation picture are determined in the target image Element set includes in the object pixel and the described first predetermined neighborhood in the object pixel and its significance degree More than at least one pixel of the threshold value;AndAccording in the pixel value of each pixel and the target representation pixel set in the set with reference to representative pixels The pixel value of each pixel, to calculate the Matching power flow between the reference pixel and the object pixel.
- 2. the method according to claim 1, it is characterised in that the ginseng for the reference pixel is determined in the reference picture Examining representative pixels set includes:Selection is in region around the reference pixel and with predetermined shape in the reference picture, as described First predetermined neighborhood;Calculate the significance degree of at least one pixel in the described first predetermined neighborhood in addition to the reference pixel;AndThe set for all pixels that the reference pixel and its significance degree are more than the threshold value is determined as described with reference to representative Property pixel set,Wherein, the significance degree bag of at least one pixel in the first predetermined neighborhood in addition to the reference pixel is calculated Include at least one in following various modes:The pixel between specific pixel and its adjacent pixel is determined in using the reference pixel as multiple and different directions of starting point It is worth saltus step degree, and the significance degree using the pixel value saltus step degree as the specific pixel;The dimension stable degree of the specific pixel is calculated by performing scale invariant feature description to specific pixel, and will Significance degree of the dimension stable degree as the specific pixel;AndThe structure attribute of the specific pixel is determined by calculating the strength structure of specific pixel, and by the structure attribute Significance degree as the specific pixel.
- 3. the method according to claim 1, it is characterised in that calculate the matching between the reference pixel and the object pixel Cost includes:The target representation set of pixels corresponding with each pixel in the set with reference to representative pixels is determined respectively Respective pixel in conjunction;Each pixel and the corresponding target representation pixel in the set with reference to representative pixels are calculated respectively The distance between respective pixel metric in set;AndBy summed to all distance metric values calculated obtain the reference pixel and the object pixel it Between Matching power flow.
- 4. method according to claim 3, it is characterised in that respectively determine with it is each in the set with reference to representative pixels Respective pixel in the corresponding target representation pixel set of a pixel includes:Determine first coordinate of the first pixel in the set with reference to representative pixels in the reference picture;Selection is in region around the second coordinate and with predetermined shape in the target image, corresponding as elasticity Neighborhood, has predetermined migration between second coordinate and first coordinate;AndBy in the target representation pixel set, the second pixel that its coordinate is in the corresponding neighborhood of elasticity be determined as with The corresponding respective pixel of first pixel.
- 5. method according to claim 3, it is characterised in that calculate respectively described with reference to each picture in representative pixels set The distance between respective pixel metric includes following various sides in the plain and corresponding target representation pixel set Any one in formula:Each pixel and the corresponding target representation pixel in the set with reference to representative pixels are calculated respectively Margin of image element in set between respective pixel;AndUsing the pixel value of the reference pixel as benchmark, according to the picture of each pixel in the set with reference to representative pixels Element value, to carry out numeralization coding to each pixel in the set with reference to representative pixels, with the object pixel Pixel value is as benchmark, the pixel value of each pixel in the target representation pixel set, to the target Each pixel in representative pixels set carries out numeralization coding, and calculates respectively described with reference to representative pixels set In numeralization coding distance in each pixel and the corresponding target representation pixel set between respective pixel.
- 6. method according to claim 3, it is characterised in that calculate respectively described with reference to each picture in representative pixels set The distance between respective pixel metric includes in the plain and corresponding target representation pixel set:The position relationship between the first pixel and the reference pixel in the set with reference to representative pixels determines First weight coefficient;The position relationship between the second pixel and the object pixel in the target representation pixel set determines Second weight coefficient, second pixel are the targets corresponding with the first pixel in the set with reference to representative pixels Respective pixel in representative pixels set;Calculate the distance between first pixel and second pixel metric;AndUsing first weight coefficient and second weight coefficient, between first pixel and second pixel Distance metric value be weighted, to obtain the Weighted distance metric between first pixel and second pixel.
- 7. method according to claim 3, it is characterised in that calculate respectively described with reference to each picture in representative pixels set The distance between respective pixel metric includes in the plain and corresponding target representation pixel set:When determine in the target representation pixel set not with the first picture in the set with reference to representative pixels During plain corresponding second pixel, predetermined value is arranged to the distance between first pixel and second pixel metric, The predetermined value is more than each pixel and the corresponding target representation picture in the set with reference to representative pixels Maximum in element set in the distance between respective pixel metric.
- 8. a kind of parallax value calculating method, it is characterised in that the method is used to calculate the reference pixel and mesh in reference picture The parallax value between matched pixel in logo image, the reference picture and the target image belong to an original image pair, And the described method includes:Determine multiple object pixels;For each object pixel, calculate respectively with reference to each first pixel and corresponding mesh in support pixel set Mark supports the Matching power flow between corresponding second pixel in pixel set, described to be included in the reference with reference to support pixel set At least one pixel in image in the second predetermined neighborhood in the reference pixel, and the target supports pixel set It is included at least one pixel in the described second predetermined neighborhood in the object pixel in the target image;For each object pixel, the reference pixel is obtained by being summed to all Matching power flows calculated Overall matching cost between the object pixel;AndCome according at least to multiple overall matching costs between the reference pixel and the multiple object pixel the multiple The matched pixel is determined in object pixel, so that it is determined that the parallax value between the reference pixel and the matched pixel,Wherein, calculate respectively and support pixel set with reference to each first pixel and corresponding target in support pixel set Matching power flow between middle the second pixel of correspondence includes:The first representative pixel set for first pixel, the described first representative picture are determined in the reference picture Element set includes in first pixel and the first predetermined neighborhood in first pixel and its significance degree and is more than At least one pixel of threshold value, wherein, the significance degree of pixel is by following at least one expression:The jump of the pixel value of the pixel Degree of stability, the architectural feature of the pixel of change degree, the pixel in the picture;The second representative pixel set for second pixel, the described second representative picture are determined in the target image Element set includes in second pixel and the described first predetermined neighborhood in second pixel and its significance degree More than at least one pixel of the threshold value;AndAccording in the pixel value of each pixel in the described first representative pixel set and the second representative pixel set The pixel value of each pixel, to calculate the Matching power flow between first pixel and second pixel.
- A kind of 9. Matching power flow computing device, it is characterised in that described device be used to calculating reference pixel in reference picture with The Matching power flow between object pixel in target image, the reference picture and the target image belong to an original image It is right, and described device includes:Reference set determination unit, for determining the reference representative pixels for the reference pixel in the reference picture Set, it is described with reference to first predetermined neighborhood of the representative pixels set including the reference pixel and in the reference pixel In and its significance degree be more than at least one pixel of threshold value, wherein, the significance degree of pixel is by following at least one Represent:Saltus step degree, degree of stability, the architectural feature of the pixel of the pixel in the picture of the pixel value of the pixel;Goal set determination unit, for determining the target representation pixel for the object pixel in the target image Set, the target representation pixel set include the object pixel and described first predetermined in the object pixel In neighborhood and its significance degree is more than at least one pixel of the threshold value;AndMatching power flow computing unit, for according to the pixel value of each pixel and institute in the set with reference to representative pixels The pixel value of each pixel in target representation pixel set is stated, to calculate between the reference pixel and the object pixel Matching power flow.
- 10. a kind of parallax value computing device, it is characterised in that the equipment is used to calculate the reference pixel and mesh in reference picture The parallax value between matched pixel in logo image, the reference picture and the target image belong to an original image pair, And the equipment includes:Object pixel determining device, for determining multiple object pixels;Matching power flow computing device, for for each object pixel, calculating respectively with reference to each in support pixel set First pixel and corresponding target support the Matching power flow between corresponding second pixel in pixel set, described with reference to support Pixel set is included at least one pixel in the second predetermined neighborhood in the reference pixel in the reference picture, and And the target supports pixel set to be included in the described second predetermined neighborhood for being in the object pixel in the target image In at least one pixel;Overall cost obtains device, for for each object pixel, passing through all Matching power flows progress to being calculated Sum to obtain the overall matching cost between the reference pixel and the object pixel;AndParallax value calculation apparatus, for according at least to multiple overall between the reference pixel and the multiple object pixel Come to determine the matched pixel in the multiple object pixel with cost, so that it is determined that the reference pixel and the matching picture Parallax value between element,Wherein, the Matching power flow computing device by following operation come calculate respectively with reference to support in pixel set each the One pixel and corresponding target support the Matching power flow between corresponding second pixel in pixel set:The first representative pixel set for first pixel, the described first representative picture are determined in the reference picture Element set includes in first pixel and the first predetermined neighborhood in first pixel and its significance degree and is more than At least one pixel of threshold value, wherein, the significance degree of pixel is by following at least one expression:The jump of the pixel value of the pixel Degree of stability, the architectural feature of the pixel of change degree, the pixel in the picture;The second representative pixel set for second pixel, the described second representative picture are determined in the target image Element set includes in second pixel and the described first predetermined neighborhood in second pixel and its significance degree More than at least one pixel of the threshold value;AndAccording in the pixel value of each pixel in the described first representative pixel set and the second representative pixel set The pixel value of each pixel, to calculate the Matching power flow between first pixel and second pixel.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410277105.1A CN105335952B (en) | 2014-06-19 | 2014-06-19 | Matching power flow computational methods and device and parallax value calculating method and equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410277105.1A CN105335952B (en) | 2014-06-19 | 2014-06-19 | Matching power flow computational methods and device and parallax value calculating method and equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105335952A CN105335952A (en) | 2016-02-17 |
CN105335952B true CN105335952B (en) | 2018-04-17 |
Family
ID=55286459
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410277105.1A Expired - Fee Related CN105335952B (en) | 2014-06-19 | 2014-06-19 | Matching power flow computational methods and device and parallax value calculating method and equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105335952B (en) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101856344B1 (en) * | 2016-08-22 | 2018-06-25 | 현대자동차주식회사 | System and method for generating disparity map by matching stereo images |
CN108074250B (en) * | 2016-11-10 | 2022-01-04 | 株式会社理光 | Matching cost calculation method and device |
CN109724537B (en) * | 2019-02-11 | 2020-05-12 | 吉林大学 | Binocular three-dimensional imaging method and system |
CN110363235B (en) * | 2019-06-29 | 2021-08-06 | 苏州浪潮智能科技有限公司 | High-resolution image matching method and system |
CN111210481A (en) * | 2020-01-10 | 2020-05-29 | 大连理工大学 | Depth estimation acceleration method of multiband stereo camera |
CN113407756B (en) * | 2021-05-28 | 2022-10-11 | 山西云时代智慧城市技术发展有限公司 | Lung nodule CT image reordering method based on self-adaptive weight |
CN115018850B (en) * | 2022-08-09 | 2022-11-01 | 深圳市领拓实业有限公司 | Method for detecting burrs of punched hole of precise electronic part based on image processing |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102572485A (en) * | 2012-02-02 | 2012-07-11 | 北京大学 | Self-adaptive weighted stereo matching algorithm, stereo display and collecting device and system |
CN103440653A (en) * | 2013-08-27 | 2013-12-11 | 北京航空航天大学 | Binocular vision stereo matching method |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101694292B1 (en) * | 2010-12-17 | 2017-01-09 | 한국전자통신연구원 | Apparatus for matching stereo image and method thereof |
KR101792501B1 (en) * | 2011-03-16 | 2017-11-21 | 한국전자통신연구원 | Method and apparatus for feature-based stereo matching |
-
2014
- 2014-06-19 CN CN201410277105.1A patent/CN105335952B/en not_active Expired - Fee Related
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102572485A (en) * | 2012-02-02 | 2012-07-11 | 北京大学 | Self-adaptive weighted stereo matching algorithm, stereo display and collecting device and system |
CN103440653A (en) * | 2013-08-27 | 2013-12-11 | 北京航空航天大学 | Binocular vision stereo matching method |
Also Published As
Publication number | Publication date |
---|---|
CN105335952A (en) | 2016-02-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105335952B (en) | Matching power flow computational methods and device and parallax value calculating method and equipment | |
US9715761B2 (en) | Real-time 3D computer vision processing engine for object recognition, reconstruction, and analysis | |
KR102204818B1 (en) | Selection of balanced-probe sites for 3-d alignment algorithms | |
Zhuang et al. | Strategies combining spectral angle mapper and change vector analysis to unsupervised change detection in multispectral images | |
CN109978839B (en) | Method for detecting wafer low-texture defects | |
CN109859226B (en) | Detection method of checkerboard corner sub-pixels for graph segmentation | |
CN111780763A (en) | Visual positioning method and device based on visual map | |
CN111126412B (en) | Image key point detection method based on characteristic pyramid network | |
CN108986152B (en) | Foreign matter detection method and device based on difference image | |
CN106991695A (en) | A kind of method for registering images and device | |
CN109118544B (en) | Synthetic aperture imaging method based on perspective transformation | |
CN104123554B (en) | SIFT image characteristic extracting methods based on MMTD | |
CN105787943B (en) | SAR image registration method based on multi-scale image block feature and rarefaction representation | |
CN110533774B (en) | Three-dimensional model reconstruction method based on smart phone | |
JPWO2018179338A1 (en) | Machine learning device and image recognition device | |
CN105427333A (en) | Real-time registration method of video sequence image, system and shooting terminal | |
CN109493384A (en) | Camera position and orientation estimation method, system, equipment and storage medium | |
CN113888461A (en) | Method, system and equipment for detecting defects of hardware parts based on deep learning | |
CN102446356A (en) | Parallel and adaptive matching method for acquiring remote sensing images with homogeneously-distributed matched points | |
CN111105452A (en) | High-low resolution fusion stereo matching method based on binocular vision | |
CN109003307B (en) | Underwater binocular vision measurement-based fishing mesh size design method | |
CN110443228B (en) | Pedestrian matching method and device, electronic equipment and storage medium | |
CN113688846A (en) | Object size recognition method, readable storage medium, and object size recognition system | |
CN114119437B (en) | GMS-based image stitching method for improving distortion of moving object | |
CN107330930A (en) | Depth of 3 D picture information extracting method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20180417 |
|
CF01 | Termination of patent right due to non-payment of annual fee |