CN105335952A - Matching cost calculation method and apparatus, and parallax value calculation method and equipment - Google Patents

Matching cost calculation method and apparatus, and parallax value calculation method and equipment Download PDF

Info

Publication number
CN105335952A
CN105335952A CN201410277105.1A CN201410277105A CN105335952A CN 105335952 A CN105335952 A CN 105335952A CN 201410277105 A CN201410277105 A CN 201410277105A CN 105335952 A CN105335952 A CN 105335952A
Authority
CN
China
Prior art keywords
pixel
value
power flow
representative
matching power
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410277105.1A
Other languages
Chinese (zh)
Other versions
CN105335952B (en
Inventor
刘振华
刘媛
师忠超
刘殿超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ricoh Co Ltd
Original Assignee
Ricoh Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ricoh Co Ltd filed Critical Ricoh Co Ltd
Priority to CN201410277105.1A priority Critical patent/CN105335952B/en
Publication of CN105335952A publication Critical patent/CN105335952A/en
Application granted granted Critical
Publication of CN105335952B publication Critical patent/CN105335952B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The application discloses a matching cost calculation method and apparatus, and a parallax value calculation method and equipment. The matching cost calculation method comprises: a reference representative pixel set for a reference pixel is determined in a reference image, wherein the reference representative pixel set contains a reference pixel and at least one pixel that is located in a first predetermined adjacent region of the reference pixel and has a saliency value larger than a threshold value; a target representative pixel set for a target pixel is determined in a target image, wherein the target representative pixel set contains a target pixel and at least one pixel that is located in a first predetermined adjacent region of the target pixel and has a saliency value larger than a threshold value; and matching cost between the reference pixel and the target pixel is calculated according to a pixel value of each pixel in the reference representative pixel set and a pixel value of each pixel in the target representative pixel set. Therefore, the distinguishable matching cost can be obtained and thus correct parallax information can be obtained.

Description

Matching power flow computing method and device and parallax value computing method and equipment
Technical field
The application's relate generally to digital image processing field, more specifically, the application relates to a kind of Matching power flow computing method and device and parallax value computing method and equipment.
Background technology
Solid matching method has a wide range of applications in the various fields such as such as robot, monitoring and intelligent vehicle.The relative distance that the parallax information (or being referred to as, depth information) obtained by solid matching method can be used between estimated image capture device and object.Such as, for intelligent vehicle, by the parallax information obtained by solid matching method, can easily detect road surface, white line and fence, and then test example is classified to target as the target such as pedestrian and vehicle, can control the overall travel conditions of vehicle thus comprehensively.
The ultimate principle of solid matching method is to by image-capturing apparatus (such as, stereoscopic camera) captured by two different visual angles under under (when binocular camera) or more different visual angles the image of (when many orders camera) same object compare, by finding the position deviation between pixel that respective pixel carrys out computed image, thus obtain parallax information, and draw anaglyph according to this parallax information.
Conventional solid matching method comprises sectional perspective matching process (such as, block matching method), overall solid matching method (such as, dynamic programming method) and half overall solid matching method is (such as, half global registration (SGM) method), it all comprises four steps below or several steps wherein usually: Matching power flow calculates, support that summation, disparity computation/optimizations and parallax improve, and wherein Matching power flow calculating is step the most key in these steps.
As a rule, current Matching power flow computing method, for the pixel of Matching power flow to be calculated, select the neighbor that there is with it close attribute or be in its fixed position, and according to the Pixel Information included by the pixel of Matching power flow to be calculated and neighbor thereof, the Matching power flow between the respective pixel calculating multi-view image.
But, owing to not taking into full account selected pixel whether representative information in associated picture in the said process selecting neighbor, so obtain the Matching power flow of undistinguishable possibly, and then in subsequent step, obtain the parallax information of mistake.
Summary of the invention
In order to solve the problems of the technologies described above, according to an aspect of the application, provide a kind of Matching power flow computing method, described method is for the Matching power flow between the reference pixel in computing reference image and the object pixel in target image, described reference picture and described target image belong to an original image pair, and described method comprises: the reference representative pixels set determining described reference pixel in described reference picture, the set of described reference representative pixels comprises described reference pixel, with in the be in described reference pixel first predetermined neighborhood and its significance degree is greater than at least one pixel of threshold value, in described target image, determine the target representation pixel set of described object pixel, the set of described target representation pixel comprises described object pixel and is at least one pixel that in the described first predetermined neighborhood of described object pixel and its significance degree is greater than described threshold value, and according to the pixel value of each pixel in the described pixel value with reference to each pixel in representative pixels set and the set of described target representation pixel, calculate the Matching power flow between described reference pixel and described object pixel.
In addition, according to the another aspect of the application, provide a kind of parallax value computing method, described method is for the parallax value between the reference pixel in computing reference image and the matched pixel in target image, described reference picture and described target image belong to an original image pair, and described method comprises: determine multiple object pixel, for each object pixel, computing reference supports that in pixel set, each first pixel and the target corresponding with it support the Matching power flow in pixel set between corresponding second pixel respectively, described with reference to supporting that pixel set is included in described reference picture at least one pixel be in the second predetermined neighborhood of described reference pixel, and described target supports that pixel set is included in described target image at least one pixel be in the described second predetermined neighborhood of described object pixel, for each object pixel, by the overall matching cost obtained between described reference pixel and described object pixel of suing for peace to calculated all Matching power flow, and at least in described multiple object pixel, determine described matched pixel according to the several groups Matching power flow between described reference pixel and described multiple object pixel, thus determine the parallax value between described reference pixel and described matched pixel, wherein, computing reference supports that in pixel set, each first pixel and the target corresponding with it support that the Matching power flow in pixel set between corresponding second pixel comprises: the first representative pixel set determining described first pixel in described reference picture respectively, described first representative pixel set comprises described first pixel, with at least one pixel being in the first predetermined neighborhood of described first pixel and its significance degree and being greater than threshold value, in described target image, determine the second representative pixel set of described second pixel, described second representative pixel set comprises described second pixel and is at least one pixel that in the described first predetermined neighborhood of described second pixel and its significance degree is greater than described threshold value, and according to the pixel value of each pixel in the pixel value of each pixel in described first representative pixel set and described second representative pixel set, calculate the Matching power flow between described first pixel and described second pixel.
According to the another aspect of the application, provide a kind of Matching power flow calculation element, described device is for the Matching power flow between the reference pixel in computing reference image and the object pixel in target image, described reference picture and described target image belong to an original image pair, and described device comprises: reference set determining unit, for determining the reference representative pixels set of described reference pixel in described reference picture, the set of described reference representative pixels comprises described reference pixel, with in the be in described reference pixel first predetermined neighborhood and its significance degree is greater than at least one pixel of threshold value, goal set determining unit, for determining the target representation pixel set of described object pixel in described target image, the set of described target representation pixel comprises described object pixel and is at least one pixel that in the described first predetermined neighborhood of described object pixel and its significance degree is greater than described threshold value, and Matching power flow computing unit, for the pixel value according to each pixel in the described pixel value with reference to each pixel in representative pixels set and the set of described target representation pixel, calculate the Matching power flow between described reference pixel and described object pixel.
According to the another aspect of the application, provide a kind of parallax value computing equipment, described equipment is for the parallax value between the reference pixel in computing reference image and the matched pixel in target image, described reference picture and described target image belong to an original image pair, and described equipment comprises: object pixel determining device, for determining multiple object pixel, Matching power flow calculation element, for for each object pixel, computing reference supports that in pixel set, each first pixel and the target corresponding with it support the Matching power flow in pixel set between corresponding second pixel respectively, described with reference to supporting that pixel set is included in described reference picture at least one pixel be in the second predetermined neighborhood of described reference pixel, and described target supports that pixel set is included in described target image at least one pixel be in the described second predetermined neighborhood of described object pixel, overall cost obtaining means, for for each object pixel, by the overall matching cost obtained between described reference pixel and described object pixel of suing for peace to calculated all Matching power flow, and parallax value calculation element, for at least determining described matched pixel according to the several groups Matching power flow between described reference pixel and described multiple object pixel in described multiple object pixel, thus determine the parallax value between described reference pixel and described matched pixel, wherein, described Matching power flow calculation element is distinguished computing reference by following operation and is supported that in pixel set, each first pixel and the target corresponding with it support the Matching power flow in pixel set between corresponding second pixel: the first representative pixel set determining described first pixel in described reference picture, described first representative pixel set comprises described first pixel, with at least one pixel being in the first predetermined neighborhood of described first pixel and its significance degree and being greater than threshold value, in described target image, determine the second representative pixel set of described second pixel, described second representative pixel set comprises described second pixel and is at least one pixel that in the described first predetermined neighborhood of described second pixel and its significance degree is greater than described threshold value, and according to the pixel value of each pixel in the pixel value of each pixel in described first representative pixel set and described second representative pixel set, calculate the Matching power flow between described first pixel and described second pixel.
Compared with prior art, the embodiment of the application provides a kind of Matching power flow computing method and device, is wherein incorporated in Matching power flow computation process by the concept of supporting construction information, and supporting construction information refers to some representative neighbors here.That is, embodiments provides a kind of Matching power flow computing method based on supporting construction, for solid matching method in the application.According in the Matching power flow computing method of the embodiment of the present application, can in the process of the pixel selection neighbor for Matching power flow to be calculated, take into full account selected pixel whether representative information in associated picture, select representative neighbor adaptively, instead of selection has close attribute or is in the neighbor of its fixed position with it simply, and by selecting the neighbor of these representative information, differentiable Matching power flow can be obtained, and then in subsequent step, obtain correct parallax information.
In addition, the application embodiment still provides a kind of parallax value computing method and equipment, wherein the concept of supporting construction information is incorporated in Matching power flow computation process, can in the process of the pixel selection neighbor for Matching power flow to be calculated, take into full account selected pixel whether representative information in associated picture, select representative neighbor adaptively, instead of selection has close attribute or is in the neighbor of its fixed position with it simply, and by selecting the neighbor of these representative information, differentiable Matching power flow can be obtained, and then differentiable Matching power flow is applied to various Stereo Matching Algorithm, thus obtain correct parallax information.
The further feature of the application and advantage will be set forth in the following description, and, partly become apparent from instructions, or understand by implementing the application.The object of the application and other advantages realize by structure specifically noted in instructions, claims and accompanying drawing and obtain.
Accompanying drawing explanation
Accompanying drawing is used to provide further understanding of the present application, and forms a part for instructions, is used from explanation the application, does not form the restriction to the application with the embodiment one of the application.In the accompanying drawings:
Figure 1A be a diagram that the schematic diagram of the matching algorithm based on absolute grayscale difference sum according to prior art.
Figure 1B be a diagram that according to prior art based on a square schematic diagram for the matching algorithm of gray scale difference sum.
Fig. 1 C be a diagram that the matching algorithm converted based on basic statistics generaI investigation (Census) according to prior art
Fig. 1 D be a diagram that the matching algorithm strengthening Census conversion based on first according to prior art.
Fig. 1 E be a diagram that the matching algorithm strengthening Census conversion based on second according to prior art.
Fig. 2 A be a diagram that the schematic diagram according to the left image photographed in visual angle, left side by binocular camera.
Fig. 2 B be a diagram that the disparity map obtained based on the second matching algorithm strengthening statistics census transform according to prior art.
Fig. 3 be a diagram that the overview flow chart of the Matching power flow computing method according to the embodiment of the present application.
Fig. 4 be a diagram that the process flow diagram of the Matching power flow computing method according to the concrete example of the embodiment of the present application first.
Fig. 5 A be a diagram that the schematic diagram of the significance degree calculating pixel in a first case.
Fig. 5 B be a diagram that the schematic diagram of the significance degree calculating pixel in a second situation.
Fig. 6 A be a diagram that the schematic diagram determining respective pixel in a first case.
Fig. 6 B be a diagram that the schematic diagram determining respective pixel in a second situation.
Fig. 7 be a diagram that the process flow diagram of the Matching power flow computing method according to the concrete example of the embodiment of the present application second.
Fig. 8 A be a diagram that the schematic diagram of the Census transformation results of left image in a first case.
Fig. 8 B be a diagram that the schematic diagram of the Census transformation results of left image in a second situation.
Fig. 9 be a diagram that the overview flow chart of the parallax value computing method according to the embodiment of the present application.
Figure 10 A be a diagram that the disparity map obtained based on the second matching algorithm strengthening statistics census transform according to prior art.
Figure 10 B be a diagram that the disparity map obtained according to the parallax value computing method of the embodiment of the present application.
Figure 11 illustrates the functional configuration block diagram of the Matching power flow calculation element according to the embodiment of the present application.
Figure 12 illustrates the functional configuration block diagram of the parallax value computing equipment according to the embodiment of the present application.
Figure 13 illustrates the functional structure chart of the parallax value computing system according to the embodiment of the present application.
Figure 14 illustrates the general hardware block diagram of the hardware system for parallax value calculating according to the embodiment of the present application.
Embodiment
Describe each embodiment according to the application with reference to the accompanying drawings in detail.Here it is to be noted that it in the accompanying drawings, identical Reference numeral is given there is identical or similar structures and function ingredient substantially, and the repeated description of will omit about them.
In order to make those skilled in the art understand the application better, will come in the following order to be described in further detail the application.
1, the briefly introducing of prior art
2, the thought general introduction of the application
3, Matching power flow computing method
3.1, the first concrete example
3.2, the second concrete example
4, parallax value computing method
5, Matching power flow calculation element
6, parallax value computing equipment
7, parallax value computing system
8, for the hardware system of parallax value calculating
1, the briefly introducing of prior art
Before being described the embodiment of the application, for the ease of understanding, by the know-why introduced first briefly according to the Matching power flow computing method of prior art and technical matters thereof.
As the committed step in solid matching method, current, main exist two class Matching power flow computing method, for calculate multi-view image respective pixel between Matching power flow.Below, for convenience of description, by for by image-capturing apparatus for same object taken to multi-view image set be only comprise reference picture and target image two images original image to being described.
First kind Matching power flow computing method can utilize the information of some reference pixels in reference picture and have with it the information of some object pixels in the information of neighbor of close attribute and target image and have with it the information of neighbor of close attribute, calculate the Matching power flow between this reference pixel and this object pixel.
It should be noted that, in the following description, when not distinguishing reference pixel and object pixel, simply both can be referred to as center pixel.
Particularly, as the example of first kind Matching power flow computing method, there is a kind of matching algorithm based on grey similarity in the prior art, it is when selecting to have the neighbor of close attribute with center pixel, according to grey similarity, (namely a pixel sequence be communicated with is built to current center pixel,, a self-adaptation curve), and in this, as coupling, unit usually performs three grades of matching process, thus calculates the Matching power flow between the center pixel in reference picture and target image.In other words, this matching algorithm utilizes grey similarity to build coupling element, to select by the connected pixel sequence built with the pixel that center pixel has grey similarity as mating element.
Relatively, Equations of The Second Kind Matching power flow computing method can utilize the information of some reference pixels in reference picture and are in the information of some object pixels in the information of neighbor of its fixed position and target image and are in the information of neighbor of its fixed position, calculate the Matching power flow between this reference pixel and this object pixel.
Particularly, at present, Equations of The Second Kind Matching power flow computing method mainly comprise based on the matching algorithm of absolute grayscale difference sum (SAD), based on square matching algorithm of gray scale difference sum (SSD) and Corpus--based Method generaI investigation (Census) matching algorithm that converts.
Below, first, with reference to Figure 1A and Figure 1B, SAD matching algorithm according to prior art and SSD matching algorithm are described.
Figure 1A be a diagram that the schematic diagram of matching algorithm based on absolute grayscale difference sum according to prior art, and Figure 1B be a diagram that according to prior art based on a square schematic diagram for the matching algorithm of gray scale difference sum.
It should be noted that, for convenience, still suppose for same object taken to multi-view image set be a pair original image, it is made up of the reference picture photographed in visual angle, left and right by binocular camera and target image, and suppose that this reference picture and target image are gray-scale maps, that is, each pixel in image has respective gray-scale value.In addition, this reference picture of further hypothesis is the left image photographed in visual angle, left side by binocular camera, and this target image is the right image photographed in visual angle, right side by binocular camera, that is, in the accompanying drawings, 9 pixels on the left side are from left image, and 9 pixels on the right are from right image, and the position deviation that existence one is known between which.
Based on above-mentioned hypothesis, as illustrated in fig. ia, in order to the Matching power flow between the some reference pixels in computing reference image and the some object pixels in target image (is here supposed: this reference pixel has the center pixel that gray-scale value is 166 in the left image shown in Figure 1A, and this object pixel has the center pixel that gray-scale value is 165 in the right image shown in Figure 1A), SAD matching algorithm comprises: select the neighbor around reference pixel in 3 × 3 neighborhoods in a reference image; Neighbor in same vicinity (that is, 3 × 3 neighborhoods) in the target image around select target pixel; Calculate the absolute value of the difference between the gray-scale value of specific pixel of a certain position, the left side and the gray-scale value of the respective pixel at same position place, the right (namely seriatim, absolute grayscale is poor) (such as, in figure ia, have in left image gray-scale value be 135 top left corner pixel correspond in right image there is the top left corner pixel that gray-scale value is 133, and by that analogy); And by the Matching power flow obtained between this reference pixel and this object pixel of suing for peace to the absolute grayscale difference of whole 9 positions, i.e. C=9.
As illustrated in fig. ib, the difference of SSD matching algorithm and SAD matching algorithm is: after the neighbor around selected center's pixel, calculate the square value (that is, square gray scale difference) of the difference between the gray-scale value of specific pixel of a certain position, the left side and the gray-scale value of the respective pixel at same position place, the right seriatim; And by the Matching power flow obtained between this reference pixel and this object pixel of suing for peace to square gray scale difference of whole 9 positions, i.e. C=18.
Usually, SAD matching algorithm and SSD matching algorithm are all based on following hypothesis, namely think that the same pixel in reference picture and target image should have identical gray-scale value under desirable illumination condition.But, when reality uses, because image-capturing apparatus is easily subject to external factor (such as, illumination variation, visual angle change, block and deformation etc.) impact, and make the image collected differ comparatively greatly (such as with ideal situation, left image and right image may be in different illumination conditions), thus it is not good to result through the net result performance that above-mentioned algorithm obtains.
In order to overcome the problems referred to above and obtain the matching performance of robust more, further provide Census Transformation Matching algorithm.Census Transformation Matching algorithm is also very common in Matching power flow computing method, and first choosing neighborhood union at left image and right image carries out Census conversion to neighborhood for it, then carries out Matching power flow calculating.
Specifically, mate with SAD and SSD matching algorithm similarly, Census Transformation Matching algorithm still calculates the Matching power flow between this center pixel by center pixel and the Pixel Information of neighbor that is in its fixed position.But, mate with SAD and SSD matching algorithm differently, Census Transformation Matching algorithm utilizes the magnitude relationship between neighborhood pixels gray-scale value but not grey scale pixel value itself is used as similarity measure, thus the ability that anti-external factor is affected is enhanced.
At present, main Census Transformation Matching algorithm mainly comprises three types: basic Census Transformation Matching algorithm, first strengthens Census Transformation Matching algorithm and second and strengthens Census Transformation Matching algorithm.
Next, with reference to Fig. 1 C to Fig. 1 E, three kinds of Census matching algorithms according to prior art are described.
Fig. 1 C be a diagram that the matching algorithm converted based on basic statistics generaI investigation (Census) according to prior art, Fig. 1 D be a diagram that the matching algorithm strengthening Census conversion based on first according to prior art, and Fig. 1 E be a diagram that the matching algorithm strengthening Census conversion based on second according to prior art.
As illustrated in Figure 1 C, in order to the Matching power flow between the some reference pixels in computing reference image and the some object pixels in target image (is here supposed: this reference pixel has the center pixel that gray-scale value is 166 in the left image shown in Fig. 1 C, and this object pixel has the center pixel that gray-scale value is 165 in the right image shown in Fig. 1 C), basic Census Transformation Matching algorithm comprises: select the neighbor around reference pixel in 3 × 3 neighborhoods in a reference image; Neighbor in same vicinity (that is, 3 × 3 neighborhoods) in the target image around select target pixel; Respectively Census conversion is carried out to the window area in single image, namely by the formula shown in Fig. 1 C, respectively in left image and right image with the gray-scale value I of center pixel 0as threshold value, carry out the gray-scale value I to 8 neighbors in 3 × 3 neighborhoods icarry out single-bit binary conversion treatment, thus obtain for the binary string 10111101 of 8 neighbors in left image and the binary string 10111100 for 8 neighbors in right image; And obtain the Matching power flow between this reference pixel and this object pixel by the Hamming distance calculated between two binary strings, i.e. C=1.
Illustrated in Fig. 1 D, with basic Census Transformation Matching algorithm differently, first strengthens Census Transformation Matching algorithm when carrying out Census conversion to the window area in single image respectively, use the formula shown in Fig. 1 D, namely respectively in left image and right image with the gray-scale value I of center pixel 0with a tolerance δ (can set based on experience value) as threshold value, carry out the gray-scale value I to 8 neighbors in 3 × 3 neighborhoods icarry out dibit binary conversion treatment, thus obtain for the binary string 0100010100010000 of 8 neighbors in left image and the binary string 0100000100010000 for 8 neighbors in right image; And obtain the Matching power flow between this reference pixel and this object pixel by the Hamming distance calculated between two binary strings, i.e. C=1.
Further, in order to obtain Matching power flow more accurately, illustrated in Fig. 1 E, strengthen in Census Transformation Matching algorithm second, can consider to strengthen with first the larger neighborhood of scope compared with Census Transformation Matching algorithm and the more complicated neighbor of pattern, as shown in the dark pixels point around center pixel.
In sum, in the prior art, no matter be first kind Matching power flow computing method, or Equations of The Second Kind Matching power flow computing method, during Matching power flow in the calculation between imago element, or based on fixed position or based on close attribute, neighbour structure is selected to reference pixel and object pixel, and make each pixel in neighborhood all have identical weight.
Therefore, can find out, because prior art does not take into full account that whether neighbor in neighbour structure is the pixel of representative information in associated picture, so the Matching power flow of undistinguishable may be produced, and then in subsequent step, obtain the parallax information of mistake.Especially when at original image (such as, in left image or right image) a certain region in each pixel value between variable quantity less (such as, lack texture) time, use above-mentioned prior art cannot obtain disparity map of good performance about this region.
Below, the defect of the Stereo Matching Algorithm according to prior art is described with reference to Fig. 2 A and Fig. 2 B.
Fig. 2 A be a diagram that the schematic diagram according to the left image photographed in visual angle, left side by binocular camera, and Fig. 2 B be a diagram that the disparity map obtained based on the second matching algorithm strengthening statistics census transform according to prior art.
Particularly, illustrated in Fig. 2 A, in the left image photographed in visual angle, left side at binocular camera, can find out, almost without any texture in region, road surface.In the case, even if employing the relatively outstanding SGM method strengthening Census Transformation Matching algorithm based on second of performance in the prior art for this region, road surface also still cannot calculate parallax value correct too much.Finally, make the parallax value of mistake by after filtering, in the final parallax such as illustrated in Fig. 2 B, almost without any parallax value on road surface improving step (such as, filter is made an uproar) through parallax.Obviously, such result is cannot be gratifying.
2, the thought general introduction of the application
Below, the main thought of the application will be described first briefly.
In order to solve technical matters of the prior art, propose a kind of Matching power flow computing method and device in this application, and parallax value computing method and equipment, it can in the process of the pixel selection neighbor for Matching power flow to be calculated, take into full account selected pixel whether representative information in associated picture, select representative neighbor adaptively, instead of selection has close attribute or is in the neighbor of its fixed position with it simply, thus differentiable Matching power flow can be obtained, and then in subsequent step, obtain correct parallax information.
3, Matching power flow computing method
Hereinafter, the overall procedure example of the Matching power flow computing method according to the embodiment of the present application is described with reference to Fig. 3.
The Matching power flow calculated between the respective pixel of multi-view image is may be used for according to the Matching power flow computing method of the embodiment of the present application.
Usually, this multi-view image can refer to the image by same object under the multiple different visual angles captured by image-capturing apparatus (such as, stereoscopic camera).Obviously, the plurality of different visual angles can comprise plural visual angle, and captured image can be made up of a series of pixels being in diverse location coordinate, and each pixel in this image can have identical or different pixel value.Such as, this image can be gray-scale map, and wherein each pixel is represented by gray-scale value (Gray one-component).Alternatively, this image also can be cromogram, and wherein each pixel is represented by value of color (such as, RGB tri-components).
Below, for convenience of description, by be described for the example of the Double-visual angle gray level image taken in visual angle, left and right of same object as this multi-view image by binocular camera.Such as, this stereoscopic image set can refer to the original image pair comprising reference picture and target image two gray level images.Particularly, can suppose that this reference picture is the left image photographed in visual angle, left side by binocular camera, and this target image is the right image photographed in visual angle, right side by binocular camera.
But it should be noted that, the application both can be applied to the Matching power flow between the respective pixel calculating stereoscopic image, also can be applied to the Matching power flow between the respective pixel calculating more multi-view image.In addition, the application similarly can be applied to the Matching power flow operation in gray-scale map or cromogram.Further, also right image can be used as reference picture, and left image is used as target image.
Fig. 3 be a diagram that the overview flow chart of the Matching power flow computing method according to the embodiment of the present application.
As shown in Figure 3, these Matching power flow computing method can comprise:
In step s 110, the reference representative pixels set of reference pixel is determined in a reference image.
When needing the Matching power flow between the some reference pixels in computing reference image and the some object pixels in target image, the neighbor of representative information can be selected first in a reference image for reference pixel.
Such as, the whether representative information of neighbor can be judged by its significance degree.Here, a neighbor (or being referred to as, pixel) can be selected in general image or in subregion.Particularly, in order to reduce the calculated amount of computing equipment and promote counting yield, at least one pixel that its significance degree is greater than threshold value can be found in the predetermined neighborhood of first of reference pixel, as above-mentioned neighbor.
In addition, this significance degree can be weighed by various index, such as, it can be the architectural feature or other characteristics etc. of the saltus step degree of the pixel value (such as, gray-scale value) of this pixel, this pixel degree of stability in a reference image, this pixel.
Here, for convenience of description, and at least one pixel that in the first predetermined neighborhood of described reference pixel and its significance degree is greater than threshold value can be in be referred to as supporting construction with reference to representative pixels set or reference pixel with reference to the described reference pixel in image.
In the step s 120, the target representation pixel set of object pixel is determined in the target image.
Next, can with step S110 similarly, in the target image object pixel is selected to the neighbor of representative information.Such as, in order to the Matching power flow exactly between computing reference pixel and object pixel, the supporting construction based on same concept can be defined relative to center pixel (respectively, the reference pixel in reference picture and the object pixel in target image) in reference picture and target image.That is, in the target image, this neighbor can be at least one pixel that in the described first predetermined neighborhood of described object pixel and its significance degree is greater than described threshold value equally.
Similarly, for convenience of description, by the described object pixel in target image and the supporting construction that at least one pixel that in the first predetermined neighborhood of described object pixel and its significance degree is greater than threshold value is referred to as the set of target representation pixel or object pixel can be in.
In addition, it should be noted that, although be here described as by step S120 performing after step silo, in practice, this step S120 also can perform before step S110 or both perform simultaneously.
In step s 130, which, the Matching power flow between computing reference pixel and object pixel.
After determining with reference to representative pixels set and the set of target representation pixel, according to the pixel value of each pixel in the described pixel value with reference to each pixel in representative pixels set and the set of described target representation pixel, the Matching power flow between described reference pixel and described object pixel can be calculated.
Particularly, first, can determine respectively and the respective pixel in the described described target representation pixel set corresponding with reference to each pixel in representative pixels set.
Such as, this corresponding relation between pixel can be proper accurate correspondence, and wherein the coordinate of two pixels must be completely corresponding.Alternatively, this corresponding relation between pixel also can be the elasticity correspondence with certain tolerance, as long as wherein the coordinate of two pixels is corresponding in certain tolerance.
Then, calculate respectively described with reference to the distance metric value in each pixel in representative pixels set and the described target representation pixel set corresponding with it between respective pixel.
This distance metric value can depend on adopted Matching power flow algorithm and determine.Such as, this distance metric value can be the margin of image element (such as, gray scale difference) between respective pixel, or also can be coding distance (such as, Hamming distance) etc. between respective pixel.
In addition, can also come above-mentioned distance metric value weights assigned process according to the relative position relation between the current adjacent coordinates that processing and centre coordinate, to obtain the Weighted distance metric with more accurately metric performance.
Finally, by suing for peace to all distance metric value in each pixel in calculated described reference representative pixels set and the described target representation pixel set corresponding with it between respective pixel, the Matching power flow between described reference pixel and described object pixel can be obtained.
As can be seen here, the embodiment of the application provides a kind of Matching power flow computing method, is wherein incorporated in Matching power flow computation process by the concept of supporting construction information, and supporting construction information refers to some representative neighbors here.That is, embodiments provides a kind of Matching power flow computing method based on supporting construction, for solid matching method in the application.According in the Matching power flow computing method of the embodiment of the present application, can in the process of the pixel selection neighbor for Matching power flow to be calculated, take into full account selected pixel whether representative information in associated picture, select representative neighbor adaptively, instead of selection has close attribute or is in the neighbor of its fixed position with it simply, and by selecting the neighbor of these representative information, differentiable Matching power flow can be obtained, and then in subsequent step, obtain correct parallax information.
3.1, the first concrete example
Hereinafter, the overall procedure of the Matching power flow computing method according to the concrete example of the embodiment of the present application first is described with reference to Fig. 4 to Fig. 6 B.
Be used for the Matching power flow between reference pixel in computing reference image and the object pixel in target image according to the Matching power flow computing method of the concrete example of the embodiment of the present application first, wherein this Matching power flow is estimated by the margin of image element between respective pixel.
In this first concrete example, still suppose that this reference picture is the left image photographed in visual angle, left side by binocular camera, and this target image is the right image photographed in visual angle, right side by binocular camera.This reference pixel can be called the center pixel in reference picture, and this object pixel can be called the center pixel in target image, has a known position deviation between the two, and namely both have deviation on horizontal ordinate and ordinate.
Fig. 4 be a diagram that the process flow diagram of the Matching power flow computing method according to the concrete example of the embodiment of the present application first.
As shown in Figure 4, these Matching power flow computing method can comprise:
In step S210, select the first predetermined neighborhood in a reference image.
Such as, can select in the described reference picture (such as, left image) to be in around described reference pixel and there is the region of reservation shape, as described first predetermined neighborhood.
Particularly, in a reference image, reference pixel is as center pixel.Such as, centered by this reference pixel, and can determine that around it a ratio of width to height is the rectangular area of w × h.But, the application is not limited thereto, and reference pixel also not can be in the center of the first predetermined neighborhood, but is in the off-centring certain position place of distance the first predetermined neighborhood, and this first neighborhood also can be other shapes, such as circular, oval, square etc.
In step S220, calculate the significance degree of at least one pixel in the first predetermined neighborhood except reference pixel.
Such as, this significance degree can be weighed by various index.
In a first case, this significance degree can be the saltus step degree of the pixel value (such as, gray-scale value) of this pixel.That is, the neighbor of representative information can be the gray-scale edges pixel being in gray-scale value jump position.
At this moment, step S220 can be realized by following operation: in the multiple different directions being starting point with described reference pixel, determine the pixel value saltus step degree between specific pixel and neighbor thereof, and using the significance degree of described pixel value saltus step degree as described specific pixel.
Fig. 5 A be a diagram that the schematic diagram of the significance degree calculating pixel in a first case.
With reference to figure 5A, can in a reference image, centered by reference pixel (being marked as A), scan along multiple different directions, until the zone boundary of the define first predetermined neighborhood.In fig. 5,8 directions are adopted to complete above-mentioned scanning, namely in outside 0 °, 45 °, 90 °, 135 °, 180 °, 225 °, 270 °, the 315 ° directions of center pixel, determine whether there is Gray Level Jump pixel, and scanning process is carried out toward zone boundary from center pixel.Then, can for each direction, the position of mark gray-scale value saltus step.
Particularly, for the neighbor P of two on a direction r r,iand P r, i+1if, the gray-scale value g (P between them r,i) and g (P r, i+1) difference be greater than a threshold value η, that is, if meet following formula (1), then think that gray-scale value saltus step situation occurs, and by pixel P r,ibe labeled as jump position:
| g (P r,i)-g (P r, i+1) | > η formula (1)
Wherein, η can get empirical value, and in one example, it can get different values according to the different range of gray-scale value.
With reference to the supporting construction example from left certain pixel of image given by figure 5A, in the figure, the circle being labeled as alphabetical A represents center pixel, white, grey, dead color, black represent the gray-scale value changed from small to large, and are labeled as letter b and represent to the circle of P the gray-scale value jump position that is finally marked.
But it should be noted that, the application is not limited thereto.The scanning of gray-scale value saltus step can also be completed in more or less direction, and also can when gray-scale value difference is greater than threshold value, by pixel P r, i+1, instead of pixel P r,ibe labeled as jump position.
In a second situation, this significance degree can be this pixel degree of stability in a reference image.That is, the neighbor of representative information can be the pixel with stability.
At this moment, step S220 can be realized by following operation: describe by performing scale invariant feature to specific pixel the dimension stable degree calculating described specific pixel, and using the significance degree of described dimension stable degree as described specific pixel.
Fig. 5 B be a diagram that the schematic diagram of the significance degree calculating pixel in a second situation.
With reference to figure 5B, after a given original image (gray scale or colour), this method can apply iteration gaussian filtering to obtain cluster image to this original image (or first predetermined field of only wherein selected w × h), and the every piece image in described cluster image corresponds to some specific gaussian filtering iterationses.In figure 5b, this number of times represents with T, and wherein T is integer.Such as, T=0 represents that associated picture is the original image not applying gaussian filtering, and T=1 represents that associated picture is the original image being applied with 1 gaussian filtering, and T=2 represents that associated picture is the original image being applied with 2 gaussian filterings, so analogizes.This cluster image is referred to as the metric space of original image in the image processing arts.
Then, can find extreme point in metric space, described extreme point is the higher pixel of dimension stable degree.
Such as, in the present embodiment, the step of this structure metric space and searching extreme point can adopt Scale invariant features transform (SIFT) algorithm to realize.But it should be noted that, the application is not limited thereto, this step can also adopt any one in following algorithm to realize: accelerate robust features (SURF) algorithm and affinity Scale invariant features transform (ASIFT) algorithm etc.
In the third case, this significance degree can be this pixel structure attribute in a reference image.That is, the neighbor of representative information can be the specific pixel of such as corner pixels and so on.
Such as, the matrix of the strength structure of the local neighborhood for showing a pixel can be calculated, and judge whether this pixel is corner pixels by the eigenwert of this matrix.
Obviously, the application is not limited in three kinds of above-mentioned situations.The significance degree of the some pixels in the first predetermined neighborhood also can be determined by other one or more characteristics.
In step S230, the set being greater than all pixels of threshold value with reference to pixel and its significance degree is defined as with reference to representative pixels set.
After determining in the predetermined neighborhood of first around reference pixel that its significance degree is greater than all pixels of threshold value, and at least one pixel that in the first predetermined neighborhood of described reference pixel and its significance degree is greater than threshold value can be in be referred to as supporting construction with reference to representative pixels set or reference pixel with reference to the described reference pixel in image.
In step S240, select the first predetermined neighborhood in the target image.
In step s 250, the significance degree of at least one pixel in the first predetermined neighborhood except object pixel is calculated.
In step S260, set object pixel and its significance degree being greater than all pixels of threshold value is defined as the set of target representation pixel.
Because step S240 to S260 corresponds respectively to above-mentioned step S210 to S230, its difference is only, step S240 to S260 is for target image (such as, right image) in object pixel determine the set of representative neighbor, so its repeated description will be omitted here.
After determining in the predetermined neighborhood of first around object pixel that its significance degree is greater than all pixels of threshold value, by the described object pixel in target image and the supporting construction that at least one pixel that in the first predetermined neighborhood of described object pixel and its significance degree is greater than threshold value is referred to as the set of target representation pixel or object pixel can be in.
In step S270, determine the respective pixel in the target representation pixel set corresponding with each pixel in the set of reference representative pixels respectively.
As mentioned above, due to the horizontal ordinate between known reference pixel and object pixel and the position deviation on ordinate, and reference pixel coordinate position in a reference image can be known, with object pixel coordinate position in the target image, in like manner, can know with reference to each pixel coordinate position in a reference image in representative pixels set, with each pixel coordinate position in the target image in the set of target representation pixel, so can judge for the some pixels in the set of reference representative pixels (such as according to the coordinate of each pixel, be referred to as the first pixel), a corresponding object pixel whether is there is (such as in the set of target representation pixel, be referred to as the second pixel).
Here, it should be noted that, due in practice before carrying out Matching power flow calculating, often need to carry out altitude calibration to reference picture and target image, so simply, can only location deviation delta x on the horizontal scale between reference pixel and object pixel, and be calibrated to mutually the same on the vertical scale.Below, conveniently, will go on to say as example.
Such as, the corresponding relation in left image and right image between pixel can be that strict correspondence or elasticity are corresponding.
In a first case, the coordinate of two pixels can be proper accurate correspondence, and that is, between left image and right image, the coordinate of two pixels must be completely corresponding.
Fig. 6 A be a diagram that the schematic diagram determining respective pixel in a first case.
As illustrated in fig. 6 a, for reference to some first pixels in representative pixels set, can first determine with reference to the first coordinate (x of this first pixel in left image in representative pixels set 0, y 0), then this first coordinate is carried out to the skew on horizontal ordinate, thus obtain the second coordinate (x 0+ Δ x, y 0), and judge that in the set of target representation pixel, whether there is its coordinate is (x 0+ Δ x, y 0) the second pixel.
In a second situation, the coordinate of two pixels can be the elasticity correspondence with certain tolerance, that is, as long as the coordinate of two pixels is corresponding in certain tolerance between left image and right image.
Fig. 6 B be a diagram that the schematic diagram determining respective pixel in a second situation.
Illustrated in Fig. 6 B, for reference to some first pixels in representative pixels set, can first determine with reference to the first coordinate (x of this first pixel in left image in representative pixels set 0, y 0), then this first coordinate is carried out to the skew on horizontal ordinate, thus obtain the second coordinate (x 0+ Δ x, y 0).Next, except judging that whether there is its coordinate in the set of target representation pixel is (x 0+ Δ x, y 0) the second pixel outside, also further judge whether to exist the second pixel in the elasticity correspondence neighborhood that its coordinate is in around the second coordinate in the set of target representation pixel.
For this reason, can select to be in around the second coordinate in described target image and there is the region of reservation shape, as the corresponding neighborhood of this elasticity.Such as, the corresponding neighborhood of this elasticity can be the region of a rectangular area centered by the second coordinate, border circular areas or other shapes.
Such as, can be a rectangular shape by corresponding for this elasticity neighborhood definition by following formula (2):
| x-(x 0+ Δ x) | <=1, simultaneously | y-y 0| <=1 formula (2)
Wherein, x is the horizontal ordinate of some pixels in the corresponding neighborhood of elasticity, and y is the ordinate of some pixels in elasticity correspondence neighborhood.
Subsequently, can by the set of described target representation pixel, the second pixel of being in the corresponding neighborhood of described elasticity of its coordinate is defined as the respective pixel corresponding with described first pixel.
It should be noted that, when there is multiple pixel in the corresponding neighborhood of elasticity, suitably can reduce the scope of the corresponding neighborhood of elasticity, or directly its coordinate of selection and the minimum pixel of the second coordinate distance as this second pixel.
In addition, when determining elasticity correspondence neighborhood, except coordinate can be used as except standard, other metrics also can be used as this standard.Such as, the radius r of from center pixel to mark position (that is, the neighbor of representative information) can be used and from level to the right to the angle theta every bar sweep trace, as the standard for the formation of the corresponding neighborhood of elasticity.
In step S280, the pixel value difference respectively in the set of computing reference representative pixels in each pixel and the target representation pixel set corresponding with it between respective pixel.
If determine for the some pixels in the set of reference representative pixels (such as in step S270, be referred to as the first pixel), the object pixel that existence one is corresponding in the set of target representation pixel (such as, be referred to as the second pixel), then can calculate pixel value difference between the two.Such as, this pixel value difference can be the absolute value (that is, absolute grayscale is poor) of difference between both pixel values (such as, gray-scale value).Alternatively, this pixel value difference can be the square value (that is, square gray scale difference) of difference between both pixel values (such as, gray-scale value).
Then, aforesaid operations can be repeated, to obtain described with reference to the pixel value difference in each pixel in representative pixels set and the described target representation pixel set corresponding with it between respective pixel respectively for reference to each pixel in representative pixels set.
In addition, in one example, calculate described with reference to each pixel in representative pixels set and the described target representation pixel set corresponding with it in pixel value difference between respective pixel process in, can not be distribute identical weighted value to each pixel in representative pixels set, but determine its weight coefficient according to the position relationship between each pixel and center pixel in representative pixels set, to obtain the pixel value difference of weighting.
Particularly, think in this application, the representative pixels obtained in preceding step can have different weights, and weighted value can and this representative pixels to center pixel between distance dependent: distance center pixel is nearer, and its weight is larger; Otherwise distance center pixel is far away, its weight is less.Alternatively, as different station location marker benchmark, this weighted value also can from the radius (r) of this representative pixels and angle (θ) about: the pixel being in different angles has different weights; The pixel being in different radii has different weights, and radius is less, and weight is larger, otherwise radius is larger, and weight is less.
At this moment, this pixel value difference respectively in the set of computing reference representative pixels in each pixel and the target representation pixel set corresponding with it between respective pixel can comprise: determine the first weight coefficient according to described with reference to the position relationship between the first pixel in representative pixels set and described reference pixel; Determine the second weight coefficient according to the position relationship between the second pixel in the set of described target representation pixel and described object pixel, described second pixel is and the respective pixel in the described described target representation pixel set corresponding with reference to the first pixel in representative pixels set; Calculate the pixel value difference between described first pixel and described second pixel; And use described first weight coefficient and described second weight coefficient, the pixel value difference between described first pixel and described second pixel is weighted, to obtain the weighted pixel difference between described first pixel and described second pixel.
On the contrary, if determine for for the first pixel in representative pixels set in step S270, in the set of target representation pixel, there is not the second pixel of any correspondence, then this method possibly cannot continue the calculating operation of above-mentioned pixel value difference.
At this moment, in order to ensure that this method can continue to perform, in a simple examples, directly can ignore with reference to the first pixel in representative pixels set, and continuing process with reference to the next pixel in representative pixels set.
But, in order to obtain Matching power flow more accurately, in another example, predetermined value can be set to the pixel value difference between described first pixel and described second pixel, wherein, described predetermined value is greater than described with reference to the maximal value in the pixel value difference in each pixel in representative pixels set and the described target representation pixel set corresponding with it between respective pixel.
In step S290, by the Matching power flow obtained between reference pixel and object pixel of suing for peace to calculated all pixel value differences.
As can be seen here, in the first concrete example of the embodiment of the present application, can select in the process of neighbor for center pixel, select some sparse pixels as coupling element adaptively, they are by the supporting construction of center pixel (representative neighbor, such as, gray scale/colour edging pixel, key point pixel, with corner pixels etc.) obtain, and directly obtain differentiable Matching power flow according to the pixel value difference of center pixel between reference picture and target image and/or neighbor, and then this Matching power flow is applied in follow-up Stereo Matching Algorithm, to obtain correct parallax information.
3.2, the second concrete example
Hereinafter, the overall procedure of the Matching power flow computing method according to the concrete example of the embodiment of the present application second is described with reference to Fig. 7 to Fig. 8 B.
Be used for the Matching power flow between reference pixel in computing reference image and the object pixel in target image according to the Matching power flow computing method of the concrete example of the embodiment of the present application second, wherein this Matching power flow is estimated by the quantizing coding distance between respective pixel.
Fig. 7 be a diagram that the process flow diagram of the Matching power flow computing method according to the concrete example of the embodiment of the present application second.
As shown in Figure 7, these Matching power flow computing method can comprise:
In step S310, select the first predetermined neighborhood in a reference image.
In step s 320, the significance degree of at least one pixel in the first predetermined neighborhood except reference pixel is calculated.
In step S330, the set being greater than all pixels of threshold value with reference to pixel and its significance degree is defined as with reference to representative pixels set.
In step S340, select the first predetermined neighborhood in the target image.
In step S350, calculate the significance degree of at least one pixel in the first predetermined neighborhood except object pixel.
In step S360, set object pixel and its significance degree being greater than all pixels of threshold value is defined as the set of target representation pixel.
In step S370, determine the respective pixel in the target representation pixel set corresponding with each pixel in the set of reference representative pixels respectively.
Because step S310 to the S370 in the second concrete example and follow-up step S390 corresponds respectively to above-mentioned steps S210 to S270 in the first concrete example and step S290, so will omit its repeated description here.
In step S375, to quantize coding with reference to each pixel in representative pixels set to described.
Such as, can using the pixel value of described reference pixel as benchmark, according to the described pixel value with reference to each pixel in representative pixels set, to quantize coding with reference to each pixel in representative pixels set to described.Such as, this coding that quantizes can be single-bit binaryzation coding, dibit binary conversion treatment etc.
In step S380, each pixel in the set of described target representation pixel is quantized coding.
With step S375 similarly, can using the pixel value of described object pixel as benchmark, according to the pixel value of each pixel in the set of described target representation pixel, each pixel in the set of described target representation pixel is quantized coding.
In step S385, the coding distance that quantizes respectively in the set of computing reference representative pixels in each pixel and the target representation pixel set corresponding with it between respective pixel.
After the encoded radio that quantizes obtaining each pixel in quantize encoded radio and the set of target representation pixel with reference to each pixel in representative pixels set, can compare seriatim in units of pixel and obtain respective pixel between coding distance, to sue for peace the Matching power flow that obtains between reference pixel and object pixel to coding distance in step S390.
In step S390, by the Matching power flow obtained between reference pixel and object pixel of suing for peace to calculated all coding distances that quantize.
It should be noted that, although in superincumbent step S375 to step S390, to obtain coding distance by respective pixel over the ground and to sue for peace the Matching power flow obtained between reference pixel and object pixel to the coding distance between all respective pixel, but the application is not limited thereto.
In another example, can also by the encoded radio that quantizes of each pixel in obtained reference representative pixels set in step S375, according to pixel position in a reference image, be cascaded into string number in a certain order, and in step S380, identical operation is performed for the encoded radio that quantizes of each respective pixel in the set of target representation pixel, so that in subsequent step, directly compare the coding distance between two obtained string numerical value, thus obtain the Matching power flow between reference pixel and object pixel more easily.Obviously, in such process, because in reference picture and target image, the coding of center pixel is identical, that is, coding distance between the two must be 0, therefore, in formed string number, the encoded radio that quantizes of center pixel directly can be omitted, to save process resource when calculation code distance.
In order to help understanding, below, be described to comprise with reference to representative pixels set the gray-scale edges pixel scanned along multiple directions.But without the need to repeating, step S375 to S385 can apply other representative pixel points of such as colour edging pixel, key point pixel and corner pixels and so on equally.
Such as, in step S375, can, for left image, for each center pixel, in all directions, the gray-scale value of the gray-scale value of mark position and center pixel be utilized directly to calculate the left Census value of left image.
The step calculating left Census value can adopt any one Census Transformation Matching algorithm, and such as, basic Census Transformation Matching algorithm, first strengthens Census Transformation Matching algorithm and second and strengthens Census Transformation Matching algorithm etc.
Such as, in left image, following formula (3) can be used to come the gray-scale value I with reference to each pixel in representative pixels set icarry out dibit binary conversion treatment, thus obtain the binary string for each neighbor.
b i = 10 I i > I 0 + &delta; 01 I i < I 0 - &delta; 00 | I i - I 0 | &le; &delta; , i = 1,2 , . . . , 8 Formula (3)
Wherein, I 0be the gray-scale value of center pixel, and δ is tolerance, it can be set as constant based on experience value.In one example, δ also can according to the gray-scale value I of following formula (4), foundation center pixel 0determine:
&delta; = 0 if I 0 &Element; [ 0,50 ) 1 if I 0 &Element; [ 50,100 ) 2 if I 0 &Element; [ 100,150 ) 3 if I 0 &Element; [ 150,500 ) 4 if I 0 &Element; [ 200,255 ] Formula (4)
Such as, next, at the gray-scale value I to each pixel in the set of reference representative pixels iafter carrying out dibit binary conversion treatment, can be encoded in a form, to search according to the station location marker benchmark of this pixel.
As mentioned above, in order to each pixel in the set of identification reference representative pixels, various parameter can be used as station location marker benchmark.
Such as, in a first case, can use pixel based on the coordinate (x, y) in the two-dimensional coordinate system constructed by reference picture as its station location marker benchmark.
Fig. 8 A be a diagram that the schematic diagram of the Census transformation results of left image in a first case.
In the form shown in Fig. 8 A, x and y represents with reference to each pixel in representative pixels set respectively based on the horizontal ordinate in the two-dimensional coordinate system constructed by reference picture and ordinate, wherein, this coordinate system be with central pixel point for initial point, level to the right for x-axis positive dirction, be rectangular coordinate system that y-axis positive dirction is set up straight up, and L (x, y) represents the Census encoded radio of each pixel in this reference representative pixels set.
Then, cascade can be carried out to generate the left Census value of left image to the Census value of representative pixel points all in left image in accordance with the order from top to bottom in fig. 8 a, namely 001000001000010000010001001001.
Alternatively, in a second situation, can use pixel based on the coordinate (r, θ) in the angle coordinate system constructed by centre coordinate as its station location marker benchmark.
Fig. 8 B be a diagram that the schematic diagram of the Census transformation results of left image in a second situation.
In the form shown in Fig. 8 B, r represents the radius of the pixel from center pixel to mark position, θ represents from level to the right to the angle every bar sweep trace, and L (r, θ) represents the Census encoded radio of each pixel in this reference representative pixels set.In this form, Census is transformed to three value transforms, and * represents do not have numerical value in this position.
Then, cascade can be carried out to generate the left Census value of left image according to the Census value of order from left to right, from top to bottom to representative pixel points all in left image in the fig. 8b, namely 001000001000010000010001001001.
Obviously, the net result that the generating mode of these two kinds left Census values obtains is consistent.
Next, in step S380, for right image, for each center pixel, in all directions, the right Census value calculating right image from the gray-scale value of the mark position of right image and the gray-scale value of center pixel can be utilized.
Obviously, the computation process of this step is identical with previous step, therefore, as calculated, can obtain Census encoded radio R (x, y) or the R (r, θ) of each pixel in the set of target representation pixel in right image.Then, by according to the corresponding relation in left image between each representative pixel points, cascade is carried out to generate the right Census value of right image to the Census value of representative pixel points all in right image.
Finally, directly relatively and the Hamming distance obtained between this two strings numerical value of left Census value and right Census value, thus the Matching power flow between reference pixel and object pixel can be obtained more easily.
It should be noted that, as described in the first concrete example, the respective pixel between left image mentioned in above-mentioned steps and right image can be absolute corresponding relation or elasticity corresponding relation.
In addition, when for the first pixel in the set of reference representative pixels, when there is not the second pixel of any correspondence in the set of target representation pixel, the predetermined value of maximal value that can be greater than in the coding distance between respective pixel is set to the coding distance between described first pixel and described second pixel.Such as, in the examples described above, can be known by formula (3) above, when dibit binaryzation, the maximum coding distance between two pixels is 2.At this moment, the predetermined value (such as, 3) being greater than 2 can be set to the Hamming distance between described first pixel and described second pixel.
In addition, when calculating the Census encoded radio of each representative pixels in left images, the weight of each representative pixels in left image and right image can also be calculated further.
Such as, use pixel based on the coordinate (x, y) in the two-dimensional coordinate system constructed by reference picture as in the first situation of its station location marker benchmark, weighted value can be set according to the distance between this representative pixels to center pixel.Owing to supposing that the coordinate of center pixel is origin (0,0) here, so directly arrange weighted value according to the coordinate of this representative pixels.Particularly, can use formula (5) that the weighted value w (x, y) of pixel (x, y) is set:
w ( x , y ) = e - x 2 2 &sigma; 2 x &CenterDot; e - y 2 2 &sigma; 2 y Formula (5)
Wherein, σ xand σ yfor width parameter, for the reach of control weight function.By calculating, obtain the weighting function of left image and right image respectively: w l(x, y) and w r(x, y).
At this moment, the weighting Hamming distance of left Census value and right Census value can be calculated according to following formula (6), be Matching power flow:
C = &Sigma; x &Sigma; y ( w l ( x , y ) &CenterDot; w r ( x , y ) &CenterDot; n ( L ( x , y ) ^ R ( x , y ) ) ) &Sigma; x &Sigma; y w l ( x , y ) &CenterDot; w r ( x , y ) Formula (6)
Wherein, w l(x, y) and w r(x, y) is respectively the weighting function of left image and right image, and the number of function n (x) for calculating in Bit String x 1.
Alternatively, at use pixel based on the coordinate (r in the angle coordinate system constructed by centre coordinate, θ) as in the second situation of its station location marker benchmark, weighted value can be set according to the Distance geometry direction between this representative pixels to center pixel.Such as, weighted value can be set to the angle theta every bar sweep trace to the right according to the radius r from center pixel to mark position with from level.
As can be seen here, in the second concrete example of the embodiment of the present application, can select for center pixel, in the process of neighbor, to select representative neighbor (such as, gray scale/colour edging pixel, key point pixel and corner pixels etc.) adaptively.Because the pixel in neighborhood has different weights, so this Matching power flow computing method robust more.But, the method can to quantize coding to center pixel between reference picture and target image and/or neighbor, and obtain differentiable Matching power flow according to the coding distance that quantizes between respective pixel, and then this Matching power flow is applied in follow-up Stereo Matching Algorithm, to obtain correct parallax information.
4, parallax value computing method
After utilization obtains the Matching power flow between the reference pixel in reference picture and the object pixel in target image according to the Matching power flow computing method of the embodiment of the present application, can continue to perform subsequent step in solid matching method (such as according to this Matching power flow, support that summation, disparity computation/optimization and parallax improve), finally to calculate the parallax value between the reference pixel in reference picture and the matched pixel in target image.
Fig. 9 be a diagram that the overview flow chart of the parallax value computing method according to the embodiment of the present application.
As shown in Figure 9, these parallax value computing method can comprise:
In step S410, determine multiple object pixel.
As mentioned before, stereoscopic camera can be taken and obtain the image of same object under two or more different visual angles, i.e. reference picture and target image.
In order to the parallax value between the reference pixel in computing reference image and the matched pixel in target image, first need to search out the matched pixel of mating with the reference pixel in reference picture in the target image.For this reason, multiple object pixel can be determined, matched pixel alternatively.
As mentioned above, due in practice before carrying out Matching power flow calculating, often need to carry out altitude calibration to reference picture and target image, so simply, can only location deviation delta x on the horizontal scale between reference pixel and the matched pixel of candidate, and this position deviation Δ x depends on stereoscopic camera parameter, and because of but be in limited pixel coverage.
Therefore, be (x at hypothetical reference pixel coordinate in a reference image 0, y 0) time, only need will be positioned at coordinate (x in the target image 0, y 0) to (x 0+ Δ x max, y 0) in limited pixel be defined as object pixel.Wherein, the finite number Δ x of above-mentioned pixel maxcan based on experience value or experiment value determine.
In the step s 420, for each object pixel, computing reference supports that in pixel set, each first pixel and the target corresponding with it support the Matching power flow in pixel set between corresponding second pixel respectively, described with reference to supporting that pixel set is included in described reference picture at least one pixel be in the second predetermined neighborhood of described reference pixel, and described target supports that pixel set is included in described target image at least one pixel be in the described second predetermined neighborhood of described object pixel.
First, pixel set is supported in the reference that can determine described reference pixel in described reference picture, described with reference to supporting that pixel set (or being referred to as, the support pixel of reference pixel) comprises at least one pixel be in the second predetermined neighborhood of described reference pixel.
The support pixel of a pixel can be all pixels with it with approximate parallax value.Such as, a kind of conventional mode selects the neighbor (comprising this object pixel itself) of object pixel in gray-scale map, as the support pixel of this object pixel.Alternatively, can another kind of conventional mode be select to be in the support pixel of the neighbor (comprising this object pixel itself) on the same area block as this object pixel with object pixel in gray-scale map.
Then, in like manner, the target that can determine described object pixel in described target image supports pixel set, and described target supports that pixel set comprises at least one pixel be in the described second predetermined neighborhood of described object pixel.
Next, according to the coordinate of pixel, can determine respectively with described with reference to supporting that described target corresponding to each pixel in pixel set supports the respective pixel in pixel set.
Finally, the above-described Matching power flow computing method according to the embodiment of the present application can be used, for each object pixel, computing reference supports that in pixel set, each first pixel and the target corresponding with it support the Matching power flow in pixel set between corresponding second pixel respectively.
Due to hereinbefore in conjunction with the embodiments with two concrete examples describe in detail the Matching power flow computing method of the embodiment of the present application, so its repeated description will be omitted here.
In step S430, for each object pixel, by the overall matching cost obtained between described reference pixel and described object pixel of suing for peace to calculated all Matching power flow.
Such as, can sue for peace to described Matching power flow, and using described Matching power flow sum as the overall matching cost between described reference pixel and described object pixel.
In step S440, at least in described multiple object pixel, determine described matched pixel according to the several groups Matching power flow between described reference pixel and described multiple object pixel, thus determine the parallax value between described reference pixel and described matched pixel.
Obviously, arbitrary solid matching method (such as, Block-matching, SGM etc.) can be applied to according to the Matching power flow computing method of the embodiment of the present application.That is, the subsequent treatment of Matching power flow computing method is not the key point of the application, and those skilled in the art can utilize any existing solid matching method to determine parallax value based on Matching power flow result of calculation.
Such as, for block matching method, only needing by finding minimum overall (adding up) Matching power flow, matched pixel can be determined among multiple object pixel.For SGM method, minimum accumulative Matching power flow can be found to determine matched pixel further by multi-direction dynamic programming.
Finally, parallax value between the two can be determined by calculating coordinate distance between described reference pixel and described matched pixel.
Below, the effect of the Stereo Matching Algorithm according to the embodiment of the present application is described with reference to Figure 10 A and Figure 10 B.
Figure 10 A be a diagram that the disparity map obtained based on the second matching algorithm strengthening statistics census transform according to prior art, and Figure 10 B be a diagram that the disparity map obtained according to the parallax value computing method of the embodiment of the present application.
Comparison diagram 10A and Figure 10 B can find out, compared with the result obtained with Stereo Matching Algorithm of the prior art, have how correct parallax value according to the disparity map that the parallax value computing method of the embodiment of the present application obtain, especially without in the region, road surface of texture, this effect is more obvious.After the disparity map obtaining a large amount of correct parallax value of such existence, even if wherein there is a little invalid parallax value once in a while, subsequent operation (such as, plane fitting) also can be utilized to fill, to obtain denser disparity map.Therefore, can find out, the results show is really effective according to the method that the embodiment of the present application proposes.
As can be seen here, the embodiment of the application provides a kind of parallax value computing method, wherein the concept of supporting construction information is incorporated in Matching power flow computation process, can in the process of the pixel selection neighbor for Matching power flow to be calculated, take into full account selected pixel whether representative information in associated picture, select representative neighbor adaptively.That is, in these parallax value computing method, can make use of and have more representational information (e.g., gray-scale edges information), make the method more robust, the Matching power flow calculated is more accurate, and the disparity map obtained thus also becomes more accurate.
5, Matching power flow calculation element
The embodiment of the application can also be implemented by a kind of Matching power flow calculation element.Hereinafter, the functional configuration block diagram of the Matching power flow calculation element according to the embodiment of the present application is described with reference to Figure 11.
Figure 11 illustrates the functional configuration block diagram of the Matching power flow calculation element according to the embodiment of the present application.
As shown in figure 11, this Matching power flow calculation element 100 may be used for the Matching power flow between the reference pixel in computing reference image and the object pixel in target image, described reference picture and described target image belong to an original image pair, and described device can comprise:
Reference set determining unit 110, for determining the reference representative pixels set of described reference pixel in described reference picture, described at least one pixel comprising described reference pixel with reference to representative pixels set and be in that in the first predetermined neighborhood of described reference pixel and its significance degree is greater than threshold value;
Goal set determining unit 120, for determining the target representation pixel set of described object pixel in described target image, the set of described target representation pixel comprises described object pixel and is at least one pixel that in the described first predetermined neighborhood of described object pixel and its significance degree is greater than described threshold value; And
Matching power flow computing unit 130, for the pixel value according to each pixel in the described pixel value with reference to each pixel in representative pixels set and the set of described target representation pixel, calculate the Matching power flow between described reference pixel and described object pixel.
In one example, reference set determining unit 110 can determine the reference representative pixels set of described reference pixel in described reference picture by following operation: select be in around described reference pixel and have the region of reservation shape, as described first predetermined neighborhood in described reference picture; Calculate the significance degree of at least one pixel in described first predetermined neighborhood except described reference pixel; And the set of all pixels described reference pixel and its significance degree being greater than described threshold value is defined as described with reference to representative pixels set.
Particularly, reference set determining unit 110 can calculate the significance degree of at least one pixel in described first predetermined neighborhood except described reference pixel by least one in following various mode: in the multiple different directions being starting point with described reference pixel, determine the pixel value saltus step degree between specific pixel and neighbor thereof, and using the significance degree of described pixel value saltus step degree as described specific pixel; By performing scale invariant feature to specific pixel, the dimension stable degree calculating described specific pixel is described, and using the significance degree of described dimension stable degree as described specific pixel; And determined the structure attribute of described specific pixel by the strength structure calculating specific pixel, and using the significance degree of described structure attribute as described specific pixel.
In one example, Matching power flow computing unit 130 can calculate Matching power flow between described reference pixel with described object pixel by following operation: determine respectively and respective pixel in the described described target representation pixel set corresponding with reference to each pixel in representative pixels set; Calculate described with reference to the distance metric value in each pixel in representative pixels set and the described target representation pixel set corresponding with it between respective pixel respectively; And by the Matching power flow obtained between described reference pixel and described object pixel of suing for peace to calculated all distance metric value.
Such as, Matching power flow computing unit 130 can be determined and the respective pixel in the described described target representation pixel set corresponding with reference to each pixel in representative pixels set respectively by following operation: determine described with reference to first coordinate of the first pixel in described reference picture in representative pixels set; Select be in around the second coordinate and there is the region of reservation shape in described target image, as the corresponding neighborhood of elasticity, between described second coordinate and described first coordinate, there is predetermined migration; And by the set of described target representation pixel, the second pixel of being in the corresponding neighborhood of described elasticity of its coordinate is defined as the respective pixel corresponding with described first pixel.
Such as, Matching power flow computing unit 130 can calculate described with reference to the distance metric value in each pixel in representative pixels set and the described target representation pixel set corresponding with it between respective pixel respectively by any one in following various mode: calculate respectively described with reference to each pixel in representative pixels set and with respective pixel in its corresponding described target representation pixel set between margin of image element, and using the pixel value of described reference pixel as benchmark, according to the described pixel value with reference to each pixel in representative pixels set, to quantize coding with reference to each pixel in representative pixels set to described, using the pixel value of described object pixel as benchmark, according to the pixel value of each pixel in the set of described target representation pixel, each pixel in the set of described target representation pixel is quantized coding, and calculate described with reference to the coding distance that quantizes in each pixel in representative pixels set and the described target representation pixel set corresponding with it between respective pixel respectively.
Such as, Matching power flow computing unit 130 can calculate described with reference to the distance metric value in each pixel in representative pixels set and the described target representation pixel set corresponding with it between respective pixel respectively by following operation: determine the first weight coefficient according to described with reference to the position relationship between the first pixel in representative pixels set and described reference pixel; Determine the second weight coefficient according to the position relationship between the second pixel in the set of described target representation pixel and described object pixel, described second pixel is and the respective pixel in the described described target representation pixel set corresponding with reference to the first pixel in representative pixels set; Calculate the distance metric value between described first pixel and described second pixel; And use described first weight coefficient and described second weight coefficient, the distance metric value between described first pixel and described second pixel is weighted, to obtain the Weighted distance metric between described first pixel and described second pixel.
Such as, Matching power flow computing unit 130 can calculate described with reference to the distance metric value in each pixel in representative pixels set and the described target representation pixel set corresponding with it between respective pixel respectively by following operation: when determining in the set of described target representation pixel less than during with described second pixel corresponding with reference to the first pixel in representative pixels set, predetermined value is set to the distance metric value between described first pixel and described second pixel, described predetermined value is greater than described with reference to the maximal value in the distance metric value in each pixel in representative pixels set and the described target representation pixel set corresponding with it between respective pixel.
Concrete function and the operation of above-mentioned reference set determining unit 110, goal set determining unit 120 and Matching power flow computing unit 130 are introduced in detail in the Matching power flow computing method described above with reference to Fig. 1 to Figure 10 B, and therefore, its repeated description will be omitted.
It should be noted that, the parts of above-mentioned Matching power flow calculation element 100 can realize with software program, such as, realized in conjunction with RAM and ROM etc. and the software code that wherein runs by the CPU in multi-purpose computer.Software program can be stored on the storage mediums such as such as flash memory, floppy disk, hard disk, CD, is operationally loaded into cause CPU on such as random access storage device RAM and performs.In addition, except on multi-purpose computer, can also be realized by the cooperation between special IC and software.Described integrated circuit is comprised and being realized by least one in such as MPU (microprocessing unit), DSP (digital signal processor), FPGA (field programmable gate array), ASIC (special IC) etc.Such multi-purpose computer or special IC etc. such as can be loaded in ad-hoc location (such as, vehicle) on, and communicate with the imaging device such as camera for the object images be associated with road to road installed on location, so that the two dimensional image obtain camera shooting and/or stereo-picture extract supporting construction, and then calculate weighted registration cost, to calculate parallax value later by Stereo Matching Algorithm subsequent step.In addition, all parts of this Matching power flow calculation element 100 can realize with special hardware, such as specific field programmable gate array, special IC etc.In addition, all parts of Matching power flow calculation element 100 also can utilize the combination of software and hardware to realize.
6, parallax value computing equipment
The embodiment of the application can also be implemented by a kind of parallax value computing equipment.Hereinafter, the functional configuration block diagram of the parallax value computing equipment according to the embodiment of the present application is described with reference to Figure 12.
Figure 12 illustrates the functional configuration block diagram of the parallax value computing equipment according to the embodiment of the present application.
As shown in figure 12, this parallax value computing equipment 200 may be used for the parallax value between the reference pixel in computing reference image and the matched pixel in target image, described reference picture and described target image belong to an original image pair, and described equipment can comprise:
Object pixel determining device 210, for determining multiple object pixel;
Matching power flow calculation element 220, for for each object pixel, computing reference supports that in pixel set, each first pixel and the target corresponding with it support the Matching power flow in pixel set between corresponding second pixel respectively, described with reference to supporting that pixel set is included in described reference picture at least one pixel be in the second predetermined neighborhood of described reference pixel, and described target supports that pixel set is included in described target image at least one pixel be in the described second predetermined neighborhood of described object pixel;
Overall cost obtaining means 230, for for each object pixel, by the overall matching cost obtained between described reference pixel and described object pixel of suing for peace to calculated all Matching power flow; And
Parallax value calculation element 240, for at least determining described matched pixel according to the several groups Matching power flow between described reference pixel and described multiple object pixel in described multiple object pixel, thus determine the parallax value between described reference pixel and described matched pixel
Wherein, described Matching power flow calculation element 220 is distinguished computing reference by following operation and is supported that in pixel set, each first pixel and the target corresponding with it support the Matching power flow in pixel set between corresponding second pixel: the first representative pixel set determining described first pixel in described reference picture, and described first representative pixel set comprises described first pixel and is at least one pixel that in the first predetermined neighborhood of described first pixel and its significance degree is greater than threshold value; In described target image, determine the second representative pixel set of described second pixel, described second representative pixel set comprises described second pixel and is at least one pixel that in the described first predetermined neighborhood of described second pixel and its significance degree is greater than described threshold value; And according to the pixel value of each pixel in the pixel value of each pixel in described first representative pixel set and described second representative pixel set, calculate the Matching power flow between described first pixel and described second pixel.
In one example, this Matching power flow calculation element 220 such as can utilize the configuration of the Matching power flow calculation element 100 shown in Figure 11 to realize.
7, parallax value computing system
In addition, the application can also be implemented by a kind of parallax value computing system.Hereinafter, the functional structure of the parallax value computing system according to the embodiment of the present application is described with reference to Figure 13.
Figure 13 illustrates the functional structure chart of the parallax value computing system according to the embodiment of the present application.
As shown in figure 13, this parallax value computing system 300 can comprise: for the imaging device 310 to object images, as monocular camera, binocular camera, many orders camera etc.; Parallax value computing equipment 320, for extracting supporting construction to the two dimensional image obtained captured by imaging device 210 and/or stereo-picture, and then calculate weighted registration cost, to calculate parallax value by Stereo Matching Algorithm subsequent step, this parallax value computing equipment 320 such as can utilize the configuration of the parallax value computing equipment 200 shown in Figure 12 to realize.
Particularly, this parallax value computing system 300 be input as gray-scale map or cromogram etc., such as can be taken by the binocular camera being arranged on specific location and obtain.This input is after parallax value computing equipment, and export parallax information result of calculation, output form can be various, such as, export visible disparity map over the display.
8, for the hardware system of parallax value calculating
The application can also be implemented by a kind of hardware system for parallax value calculating.Hereinafter, with reference to Figure 14, the hardware system calculated for parallax value according to the embodiment of the present application is described.
Figure 14 illustrates the general hardware block diagram of the hardware system for parallax value calculating according to the embodiment of the present application.
As shown in figure 14, this Matching power flow computing system 400 can comprise: input equipment 410, for inputting for information about from outside, such as gray-scale map, cromogram, camera configuration information etc., such as can comprise keyboard, Genius mouse and communication network and remote input equipment of connecting thereof etc., and can comprise for the imaging device of object images and the decoding device etc. for carrying out image decoding to formed image; Treatment facility 420, for implementing the above-mentioned parallax value computing method according to the embodiment of the present application, or be embodied as above-mentioned parallax value computing equipment, such as can comprise the central processing unit or other the chip with processing power etc. of computing machine, the network (not shown) of such as the Internet can be connected to, according to the needs of processing procedure to result after teletransmission process etc.; Output device 430, for externally exporting the result implementing above-mentioned disparity computation process gained, such as, can comprise display, printer and communication network and remote output devices of connecting thereof etc.; And memory device 440, for storing the data involved by above-mentioned parallax value computation process in volatile or non-volatile mode, the data such as such as disparity map, such as, can comprise the various volatile or nonvolatile memory of random access memory (RAM), ROM (read-only memory) (ROM), hard disk or semiconductor memory etc.
Each embodiment of the application is described in detail above.But, it should be appreciated by those skilled in the art that when not departing from principle and the spirit of the application, various amendment can be carried out to these embodiments, combination or sub-portfolio, and such amendment should fall in the scope of the application.

Claims (10)

1. Matching power flow computing method, it is characterized in that, described method is for the Matching power flow between the reference pixel in computing reference image and the object pixel in target image, and described reference picture and described target image belong to an original image pair, and described method comprises:
The reference representative pixels set of described reference pixel is determined, described at least one pixel comprising described reference pixel with reference to representative pixels set and be in that in the first predetermined neighborhood of described reference pixel and its significance degree is greater than threshold value in described reference picture;
In described target image, determine the target representation pixel set of described object pixel, the set of described target representation pixel comprises described object pixel and is at least one pixel that in the described first predetermined neighborhood of described object pixel and its significance degree is greater than described threshold value; And
According to the pixel value of each pixel in the described pixel value with reference to each pixel in representative pixels set and the set of described target representation pixel, calculate the Matching power flow between described reference pixel and described object pixel.
2. method according to claim 1, is characterized in that, the reference representative pixels set determining described reference pixel in described reference picture comprises:
Select be in around described reference pixel and there is the region of reservation shape, as described first predetermined neighborhood in described reference picture;
Calculate the significance degree of at least one pixel in described first predetermined neighborhood except described reference pixel; And
The set described reference pixel and its significance degree being greater than all pixels of described threshold value is defined as described with reference to representative pixels set,
Wherein, the significance degree of at least one pixel calculated in described first predetermined neighborhood except described reference pixel comprises at least one in following various mode:
The pixel value saltus step degree between specific pixel and neighbor thereof is determined in the multiple different directions being starting point with described reference pixel, and using the significance degree of described pixel value saltus step degree as described specific pixel;
By performing scale invariant feature to specific pixel, the dimension stable degree calculating described specific pixel is described, and using the significance degree of described dimension stable degree as described specific pixel; And
The structure attribute of described specific pixel is determined by the strength structure calculating specific pixel, and
And using the significance degree of described structure attribute as described specific pixel.
3. method according to claim 1, is characterized in that, the Matching power flow calculated between described reference pixel and described object pixel comprises:
Determine respectively and the respective pixel in the described described target representation pixel set corresponding with reference to each pixel in representative pixels set;
Calculate described with reference to the distance metric value in each pixel in representative pixels set and the described target representation pixel set corresponding with it between respective pixel respectively; And
By the Matching power flow obtained between described reference pixel and described object pixel of suing for peace to calculated all distance metric value.
4. method according to claim 3, is characterized in that, determines respectively to comprise with the respective pixel in the described described target representation pixel set corresponding with reference to each pixel in representative pixels set:
Determine described with reference to first coordinate of the first pixel in described reference picture in representative pixels set;
Select be in around the second coordinate and there is the region of reservation shape in described target image, as the corresponding neighborhood of elasticity, between described second coordinate and described first coordinate, there is predetermined migration; And
By in the set of described target representation pixel, the second pixel of being in the corresponding neighborhood of described elasticity of its coordinate is defined as the respective pixel corresponding with described first pixel.
5. method according to claim 3, it is characterized in that, calculate respectively described with reference to the distance metric value in each pixel in representative pixels set and the described target representation pixel set corresponding with it between respective pixel comprise in following various mode any one:
Calculate described with reference to the margin of image element in each pixel in representative pixels set and the described target representation pixel set corresponding with it between respective pixel respectively; And
Using the pixel value of described reference pixel as benchmark, according to the described pixel value with reference to each pixel in representative pixels set, to quantize coding with reference to each pixel in representative pixels set to described, using the pixel value of described object pixel as benchmark, according to the pixel value of each pixel in the set of described target representation pixel, each pixel in the set of described target representation pixel is quantized coding, and calculate described with reference to the coding distance that quantizes in each pixel in representative pixels set and the described target representation pixel set corresponding with it between respective pixel respectively.
6. method according to claim 3, is characterized in that, calculates respectively describedly to comprise with reference to the distance metric value in each pixel in representative pixels set and the described target representation pixel set corresponding with it between respective pixel:
The first weight coefficient is determined with reference to the position relationship between the first pixel in representative pixels set and described reference pixel according to described;
Determine the second weight coefficient according to the position relationship between the second pixel in the set of described target representation pixel and described object pixel, described second pixel is and the respective pixel in the described described target representation pixel set corresponding with reference to the first pixel in representative pixels set;
Calculate the distance metric value between described first pixel and described second pixel; And
Use described first weight coefficient and described second weight coefficient, the distance metric value between described first pixel and described second pixel is weighted, to obtain the Weighted distance metric between described first pixel and described second pixel.
7. method according to claim 3, is characterized in that, calculates respectively describedly to comprise with reference to the distance metric value in each pixel in representative pixels set and the described target representation pixel set corresponding with it between respective pixel:
When determining the second pixel not corresponding with the first pixel in the set of described reference representative pixels in the set of described target representation pixel, predetermined value is set to the distance metric value between described first pixel and described second pixel, described predetermined value is greater than described with reference to the maximal value in the distance metric value in each pixel in representative pixels set and the described target representation pixel set corresponding with it between respective pixel.
8. parallax value computing method, it is characterized in that, described method is for the parallax value between the reference pixel in computing reference image and the matched pixel in target image, and described reference picture and described target image belong to an original image pair, and described method comprises:
Determine multiple object pixel;
For each object pixel, computing reference supports that in pixel set, each first pixel and the target corresponding with it support the Matching power flow in pixel set between corresponding second pixel respectively, described with reference to supporting that pixel set is included in described reference picture at least one pixel be in the second predetermined neighborhood of described reference pixel, and described target supports that pixel set is included in described target image at least one pixel be in the described second predetermined neighborhood of described object pixel;
For each object pixel, by the overall matching cost obtained between described reference pixel and described object pixel of suing for peace to calculated all Matching power flow; And
At least in described multiple object pixel, determine described matched pixel according to the several groups Matching power flow between described reference pixel and described multiple object pixel, thus determine the parallax value between described reference pixel and described matched pixel,
Wherein, computing reference supports that in pixel set, each first pixel and the target corresponding with it support that the Matching power flow in pixel set between corresponding second pixel comprises respectively:
In described reference picture, determine the first representative pixel set of described first pixel, described first representative pixel set comprises described first pixel and is at least one pixel that in the first predetermined neighborhood of described first pixel and its significance degree is greater than threshold value;
In described target image, determine the second representative pixel set of described second pixel, described second representative pixel set comprises described second pixel and is at least one pixel that in the described first predetermined neighborhood of described second pixel and its significance degree is greater than described threshold value; And
According to the pixel value of each pixel in the pixel value of each pixel in described first representative pixel set and described second representative pixel set, calculate the Matching power flow between described first pixel and described second pixel.
9. a Matching power flow calculation element, it is characterized in that, described device is for the Matching power flow between the reference pixel in computing reference image and the object pixel in target image, and described reference picture and described target image belong to an original image pair, and described device comprises:
Reference set determining unit, for determining the reference representative pixels set of described reference pixel in described reference picture, described at least one pixel comprising described reference pixel with reference to representative pixels set and be in that in the first predetermined neighborhood of described reference pixel and its significance degree is greater than threshold value;
Goal set determining unit, for determining the target representation pixel set of described object pixel in described target image, the set of described target representation pixel comprises described object pixel and is at least one pixel that in the described first predetermined neighborhood of described object pixel and its significance degree is greater than described threshold value; And
Matching power flow computing unit, for the pixel value according to each pixel in the described pixel value with reference to each pixel in representative pixels set and the set of described target representation pixel, calculate the Matching power flow between described reference pixel and described object pixel.
10. a parallax value computing equipment, it is characterized in that, described equipment is for the parallax value between the reference pixel in computing reference image and the matched pixel in target image, and described reference picture and described target image belong to an original image pair, and described equipment comprises:
Object pixel determining device, for determining multiple object pixel;
Matching power flow calculation element, for for each object pixel, computing reference supports that in pixel set, each first pixel and the target corresponding with it support the Matching power flow in pixel set between corresponding second pixel respectively, described with reference to supporting that pixel set is included in described reference picture at least one pixel be in the second predetermined neighborhood of described reference pixel, and described target supports that pixel set is included in described target image at least one pixel be in the described second predetermined neighborhood of described object pixel;
Overall cost obtaining means, for for each object pixel, by the overall matching cost obtained between described reference pixel and described object pixel of suing for peace to calculated all Matching power flow; And
Parallax value calculation element, for at least determining described matched pixel according to the several groups Matching power flow between described reference pixel and described multiple object pixel in described multiple object pixel, thus determine the parallax value between described reference pixel and described matched pixel
Wherein, described Matching power flow calculation element is distinguished computing reference by following operation and is supported that in pixel set, each first pixel and the target corresponding with it support the Matching power flow in pixel set between corresponding second pixel:
In described reference picture, determine the first representative pixel set of described first pixel, described first representative pixel set comprises described first pixel and is at least one pixel that in the first predetermined neighborhood of described first pixel and its significance degree is greater than threshold value;
In described target image, determine the second representative pixel set of described second pixel, described second representative pixel set comprises described second pixel and is at least one pixel that in the described first predetermined neighborhood of described second pixel and its significance degree is greater than described threshold value; And
According to the pixel value of each pixel in the pixel value of each pixel in described first representative pixel set and described second representative pixel set, calculate the Matching power flow between described first pixel and described second pixel.
CN201410277105.1A 2014-06-19 2014-06-19 Matching power flow computational methods and device and parallax value calculating method and equipment Expired - Fee Related CN105335952B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410277105.1A CN105335952B (en) 2014-06-19 2014-06-19 Matching power flow computational methods and device and parallax value calculating method and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410277105.1A CN105335952B (en) 2014-06-19 2014-06-19 Matching power flow computational methods and device and parallax value calculating method and equipment

Publications (2)

Publication Number Publication Date
CN105335952A true CN105335952A (en) 2016-02-17
CN105335952B CN105335952B (en) 2018-04-17

Family

ID=55286459

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410277105.1A Expired - Fee Related CN105335952B (en) 2014-06-19 2014-06-19 Matching power flow computational methods and device and parallax value calculating method and equipment

Country Status (1)

Country Link
CN (1) CN105335952B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107770512A (en) * 2016-08-22 2018-03-06 现代自动车株式会社 The system and method for disparity map are produced by matching stereo-picture
CN108074250A (en) * 2016-11-10 2018-05-25 株式会社理光 Matching power flow computational methods and device
CN109724537A (en) * 2019-02-11 2019-05-07 吉林大学 A kind of binocular three-dimensional imaging method and system
CN110363235A (en) * 2019-06-29 2019-10-22 苏州浪潮智能科技有限公司 A kind of high-definition picture matching process and system
CN111210481A (en) * 2020-01-10 2020-05-29 大连理工大学 Depth estimation acceleration method of multiband stereo camera
CN113407756A (en) * 2021-05-28 2021-09-17 山西云时代智慧城市技术发展有限公司 Lung nodule CT image reordering method based on self-adaptive weight
CN115018850A (en) * 2022-08-09 2022-09-06 深圳市领拓实业有限公司 Method for detecting burrs of punched hole of precise electronic part based on image processing

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120155747A1 (en) * 2010-12-17 2012-06-21 Electronics And Telecommunications Research Institute Stereo image matching apparatus and method
CN102572485A (en) * 2012-02-02 2012-07-11 北京大学 Self-adaptive weighted stereo matching algorithm, stereo display and collecting device and system
US20120237114A1 (en) * 2011-03-16 2012-09-20 Electronics And Telecommunications Research Institute Method and apparatus for feature-based stereo matching
CN103440653A (en) * 2013-08-27 2013-12-11 北京航空航天大学 Binocular vision stereo matching method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120155747A1 (en) * 2010-12-17 2012-06-21 Electronics And Telecommunications Research Institute Stereo image matching apparatus and method
US20120237114A1 (en) * 2011-03-16 2012-09-20 Electronics And Telecommunications Research Institute Method and apparatus for feature-based stereo matching
CN102572485A (en) * 2012-02-02 2012-07-11 北京大学 Self-adaptive weighted stereo matching algorithm, stereo display and collecting device and system
CN103440653A (en) * 2013-08-27 2013-12-11 北京航空航天大学 Binocular vision stereo matching method

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107770512A (en) * 2016-08-22 2018-03-06 现代自动车株式会社 The system and method for disparity map are produced by matching stereo-picture
CN107770512B (en) * 2016-08-22 2020-11-06 现代自动车株式会社 System and method for generating disparity map by matching stereo images
CN108074250A (en) * 2016-11-10 2018-05-25 株式会社理光 Matching power flow computational methods and device
CN108074250B (en) * 2016-11-10 2022-01-04 株式会社理光 Matching cost calculation method and device
CN109724537A (en) * 2019-02-11 2019-05-07 吉林大学 A kind of binocular three-dimensional imaging method and system
CN110363235A (en) * 2019-06-29 2019-10-22 苏州浪潮智能科技有限公司 A kind of high-definition picture matching process and system
CN110363235B (en) * 2019-06-29 2021-08-06 苏州浪潮智能科技有限公司 High-resolution image matching method and system
CN111210481A (en) * 2020-01-10 2020-05-29 大连理工大学 Depth estimation acceleration method of multiband stereo camera
CN113407756A (en) * 2021-05-28 2021-09-17 山西云时代智慧城市技术发展有限公司 Lung nodule CT image reordering method based on self-adaptive weight
CN115018850A (en) * 2022-08-09 2022-09-06 深圳市领拓实业有限公司 Method for detecting burrs of punched hole of precise electronic part based on image processing
CN115018850B (en) * 2022-08-09 2022-11-01 深圳市领拓实业有限公司 Method for detecting burrs of punched hole of precise electronic part based on image processing

Also Published As

Publication number Publication date
CN105335952B (en) 2018-04-17

Similar Documents

Publication Publication Date Title
CN105335952A (en) Matching cost calculation method and apparatus, and parallax value calculation method and equipment
US9576367B2 (en) Object detection method and device
KR101622344B1 (en) A disparity caculation method based on optimized census transform stereo matching with adaptive support weight method and system thereof
US9418313B2 (en) Method for searching for a similar image in an image database based on a reference image
CN104700062A (en) Method and equipment for identifying two-dimension code
CN104574347A (en) On-orbit satellite image geometric positioning accuracy evaluation method on basis of multi-source remote sensing data
CN104915949A (en) Image matching algorithm of bonding point characteristic and line characteristic
CN105528588A (en) Lane line recognition method and device
CN110033484B (en) High canopy density forest sample plot tree height extraction method combining UAV image and TLS point cloud
CN105389774A (en) Method and device for aligning images
JP2016206837A5 (en)
CN106709512B (en) Infrared target detection method based on local sparse representation and contrast
JP2012073845A (en) Computer system and method for alignment of image and graph
CN109214254B (en) Method and device for determining displacement of robot
CN110084743B (en) Image splicing and positioning method based on multi-flight-zone initial flight path constraint
CN102446356A (en) Parallel and adaptive matching method for acquiring remote sensing images with homogeneously-distributed matched points
CN104915927A (en) Parallax image optimization method and apparatus
CN111553296B (en) Two-value neural network stereo vision matching method based on FPGA
CN111444923A (en) Image semantic segmentation method and device under natural scene
KR101461108B1 (en) Recognition device, vehicle model recognition apparatus and method
Laupheimer et al. The importance of radiometric feature quality for semantic mesh segmentation
CN105389825A (en) Image processing method and system
CN116091706A (en) Three-dimensional reconstruction method for multi-mode remote sensing image deep learning matching
JP2020173584A (en) Object detection device
CN105447451A (en) Method and device for retrieving object markers

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20180417