CN100590658C - Method for matching two dimensional object point and image point with bilateral constraints - Google Patents

Method for matching two dimensional object point and image point with bilateral constraints Download PDF

Info

Publication number
CN100590658C
CN100590658C CN200810167782A CN200810167782A CN100590658C CN 100590658 C CN100590658 C CN 100590658C CN 200810167782 A CN200810167782 A CN 200810167782A CN 200810167782 A CN200810167782 A CN 200810167782A CN 100590658 C CN100590658 C CN 100590658C
Authority
CN
China
Prior art keywords
point
object point
picture point
transformation relation
picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN200810167782A
Other languages
Chinese (zh)
Other versions
CN101393639A (en
Inventor
魏振忠
王巍
张广军
赵征
李庆波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN200810167782A priority Critical patent/CN100590658C/en
Publication of CN101393639A publication Critical patent/CN101393639A/en
Application granted granted Critical
Publication of CN100590658C publication Critical patent/CN100590658C/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses a method for matching two-dimension object points and image points which are restricted in two ways. A transformation relation between the object points and the image points ofa space plane object is built based on a weak perspective model. The method comprises the steps of extracting the edges of geometrical characteristics of the image of the space plane object so as toobtain an edge image point set, and extracting points of the geometrical characteristic of the space plane object so as to obtain an object point set; obtaining a matching matrix of the transformationrelation of the representative object points and image points according to the object point set and the edge image point set; giving the initial variation estimation of the transformation relation between the object points and the image points; and obtaining an estimated transformation relation between the object points and the image points according to the object point set, the edge image pointset, the initial variation estimation and the matching matrix, and obtaining a matching image point set of the object point set according to the estimated transformation relation and the object pointset. The method of the invention can weakens the limitations on the matching conditions of the object points and the image points, and can enable the object points and the image points of the space plane object to be quickly matched.

Description

The two-dimentional object point of two-way constraint and picture point matching process
Technical field
The present invention relates to image processing field, be meant a kind of two-dimentional object point and picture point matching process of two-way constraint especially.
Background technology
A basic problem in the machine vision is the coupling between the different point sets, the coupling of point set for the outer ginseng of Attitude Calculation, video camera find the solution, the aligning of image etc. is significant.The coupling of point set is divided into the picture point coupling between different images and coupling two classes of object point and picture point, and the coupling for point set has following several method commonly used at present:
A kind of some matching process that is based on Hausdorff distances, this is the representative method of a class, this class methods random selecting point carries out the coupling of point set according to the feature of neighborhood of a point.The shortcoming of these class methods is that the calculated amount of point set matching algorithm is big, and the each process object of algorithm is the local feature of object, and therefore the result of coupling is absorbed in local optimum sometimes, and can not reach global optimum; In addition, such algorithm is under the situation that isolated point exists, and robustness is not strong.
Another kind is a RANSAC point matching process, this is that matching algorithm is put in a kind of antinoise preferably, but the RSANC algorithm with based on the point the field feature the class of algorithms seemingly, also be random selecting point, during reconnaissance certain threshold value to be set, carry out a coupling according to the point set that satisfies certain threshold value relation.The shortcoming of these class methods is that the calculated amount of a matching algorithm is big equally, and the each object of handling of algorithm is the local feature of object just also, and during reconnaissance threshold value bad assurance is set.
In addition, also have the method based on geometrical invariants, these class methods need be calculated the invariant of coplanar point or the invariant of coplanar point and coplane straight line.Each object is set up a totalizer array of representing the pose space, " bucket " in the corresponding pose of each element space of totalizer array; The geometrical invariants that each object several picture is constructed, produce corresponding hypothesis with each architectural feature group of each object, and determine corresponding pose parameter, be that above-mentioned geometrical invariants increases by 1 with the value of the corresponding hypothesis of object structures feature group generation then.So utilize geometrical invariants to vote, obtain the coupling of point set with this.The shortcoming of these class methods is geometric properties that matching algorithm is subject to object, and when utilizing geometrical invariants to vote, the size of " bucket " of ballot is difficult to determine.
This shows that the restrictive condition of existing point set matching algorithm is relatively stricter, and in computation process, be absorbed in the optimal value of local coupling easily, thereby can not get the coupling optimal value of the overall situation.
Summary of the invention
In view of this, fundamental purpose of the present invention is to provide a kind of two-dimentional object point and picture point matching process of two-way constraint, can weaken the condition restriction of object point and picture point coupling, and the object point of space plane object and picture point are mated.
For achieving the above object, technical scheme of the present invention is achieved in that
The invention provides a kind of two-dimentional object point and picture point matching process of two-way constraint, this method comprises:
A0, foundation are based on the object point of the space plane object of weak perspective model and the transformation relation between the picture point;
The edge of the geometric properties of a, extraction space plane object image obtains edge picture point set, gets a little on the geometric properties of space plane object, obtains the object point set;
B, obtain characterizing the coupling matrix of object point and picture point transformation relation according to the set of described object point and the set of edge picture point;
C, provide the initial variation estimation of object point and picture point transformation relation;
D, according to the set of described object point, the set of described edge picture point, described initial variation estimation and described coupling matrix, obtain the estimation transformation relation of object point and picture point, and, obtain picture point set with described object point sets match by resulting estimation transformation relation and the set of described object point.
Wherein, the edge of the geometric properties of extraction described in the step a is: the edge that utilizes the geometric properties of Canny operator extraction space plane object image.
The described initial variation estimation that provides object point and picture point transformation relation of step c is: utilize randomizer to provide described initial variation estimation.
The described estimation transformation relation that obtains object point and picture point of steps d is: utilize and determine that annealing algorithm obtains the estimation transformation relation of object point and picture point.
Steps d further comprises:
D1, the initial parameter of determining annealing algorithm is set according to the initial value of described coupling matrix;
D2, utilize the sinkhorn algorithm to upgrade described coupling matrix;
D3, utilize the Gauss-Seidel alternative manner to calculate the estimation transformation relation of object point and picture point according to the coupling matrix after upgrading;
D4, according to the set of described estimation transformation relation and described object point, utilize the transformation relation expression formula of object point and picture point to obtain picture point set with the object point sets match.
Wherein, the steps d 2 described sinkhorn of utilization algorithms upgrade described coupling matrix and comprise:
The described coupling matrix of initialization;
Each row and each column element to described coupling matrix carry out normalization calculating;
Described coupling matrix circulation carrying out normalization is calculated.
The two-dimentional object point of two-way constraint provided by the present invention and picture point matching process show the coupling of object point and picture point with a kind of displayable matrix form, very directly perceived on handling; Because the coupling of object point and picture point is to finish in the process of determining annealing, this algorithm calculates at the global characteristics of object, and it is optimum therefore can to make matching result be difficult for being absorbed in local coupling, has guaranteed that most possibly matching result obtains global optimum.
In addition, the present invention does not need space plane object to have the point of obvious characteristic, the object of handling can be the edge point set, it also can be the scattered point set on the plane, the geometric configuration of matching algorithm of the present invention and planar object is irrelevant simultaneously, the scope of application is wider, can be used for obturation, confusion, have false picture point or the disappearance picture point situation; And, the present invention has used definite annealing algorithm to calculate the coupling of object point and picture point, experiment shows that this algorithm has also shown matching result preferably in the Gaussian noise environment of the noise criteria difference of different stage, so, method of the present invention also has good robustness in the Gaussian noise environment of different stage, have very high using value.
Description of drawings
Fig. 1 is the two-dimentional object point of the two-way constraint of the present invention and the schematic flow sheet of picture point matching process;
Fig. 2 is the weak perspective model synoptic diagram in plane;
Fig. 3 is a Canny operator extraction circular object image geometry feature synoptic diagram.
Embodiment
The technical solution of the present invention is further elaborated below in conjunction with the drawings and specific embodiments.
The invention provides a kind of object point and picture point matching process of the space plane object based on weak perspective model, be a kind of object point of two-way constraint and the matching process of picture point, this method determines that by merging annealing algorithm and two-way bounding algorithm calculate the coupling that space plane object carries out object point and picture point.
As shown in Figure 1, the two-dimentional object point of two-way constraint of the present invention and picture point matching process mainly may further comprise the steps:
Step 101, foundation is based on the object point of the space plane object of weak perspective model and the transformation relation between the picture point.
Here, set up based on the plane object point of weak perspective and the transformation relation of picture point, { wherein A is 2 * 2 affine transformation matrix for A, B}, and B is 2 * 1 vector, the picpointed coordinate of expression object mass center in conversion.
By setting up the transformation relation between object point and the picture point, can obtain the transformation relation expression formula of object point and picture point.Like this, under the known situation of object point, can obtain the picture point of mating one by one by the transformation relation expression formula of object point and picture point with object point.
Weak perspective model specifically is to set up like this:
The present invention is applicable to any space plane object that satisfies weak perspective, so, generally require object to the distance of video camera at least greater than the variation of ten times object surface depth, if the visual field of video camera is smaller, it arrives the very little words of distance of video camera and object surface depth changes relatively, is at least ten times that its case depth changes as object to the distance of video camera, and the change in depth of each point can be with fixing depth value z on the object so 0Approximate, z 0Be the depth coordinate of object mass center on optical axis.
Suppose that focus of camera is f, the coordinate of any 1 P under camera coordinate system is (x on the space plane object c, y c, z c) T, (x, y) TRepresentative point P video camera as the plane on the coordinate of picture point, then weak perspective model can be expressed as formula (1):
x y = f z 0 0 0 f z 0 x c y c - - - ( 1 )
Wherein,
Figure C20081016778200082
Be the scaling constant.
After weak perspective model was set up, to weak perspective model expression formula, promptly formula (1) was derived, and can obtain the transformation relation expression formula of object point and picture point, and concrete grammar is as follows:
If the geometric properties of space plane object is round, then establishing the coordinate of the space circle center of circle under camera coordinate system is o w(x C0, y C0, z C0), shown in perspective model synoptic diagram a little less than Fig. 2 plane, with o wBe initial point, set up space circle three-dimensional system of coordinate o wx wy wz wIf o ' wBe the space circle barycenter, with space circle center of circle o wOverlap, with the parallel mistake space circle barycenter o ' of picture plane xoy wThe plane be x ' wO ' wY ' wCoordinate surface, o ' wBe true origin, set up three-dimensional system of coordinate o ' wX ' wY ' wZ ' w, coordinate axis o ' wherein wX ' w//ox, o ' wY ' w//oy, and direction unanimity.For plane x ' wO ' wY ' wOn arbitrfary point p ' w, establish it at camera coordinate system o cx cy cz cUnder coordinate be (x c, y c, z C0), at coordinate system o ' wX ' wY ' wZ ' wUnder coordinate be (x ' w, y ' w, 0), will
Figure C20081016778200083
Bring formula (1) into, formula of reduction (1) is formula (2):
x y = s 0 0 s x c y c - - - ( 2 )
Wherein,
Figure C20081016778200085
Be the scaling constant.
According to coordinate system o ' wX ' wY ' wZ ' wWith coordinate system o cx cy cz cBetween transformation relation, can obtain (x c, y c) T=(x ' w, y ' w) T+ (x C0, y C0) T, substitution formula (2) obtains formula (3):
x y = s 0 0 s x ′ w y ′ w + b 1 b 2 - - - ( 3 )
Wherein,
Figure C20081016778200087
Coordinate system o wx wy wz wWith o ' wX ' wY ' wZ ' wBetween rotational transform relation can describe by formula (4):
x ′ w y ′ w z ′ w = R x w y w z w - - - ( 4 )
Wherein,
Figure C20081016778200092
It is orthogonal matrix.
Space circle plane x wo wy wOn some z is arranged w=0, can get formula (5) by formula (4):
x ′ w y ′ w = r 11 r 12 r 21 r 22 x w y w - - - ( 5 )
Bring formula (5) into formula (3) and can obtain transformation relation between object point and the picture point, conversion A, B}:
x y = A x w y w + B - - - ( 6 )
Wherein, A is 2 * 2 affine transformation matrix,
Figure C20081016778200095
Figure C20081016778200096
B is 2 * 1 vector,
Figure C20081016778200097
The picpointed coordinate of expression object mass center then can obtain the picpointed coordinate of space circle barycenter: B=s (x from formula (6) C0, y C0) T
Step 102, the edge of the geometric properties of extraction space plane object image obtains edge picture point set, gets a little on the geometric properties of space plane object, obtains the object point set.
Here, the edge that how to extract space plane object image geometry feature can adopt various existing algorithms, specifies with the method that adopts the Canny operator extraction below.Utilize the Canny boundary operator to extract the edge of space plane object image geometry feature, obtain edge picture point set { p j, 1≤j≤g, g are the number of edge picture point, and the number of edge picture point is determined by the Canny operator; On the edge of object plane geometric properties, get object point at interval, obtain object point set { P i, 1≤i≤h, h are the number of object point, h 〉=g, and wherein getting of object point adopted prior art and is at random, and the interval between the object point can equate also can not wait, but will guarantee the number of the number of object point more than or equal to the edge picture point.
Step 103 is gathered the coupling matrix that obtains characterizing object point and picture point transformation relation according to object point set and edge picture point.
Suppose that the plane object point set that obtains is combined into { P i, P i=(x Wi, y Wi) T, 1≤i≤h; Edge picture point set is { p j, p j=(x j, y j) T, 1≤j≤g, h 〉=g.Transformation relation between formula (6) expression object point and picture point, so, object point set { P iAnd edge picture point set { p jRestriction relation can be by formula (6), i.e. { A, B} represents in conversion.Object point set { P iIn each point at most can only the edges matched picture point set { p jIn a point, simultaneously, edge picture point set { p jIn each point also can only mate object point set { P at most iIn a point, this is a two-way constrained optimization problem, can use formula (7) expression:
Q g × h = - ( x 1 - x ′ 1 ) 2 - ( x 1 - x ′ 2 ) 2 . . . - ( x 1 - x ′ h ) 2 - ( x 2 - x ′ 1 ) 2 - ( x 2 - x ′ 2 ) 2 . . . - ( x 2 - x ′ h ) 2 . . . . . . . . . . . . - ( x g - x ′ 1 ) 2 - ( x g - x ′ 2 ) 2 . . . - ( x g - x ′ h ) 2 + - ( y 1 - y ′ 1 ) 2 - ( y 1 - y ′ 2 ) 2 . . . - ( y 1 - y ′ h ) 2 - ( y 2 - y ′ 1 ) 2 - ( y 2 - y ′ 2 ) 2 . . . - ( y 2 - y ′ h ) 2 . . . . . . . . . . . . - ( y g - y ′ 1 ) 2 - ( y g - y ′ 2 ) 2 . . . - ( y g - y ′ h ) 2 - - - ( 7 )
Wherein,
Figure C20081016778200102
Formula (7) can be expressed as with matrix Q:
Q g × h = q 11 q 12 . . . q 1 h q 21 q 22 . . . q 2 h . . . . . . . . . . . . q g 1 q g 2 . . . q gh - - - ( 8 )
According to the two-way constrained optimization of object point set, obtain matrix Q with the set of edge picture point G * hAgain according to matrix Q G * h, the matching problem of object point and picture point is converted into seeks max q Ij, max q IjRepresenting matrix Q G * hThe capable maximal value with j row of i.So, can construct the coupling matrix M of the g * h that characterizes object point and picture point matching relationship: the coupling matrix M is by element m IjForm each row and each column element, i.e. m in the coupling matrix M IjNumber and matrix Q G * hEach the row and each column element, i.e. q IjNumber identical.Seek matrix Q G * hIn each the row or each row in q IjMaximal value, i.e. max q Ij, among the order matrix M with max q IjThe m of correspondence position IjBe 1, i.e. m Ij=1, represent i picture point and j object point coupling, the coupling matrix M that finally obtains, and satisfying each row, to have only a value with each row be 1, its residual value is 0 entirely.
Here, consider the phenomenon of disappearance point, as a cube 8 angle points are arranged, if take from cubical front, four angle points at the so cubical back side will be covered by four angle points of front, so in the coordinate system of video camera, four angle points at the cube back side just and four angle points coincidences of front, then four of the back side angle points lack a little exactly.Consider that the disappearance point calculates coarse problem of bringing to object pose, for the coupling matrix M increases delegation and row, then M becomes (g+1) * (h+1) matrix, if m I, h+1=1,1≤i≤g, then i picture point can't be mated with any object point, and be same, m G+1, j=1,1≤j≤h represents that j object point can't mate with any picture point.This dispersed problem can be converted into continuous problem by introducing a control variable β (β>0), initialization coupling matrix M, order
Figure C20081016778200111
1≤i≤g,1≤j≤h (9)
So, can guarantee m 0 Ij>0.Carry out later definite annealing algorithm according to formula (9), wherein, α, γ, β are the parameter values of determining annealing algorithm, γ is the scaling constant coefficient, β is used for simulating the temperature of determining in annealing (deterministic annealing) algorithm, its initial value is very little, and α is a very little constant, expression q IjWith 0 degree of closeness.
Step 104 provides the initial variation estimation of object point and picture point transformation relation.
Here, can utilize randomizer to provide the initial variation estimation of object point and picture point transformation relation, conversion { A 0, B 0, A 0Be 2 * 2 matrixes, B 0Be 2 * 1 vectors.
According to the initial variation estimation of object point and picture point, conversion { A 0, B 0And the set of known object point can obtain estimating the picture point set, then by edge picture point set { p jAnd estimate that the picture point set can calculate matrix Q G * hValue, matrix Q G * hValue be used as when determining annealing algorithm, the foundation of parameter beta is set.
Preferably, A 0The magnitude of numerical value is set to about 10, so, when object point and picture point coupling, the overall situation function that optimize Be easy to just can converge to global optimum, when overall situation function converged to global optimum, { A, the value of B} was exactly the estimation transformation relation that obtains of ultimate demand { A ', B ' }, is illustrated in the step of back herein in conversion pairing with it.
According to initial transformation { A 0, B 0, by conversion A, B}, promptly formula (6) can calculate estimate the picture point set p ' i, 1≤p ' wherein i≤ h, h are the number of object point.Adopt the building method of matrix in the prior art, by edge picture point set { p jStructural matrix U and V:
U = x 1 x 1 . . . x 1 x 2 x 2 . . . x 2 . . . . . . . . . . . . x g x g . . . x g g × h V = y 1 y 1 . . . y 1 y 2 y 2 . . . y 2 . . . . . . . . . . . . y g y g . . . y g g × h - - - ( 10 )
By estimate the picture point set p ' iStructural matrix U ' and V ':
U ′ = x ′ 1 x ′ 2 . . . x ′ h x ′ 1 x ′ 2 . . . x ′ h . . . . . . . . . . . . x ′ 1 x ′ 2 . . . x ′ h g × h V ′ = y ′ 1 y ′ 2 . . . y ′ h y ′ 1 y ′ 2 . . . y ′ h . . . . . . . . . . . . y ′ 1 y ′ 2 . . . y ′ h g × h - - - ( 11 )
Formula (10) and formula (11) substitution formula (7) can be calculated matrix Q G * hQ G * hValue with deciding the value of β, describe in the step below herein.
Step 105, according to the initial variation estimation of object point set, edge picture point set, object point and picture point transformation relation and coupling matrix, obtain the estimation transformation relation of object point and picture point, and, obtain picture point set with the object point sets match by estimating the set of transformation relation and object point.
The key of this step is the estimation transformation relation that calculates object point and picture point, and real transform that this estimation transformation relation promptly is object point and picture point relation just can obtain the actual image point of mating one by one with object point thus, and then obtain the actual image point set.Wherein, the estimation transformation relation of object point and picture point is tried to achieve by determining annealing algorithm, and concrete grammar is as follows:
According to formula (9)
Figure C20081016778200125
1≤i≤g, 1≤j≤h determines annealing algorithm, determine that the parameter that annealing algorithm will be found the solution is the transformation relation of object point and picture point, conversion A, B} is by calculating the estimation transformation relation that can obtain object point and picture point, conversion A ', B ' }, can obtain and object point set { P according to conversion { A ', B ' } iPicture point set { the p of coupling i.In determining the annealing algorithm process each time, by calculating the value that can obtain a coupling matrix M, can obtain the transformation relation of a new object point and picture point, conversion { A according to the value of coupling matrix M I+1, B I+1, as conversion { A I+1, B I+1Make the global optimization function Reach global optimum, promptly satisfy the stopping criterion for iteration of determining annealing:
Figure C20081016778200131
The time, finish to determine anneal cycles, then this conversion { A I+1, B I+1Be the estimation transformation relation of object point and picture point, conversion A ', B ' }.Concrete grammar is:
At first, each parameter value of determining annealing algorithm need be set, promptly each parameter value in the formula (9) be provided with.
The setting of determining each parameter value of annealing algorithm is such: α is a very little value, generally can be set to α=10 -5
Figure C20081016778200132
γ is little to the influence of the iteration in the annealing process, can be set to 1; β is bigger to the iteration influence,
Figure C20081016778200133
T represents to determine the temperature parameter of annealing algorithm, and then β can be used to simulate the temperature of determining annealing algorithm, the initial value β of β 0Setting answer R-matrix Q G * hThe order of magnitude of element, Q G * hValue in step 104 by estimate the picture point set p ' iAnd picture point set { p jCan calculate, if Q G * hThe order of magnitude of middle most elements is 10 5, β so 0Should be set to β 0=10 -5, β 0What be provided with is too high or too low, makes iteration be absorbed in local minimum easily, the stop value β of β FinalBe traditionally arranged to be 0.5, the renewal multiple β of β UpdateBe traditionally arranged to be 1.05, β iBe the β value in the annealing process each time, β I+1i* β UpdateDelta represents the mean value of the true picture point Euclidean distance of the picture point estimated and extraction,
Figure C20081016778200134
Tol 1Be a very little numerical value, relevant with noise level,
Figure C20081016778200135
NoiseStd represents the standard deviation of noise, and in real image was handled, the standard deviation of noise was unknown, tol 1Value be traditionally arranged to be 0.5.
After each parameter value of determining annealing algorithm set, utilize the sinkhorn algorithm to upgrade the coupling matrix M, concrete method is:
1, initialization coupling matrix M, order
Figure C20081016778200136
1≤i≤g, 1≤j≤h gives m I, h+1, 1≤i≤g+1 distributes a very little constant, as 10 -3, give m G+1, j, 1≤j≤h+1 distributes a very little constant, as 10 -3
2, each row element and each column element to matrix M carries out normalization calculating, uses formula (12) to carry out normalization to each row element and calculates:
Figure C20081016778200141
1≤i≤g,1≤j≤h+1 (12)
Wherein, l represents to carry out the number of times that normalization is calculated.
Using formula (13) to carry out normalization to each column element calculates:
Figure C20081016778200142
1≤i≤g+1,1≤j≤h (13)
Wherein l represents to carry out the number of times that normalization is calculated.
3, normalization calculating is carried out in circulation to the coupling matrix M;
Here, the end condition of circulation normalization calculating is tol 2Less than a specified value, as 0.005,
Figure C20081016778200143
The coupling matrix M is finished a normalized obtain a new coupling matrix M later on, with its substitution
Figure C20081016778200144
In, obtain a tol 2If, tol 2<0.005, then stop the normalization of coupling matrix M is calculated; The number of times that the normalization that perhaps circulates is calculated, then stops the normalization of coupling matrix M is calculated as 80 times greater than the maximum cycle of setting.
By the sinkhorn algorithm coupling matrix M is whenever circulated after normalization calculates, will obtain new coupling matrix M, calculate the new object point and the transformation relation { A of picture point according to the coupling matrix M that calculates I+1, B I+1, order
∂ E ∂ a 11 = 0 ∂ E ∂ a 12 = 0 ∂ E ∂ a 21 = 0 ∂ E ∂ a 22 = 0 ∂ E ∂ b 1 = 0 ∂ E ∂ b 2 = 0 - - - ( 14 )
Wherein, a 11, a 12, a 21, a 22Be the element in the matrix A, b 1, b 2Be the element in the matrix B, with the global optimization function
Figure C20081016778200151
Bring formula (14) into, can obtain system of linear equations CX=N after then formula (14) being launched, wherein C is one 6 * 6 a matrix:
Figure C20081016778200152
X=(a 11, a 12, a 21, a 22, b 1, b 2) T, N is one 6 * 1 a matrix:
It is pointed out that when structure system of linear equations CX=N, can do a technical processing at concrete problem.When by calculating can be in the hope of the picpointed coordinate at object geometric properties center the time, then can be with the approximate picpointed coordinate that replaces object mass center of this picpointed coordinate, by formula (6) as can be known, B represents the picpointed coordinate of object mass center.For example, when the geometric properties of space plane object is circle, this space circle be imaged as ellipse, can utilize the Canny operator extraction to obtain oval edge picture point set so, utilize the method for ellipse fitting to obtain the picpointed coordinate of elliptical center then.Because the distortion error of distance has only several microns to tens microns between elliptical center picture point and the space circle barycenter picture point, so the picpointed coordinate of elliptical center can be similar to the picpointed coordinate that replaces the space circle barycenter, then B just becomes known quantity in the formula (6), so, the unknown quantity that needs to find the solution in determining the annealing algorithm process is 2 * 2 matrix A.
After B replaced by the picpointed coordinate at object geometric properties center is approximate, formula (6) can be expressed as:
x ′ y ′ = x - b 1 y - b 2 = A x w y w
At this moment, determine that the overall situation function that needs in the annealing algorithm process to optimize is:
Figure C20081016778200155
P ' wherein i=(x ' y ') T, then construct system of linear equations CX=N, wherein:
Figure C20081016778200161
X=(a 11a 12a 21a 22) T, then N is one 4 * 1 a matrix:
Σ i = 1 g Σ j = 1 h m ij u i x wj u i y wj v i x wj v i y wj
Then, system of linear equations CX=N is found the solution, just can obtain new conversion { A according to the Gauss-Seidel alternative manner I+1, B I+1.
After executing annealing algorithm, upgrade β, β I+1i* β Update, carry out once more and determine annealing algorithm, obtain new coupling matrix M and conversion { A I+1, B I+1, so circulation is carried out and is determined annealing algorithm, up to satisfying the stopping criterion for iteration of determining annealing, i.e. β Final=0.5,
Figure C20081016778200163
After determining that annealing algorithm reaches stopping criterion for iteration, withdraw from definite annealing algorithm.
It is pointed out that after each definite annealing algorithm process finishes and all can obtain a new conversion { A I+1, B I+1, as conversion { A I+1, B I+1Make the global optimization function
Figure C20081016778200164
Reach global optimum, promptly satisfy the stopping criterion for iteration of determining annealing:
Figure C20081016778200165
The time, finish to determine anneal cycles, then this conversion { A I+1, B I+1Be the estimation transformation relation of object point and picture point, conversion A ', B ' }.
To estimate transformation relation, conversion A ', B ' } and object point set { P iBring formula (6) into, by calculating the picture point set { p that can obtain with the object point sets match i.
Test below by a hardware-in-the-loop simulation the solution of the present invention is described.
Utilize the geometric properties edge of Canny operator extraction space plane object image, the image at the subject image geometric properties edge that extracts is the virtual image that half platform in kind of development in laboratory generates.Be illustrated in figure 3 as Canny operator extraction circular object image geometry feature synoptic diagram, this image is to be generated by virtual circle, radius of a circle is 2 meters, about 300 kilometers away from virtual video camera of distance of center circle be, focal length of camera is 66.885 meters, the image pixel size that generates is 1024 * 768, and the image of circle accounts for 80 * 80 pixels on whole picture plane greatly.
At first utilize the edge of canny operator extraction circle, obtain 100 edge picture point { p j, on the object circle, evenly get 100 object point { P at interval i.{ { A, B} calculates and object point { P the transformation relation of calculating object point and picture point according to transformation relation then for A, B} iOne by one the coupling picture point { p i.
The above is preferred embodiment of the present invention only, is not to be used to limit protection scope of the present invention.

Claims (6)

1, a kind of two-dimentional object point of two-way constraint and picture point matching process is characterized in that, this method comprises:
A0, foundation are based on the object point of the space plane object of weak perspective model and the transformation relation between the picture point;
The edge of the geometric properties of a, extraction space plane object image obtains edge picture point set, gets a little on the geometric properties of space plane object, obtains the object point set;
B, obtain characterizing the coupling matrix of object point and picture point transformation relation according to the set of described object point and the set of edge picture point;
C, provide the initial variation estimation of object point and picture point transformation relation;
D, according to the set of described object point, the set of described edge picture point, described initial variation estimation and described coupling matrix, obtain the estimation transformation relation of object point and picture point, and, obtain picture point set with described object point sets match by resulting estimation transformation relation and the set of described object point.
According to the two-dimentional object point and the picture point matching process of the described two-way constraint of claim 1, it is characterized in that 2, the edge that extracts geometric properties described in the step a is: the edge that utilizes the geometric properties of Canny operator extraction space plane object image.
According to the two-dimentional object point and the picture point matching process of the described two-way constraint of claim 1, it is characterized in that 3, the described initial variation estimation that provides object point and picture point transformation relation of step c is: utilize randomizer to provide described initial variation estimation.
According to the two-dimentional object point and the picture point matching process of the described two-way constraint of claim 1, it is characterized in that 4, the described estimation transformation relation that obtains object point and picture point of steps d is: utilize and determine that annealing algorithm obtains the estimation transformation relation of object point and picture point.
5, according to the two-dimentional object point and the picture point matching process of the described two-way constraint of claim 4, it is characterized in that steps d further comprises:
D1, the initial parameter of determining annealing algorithm is set according to the initial value of described coupling matrix;
D2, utilize the sinkhorn algorithm to upgrade described coupling matrix;
D3, utilize the Gauss-Seidel alternative manner to calculate the estimation transformation relation of object point and picture point according to the coupling matrix after upgrading;
D4, according to the set of described estimation transformation relation and described object point, utilize the transformation relation expression formula of object point and picture point to obtain picture point set with the object point sets match.
According to the two-dimentional object point and the picture point matching process of the described two-way constraint of claim 5, it is characterized in that 6, the steps d 2 described sinkhorn of utilization algorithms upgrade described coupling matrix and comprise:
The described coupling matrix of initialization;
Each row and each column element to described coupling matrix carry out normalization calculating;
Described coupling matrix circulation carrying out normalization is calculated.
CN200810167782A 2008-07-11 2008-10-07 Method for matching two dimensional object point and image point with bilateral constraints Expired - Fee Related CN100590658C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN200810167782A CN100590658C (en) 2008-07-11 2008-10-07 Method for matching two dimensional object point and image point with bilateral constraints

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN200810116560 2008-07-11
CN200810116560.8 2008-07-11
CN200810167782A CN100590658C (en) 2008-07-11 2008-10-07 Method for matching two dimensional object point and image point with bilateral constraints

Publications (2)

Publication Number Publication Date
CN101393639A CN101393639A (en) 2009-03-25
CN100590658C true CN100590658C (en) 2010-02-17

Family

ID=40493923

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200810167782A Expired - Fee Related CN100590658C (en) 2008-07-11 2008-10-07 Method for matching two dimensional object point and image point with bilateral constraints

Country Status (1)

Country Link
CN (1) CN100590658C (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101846514A (en) * 2010-06-17 2010-09-29 中国人民解放军信息工程大学 Image point matching method for industrial digital photogrammetry
CN103345642A (en) * 2013-06-28 2013-10-09 华中科技大学 Image matching method based on dotted line dual

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102763123B (en) * 2009-12-02 2015-03-25 高通股份有限公司 Improving performance of image recognition algorithms by pruning features, image scaling, and spatially constrained feature matching
CN104376555B (en) * 2014-10-24 2017-02-15 南京大学 Navigation array matching method and device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101846514A (en) * 2010-06-17 2010-09-29 中国人民解放军信息工程大学 Image point matching method for industrial digital photogrammetry
CN101846514B (en) * 2010-06-17 2011-11-23 中国人民解放军信息工程大学 Image point matching method for industrial digital photogrammetry
CN103345642A (en) * 2013-06-28 2013-10-09 华中科技大学 Image matching method based on dotted line dual
CN103345642B (en) * 2013-06-28 2016-05-25 华中科技大学 A kind of image matching method based on dotted line antithesis

Also Published As

Publication number Publication date
CN101393639A (en) 2009-03-25

Similar Documents

Publication Publication Date Title
CN101377812B (en) Method for recognizing position and attitude of space plane object
Zeng et al. 3D point cloud denoising using graph Laplacian regularization of a low dimensional manifold model
CN104748750B (en) A kind of model constrained under the Attitude estimation of Three dimensional Targets in-orbit method and system
Lhuillier et al. A quasi-dense approach to surface reconstruction from uncalibrated images
Zhang et al. Vision-based pose estimation for textureless space objects by contour points matching
CN105913489A (en) Indoor three-dimensional scene reconstruction method employing plane characteristics
CN103646156B (en) A kind of laser point cloud data autoegistration method based on the detection of ball target
CN102750704B (en) Step-by-step video camera self-calibration method
CN109035327B (en) Panoramic camera attitude estimation method based on deep learning
CN103913131A (en) Free curve method vector measurement method based on binocular vision
Friedman et al. Online detection of repeated structures in point clouds of urban scenes for compression and registration
Hu et al. Efficient and automatic plane detection approach for 3-D rock mass point clouds
CN111145232A (en) Three-dimensional point cloud automatic registration method based on characteristic information change degree
CN100590658C (en) Method for matching two dimensional object point and image point with bilateral constraints
CN107330930B (en) Three-dimensional image depth information extraction method
Cao et al. Fast incremental structure from motion based on parallel bundle adjustment
Puig et al. Scale space for camera invariant features
Loseille et al. On 3D anisotropic local remeshing for surface, volume and boundary layers
CN110851978B (en) Camera position optimization method based on visibility
Coorg Pose imagery and automated three-dimensional modeling of urban environments
Tan et al. Automatic Registration Method of Multi-Source Point Clouds Based on Building Facades Matching in Urban Scenes
CN105354839A (en) Method for calibrating parabolic catadioptric camera by using tangential two-ball image and orthogonal vanishing points
Gonzalez‐Aguilera et al. From 2D to 3D through modelling based on a single image
Grammatikopoulos et al. A unified approach for automatic camera calibration from vanishing points
Yang et al. Three-Filters-to-Normal $+ $: Revisiting Discontinuity Discrimination in Depth-to-Normal Translation

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20100217

Termination date: 20191007