CN103745220A - Method and device for obtaining affine local invariant features of image - Google Patents

Method and device for obtaining affine local invariant features of image Download PDF

Info

Publication number
CN103745220A
CN103745220A CN201410043639.8A CN201410043639A CN103745220A CN 103745220 A CN103745220 A CN 103745220A CN 201410043639 A CN201410043639 A CN 201410043639A CN 103745220 A CN103745220 A CN 103745220A
Authority
CN
China
Prior art keywords
interest
affine
point
pixel
local
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410043639.8A
Other languages
Chinese (zh)
Inventor
林睿
孙荣川
任子武
厉茂海
孙立宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou University
Original Assignee
Suzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou University filed Critical Suzhou University
Priority to CN201410043639.8A priority Critical patent/CN103745220A/en
Publication of CN103745220A publication Critical patent/CN103745220A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a method and device for obtaining the affine local invariant features of an image. In order to solve the problem of extraction of local invariant features of machine vision, the method comprises the following steps: obtaining a target image including a plurality of pixel points; establishing a local feature conversion model corresponding to each pixel point; according to each local feature conversion model, further obtaining a local direction tensor corresponding to each pixel point; according to each local direction tensor, determining the maximum value of the smaller feature value of each pixel point in a preset area corresponding to each pixel point to further determine the pixel point corresponding to the maximum value as an initial interesting point; utilizing an affine recursion algorithm to converge each initial interesting point into an affine interesting point and an affine feature area to obtain an image coordinate and feature scale of the affine interesting point and the affine feature area corresponding to the affine interesting point, so that the affine local invariant features of the image are obtained.

Description

Acquisition methods and the device of the affine local invariant feature of a kind of image
Technical field
The application relates to technical field of image processing, acquisition methods and the device of the affine local invariant feature of especially a kind of image.
Background technology
Image local feature is exactly to embody emphatically certain local feature in image, for example, and the low-level feature information such as the edge of image, point, line, surface.The affined transformation of image is the complex transformation being obtained by basic transformations such as the translation of image, rotation, convergent-divergents.For example, mobile robot moves in operative scenario, and it utilizes binocular camera can obtain the image of this scene.Due to the motion of robot, the visual angle of video camera is changed, thereby cause Same Scene, in each image obtaining, certain conversion also occurs, for example, position, yardstick, viewpoint etc. convert, and described conversion can be comprehensive is considered to a kind of affined transformation.
The affine local invariant feature of image refers to, the local feature that image remains unchanged under affined transformation, though image has carried out affined transformation, and some local feature in this image does not change thereupon.Because local feature has less information data amount, redundancy is low, can be used as the affine invariant features of image.In the problem solvings such as the object identification of image, the splicing of panorama sketch, need to utilize the affine local invariant feature of image.
But inventor finds by research, there is no at present a kind of acquisition methods to the affine local invariant feature of image.
Summary of the invention
In view of this, the application provides the acquisition methods of the affine local invariant feature of a kind of image, to solve the technical matters that there is no at present a kind of acquisition methods to the affine local invariant feature of image in prior art.The technical scheme that the application provides is as follows:
An acquisition methods for the affine local invariant feature of image, comprising:
Obtain the target image that includes multiple pixels;
Set up described local feature transformation model corresponding to each pixel;
According to described each local feature transformation model, obtain described local direction tensor corresponding to each pixel;
According to described each local direction tensor, in the predeterminable area corresponding with described each pixel, determine the maximal value in the less eigenwert of each pixel in this predeterminable area;
Pixel corresponding described maximal value is defined as to initial point of interest;
Utilize affine recursive algorithm, described each initial point of interest is converged on to affine point of interest and affine characteristic area;
Obtain image coordinate, characteristic dimension and the described affine characteristic area corresponding to described affine point of interest of described affine point of interest.
Said method, preferred, described target image is two-dimensional discrete image.
Said method, preferred, the local feature transformation model that described each pixel is corresponding is the multinomial expanded type of second order, that is:
f(x)~r 1+r 2x+r 3y+r 4x 2+r 5y 2+r 6xy=x TAx+b Tx+c;
Wherein, described f (x) is the gray-scale value of described each pixel x; Described (x, y) tfor the image coordinate of described each pixel x; Described A is 2 rank symmetric matrixes, in order to characterize the even item local feature except constant; Described b is 2 dimensional vectors, in order to characterize odd item local feature; Described c is constant component, and described c and described r 1equate; Basis function is { 1, x, y, x 2, y 2, xy}.
Said method, preferred, described local direction tensor is
Figure BDA0000463839970000021
described predeterminable area is 8 neighborhood regions of described each pixel;
Wherein, each local direction tensor described in described foundation, in the predeterminable area corresponding with described each pixel, determines the maximal value in the less eigenwert of each pixel in this predeterminable area, comprising:
Determine the each self-corresponding 8 neighborhood regions of described each pixel;
According to local direction tensor T corresponding to described each pixel, in described each 8 neighborhood region, obtain two non-negative eigenvalue λ of each pixel in this 8 neighborhood region 1, λ 2in less eigenwert; Wherein, λ 1,2 = 1 2 ( t 11 + t 22 ± ( t 11 - t 22 ) 2 + 4 t 12 2 ) ;
Determine the maximal value in the corresponding less eigenwert of each pixel in this 8 neighborhood region;
Wherein, described T is positive semidefinite symmetric matrix; Described σ dfor characteristic dimension; Described t ij(i, j=1,2) are element corresponding to described positive semidefinite symmetric matrix T.
Said method, preferred, describedly utilize affine recursive algorithm, described each initial point of interest is converged on to affine point of interest and affine characteristic area, comprising:
Initialization unit matrix U (1); Wherein, described unit matrix U (1)characteristic of correspondence region is the unit circle region centered by the coordinate of described initial point of interest;
By matrix U ibe defined as current matrix, described initial point of interest is defined as to current point of interest, and utilize described current matrix, determine image local domain transformation; Wherein, the current recurrence number of times that described i is affine recursive algorithm, when described current recurrence number of times is 1, described U ifor described U (1);
Determine described image local domain transformation characteristic of correspondence yardstick
Figure BDA0000463839970000031
according to described characteristic dimension
Figure BDA0000463839970000032
in described image local domain transformation, search the alternative point of interest nearest with described current point of interest;
Obtain the direction tensor T that described alternative point of interest is corresponding (i); To described direction tensor T (i)carry out U conversion and obtain matrix U i+1; To described matrix U i+1carry out U (i+1)=U (i+1)/ λ 1(U (i+1)) standardized transformation so that described λ 1value be 1;
Judge whether λ 2(T (i))/λ 1(T (i)) value be less than default convergence threshold and described current recurrence number of times does not reach default recurrence frequency threshold value;
If so, the value of described current recurrence number of times i is added to 1, by described matrix U i+1be defined as current matrix, described alternative point of interest is defined as to current point of interest, return to carry out and utilize described current matrix, determine image local domain transformation;
If not, judge described λ 2(T (i))/λ 1(T (i)) value whether be more than or equal to described default convergence threshold; If so, described alternative point of interest is defined as to affine point of interest, and determines that affine characteristic area corresponding to described affine point of interest is x tu (i+1)x=1; Wherein, described U (i+1)for positive semidefinite symmetric matrix.
The application also provides the acquisition device of the affine local invariant feature of a kind of image, comprising:
Target image acquiring unit, for obtaining the target image that includes multiple pixels;
Transformation model is set up unit, for setting up described local feature transformation model corresponding to each pixel;
Direction tensor acquiring unit, for according to described each local feature transformation model, obtains described local direction tensor corresponding to each pixel;
Eigenwert determining unit, for according to described each local direction tensor, in the predeterminable area corresponding with described each pixel, determines the maximal value in the less eigenwert of each pixel in this predeterminable area;
Initial point of interest determining unit, for being defined as initial point of interest by pixel corresponding described maximal value;
Point of interest convergence unit, for utilizing affine recursive algorithm, converges on affine point of interest and affine characteristic area by described each initial point of interest;
Invariant features acquiring unit, for obtaining image coordinate, characteristic dimension and the described affine characteristic area corresponding to described affine point of interest of described affine point of interest.
Said apparatus, preferred, the described target image that described target image acquiring unit obtains is two-dimensional discrete image.
Said apparatus, preferred, it is the multinomial expanded type of second order that described transformation model is set up the local feature transformation model of setting up unit, that is:
f(x)~r 1+r 2x+r 3y+r 4x 2+r 5y 2+r 6xy=x TAx+b Tx+c;
Wherein, described f (x) is the gray-scale value of described each pixel x; Described (x, y) tfor the image coordinate of described each pixel x; Described A is 2 rank symmetric matrixes, in order to characterize the even item local feature except constant; Described b is 2 dimensional vectors, in order to characterize odd item local feature; Described c is constant component, and described c and described r 1equate; Basis function is { 1, x, y, x 2, y 2, xy}.
Said apparatus, preferred, the local direction tensor that described direction tensor acquiring unit obtains is
Figure BDA0000463839970000041
predeterminable area when described eigenwert determining unit is determined the maximal value in less eigenwert is 8 neighborhood regions of described each pixel;
Wherein, described eigenwert determining unit, comprising:
Subelement is determined in 8 neighborhood regions, for determining the each self-corresponding 8 neighborhood regions of described each pixel;
Less eigenwert is obtained subelement, for according to local direction tensor T corresponding to described each pixel, in described each 8 neighborhood region, obtains two non-negative eigenvalue λ of each pixel in this 8 neighborhood region 1, λ 2in less eigenwert; Wherein,
Figure BDA0000463839970000042
Maximal value is determined subelement, for determining the maximal value of corresponding less eigenwert of each pixel in this 8 neighborhood region;
Wherein, described T is positive semidefinite symmetric matrix; Described σ dfor characteristic dimension; Described t ij(i, j=1,2) are element corresponding to described positive semidefinite symmetric matrix T.
Said apparatus, preferred, described point of interest convergence unit, comprising:
Unit matrix initialization subelement, for initialization unit matrix U (1); Wherein, described unit matrix U (1)characteristic of correspondence region is the unit circle region centered by the coordinate of described initial point of interest;
Domain transformation is determined subelement, for by matrix U ibe defined as current matrix, described initial point of interest is defined as to current point of interest, and utilize described current matrix, determine image local domain transformation; Wherein, the current recurrence number of times that described i is affine recursive algorithm, when described current recurrence number of times is 1, described U ifor described U (1);
Alternative point of interest is determined subelement, for determining described image local domain transformation characteristic of correspondence yardstick
Figure BDA0000463839970000051
according to described characteristic dimension
Figure BDA0000463839970000052
in described image local domain transformation, search the alternative point of interest nearest with described current point of interest;
Affine matrix obtains subelement, for obtaining the direction tensor T (i) that described alternative point of interest is corresponding; According to described direction tensor T (k)obtain matrix U i+1; To described matrix U i+1carry out U (i+1)=U (i+1)/ λ 1(U (i+1)) standardized transformation so that described λ 1value be 1;
Judgment sub-unit, for judging whether λ 2(T (i))/λ 1(T (i)) value be less than default convergence threshold and described current recurrence number of times does not reach default recurrence frequency threshold value; If so, trigger first unit that bears fruit; If not, trigger described second unit that bears fruit;
Described first unit that bears fruit, for the value of described current recurrence number of times i is added to 1, by described matrix U i+1be defined as current matrix, described initial point of interest is defined as to current point of interest, and trigger described domain transformation and determine that subelement utilizes described current matrix, determines image local domain transformation;
Second unit that bears fruit, for judging described λ 2(T (i))/λ 1(T (i)) value whether be more than or equal to described default convergence threshold; If so, described alternative point of interest is defined as to affine point of interest, and determines that affine characteristic area corresponding to described affine point of interest is x tu (i+1)x=1; Wherein, described U (i+1)for positive semidefinite symmetric matrix.
From above technical scheme, the application provides acquisition methods and the device of the affine local invariant feature of a kind of image, described method is first by obtaining the target image that includes multiple pixels, and local feature transformation model corresponding to each pixel described in setting up, and then according to described each local feature transformation model, obtain described local direction tensor corresponding to each pixel, and according to described each local direction tensor, in the predeterminable area corresponding with described each pixel, determine the maximal value in the less eigenwert of each pixel in this predeterminable area, and then pixel corresponding described maximal value is defined as to initial point of interest, then, utilize affine recursive algorithm, described each initial point of interest is converged on to affine point of interest and affine characteristic area, finally, obtain the image coordinate of described affine point of interest, characteristic dimension and described affine characteristic area corresponding to described affine point of interest, thereby realized the affine local invariant feature that obtains image.
Accompanying drawing explanation
In order to be illustrated more clearly in the technical scheme in the embodiment of the present application, below the accompanying drawing of required use during embodiment is described is briefly described, apparently, accompanying drawing in the following describes is only some embodiment of the application, for those of ordinary skills, do not paying under the prerequisite of creative work, can also obtain according to these accompanying drawings other accompanying drawing.
Fig. 1 is an affined transformation schematic diagram of picture signal;
The process flow diagram of the acquisition methods embodiment mono-of the affine local invariant feature of a kind of image that Fig. 2 provides for the application;
The exemplary plot that the acquisition methods embodiment mono-of the affine local invariant feature of a kind of image that Fig. 3 provides for the application applies;
Another exemplary plot that the acquisition methods embodiment mono-of the affine local invariant feature of a kind of image that Fig. 4 provides for the application applies;
Fig. 5 provides the particular flow sheet in the acquisition methods embodiment mono-of the affine local invariant feature of a kind of image for the application;
Fig. 6 is the schematic diagram of a kind of affine normalization process of local direction tensor matrix;
The part process flow diagram of the affine local invariant feature acquisition methods of a kind of image embodiment bis-that Fig. 7 provides for the application;
Fig. 8 is that the application applies affine recursive algorithm affine characteristic area is carried out to a standardized exemplary plot;
The structured flowchart of the affine local invariant feature acquisition device of a kind of image embodiment mono-that Fig. 9 provides for the application;
A concrete structure block diagram in the affine local invariant feature acquisition device of a kind of image embodiment mono-that Figure 10 provides for the application;
The structured flowchart of the affine local invariant feature acquisition device of a kind of image embodiment bis-that Figure 11 provides for the application;
Figure 12 is the result figure that algorithm that the application applies obtains local invariant feature in real image transform sequence image;
Figure 13 is repetition rate and the matching number result figure of the algorithm real image sequence applied of the application.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present application, the technical scheme in the embodiment of the present application is clearly and completely described, obviously, described embodiment is only the application's part embodiment, rather than whole embodiment.Based on the embodiment in the application, those of ordinary skills are not making the every other embodiment obtaining under creative work prerequisite, all belong to the scope of the application's protection.
In field of machine vision, in the problem solvings such as the identification to target object in visual pattern and the splicing of panorama sketch, need the acquisition methods of the affine local invariant feature of application image.About the affined transformation of image illustrates as follows:
In the transformation model of image, basic geometric transformation has translation, rotation, convergent-divergent (isotropy and anisotropy), reflection and perspective etc., and described geometric transformation combination can be obtained to the complex transformations such as perspective transform, similarity transformation and affined transformation.In image vision field, perspective transform under certain condition can be approximate by affined transformation.For example, mobile robot moves in operative scenario, and due to variations such as video camera visual angle, bright and dark lights, certain perspective transform also can occur the scene environment image obtaining, but parallel or approximate when parallel when binocular camera optical axis, perspective transform can be approximately affined transformation.
For example, refer to Fig. 1, it shows an affined transformation schematic diagram of picture signal.In the figure, picture signal is f (x), through being expressed as picture signal after video camera visual angle, rotation and the conversion of brightness equiaffine f ~ ( x ) = f a b c d · x + e f , Wherein a b d d = ad - dc ≠ 0 . Will P = a b c d Be decomposed into:
P = a b c d = t cos ψ - sin ψ sin ψ cos ψ 1 0 0 1 / q cos φ - sin φ sin φ cos φ - - - ( 1 )
Wherein:
T---scale parameter;
ψ---the angle of its optical axis of camera intrinsic;
φ---optical axis is at the projection of the plane of delineation and the angle of image coordinate axle;
The angle of θ---optical axis and image Z-axis, θ=arccos (1/q) (q >=1).
Above formula (1) is carried out to independently affine transformation parameter of six of the known total t of mathematics model analysis, ψ, q, φ, e and f.Wherein, P is non-homogeneous transformation matrix; [e, f] tfor translation transformation vector, comprise two special geometric conversion, as rigid transformation model, similarity transformation model etc.Rigid transformation model is mainly described the translation motion of video camera and around the rotatablely moving of its optical axis, is now had t=1, φ=0, and q=1, only has tri-free parameters of ψ, e and f; And similarity transformation model is mainly described translation, rotation, the convergent-divergent motion of video camera.This model can distortion object original shape, but the picture size of object size can convert, now φ=0, q=1, and have t, ψ, e and tetra-free parameters of f.The local invariant feature of image extracts the impact that can not be subject to exactly these Image geometry transforms, stable and detect reliably corresponding same local feature point.
Refer to Fig. 2, it shows the process flow diagram of the acquisition methods embodiment mono-of the affine local invariant feature of a kind of image that the application provides, and the present embodiment can comprise:
Step 101: obtain the target image that includes multiple pixels.
Wherein, described target image can be the image that the machine of field of machine vision gets, for example, and the scene environment image that mobile robot obtains in operative scenario.
Step 102: set up described local feature transformation model corresponding to each pixel.
Wherein, described local feature transformation model is a kind of mathematic(al) manipulation model, and this step is that applied mathematics transformation model represents each pixel in described target image.
Gray-scale value to described target image is processed, and obtains local feature transformation model.Described processing can be to adopt mathematics or geometric transform method, as Fourier transform, little filter conversion and Gabor filtering, multinomial expanded type conversion etc.Wherein, described multinomial expanded type is a kind of image local Transformation of Mathematical Model, and each pixel that is about to image launches in the mode of multinomial coefficient, to set up local feature transformation model.Wherein, suppose that described multinomial coefficient is enough to express the characteristic information of topography.
Preferably, described local feature transformation model is the multinomial expanded type of second order, that is:
f(x)~r 1+r 2x+r 3y+r 4x 2+r 5y 2+r 6xy=x TAx+b Tx+c (2)
Wherein, described f (x) is the gray-scale value of described each pixel x; Described (x, y) tfor the image coordinate of described each pixel x; Described A is 2 rank symmetric matrixes, in order to characterize the even item local feature except constant; Described b is 2 dimensional vectors, in order to characterize odd item local feature; Described c is constant component, and described c and described r 1equate; Basis function is { 1, x, y, x 2, y 2, xy}.Refer to Fig. 3, it shows the exemplary plot that the present embodiment is applied, and the figure shows basis function { 1, x, y, the x of the multinomial expanded type of local feature transformation model second order of target image described in the present embodiment 2, y 2, an exemplary plot of xy}.
Wherein:
A = r 4 r 6 / 2 r 6 / 2 r 5 , b = r 2 r 3 .
Step 103: according to described each local feature transformation model, obtain described local direction tensor corresponding to each pixel.
Described local direction tensor is by regulating black plug (Hessian) matrix and the second-order moments matrix of parameter in conjunction with described target image, describing corresponding pixel points energy variation in two vertical features directions in its partial structurtes.
It should be noted that, the local direction tensor that described each pixel is corresponding can be obtained by the local feature transformation model of described each pixel.Wherein, if described local feature transformation model is the multinomial expanded type of second order, described local direction tensor can be expressed as:
T=AA T+γbb T (3)
Wherein, γ is non-negative weight factor.Due to for single γ, described local direction tensor only has phase unchangeability, therefore γ is called again the adjusting parameter of even item and odd item characteristic component.
It should be noted that, in to the solution procedure of above-mentioned (3) formula, only need obtain the value of determining each element in A and b.From Maclaurin expansion, one can local derviation picture signal at its initial point place, corresponding Maclaurin expansion is:
f ( x ) = f ( 0 ) + ( ▿ f ) T x + 1 2 x T Hx + O ( | | x | | 3 ) - - - ( 4 )
Wherein gradient
Figure BDA0000463839970000094
represent the single order partial derivative of signal f at initial point place, the second-order partial differential coefficient f that H comprises signal xy.For second order signal, its expression formula is as follows:
▿ f = f x ( 0 ) f y ( 0 ) , H = f xx ( 0 ) f xy ( 0 ) f xy ( 0 ) f yy ( 0 ) - - - ( 5 )
It should be noted that, if
Figure BDA0000463839970000095
described Maclaurin expansion (4) and above-mentioned formula (2) form class seemingly, can obtain the expression formula of local direction tensor T.But the each first vegetarian refreshments obtaining is thus very sensitive to noise, nonsensical in actual computation.In addition, one-dimentional structure and directional information can only be estimated thus, local direction or the gradient-structure information of described target image can not be reflected.But, because described Maclaurin expansion (4) requires each rank partial derivative, at initial point place, trend towards zero point, and do not meet this requirement for target image discrete signal.
Therefore, need first target image to be carried out under some scale σ gaussian filtering, in window, carry out Gauss's weighted mean, then ask the each rank partial derivative after convolution.Meanwhile, this kind of method contributes to reduce the noise effect that target image is subject to.
h = f ⊗ g , g ( x , σ ) = 1 2 πσ 2 exp ( - | | x | | 2 2 σ 2 )
The isotropy circle gaussian kernel function that in described A and described b, the single order of correspondence and second-order partial differential coefficient can be σ by standard deviation is determined.For Second-Order Discrete picture signal H and b, be defined as:
H = h 11 h 12 h 21 h 22 = h xx ( σ ) h xy ( σ ) h xy ( σ ) h yy ( σ ) , M = bb T = h x 2 ( σ ) h x ( σ ) h y ( σ ) h x ( σ ) h y ( σ ) h y 2 ( σ ) - - - ( 7 )
Wherein: described σ is differential yardstick, described h dfor the partial derivative along direction d.
It should be noted that, the value of described differential yardstick σ has determined the level and smooth level of described target image.Concrete, along with the increase gradually of σ value, the smooth effect of described target image is more obvious, but the loss in detail of described target image is also more serious simultaneously.From convolution rule, can be by first calculating the corresponding partial derivative of Gaussian function to the partial derivative of asking of filtering image h, then original image f is carried out to convolution.Wherein, described standardization Gauss single order and second-order partial differential coefficient are shown in Figure 4, it shows another exemplary plot of the present embodiment application, this exemplary plot has illustrated A and corresponding single order and second order highly isotropic this function of circle and the corresponding partial derivative of b in the multinomial expanded type of described second order, wherein, described σ=1.5.
From formula (7), for different described differential yardstick σ, A and b value be difference, and then cause described local direction tensor T also can correspondingly to change.In order to express exactly energy and the direction of local feature of described target image, solving of described local direction tensor T should be based on a certain typical size σ d, be referred to as characteristic dimension.The impact of partial derivative being calculated in order to reduce described differential yardstick σ, partial derivative function need carry out standardization corresponding to described differential yardstick σ, and the yardstick partial derivative standardization equation formula on m rank is as follows:
D i 1 . . . i m = σ m h i 1 . . . i m ( x , σ ) = σ m f ( x ) * g i 1 . . . i m ( σ ) - - - ( 8 )
Theoretically, by different differential yardsticks, can obtain identical local derviation numerical value.Thus described local direction tensor T is carried out, after above-mentioned standardized process, can obtaining the final expression formula of standardized local direction tensor:
T = σ D 4 ( AA T + 1 4 σ D 4 bb T ) - - - ( 9 )
Therefore, apply the above-mentioned standardized formula (9) of carrying out and represent local direction tensor T.
Step 104: according to described each local direction tensor, in the predeterminable area corresponding with described each pixel, determine the maximal value in the less eigenwert of each pixel in this predeterminable area.
Wherein, 8 neighborhood regions of default each pixel are the predeterminable area corresponding with this pixel.Certainly, described predeterminable area can for but be not limited to as above-mentioned 8 neighborhood regions.Please refer to Fig. 5, it shows a particular flow sheet in the embodiment mono-that the application provides, and described step 104 can realize by following each step:
Step 201: determine the each self-corresponding 8 neighborhood regions of described each pixel.
For each pixel in described target image is determined 8 corresponding neighborhood regions, wherein, described 8 neighborhood regions refer to pixel 8 pixels around, and described 8 pixels and this pixel are in 3 × 3 image window regions.
Step 202: according to local direction tensor T corresponding to described each pixel, in described each 8 neighborhood region, obtain two non-negative eigenvalue λ of each pixel in this 8 neighborhood region 1, λ 2in less eigenwert; Wherein, λ 1,2 = 1 2 ( t 11 + t 22 ± ( t 11 - t 22 ) 2 + 4 t 12 2 ) .
Wherein, the local direction tensor T being represented by above-mentioned formula (9), known described each local direction tensor T is a positive semidefinite symmetric matrix, wherein, described t ij(i, j=1,2) are element corresponding to described positive semidefinite symmetric matrix T.And then, by described positive semidefinite symmetric matrix, can determine two non-negative eigenvalue λ 1, λ 2, for expressing local signal vertical direction (proper vector
Figure BDA0000463839970000112
) on energy value.
Step 203: determine the maximal value in the corresponding less eigenwert of each pixel in this 8 neighborhood region.
It should be noted that, the each pixel in described target image all has the local direction tensor T corresponding with it, and because each local direction tensor T all has two corresponding non-negative eigenvalue λ 1and λ 2, known, the each pixel in described target image all has two corresponding non-negative eigenvalue λ 1and λ 2.
, for the each pixel in described target image, determine each self-corresponding 8 neighborhood regions, thereby described 9 pixels are in 3 × 3 image window regions together.Obtain each pixel in this 3 × 3 image window region to each self-corresponding two non-negative eigenvalue λ respectively 1and λ 2in less eigenwert, obtain 9 less eigenwerts, determine the maximal value in described 9 less eigenwerts.
For example, described pixel is x5, determines that each pixel in 8 neighborhood regions of its correspondence is x1~x4 and x6~x9.Wherein, each less eigenwert that described 9 pixels are corresponding is respectively λ 21, λ 22, λ 23, λ 24λ 29, determine the maximal value in described 9 less eigenwerts, as be λ 23.
Step 105: pixel corresponding described maximal value is defined as to initial point of interest.
It should be noted that, for each pixel of described target image, determine the 8 neighborhood regions corresponding with this pixel, can determine multiple 8 neighborhood regions.Determine the maximal value in the each self-corresponding less eigenwert of 9 pixels in described each 8 neighborhood region, pixel corresponding this maximal value is defined as to initial point of interest, can determine multiple initial points of interest.
For example,, by described λ 23corresponding pixel x3 is defined as initial point of interest.
Step 106: utilize affine recursive algorithm, described each initial point of interest is converged on to affine point of interest and affine characteristic area.
Wherein, utilizing affine recursive algorithm, described each initial point of interest is converged on to the process of affine point of interest, is that local direction tensor corresponding described each pixel is carried out to affine normalized process.Now described affine normalization process is carried out to following principle explanation.
When certain affined transformation occurs image, along the yardstick of x and y direction, also can change thereupon.In order to predict corresponding yardstick, adapt to image local feature structure, will in affine Gauss's metric space, ask image single order partial derivative and second-order partial differential coefficient, use oval Gaussian convolution window to replace homogeneous circle convolution window.Because there is affined transformation between two width view data, represent that the Gaussian function that generates its metric space also exists affined transformation, so use affine Gaussian function to define corresponding matrix.Now affine gaussian kernel function has 3 free parameters, need to reduce complexity by means of some constraint conditions.
When signal f through affined transformation is
Figure BDA0000463839970000121
Figure BDA0000463839970000122
wherein B is affine transformation matrix, new local direction tensor
Figure BDA0000463839970000124
for:
T ~ = ( B T AB ) ( B T AB ) T + γ ( B T b ) ( B T b ) T = B T [ A ( BB T ) A T + rbb T ] B - - - ( 10 )
And original local direction tensor has T=AA t+ γ bb t.Now, be difficult to find out this relation between the two; Show by experiment, for the little image (σ of affine deformation xand σ ybe more or less the same), in order to reduce the complexity of calculating, local direction tensor T can be approximately:
T ~ ≈ B T [ AA T + rbb T ] B = B T TB - - - ( 11 )
In this case, the corresponding direction tensor of affined transformation image slices vegetarian refreshments is approximate constant to local affine transformations.From correlation theory:
Figure BDA0000463839970000127
and
Figure BDA0000463839970000128
as long as local data exist the relation of affined transformation, also there is affine transformation relationship in its corresponding Gauss's metric space core.For the oval feature region before and after affined transformation, its normalization process can respective table be shown as:
B = T ~ - 1 / 2 RT 1 / 2 - - - ( 12 )
Wherein R is rotation matrix.According to above-mentioned derivation, unique point x 1and x 2corresponding oval feature region can be by conversion
Figure BDA0000463839970000132
with
Figure BDA0000463839970000133
be normalized into border circular areas.Pass between latter two characteristic area of normalization is x 2'=Rx 1', being about to original affine transformation relationship abbreviation is rotational transform relation.
It should be noted that, the affine normalization process of one of above-mentioned autocorrelation matrix (local direction tensor matrix) as shown in Figure 6.
Wherein, through above-mentioned affine normalization process, can determine each affine point of interest of described target image, and the affine region corresponding with described each affine point of interest.
Step 107: the image coordinate, characteristic dimension and described affine characteristic area corresponding to described affine point of interest that obtain described affine point of interest.
From above-described embodiment, this embodiment is first by obtaining the target image that includes multiple pixels, and local feature transformation model corresponding to each pixel described in setting up, and then according to described each local feature transformation model, obtain described local direction tensor corresponding to each pixel, and according to described each local direction tensor, in the predeterminable area corresponding with described each pixel, determine the maximal value in the less eigenwert of each pixel in this predeterminable area, and then pixel corresponding described maximal value is defined as to initial point of interest, then, utilize affine recursive algorithm, described each initial point of interest is converged on to affine point of interest and affine characteristic area, finally, obtain the image coordinate of described affine point of interest, characteristic dimension and described affine characteristic area corresponding to described affine point of interest, thereby realized the affine local invariant feature that obtains image.
It should be noted that, about the calculating of described local direction tensor.Because the affine gaussian kernel function of anisotropy has three variablees (characteristic dimension variable can be defined as to constant, real variable has two), for saving calculated amount, can directly not utilize affine gaussian kernel function to carry out filtering to image, but first carry out U conversion, i.e. U (k)=(T -1/2) (k)(T -1/2) (1)u (0), and then use isotropy circle gaussian kernel function to carry out convolution to U matrixing image, obtain single order and the second-order partial differential coefficient of picture signal, thereby calculate corresponding local direction tensor T.
In addition, determining about characteristic dimension.In U matrixing image, for a series of characteristic dimension calculate corresponding local direction tensor T (i), choose λ min(T i)/λ max(T i) differential yardstick when maximum is defined as characteristic dimension.Characteristic dimension calculates local direction tensor T again in U matrixing image thus, and and then determines final affine point of interest and characteristic of correspondence region thereof.
Concrete, refer to Fig. 7, it shows the part process flow diagram of the affine local invariant feature acquisition methods of a kind of image embodiment bis-that the application provides, and the step 106 in above-described embodiment can realize in the following manner:
Step 301: initialization unit matrix U (1); Wherein, described unit matrix U (1)characteristic of correspondence region is the unit circle region centered by the coordinate of described initial point of interest.
Step 302: by matrix U ibe defined as current matrix, described initial point of interest is defined as to current point of interest, and utilize described current matrix, determine image local domain transformation; Wherein, the current recurrence number of times that described i is affine recursive algorithm, when described current recurrence number of times i is 1, described U ifor described U (1).
The process of described definite image local domain transformation is relevant to described affine recursive algorithm.Concrete, the current recurrence number of times of described affine recursive algorithm is i, when described current recurrence number of times is 1, and described U ifor described U (1), i.e. described initialized unit matrix.
Wherein, described definite process is specially, and first obtains the image coordinate of described initial point of interest
Figure BDA0000463839970000142
to utilize described matrix U iwith the image coordinate of described initial point of interest centered by convert described matrix U icharacteristic of correspondence region, and then determine image local domain transformation.It should be noted that, when the current recurrence number of times of described affine recursive algorithm is 1, the image coordinate that described image coordinate is initial point of interest
Figure BDA0000463839970000144
but when described current recurrence number of times is not 1, described image coordinate is the image coordinate of the alternative point of interest that finds in step 303
Figure BDA0000463839970000145
Step 303: determine described image local domain transformation characteristic of correspondence yardstick
Figure BDA0000463839970000146
according to described characteristic dimension
Figure BDA0000463839970000147
in described image local domain transformation, search the alternative point of interest nearest with described current point of interest.
Concrete, in described partial transformation characteristic area, include multiple pixels, obtain the described each self-corresponding local direction tensor T of each pixel (i), the local direction tensor T of each pixel described in determining, as λ corresponding to certain pixel 2(T i)/λ 1(T i) when maximum, selecting differential yardstick is now characteristic dimension
Figure BDA0000463839970000148
In the definite image local domain transformation of described step 302, search the point of interest nearest with described current point of interest, and by the alternative point of interest that is defined as of described nearest interest.Wherein, described in search nearest alternative point of interest process refer to step 102 to the deterministic process of step 105 to initial point of interest, at this, do not repeat.
Step 304: obtain the direction tensor T that described alternative point of interest is corresponding (i); To described direction tensor T (i)carry out U conversion and obtain matrix U i+1; To described matrix U i+1carry out U (i+1)=U (i+1)/ λ 1(U (i+1)) standardized transformation so that described λ 1value be 1.
It should be noted that, to described direction tensor T (i)the mode of carrying out U conversion is, U (i+1)=(T -1/2) (i+1)(T -1/2) (1)u (0)thereby, can obtain matrix U i+1.Wherein, need to carry out standardized transformation to described matrix U i+1, so that described λ 1value be 1.Wherein, described λ 1for the larger eigenwert in two non-negative eigenwerts corresponding to described local direction tensor T (i).
Step 305: judge whether λ 2(T (i))/λ 1(T (i)) value be less than default convergence threshold and described current recurrence number of times does not reach default recurrence frequency threshold value; If so, perform step 306; If not, execution step 307.
Wherein, described λ 2(T (i))/λ 1(T (i)) corresponding default convergence threshold can be preset as the numerical value close to 1, for example, the numerical value between 0.9-1.2.Optionally, described default convergence threshold is 0.95, as described λ 2(T (i))/λ 1(T (i)) value and described 0.95 magnitude relationship.In addition, optional, described default recurrence frequency threshold value can be set to 10 times.
It should be noted that, the condition of execution step 306 comprises two aspects, i.e. described λ 2(T (i))/λ 1(T (i)) value be less than described default convergence threshold, and described current recurrence number of times does not reach default recurrence frequency threshold value.And when any one party face of described two aspects does not satisfy condition, perform step 307.
Step 306: the value of described current recurrence number of times i is added to 1, described matrix U i+1 is defined as to current matrix, described alternative point of interest is defined as to current point of interest, return to carry out and utilize described current matrix, determine image local domain transformation.
If carry out the explanation of this step, described current point of interest does not converge on affine local invariant feature region, carries out the next calculating process of described affine recursive algorithm.
Step 307: judge described λ 2(T (i))/λ 1(T (i)) value whether be more than or equal to described default convergence threshold; If so, described alternative point of interest is defined as to affine point of interest, and determines that affine characteristic area corresponding to described affine point of interest is x tu (i+1)x=1; Wherein, described U (i+1)for positive semidefinite symmetric matrix.
Refer to Fig. 8, it shows the present embodiment and applies affine recursive algorithm affine characteristic area is carried out to a standardized exemplary plot.Wherein, described λ 2(T (i))/λ 1(T (i)) value be α, and described default convergence threshold is 0.95.The initial detecting result that described (a) figure is target image, the final detection result that described (b) figure is target image, described (c) figure is affine recursive procedure.
In the present embodiment, first, analyze the Transformation Properties of the local direction tensor T of each pixel in described target image, by regulating black plug (Hessian) matrix and the second-order moments matrix of parameter combining image, describe corresponding pixel points energy variation in two vertical features directions in its partial structurtes, and and then obtain the final mathematic(al) representation of described local direction tensor.Then, search for the maximal value in 8 local neighborhood of the less eigenwert of each pixel local direction tensor, extract each initial point of interest of described target image.Finally, utilize affine recursive algorithm, make described each initial point of interest finally converge on affine constant characteristic area, and then obtain final affine point of interest.
The implementation procedure of corresponding said method embodiment mono-, the application provides a kind of device embodiment.Refer to Fig. 9, it shows the structured flowchart of the affine local invariant feature acquisition device of a kind of image embodiment mono-that the application provides.The present embodiment can comprise: target image acquiring unit 401, transformation model are set up unit 402, direction tensor acquiring unit 403, eigenwert determining unit 404, initial point of interest determining unit 405, point of interest convergence unit 406 and invariant features acquiring unit 407.Wherein:
Described target image acquiring unit 401, for obtaining the target image that includes multiple pixels;
Described transformation model is set up unit 402, for setting up described local feature transformation model corresponding to each pixel;
Described direction tensor acquiring unit 403, for according to described each local feature transformation model, obtains described local direction tensor corresponding to each pixel;
Described eigenwert determining unit 404, for according to described each local direction tensor, in the predeterminable area corresponding with described each pixel, determines the maximal value in the less eigenwert of each pixel in this predeterminable area;
Described initial point of interest determining unit 405, for being defined as initial point of interest by pixel corresponding described maximal value;
Described point of interest convergence unit 406, for utilizing affine recursive algorithm, converges on affine point of interest and affine characteristic area by described each initial point of interest;
Described invariant features acquiring unit 407, for obtaining image coordinate, characteristic dimension and the described affine characteristic area corresponding to described affine point of interest of described affine point of interest.
The explanation of this device embodiment, refers to said method embodiment mono-, does not repeat them here.
It should be noted that, optional, the described target image that the described target image acquiring unit 401 in said apparatus embodiment obtains is two-dimensional discrete image.
Accordingly, optional, it is the multinomial expanded type of second order that described transformation model is set up the local feature transformation model of setting up unit, that is:
f(x)~r 1+r 2x+r 3y+r 4x 2+r 5y 2+r 6xy=x TAx+b Tx+c;
Wherein, described f (x) is the gray-scale value of described each pixel x; Described (x, y) tfor the image coordinate of described each pixel x; Described A is 2 rank symmetric matrixes, in order to characterize the even item local feature except constant; Described b is 2 dimensional vectors, in order to characterize odd item local feature; Described c is constant component, and described c and described r 1equate; Basis function is { 1, x, y, x 2, y 2, xy}.
Optionally, the local direction tensor that the acquiring unit of direction tensor described in said apparatus embodiment mono-403 obtains is
Figure BDA0000463839970000171
predeterminable area during maximal value that described eigenwert determining unit 404 is determined in less eigenwert is 8 neighborhood regions of described each pixel.Refer to Figure 10, it shows a concrete structure block diagram in the affine local invariant feature acquisition device of a kind of image embodiment mono-that the application provides.Described eigenwert determining unit 404 can realize in the following manner: 8 neighborhood regions determine that subelement 4041, less eigenwert are obtained subelement 4042 and maximal value is determined subelement 4043.Wherein:
Subelement 4041 is determined in described 8 neighborhood regions, for determining the each self-corresponding 8 neighborhood regions of described each pixel;
Described less eigenwert is obtained subelement 4042, for according to local direction tensor T corresponding to described each pixel, in described each 8 neighborhood region, obtains two non-negative eigenvalue λ of each pixel in this 8 neighborhood region 1, λ 2in less eigenwert; Wherein,
Figure BDA0000463839970000172
Described maximal value is determined subelement 4043, for determining the maximal value of corresponding less eigenwert of each pixel in this 8 neighborhood region;
Wherein, described T is positive semidefinite symmetric matrix; Described σ dfor characteristic dimension; Described t ij(i, j=1,2) are element corresponding to described positive semidefinite symmetric matrix T.
Refer to Figure 11, it shows the structured flowchart of the affine local invariant feature acquisition device of a kind of image embodiment bis-that the application provides.Described point of interest convergence unit 406 can be realized in the following manner: unit matrix initialization subelement 501, domain transformation determine that subelement 502, alternative point of interest determine that subelement 503, affine matrix obtain subelement 504, judgment sub-unit 505, first unit 507 that bears fruit, unit 506 and second that bears fruit.Wherein:
Described unit matrix initialization subelement 501, for initialization unit matrix U (1); Wherein, described unit matrix U (1)characteristic of correspondence region is the unit circle region centered by the coordinate of described initial point of interest;
Described domain transformation is determined subelement 502, for by matrix U ibe defined as current matrix, described initial point of interest is defined as to current point of interest, and utilize described current matrix, determine image local domain transformation; Wherein, the current recurrence number of times that described i is affine recursive algorithm, when described current recurrence number of times is 1, described U ifor described U (1);
Described alternative point of interest is determined subelement 503, for determining described image local domain transformation characteristic of correspondence yardstick
Figure BDA0000463839970000181
according to described characteristic dimension
Figure BDA0000463839970000182
in described image local domain transformation, search the alternative point of interest nearest with described current point of interest;
Described affine matrix obtains subelement 504, for obtaining the direction tensor T that described alternative point of interest is corresponding (i); According to described direction tensor T (k)obtain matrix U i+1; To described matrix U i+1carry out U (i+1)=U (i+1)/ λ 1(U (i+1)) standardized transformation so that described λ 1value be 1;
Described judgment sub-unit 505, for judging whether λ 2(T (i))/λ 1(T (i)) value be less than default convergence threshold and described current recurrence number of times does not reach default recurrence frequency threshold value; If so, trigger first unit 506 that bears fruit; If not, trigger described second unit 507 that bears fruit;
Described first unit 506 that bears fruit, for the value of described current recurrence number of times i is added to 1, by described matrix U i+1be defined as current matrix, described initial point of interest is defined as to current point of interest, and trigger described domain transformation and determine that subelement 502 utilizes described current matrix, determines image local domain transformation;
It should be noted that, described first unit 506 that bears fruit determines that with described domain transformation subelement 502 is connected, and wherein, described first unit 506 that bears fruit adds 1 by the value of described current recurrence number of times i, by described matrix U i+1be defined as current matrix, described initial point of interest is defined as to current point of interest, and trigger described domain transformation and determine that subelement 502 carries out work, but described domain transformation is determined subelement 502 and do not needed matrix U ibe defined as current matrix and described initial point of interest is defined as to current point of interest, only need utilize the described first described current matrix U that unit 506 determines that bears fruit i+1, determine image local domain transformation.
Described second unit 507 that bears fruit, for judging described λ 2(T (i))/λ 1(T (i)) value whether be more than or equal to described default convergence threshold; If so, described alternative point of interest is defined as to affine point of interest, and determines that affine characteristic area corresponding to described affine point of interest is x tu (i+1)x=1; Wherein, described U (i+1)for positive semidefinite symmetric matrix.
The explanation of this device embodiment refers to said method embodiment bis-, at this, does not repeat.
For the validity of each embodiment of providing of explanation the application, now provide the emulation comparative experiments of following actual scene image.
The application mainly selects three image conversion sequences: (1) view transformation sequence; (2) figure sequence of yardstick and rotational transform; (3) luminance transformation sequence.Wherein, each image conversion sequence has six width images, and wherein the first width is reference picture, and other a few width images are that video camera obtains through view transformation or luminance transformation etc.In addition, the application mainly adopts the validity of repetition rate criterion assessment point of interest detection algorithm.Described repetition rate criterion is mainly whether evaluate the point of interest location and the yardstick that detect accurate.For a pair of image, repetition rate mainly refers in two width images of certain conversion by the number of the determined repetition point of interest of homography matrix and the ratio that contains point of interest sum in the less piece image of point of interest.And for the corresponding point of interest x of two width images aand x bmatching condition as follows:
The positioning error of the corresponding point of interest of (1) two width image is less than 1.5 pixel value: x a-Hx b<1.5.
The matching error of (2) two point of interest neighborhoods is less than 0.4, that is: ε s<0.4, and
Figure BDA0000463839970000191
Figure BDA0000463839970000192
expression is by x tμ x=1 is determined with x aor x bcentered by, x tthe determined point of interest elliptical region of μ x=1. with
Figure BDA0000463839970000194
table respectively showthe common factor of two point of interest elliptical region and union.
The algorithm that Figure 12 shows the application application obtains the result figure of local invariant feature in real image transform sequence image.In described figure (a), (b), (c), (d), (e) and (f) six width images, there are respectively successively 1124,1117,1865,1068,1000 and 385 points of interest.Figure 13 shows repetition rate and the matching number result figure of the algorithm real image sequence of the application's application.In the figure, show for the algorithm of the different transform sequence the present invention application point of interest conventional with other and detect the image repetition rate result of operator and corresponding match point quantity, wherein said other conventional point of interest detection algorithms mainly refer to Harris-Affine, Hessian-Affine and tri-kinds of algorithms of MSER, and algorithm of the present invention is PLOT algorithm.In this Figure 13, what described figure (a) and figure (b) showed is the result of the image conversion sequence of view transformation, wherein, apply algorithm PLOT(polynomial local orientation tensor of the present invention, multinomial expanded type local direction tensor) algorithm, along with the variation at visual angle, repetition rate reduces to 23.5% by 76.8%; What figure (c) and figure (d) showed is the result of yardstick and rotational transform, wherein, applies algorithm PLOT algorithm of the present invention, and along with yardstick and the anglec of rotation change, repetition rate reduces to 28.0% by 78.7%; What figure (e) and figure (f) showed is the result of brightness modified-image sequence, wherein, apply algorithm PLOT algorithm of the present invention, dimmed along with brightness, repetition rate reduces to 55.0% by 79.7%, can see that curve declines less, illustrate that the algorithm of the present invention's application changes and has good robustness for brightness.
Desirable repetition rate curve is level, and approaches 100%.And in fact, detection algorithm is because graphical rule, the conversion of rotation equiaffine become large, repetition rate curve also declines thereupon.Certainly, for the image of different scenes, as structuring scene, texture scene etc., repetition rate curve is also different.Most of experimental result shows, the PLOT algorithm of the present invention's application can obtain than Hessian-Affine and the better result of Harris-Affine operator, is only second to MSER.But the PLOT algorithm of the present invention's application can obtain a fairly large number of point of interest region, can be enough to express the local feature information of image.
It should be noted that, each embodiment in this instructions all adopts the mode of going forward one by one to describe, and each embodiment stresses is and the difference of other embodiment, between each embodiment identical similar part mutually referring to.
Above acquisition methods and the device of the affine local invariant feature of a kind of image provided by the present invention are described in detail, to the above-mentioned explanation of the disclosed embodiments, make professional and technical personnel in the field can realize or use the present invention.To the multiple modification of these embodiment, will be apparent for those skilled in the art, General Principle as defined herein can, in the situation that not departing from the spirit or scope of the present invention, realize in other embodiments.Therefore, the present invention will can not be restricted to these embodiment shown in this article, but will meet the widest scope consistent with principle disclosed herein and features of novelty.

Claims (10)

1. an acquisition methods for the affine local invariant feature of image, is characterized in that, comprising:
Obtain the target image that includes multiple pixels;
Set up described local feature transformation model corresponding to each pixel;
According to described each local feature transformation model, obtain described local direction tensor corresponding to each pixel;
According to described each local direction tensor, in the predeterminable area corresponding with described each pixel, determine the maximal value in the less eigenwert of each pixel in this predeterminable area;
Pixel corresponding described maximal value is defined as to initial point of interest;
Utilize affine recursive algorithm, described each initial point of interest is converged on to affine point of interest and affine characteristic area;
Obtain image coordinate, characteristic dimension and the described affine characteristic area corresponding to described affine point of interest of described affine point of interest.
2. method according to claim 1, is characterized in that, described target image is two-dimensional discrete image.
3. method according to claim 2, is characterized in that, the local feature transformation model that described each pixel is corresponding is the multinomial expanded type of second order, that is:
f(x)~r 1+r 2x+r 3y+r 4x 2+r 5y 2+r 6xy=x TAx+b Tx+c;
Wherein, described f (x) is the gray-scale value of described each pixel x; Described (x, y) tfor the image coordinate of described each pixel x; Described A is 2 rank symmetric matrixes, in order to characterize the even item local feature except constant; Described b is 2 dimensional vectors, in order to characterize odd item local feature; Described c is constant component, and described c and described r 1equate; Basis function is { 1, x, y, x 2, y 2, xy}.
4. method according to claim 3, is characterized in that, described local direction tensor is
Figure FDA0000463839960000011
described predeterminable area is 8 neighborhood regions of described each pixel;
Wherein, each local direction tensor described in described foundation, in the predeterminable area corresponding with described each pixel, determines the maximal value in the less eigenwert of each pixel in this predeterminable area, comprising:
Determine the each self-corresponding 8 neighborhood regions of described each pixel;
According to local direction tensor T corresponding to described each pixel, in described each 8 neighborhood region, obtain two non-negative eigenvalue λ of each pixel in this 8 neighborhood region 1, λ 2in less eigenwert; Wherein, &lambda; 1,2 = 1 2 ( t 11 + t 22 &PlusMinus; ( t 11 - t 22 ) 2 + 4 t 12 2 ) ;
Determine the maximal value in the corresponding less eigenwert of each pixel in this 8 neighborhood region;
Wherein, described T is positive semidefinite symmetric matrix; Described σ dfor characteristic dimension; Described t ij(i, j=1,2) are element corresponding to described positive semidefinite symmetric matrix T.
5. method according to claim 1, is characterized in that, describedly utilizes affine recursive algorithm, and described each initial point of interest is converged on to affine point of interest and affine characteristic area, comprising:
Initialization unit matrix U (1); Wherein, described unit matrix U (1)characteristic of correspondence region is the unit circle region centered by the coordinate of described initial point of interest;
By matrix U ibe defined as current matrix, described initial point of interest is defined as to current point of interest, and utilize described current matrix, determine image local domain transformation; Wherein, the current recurrence number of times that described i is affine recursive algorithm, when described current recurrence number of times is 1, described U ifor described U (1);
Determine described image local domain transformation characteristic of correspondence yardstick
Figure FDA0000463839960000022
according to described characteristic dimension
Figure FDA0000463839960000023
in described image local domain transformation, search the alternative point of interest nearest with described current point of interest;
Obtain the direction tensor T that described alternative point of interest is corresponding (i); To described direction tensor T (i)carry out U conversion and obtain matrix U i+1; To described matrix U i+1carry out U (i+1)=U (i+1)/ λ 1(U (i+1)) standardized transformation so that described λ 1value be 1;
Judge whether λ 2(T (i))/λ 1(T (i)) value be less than default convergence threshold and described current recurrence number of times does not reach default recurrence frequency threshold value;
If so, the value of described current recurrence number of times i is added to 1, described matrix U i+1 is defined as to current matrix, described alternative point of interest is defined as to current point of interest, return to carry out and utilize described current matrix, determine image local domain transformation;
If not, judge described λ 2(T (i))/λ 1(T (i)) value whether be more than or equal to described default convergence threshold; If so, described alternative point of interest is defined as to affine point of interest, and determines that affine characteristic area corresponding to described affine point of interest is x tu (i+1)x=1; Wherein, described U (i+1)for positive semidefinite symmetric matrix.
6. an acquisition device for the affine local invariant feature of image, is characterized in that, comprising:
Target image acquiring unit, for obtaining the target image that includes multiple pixels;
Transformation model is set up unit, for setting up the local feature conversion mould that described each pixel is corresponding
Direction tensor acquiring unit, for according to described each local feature transformation model, obtains described local direction tensor corresponding to each pixel;
Eigenwert determining unit, for according to described each local direction tensor, in the predeterminable area corresponding with described each pixel, determines the maximal value in the less eigenwert of each pixel in this predeterminable area;
Initial point of interest determining unit, for being defined as initial point of interest by pixel corresponding described maximal value;
Point of interest convergence unit, for utilizing affine recursive algorithm, converges on affine point of interest and affine characteristic area by described each initial point of interest;
Invariant features acquiring unit, for obtaining image coordinate, characteristic dimension and the described affine characteristic area corresponding to described affine point of interest of described affine point of interest.
7. device according to claim 6, is characterized in that, the described target image that described target image acquiring unit obtains is two-dimensional discrete image.
8. device according to claim 7, is characterized in that, it is the multinomial expanded type of second order that described transformation model is set up the local feature transformation model of setting up unit, that is:
f(x)~r 1+r 2x+r 3y+r 4x 2+r 5y 2+r 6xy=x TAx+b Tx+c;
Wherein, described f (x) is the gray-scale value of described each pixel x; Described (x, y) tfor the image coordinate of described each pixel x; Described A is 2 rank symmetric matrixes, in order to characterize the even item local feature except constant; Described b is 2 dimensional vectors, in order to characterize odd item local feature; Described c is constant component, and described c and described r 1equate; Basis function is { 1, x, y, x 2, y 2, xy}.
9. device according to claim 6, is characterized in that, the local direction tensor that described direction tensor acquiring unit obtains is predeterminable area when described eigenwert determining unit is determined the maximal value in less eigenwert is 8 neighborhood regions of described each pixel;
Wherein, described eigenwert determining unit, comprising:
Subelement is determined in 8 neighborhood regions, for determining the each self-corresponding 8 neighborhood regions of described each pixel;
Less eigenwert is obtained subelement, for according to local direction tensor T corresponding to described each pixel, in described each 8 neighborhood region, obtains two non-negative eigenvalue λ of each pixel in this 8 neighborhood region 1, λ 2in less eigenwert; Wherein,
Figure FDA0000463839960000032
Maximal value is determined subelement, for determining the maximal value of corresponding less eigenwert of each pixel in this 8 neighborhood region;
Wherein, described T is positive semidefinite symmetric matrix; Described σ dfor characteristic dimension; Described t ij(i, j=1,2) are element corresponding to described positive semidefinite symmetric matrix T.
10. device according to claim 6, is characterized in that, described point of interest convergence unit, comprising:
Unit matrix initialization subelement, for initialization unit matrix U (1); Wherein, described unit matrix U (1)characteristic of correspondence region is the unit circle region centered by the coordinate of described initial point of interest;
Domain transformation is determined subelement, for by matrix U ibe defined as current matrix, described initial point of interest is defined as to current point of interest, and utilize described current matrix, determine image local domain transformation; Wherein, the current recurrence number of times that described i is affine recursive algorithm, when described current recurrence number of times is 1, described U ifor described U (1);
Alternative point of interest is determined subelement, for determining described image local domain transformation characteristic of correspondence yardstick
Figure FDA0000463839960000041
according to described characteristic dimension
Figure FDA0000463839960000042
in described image local domain transformation, search the alternative point of interest nearest with described current point of interest;
Affine matrix obtains subelement, for obtaining the direction tensor T that described alternative point of interest is corresponding (i); According to described direction tensor T (k)obtain matrix U i+1; To described matrix U i+1carry out U (i+1)=U (i+1)/ λ 1(U (i+1)) standardized transformation so that described λ 1value be 1;
Judgment sub-unit, for judging whether λ 2(T (i))/λ 1(T (i)) value be less than default convergence threshold and described current recurrence number of times does not reach default recurrence frequency threshold value; If so, trigger first unit that bears fruit; If not, trigger described second unit that bears fruit;
Described first unit that bears fruit, for the value of described current recurrence number of times i is added to 1, by described matrix U i+1be defined as current matrix, described initial point of interest is defined as to current point of interest, and trigger described domain transformation and determine that subelement utilizes described current matrix, determines image local domain transformation;
Second unit that bears fruit, for judging described λ 2(T (i))/λ 1(T (i)) value whether be more than or equal to described default convergence threshold; If so, described alternative point of interest is defined as to affine point of interest, and determines that affine characteristic area corresponding to described affine point of interest is x tu (i+1)x=1; Wherein, described U (i+1)for positive semidefinite symmetric matrix.
CN201410043639.8A 2014-01-29 2014-01-29 Method and device for obtaining affine local invariant features of image Pending CN103745220A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410043639.8A CN103745220A (en) 2014-01-29 2014-01-29 Method and device for obtaining affine local invariant features of image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410043639.8A CN103745220A (en) 2014-01-29 2014-01-29 Method and device for obtaining affine local invariant features of image

Publications (1)

Publication Number Publication Date
CN103745220A true CN103745220A (en) 2014-04-23

Family

ID=50502237

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410043639.8A Pending CN103745220A (en) 2014-01-29 2014-01-29 Method and device for obtaining affine local invariant features of image

Country Status (1)

Country Link
CN (1) CN103745220A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105740870A (en) * 2016-01-30 2016-07-06 湘潭大学 Anti-rotation HDO local feature description method for target robust identification
CN108364013A (en) * 2018-03-15 2018-08-03 苏州大学 Image key points feature descriptor extracting method, system based on the distribution of neighborhood gaussian derivative
CN108960155A (en) * 2018-07-09 2018-12-07 济南大学 Adult Gait extraction and exception analysis method based on Kinect
CN114331863A (en) * 2021-09-09 2022-04-12 国家电网有限公司 Affine transformation-based image smoothing method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080063285A1 (en) * 2006-09-08 2008-03-13 Porikli Fatih M Detecting Moving Objects in Video by Classifying on Riemannian Manifolds
WO2012074699A1 (en) * 2010-11-29 2012-06-07 Microsoft Corporation Robust recovery of transform invariant low-rank textures
CN103310453A (en) * 2013-06-17 2013-09-18 北京理工大学 Rapid image registration method based on sub-image corner features

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080063285A1 (en) * 2006-09-08 2008-03-13 Porikli Fatih M Detecting Moving Objects in Video by Classifying on Riemannian Manifolds
WO2012074699A1 (en) * 2010-11-29 2012-06-07 Microsoft Corporation Robust recovery of transform invariant low-rank textures
CN103310453A (en) * 2013-06-17 2013-09-18 北京理工大学 Rapid image registration method based on sub-image corner features

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
林睿,等: ""基于多项扩展式局部方向张量的仿射不变兴趣点检测算子"", 《高技术通讯》 *
林睿: ""基于图像特征点的移动机器人立体视觉SLAM研究"", 《中国博士学位论文全文数据库(信息科技辑)》 *
胡俊华: ""图像局部不变特征及其应用研究"", 《中国博士学位论文全文数据库(信息科技辑)》 *
蔡红苹 等: ""一种通用的仿射不变特征区域提取方法"", 《电子学报》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105740870A (en) * 2016-01-30 2016-07-06 湘潭大学 Anti-rotation HDO local feature description method for target robust identification
CN105740870B (en) * 2016-01-30 2019-03-15 湘潭大学 A kind of anti-rotation HDO method for describing local characteristic of target robust control policy
CN108364013A (en) * 2018-03-15 2018-08-03 苏州大学 Image key points feature descriptor extracting method, system based on the distribution of neighborhood gaussian derivative
CN108960155A (en) * 2018-07-09 2018-12-07 济南大学 Adult Gait extraction and exception analysis method based on Kinect
CN114331863A (en) * 2021-09-09 2022-04-12 国家电网有限公司 Affine transformation-based image smoothing method

Similar Documents

Publication Publication Date Title
US10445616B2 (en) Enhanced phase correlation for image registration
CN109903313B (en) Real-time pose tracking method based on target three-dimensional model
Chaudhury et al. Auto-rectification of user photos
Riklin-Raviv et al. Unlevel-sets: Geometry and prior-based segmentation
Goshtasby Theory and applications of image registration
CN106981077A (en) Infrared image and visible light image registration method based on DCE and LSS
Dambreville et al. Robust 3d pose estimation and efficient 2d region-based segmentation from a 3d shape prior
Lee et al. Skewed rotation symmetry group detection
Cinaroglu et al. A direct approach for human detection with catadioptric omnidirectional cameras
EP2153379A2 (en) Generalized statistical template matching under geometric transformations
CN103745220A (en) Method and device for obtaining affine local invariant features of image
CN114331879A (en) Visible light and infrared image registration method for equalized second-order gradient histogram descriptor
Narváez et al. Point cloud denoising using robust principal component analysis
Cakir et al. Combining feature-based and model-based approaches for robust ellipse detection
CN114170284B (en) Multi-view point cloud registration method based on active landmark point projection assistance
CN109711420B (en) Multi-affine target detection and identification method based on human visual attention mechanism
Bellavia et al. Image orientation with a hybrid pipeline robust to rotations and wide-baselines
CN103824076A (en) Detecting and extracting method and system characterized by image dimension not transforming
JP3863014B2 (en) Object detection apparatus and method
Dambreville et al. A geometric approach to joint 2D region-based segmentation and 3D pose estimation using a 3D shape prior
Kovacs et al. Orientation based building outline extraction in aerial images
Liu et al. Using Retinex for point selection in 3D shape registration
CN104091315B (en) Method and system for deblurring license plate image
Conomis Conics-based homography estimation from invariant points and pole-polar relationships
Dryanovski et al. Real-time pose estimation with RGB-D camera

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20140423