Summary of the invention
For the problem that prior art exists, it is a kind of at the constant target identification method based on contour feature of matching process mesoscale that fundamental purpose of the present invention is to provide.
For achieving the above object, the invention provides a kind of embodiment of the target identification method based on contour feature, the method comprises the steps: step 1, sets up the feature database of object template contours; Step 2, treat detected image and carry out target identification.
Wherein the step 1 feature database of setting up the object template contours comprises following steps (1.1)~step (1.4):
(1.1) integrity profile of extract phantom plate;
(1.2) central point of extract minutiae and profile on the profile of object template;
(1.3) utilizing unique point and central point to set up distance matrix is described profile;
(1.4) carry out the calculating of distance matrix for all pixels on the profile, the set of result of calculation is the feature database of object template contours.
Step 2 is treated detected image and is carried out target identification and comprise following steps (2.1)~step (2.6):
(2.1) edge of extraction image to be detected;
(2.2) extract minutiae on the edge of image to be detected;
(2.3) calculate the feature description that image characteristic point to be detected forms;
(2.4) Characteristic of Image description to be detected is mated with the feature in the object template contours feature database;
(2.5) central point of estimation image outline to be detected;
(2.6) distance relation of central point and profile in the central point of estimating according to image outline to be detected and the distance relation at correspondence image edge and the object template contours estimates the profile of image to be detected.
Utilize the integrity profile of background subtraction method extract phantom plate in the step (1.1), the result of profile is expressed as S={P
iI=1...N}, wherein S represents the profile of object template, P
iPixel on the expression profile, N represents the number of pixel in the profile.
In the step (1.2) on the profile of object template the central point of extract minutiae and profile comprise the steps: that (1) is to the some P on the profile
i(x
i, y
i) calculate the characteristic of correspondence point that it satisfies certain condition, specifically comprise following steps: establish the arbitrfary point P on the object template contours
j(x
j, y
j), get a P
k(x
k, y
k), wherein k=2*j-i obtains
D represents a P
jTo a P
iWith a P
kOrganize straight distance, Ax+By+C=0 represents a P
iWith a P
kThe straight-line equation at place, wherein A=y
k-y
j, B=x
i-x
k, C=x
ky
j-x
iy
k, some P
iTo a P
kDistance can be expressed as:
T is d and d
IkThe ratio of these two distances when t 〉=T, is then got the most close P on the order
iThe P of point
jPoint is P
iCharacteristic of correspondence point, T are the threshold parameter of setting according to different objects; (2) on the profile have a few and calculate successively the characteristic of correspondence point, obtain corresponding relation C table: C={P
i, P
mI=1...N, m ∈ 1 ..., N}}, wherein P
mBe P
iThe characteristic of correspondence point, N represents the number of pixel in the profile; (3) central point of calculating profile is:
Step (1.3) is utilized unique point and central point to set up distance matrix and profile is described is specifically comprised following steps: calculate with the some P on the profile (1)
iBe the Description Matrix D in the object template contours feature database of starting point
i, specifically comprise following steps: utilize corresponding relation C table to find P
iCharacteristic of correspondence point P
jUtilize corresponding relation C table to find P
jCharacteristic of correspondence point P
kCalculate
D
iFor with P
iBe starting point, P
kBe the profile Description Matrix of terminal point, wherein
d
I, jP
iAnd P
jEuclidean distance between the point is in the distance expression of log space.(2) calculate each unique point P on the profile
iAnd center point P
CenterRelation, with vector representation be
Wherein N represents the number of pixel in the profile.
The edge that step (2.1) is extracted image to be detected specifically comprises following steps: the image transitions to be detected that (1) will collect is gray-scale map; (2) treat detected image with the canny operator and carry out edge extracting; (3) with edge linking algorithm the edge of close together is coupled together, obtain the edge aggregation E={e (i) of image to be detected; I=1...M}, wherein M represents the number at edge.
Step (2.2) extract minutiae on the edge of image to be detected specifically comprises following steps: (1) is for any edge e (i) of image to be detected, on the edge have a few and calculate successively the characteristic of correspondence point, obtain the corresponding relation C table of edge e (i); (2) treat detected image edge aggregation E={e (i); All edges among the i=1...M} carry out respectively the calculating of unique point, and all edges are obtained corresponding corresponding relation C table, E
c={ e
c(i); I=1...M}, wherein M represents the number at edge, and the corresponding relation C of edge i is expressed as e
c(i)={ P
j, P
mJ=1...N
i, m ∈ 1 ..., N
i, N wherein
iRepresent the number of the upper pixel of edge i, P
mBe P
jThe characteristic of correspondence point.
The feature description that step (2.3) is calculated image characteristic point composition to be detected specifically comprises following steps: calculate with the some P on the edge i (1)
m(i) be the feature description of starting point, specifically comprise following steps: utilize corresponding relation C table set E
cIn e
c(i) find P
m(i) characteristic of correspondence point P
j(i); Utilize corresponding relation C table e
c(i) find P
j(i) characteristic of correspondence point P
k(i); Calculate
For edge i goes up with P
m(i) be starting point, P
k(i) be the profile description of terminal point, wherein d
I, j=log (1+||P
i-P
j||
2), d
I, jP
iAnd P
jEuclidean distance between the point is in the distance expression of log space; (2) edge set E={e (i); Each edge among the i=1...M} respectively calculated characteristics is described,
Wherein
Be illustrated on the edge i feature Description Matrix of some j place's beginning, N
iRepresent the number of the upper pixel of edge i, M represents the number at edge.
Step (2.4) with Characteristic of Image to be detected describe with object template contours feature database in feature mate mainly and to carry out respectively similarity by the Characteristic of Image Description Matrix to be detected that will extract and the Description Matrix in the object template contours feature database and calculate to realize that key step is as follows: (1) Characteristic of Image Description Matrix to be detected is D
e, the Description Matrix in the object template contours feature database is D
m, to two matrix D
e, D
mThe diagonal line both sides carry out respectively normalized, to D
eObtain after the processing
Wherein
D
eSome i on (i, j) expression image outline e to be detected is to the Euclidean distance of some j, to D
mObtain after the processing
Wherein
D
mSome i on (i, j) expression object template contours m is to the Euclidean distance of some j; (2) to matrix D '
eAnd matrix D '
mCarry out similarity and calculate, the value of similarity is
I wherein, j=1,2,3, i ≠ j; σ is the punishment parameter of error of adjusting the distance, when i=j, and A (i, j)=1, then the value of similarity is:
Step (2.5) estimates that the central point of image outline to be detected specifically comprises following steps: (1) according to similarity value ψ (D '
e, D '
m) judgement 1-ψ (D '
e, D '
mWhether)>ψ τ sets up, wherein ψ
τThreshold parameter for the restriction similarity; (2) if 1-ψ (D '
e, D '
m)>ψ
τ, i.e. Characteristic of Image Description Matrix D to be detected
eWith the Description Matrix D in the object template contours feature database
mMate unsuccessfully, then give up D
e(3) if 1-ψ (D '
e, D '
m)≤ψ
τ, i.e. Characteristic of Image Description Matrix D to be detected
eWith the Description Matrix D in the object template contours feature database
mThe match is successful, then according to S
e, S
mAnd vector
Estimate and form D
eThree some P
m(i), P
j(i), P
k(i) central point corresponding to difference
P wherein
iBe the arbitrfary point on the image border to be detected, P '
iFor on the image border to be detected with P
iCorresponding point, P
Center' be the central point of estimating; (4) calculate the marginate center point P of image to be detected
Center'; (4) by the marginate center point P of image to be detected
Center' be formed in the image to be detected ballot figure to central point; (5) in the statistics image to be detected to the voting results of central point, with mean shift algorithm voting results are carried out cluster, the cluster centre that obtains is the central point of image outline to be detected.
The distance relation at central point and edge in the central point that step (2.6) is estimated according to image outline to be detected and the distance relation at correspondence image edge and the object template contours, the profile that estimates image to be detected specifically comprises following steps: (1) gets the center point P of image outline to be detected
Center(i), wherein i is the some targets in the image to be detected, finds with P
Center(i) centered by, radius is the series of features Description Matrix D of ballot in the σ zone
e, wherein σ refers to the radius value according to image size to be detected and scene setting; (2) statistics characteristics of image Description Matrix D to be detected
e(i) with object template contours feature database in corresponding Description Matrix D
m(j) the scaling relation V between (i) is namely according to S
m(j) and S
e(i) value obtains ratio
Wherein e represents a certain edge of image to be detected, and i represents the starting point of characteristics of image Description Matrix to be detected on edge e, and m represents the jobbie template contours, the starting point of Description Matrix on profile m of correspondence in the j representative body template contours feature database; (3) statistics is with P
Center(i) centered by, radius is all characteristics of image Description Matrix D to be detected of ballot in the σ zone
eWith the Description Matrix D in the object template contours feature database
mBetween scaling relation, obtain in characteristics of image Description Matrix to be detected and the object template contours feature database Description Matrix than value set V={V (i); I=1...m}, wherein the m representative is with P
Center(i) centered by, radius is the quantity of the characteristics of image Description Matrix to be detected of ballot in the σ zone; (4) carry out cluster with the ratio among the mean shift algorithm pair set V, determine the ratio V (i) of the Description Matrix in characteristics of image Description Matrix to be detected and the object template contours feature database according to cluster result, determine to satisfy the characteristics of image Description Matrix D to be detected of described ratio according to this ratio V (i)
e(i), according to characteristics of image Description Matrix D to be detected
e(i) determine point on the edge of composition characteristic Description Matrix, these points are with P
Center(i) centered by, with the point in the object template contours image outline to be detected that the match is successful; (5) according to ratio V, with P
Center(i) centered by, the point in the image outline to be detected that does not match is filled the profile of the complete image to be detected that obtains estimating.
The present invention has solved the scale problem in the outline process with respect to prior art, makes the yardstick in the outline process constant, and effectively is applied to the target identification in the image.
Embodiment
Below in conjunction with accompanying drawing, describe the specific embodiment of the present invention in detail.
As shown in Figure 1, the target identification method based on contour feature of the present invention is used for identifying image target to be detected, if a plurality of targets are arranged in the image to be detected, then identifies one by one the target in the image to be detected.Comprise following steps S1~S2:S1, set up the feature database of object template contours; S2, treat detected image and carry out target identification.
Wherein step S1 comprises following steps S11~S14:
The integrity profile of S11, extract phantom plate mainly under static background, utilizes the integrity profile of background subtraction method extract phantom plate, and the result of profile is expressed as S={P
iI=1...N}, wherein S represents the profile of object template, P
iPixel on the expression profile, N represents the number of pixel in the profile.
S12, on the profile of object template the central point of extract minutiae and profile, comprise the steps: that (1) is to the some P on the profile
i(x
i, y
i) calculate the characteristic of correspondence point that it satisfies certain condition, specifically comprise following steps: establish the arbitrfary point P on the object template contours
j(x
j, y
j), get a P
k(x
k, y
k), wherein k=2*j-i obtains
D represents a P
jTo a P
iWith a P
kOrganize straight distance, Ax+By+C=0 represents a P
iWith a P
kThe straight-line equation at place, wherein A=y
k-y
j, B=x
i-x
k, C=x
ky
j-x
iy
k, some P
iTo a P
kDistance can be expressed as:
T is d and d
IkThe ratio of these two distances when t 〉=T, is then got the most close P on the order
iThe P of point
jPoint is P
iCharacteristic of correspondence point, T are the threshold parameter of setting according to different objects, and the setting of T is relevant with the crooked situation of object objective contour, and the value of T can be 0.12; (2) all pixels on the profile are calculated the characteristic of correspondence point successively, obtain corresponding relation C table: C={P
i, P
mI=1...N, m ∈ 1 ..., N}}, wherein P
mBe P
iThe characteristic of correspondence point, N represents the number of pixel in the profile; (3) central point of calculating profile is:
S13, utilize unique point and central point to set up distance matrix profile to be described, to comprise following steps:
(1) calculates with the some P on the profile
iBe the Description Matrix D in the object template contours feature database of starting point
i, specifically comprise following steps: utilize corresponding relation C table to find P
iCharacteristic of correspondence point P
jUtilize corresponding relation C table to find P
jCharacteristic of correspondence point P
kCalculate
D
iFor with P
iBe starting point, P
kBe the profile Description Matrix of terminal point, wherein d
I, j=log (1+||P
i-P
j||
2), d
I, j, be P
iAnd P
jEuclidean distance between the point is at the distance expression of log space, d
I, k=log (1+||P
i-P
k||
2), d
I, kP
iAnd P
kEuclidean distance between the point is at the distance expression of log space, d
J, k=log (1+||P
j-P
k||
2), d
J, kP
jAnd P
kEuclidean distance between the point is at the distance expression of log space, D
iOther value in the matrix is such as d
I, i, D
J, i, d
K, iImplication by that analogy; (2) calculate each unique point P on the profile
iAnd center point P
CenterRelation, namely calculate pixel P all on the profile
iWith center point P
CenterVector relations, with vector representation be
Wherein N represents the number of pixel in the profile.
S14, carry out the calculating of distance matrix for all pixels on the profile, the set of result of calculation is the feature database of object template contours, and the Description Matrix in the object template contours feature database is D
m, D
mThe D that is calculated by all pixels on the profile
iConsist of.For same class clarification of objective storehouse, a lot of training samples can be arranged, for the inhomogeneity things, difference composition characteristic storehouse, D={D
ObjObj=people, horse, car ....
Step S2 comprises following steps S21~S26:
The edge of S21, extraction image to be detected, specifically comprise following steps: the image transitions to be detected that (1) will collect is gray-scale map; (2) treat detected image with the canny operator and carry out edge extracting; (3) with edge linking algorithm the edge of close together is coupled together, obtain the edge aggregation E={e (i) of image to be detected; I=1...M}, wherein M represents the number at edge.
S22, on the edge of image to be detected extract minutiae, with among the step S12 on the profile of object template the method for extract minutiae identical, specifically comprise following steps: (1) is for any edge e (i) of image to be detected, on the edge have a few and calculate successively the characteristic of correspondence point, obtain the corresponding relation C table of edge e (i); (2) treat detected image edge aggregation E={e (i); All edges of i=1...M} carry out respectively the calculating of unique point, and all edges are obtained corresponding corresponding relation C table, E
c={ e
c(i); I=1...M}, wherein M represents the number at edge, and the corresponding relation C of edge i is expressed as e
c(i)={ P
j, P
mJ=1...N
i, m ∈ 1 ..., N
i, N wherein
iRepresent the number of the upper pixel of edge i, P
mBe P
jThe characteristic of correspondence point.
S23, calculate the feature that image characteristic point to be detected forms and describe, with to utilize unique point and central point to set up distance matrix among the step S13 identical to the method that profile is described, specifically comprise following steps: calculate with the some P on the edge i (1)
m(i) be the feature description of starting point, specifically comprise following steps: utilize corresponding relation C table set E
cIn e
c(i) find P
m(i) characteristic of correspondence point P
j(i); Utilize corresponding relation C table e
c(i) find P
j(i) characteristic of correspondence point P
k(i); Calculate
For edge i goes up with P
m(i) be starting point, P
k(i) be the profile description of terminal point, wherein d
I, j=log (1+||P
i-P
j||
2), d
I, jP
iAnd P
jEuclidean distance between the point is in the distance expression of log space; (2) edge set E={e (i); Each edge among the i=1...M} respectively calculated characteristics is described,
Wherein
Be illustrated on the edge i feature Description Matrix of some j place's beginning, N
iRepresent the number of the upper pixel of edge i, M represents the number at edge.
S24, with Characteristic of Image to be detected describe with object template contours feature database in feature mate, mainly carry out respectively similarity by the Characteristic of Image Description Matrix to be detected that will extract and the Description Matrix in the object template contours feature database and calculate to realize that key step is as follows: (1) Characteristic of Image Description Matrix to be detected is D
e, the Description Matrix in the object template contours feature database is D
m, to two matrix D
e, D
mThe diagonal line both sides carry out respectively normalized, to D
eObtain after the processing
Wherein
D
eSome i on (i, j) expression image outline e to be detected is to the Euclidean distance of some j, to D
mObtain after the processing
Wherein
D
mSome i on (i, j) expression object template contours m is to the Euclidean distance of some j; (2) to matrix D '
eAnd matrix D '
mCarry out similarity and calculate, the value of similarity is
I wherein, j=1,2,3, i ≠ j; σ is the punishment parameter of error of adjusting the distance, and industry is general unifiedly gets 0.2, when i=j, and A (i, j)=1, then the value of similarity is:
S25, estimate the central point of image outline to be detected specifically to comprise following steps: (1) according to similarity value ψ (D '
e, D '
m) judgement 1-ψ (D '
e, D '
m)>ψ
τWhether set up, wherein ψ
τBe the threshold parameter of restriction similarity, this threshold parameter can be 0.1; (2) if 1-ψ (D '
e, D '
M)>ψ
τ, i.e. Characteristic of Image Description Matrix D to be detected
eWith the Description Matrix D in the object template contours feature database
mMate unsuccessfully, then give up D
e(3) if 1-ψ (D '
e, D '
m)≤ψ
τ, i.e. Characteristic of Image Description Matrix D to be detected
eWith the Description Matrix D in the object template contours feature database
mThe match is successful, then according to S
e, S
mAnd vector
Estimate and form D
eThree some P
m(i), P
j(i), P
k(i) central point corresponding to difference
P wherein
iBe the arbitrfary point on the image border to be detected, P '
iFor on the image border to be detected with P
iCorresponding point, P
Center' be the central point of estimating; (4) calculate the marginate center point P of image to be detected
Center'; (4) by the marginate center point P of image to be detected
Center' be formed in the image to be detected ballot figure to central point; (5) in the statistics image to be detected to the voting results of central point, with mean shift algorithm voting results are carried out cluster, the cluster centre that obtains is the central point of image outline to be detected.
The distance relation of central point and profile in the distance relation at S26, the central point of estimating according to image outline to be detected and correspondence image edge and the object template contours, estimate the profile of image to be detected, specifically comprise following steps: (1) gets the center point P of image outline to be detected
Center(i), wherein i is the some targets (because if a plurality of targets are arranged in the image to be detected, then may there be a plurality of central points in the corresponding central point of each target) in the image to be detected, finds with P
Center(i) centered by, radius is the series of features Description Matrix D of ballot in the σ zone
e, wherein σ refers to can get 5*5 according to the radius value of image size to be detected and scene setting; (2) statistics characteristics of image Description Matrix D to be detected
e(i) with object template contours feature database in corresponding Description Matrix D
m(j) the scaling relation V between (i) is namely according to S
m(j) and S
e(i) value obtains ratio
Wherein e represents a certain edge of image to be detected, and i represents the starting point of characteristics of image Description Matrix to be detected on edge e, and m represents the jobbie template contours, the starting point of Description Matrix on profile m of correspondence in the j representative body template contours feature database; (3) statistics is with P
Center(i) centered by, radius is all characteristics of image Description Matrix D to be detected of ballot in the σ zone
eWith the Description Matrix D in the object template contours feature database
mBetween scaling relation, obtain in characteristics of image Description Matrix to be detected and the object template contours feature database Description Matrix than value set V={V (i); I=1...m}, wherein the m representative is with P
Center(i) centered by, radius is the quantity of the characteristics of image Description Matrix to be detected of ballot in the σ zone; (4) carry out cluster with the ratio among the mean shift algorithm pair set V, determine the ratio V (i) of the Description Matrix in characteristics of image Description Matrix to be detected and the object template contours feature database according to cluster result, determine to satisfy the characteristics of image Description Matrix D to be detected of described ratio according to this ratio V (i)
e(i), according to characteristics of image Description Matrix D to be detected
e(i) determine point on the edge of composition characteristic Description Matrix, these points are with P
Center(i) centered by, with the point in the object template contours image outline to be detected that the match is successful; (5) according to ratio V, with P
Center(i) centered by, the point in the image outline to be detected that does not match is filled the profile of the complete image to be detected that obtains estimating.
More than introduced a kind of target identification method based on contour feature, the method has estimated the definite position of target when finishing target identification.The present invention is not limited to above embodiment, and any technical solution of the present invention that do not break away from is namely only carried out improvement or the change that those of ordinary skills know to it, all belongs within protection scope of the present invention.