CN105590114A - Image characteristic quantity generation method - Google Patents

Image characteristic quantity generation method Download PDF

Info

Publication number
CN105590114A
CN105590114A CN201510973596.8A CN201510973596A CN105590114A CN 105590114 A CN105590114 A CN 105590114A CN 201510973596 A CN201510973596 A CN 201510973596A CN 105590114 A CN105590114 A CN 105590114A
Authority
CN
China
Prior art keywords
point
image
designated
value
extreme point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201510973596.8A
Other languages
Chinese (zh)
Inventor
马洪明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201510973596.8A priority Critical patent/CN105590114A/en
Publication of CN105590114A publication Critical patent/CN105590114A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/48Extraction of image or video features by mapping characteristic values of the pattern into a parameter space, e.g. Hough transformation

Abstract

The invention discloses an image characteristic quantity generation method, mainly solving the problems of greater computational complexity and slower speed of image characteristic description in image coupling, and inaccurate coupling caused by neighborhood errors. The method comprises the steps of: extracting extreme points through an algorithm, finding the main direction of the extreme points through certain rules, performing adjacent diffusion on each extraction point to obtain a boundary thereof, and optimizing and regularizing the boundary to obtain the rule neighborhood of the extraction point; and randomly selecting dot pairs through the neighbourhood, and generating image characteristic quantity through simple binary system test. The image characteristic quantity generation method has the characteristics of faster computational speed, more accurate feature point neighborhood and greater instantaneity compared with traditional characteristic point detection and description, and has higher stability of image characteristic description, and greater robustness of image coupling.

Description

A kind of generation method of image feature amount
Technical field
The present invention relates to technical field of image processing, particularly a kind of generation method of image feature amount.
Background technology
Along with the development in computerized information epoch, computer vision becomes a focus of people's research, and wantsWith some machine intelligences process image or carry out video tracking, what first will do is exactly the identification to image, and imageIt is the basis of image recognition and coupling that feature is described, and visual picture feature is described in the importance in this field. Tradition forThe algorithm of the feature point detection of image and description all have certain limitation because of the unicity of feature itself, and algorithm tooComplexity, descriptor dimension is higher, and the robustness of obtaining extreme point neighborhood method that provides is good and at target in complex environmentThe effect that the feature extraction of image all can not obtain, this causes follow-up images match and recognition accuracy not high. NoOnly like this, the characteristic vector finally generating also needs more time cost in the time of images match, particularly follows at video objectIn track application, to the having relatively high expectations of time, the description vectors that traditional method is extracted makes whole tracing process very slow, veryThe very high occasion of difficult requirement of real time.
Summary of the invention
The object of the invention is to when larger and larger for the error causing in above-mentioned images match and identifyingBetween cost, a kind of extreme point and direction of extracting based on algorithm proposed, obtain after the accurate neighborhood of extreme point extreme point again and generate twoSystem string carries out feature description, has simplified greatly algorithm complex, reduces time cost and makes extreme point coupling have more ShandongRod.
In order to realize the generation of above-mentioned image feature amount, the generation method of a kind of image feature amount of the present invention, comprises as followsStep:
(1) obtain the extreme point K (x, y) of image I (x, y);
(2) obtain the accurate neighborhood of image Near The Extreme Point according to extreme point K (x, y), and find out extreme point K's (x, y)Principal direction;
(3) in described accurate neighborhood random selected point to composition binary string, as image feature amount.
Preferably, the extreme point K (x, y) that obtains image I (x, y) that step (1) is described, carries out: (1a) as followsImage I (x, y) at the Hessian matrix of yardstick σ is
H ( x , y , σ ) = L x x ( x , y , σ ) L x y ( x , y , σ ) L x y ( x , y , σ ) L y y ( x , y , σ )
Wherein, Lxx(x, y, σ) is that image is located and Gauss's second order local derviation at point (x, y)Convolution, Lxy(x,Y, σ) be that image is located and Gauss's second order local derviation at point (x, y)Convolution, Lyy(x, y, σ) is that image is at point (x, y)Place and Gauss's second order local derviationConvolution;
(1b) keep image size constant, form image pyramid by the mode that increases square frame filtering mask size;
(1c) value of square frame filtering mask after with image convolution is designated as respectively Dxx,Dxy,Dyy, further obtain Hessian matrixApproximation be: detH=DxxDyy-(0.9Dxy)2, calculate Hessian matrix approximation, on duty be greater than zero and near neighborhoodWhen large value, be the extreme point K (x, y) of correspondence image I (x, y).
Preferably, what step (2) was described obtains the accurate neighborhood of image Near The Extreme Point according to extreme point K (x, y), and looks forGo out the principal direction of extreme point K (x, y), carry out as follows:
(2a) have an I (x, y) to be designated image and wait to expand, and set of pixels P={K (x, y) has been expanded in order };
(2b) using K (x, y) around adjacent n × n point unconditionally add in P as initial point set, n value is for being greater than 1Odd number;
(2c) find the boundary set C ' of P;
(2d), to the each pixel E (x, y) in C ', find with it and be no more than θ apart1, do not belong to P and be designated and wait to expandAll pixel point set P '=(K ' (x, y)),WhereinAnd distance is less thanEqual θ2, and do not belong to all pixel point set P "=(K " (x, y) of P '), &theta; 1 < ( E x - K &prime; &prime; x ) 2 + ( E y - K &prime; &prime; y ) 2 &le; &theta; 2 , Wherein &theta; 2 = 5 ;
(2e) to point set P ' and P " in each some K ' (x, y), K " (x, y), by function f (d1,d2) calculate its permissionLarge tolerance δ, wherein:
f(d1,d2): be the empirical function simulating by mass data; As given d1,d2Time, for calculating permissionMaximum tolerance δ;
d1: represent the distance with E (x, y);
d2: represent " the distance of (x, y) with some K ' (x, y) or K;
δ: the tolerance allowing when expansion;
If (2f)OrBy K ' (x, y) or K, " (x, y) adds in P, and will join point identification in P for expanding; Otherwise by this point from P ' or P "Middle rejecting, wherein
: be ">" or "<", determined by Laplace;
Value (E (x, y)): the gray value that is illustrated in coordinate points E (x, y) in former figure;
" if=φ, finds superset P to (2g) P '=φ and P, and expansion finishes; Otherwise carry out (2c);
(2h) find out the boundary set C of P;
(2i) using image extreme point coordinate K (x, y) as starting point, outwards travel through to any direction, with the intersection point note of border CFor O (x, y);
(2j) meet all in the C of borderSome T (x, y) join queue QIn, and point corresponding in C is labeled as and is retrieved; Wherein dmaxThe maximum length of " shortcut ",
(2k) point in traversal Q, if this point and O (x, y) are at dmaxDistance in continuous adjacent, ignore; Otherwise calculateThe long O of minor arc (x, y) Q{T (x, y) }, be designated as l; Calculate fan-shaped O (x, y) Q{T (x, y) } area, be designated as s; Calculate apart from O (x,Y) Q{T (x, y) }, be designated as d, calculate the distance range d ' allowing by empirical function f (l, s); Wherein:
F (l, s): go out to meet apart from the empirical function of allowing requirement by data fitting; As given l, when s, for calculatingThe distance range d ' that goes out to allow;
If (2l) d < d ', by this Q{T (x, y) } insert in new queue Q ';
(2m) choose in Q ' and the distance d of O (x, y) and the some Q ' of the ratio minimum of its corresponding distance d ' that can allow{ T (x, y) }, directly tie point O (x, y) Q ' { T (x, y) } rejects minor arc O (x, y) Q ' { T (x, y) } from the C of border;
(2n) replace O (x, y) with T (x, y), return to step (2j), until all border C are a little marked as and examineRope;
(2o) find border C end points T (x, y) up and down, B (x, y), L (x, y), R (x, y);
(2p) be designated as l through the horizontal line of upper extreme point T (x, y)1,l1Two points of head and the tail through border C are designated as T1(x,y),T2(x, y), if | T2-T1| >=2, by l1Be designated as L1; Otherwise be designated as l through a horizontal line of T (x, y-1)1, go equally judgement, straightTo finding L1; Horizontal line through lower extreme point T (x, y) is designated as l2,l2Two points of head and the tail through border C are designated as T3(x,y),T4(x,Y), if | T4-T3| >=2, by l2Be designated as L2, in like manner find the straight line L through left end point, right endpoint3,L4
(2q) by four straight line L1,L2,L3,L4Calculate rectangle R, be accurate neighborhood;
(2r) in traveling through the irregular figure diffuseing to form, extreme point connects respectively between two wherein 2 the longest companiesThe line segment becoming is as axis, relatively using axis as marginal two parts area, if top area is large, and just upper by axisThe principal direction of this extreme point of Fang Zuowei, if below area is large, the just principal direction using the below of axis as this extreme point.
Preferably, the random selected point in described accurate neighborhood described in step (3) is to composition binary string, as imageCharacteristic quantity, carries out as follows:
(3a) near the accurate neighborhood of image Near The Extreme Point, choose at random some test points pair, these are put to right twoSystem detected value is combined into a binary string, the characteristic vector using this as image extreme point;
Binary detection value is defined as:
&tau; ( p ; x , y ) = 1 p ( x ) < p ( y ) 0 p ( x ) > p ( y )
Wherein, p (x) is that Image neighborhood P is at an x=(u, v)TThe gray value at place;
(3b) select n (x, y) test point to time, definition binary system criterion, the characteristic vector generating is like this n and ties up twoSystem string:
f n ( p ) = &Sigma; 1 &le; i &le; n 2 i - 1 &tau; ( p ; x , y )
Wherein n gets 128 integral multiple;
(3c) to test point to retraining, in the window of m × m, find all test points pair, with the subwindow of l × lInterior pixel grey scale mean value, replaces the mean value of certain test point to some places, and wherein m is more than or equal to 31 integer, and l is largeIn the integer that equals 5, concrete steps are as follows:
1) in the accurate neighborhood of a certain extreme point, calculate subwindow be basis test point to test value τ;
2) by step 1) in all test value τ according to itself and the sequence of 0.5 difference, form vector T;
3) progressively search for;
3aa) first test value τ is put into result vector R, and it is removed from vector T;
3bb) get next test value τ from vector T, and all test values in itself and vectorial R are compared, if phaseGuan Du is greater than a certain threshold value, is abandoned, otherwise it is added to the seat of incoming vector R as synthetic image characteristic quantityMark;
3cc) repeat above each step, until there be n coordinate in vectorial R, form final image characteristic quantity, then willThe threshold value of the degree of correlation increases, and detects again selected these the degree of correlation of these test values τ, ensures that final result has lessThe degree of correlation.
The present invention compared with prior art, has the following advantages:
1) the characteristic point stability that the present invention extracts is high, can keep displacement consistency, invariable rotary shape, yardstick consistencyAnd illumination etc.;
2) reduced algorithm complex, coupling accuracy is higher, and real-time is also relatively good;
3) the present invention is owing to being spread and being obtained neighborhood by extreme point, thereby it is more accurate to compare existing algorithm;
4) obtain very accurately neighborhood after and be optimized, during for Description Image extreme point, more there is robustProperty;
5) the present invention, because the Feature Descriptor generating has been abandoned traditional method of describing region by histogram of gradients, changesWith detecting random response, greatly accelerate descriptor and set up speed;
6) descriptor is the form of binary code string, and efficiency is very high, therefore uses Hamming distance to mate subsequent characteristics pointTime have an obvious advantage.
Brief description of the drawings
Fig. 1 is the generation method general flow chart of image feature amount of the present invention;
Fig. 2 is the particular flow sheet that method of the present invention is obtained accurate neighborhood;
Fig. 3 is the schematic diagram that method of the present invention is optimized border C.
Detailed description of the invention
Below in conjunction with test example and detailed description of the invention, the present invention is described in further detail. But this should not understoodFor the scope of the above-mentioned theme of the present invention only limits to following embodiment, all technology realizing based on content of the present invention all belong to thisScope of invention.
Describe the present invention referring to accompanying drawing:
Fig. 1 is the generation method general flow chart of image feature amount of the present invention, and method comprises the following steps:
(1) obtain the extreme point K (x, y) of image I (x, y);
(2) obtain the accurate neighborhood of image Near The Extreme Point according to extreme point K (x, y), and find out extreme point K's (x, y)Principal direction;
(3) in described accurate neighborhood random selected point to composition binary string, as image feature amount.
With reference to Fig. 2, the generation method concrete steps of image feature amount of the present invention, comprise the following steps:
Step 1 extracts extreme point and extreme point by algorithm;
(a) image I (x, y) at the Hessian matrix of yardstick σ is
H ( x , y , &sigma; ) = L x x ( x , y , &sigma; ) L x y ( x , y , &sigma; ) L x y ( x , y , &sigma; ) L y y ( x , y , &sigma; )
Here L,xx(x, y, σ) is that image is located and Gauss's second order local derviation at point (x, y)Convolution, similarlyDefinable Lxy(x,y,σ),Lyy(x, y, σ), i.e. Lxy(x, y, σ) is that image is located and Gauss's second order local derviation at point (x, y)Convolution, Lyy(x, y, σ) is that image is located and Gauss's second order local derviation at point (x, y)Convolution;
(b) keep image size constant, form image pyramid, square frame by the mode that increases square frame filtering mask sizeThe filtering of the approximate replacement of filtering second order Gauss, square frame filtering utilizes integral image to accelerate convolution algorithm;
(c) value of square frame filtering mask after with the convolution of image is designated as respectively Dxx,Dxy,Dyy, further obtain Hessian matrixApproximation be: detH=DxxDyy-(0.9Dxy)2, calculate Hessian matrix approximation, on duty be greater than at 1 o'clock and for neighborhood aroundWhen large, be the extreme point K (x, y) of corresponding yardstick;
Step 2 is obtained accurate neighborhood and the extreme point principal direction of Near The Extreme Point;
(2a) all image pixel I (x, y) are designated and wait to expand, and set of pixels P={K (x, y) has been expanded in order };
(2b) by K (x, y) around the adjacent individual point of n × n (n generally gets 3,5,7,9) unconditionally add in P as initial pointCollection, to increase the robustness of neighborhood;
(2c) find the boundary set C ' of P;
(2d), to the each pixel E (x, y) in C ', find with it and be no more than θ apart1(generally get) do not belong to P and markKnow for all pixel point set P ' (K ' (x, y)) to be expanded,And distance is littleIn equaling θ2And do not belong to all pixel point set P of P " (K (x, y)), &theta; 1 < ( E x - K &prime; &prime; x ) 2 + ( E y - K &prime; &prime; y ) 2 &le; &theta; 2 (generally get 5 );
(2e) to point set P ' and P " in each some K ' (x, y), by function f (d1,d2) calculate the maximum tolerance δ of its permission.Wherein:
f(d1,d2): be the empirical function simulating by mass data; As given d1,d2Time, for calculating permissionMaximum tolerance δ;
d1: represent the distance with E (x, y);
d2: represent the distance with characteristic point K (x, y) or K ' (x, y);
δ: the tolerance allowing when expansion.
If (2f)OrBy K ' (x, y) or K, " (x, y) adds in P, and will join point identification in P for expanding; Otherwise by this point from P ' or P "Middle rejecting. Wherein
: be ">" or "<", determined by Laplace;
Value (E (x, y)): the gray value that is illustrated in coordinate points E (x, y) in former figure.
" if=φ, finds superset P to (2g) P '=φ and P, and expansion finishes; Otherwise carry out (3c);
(2h) find out the boundary set C of P.
(2i) using described step 1 in claim 1 extreme point coordinate K (x, y) as starting point, outside in any directionTraversal, is designated as O (x, y) with the intersection point of border C;
(2j) meet all in the C of borderSome T (x, y) join queue QIn, and point corresponding in C is labeled as and is retrieved; Wherein dmaxBe the maximum length of " shortcut ", generally get(2k) traversal QIn point, if this point and O (x, y) are at dmaxDistance in continuous adjacent, ignore; Otherwise calculate the long O of minor arc (x, y) Q{T(x, y) }, be designated as l; Calculate fan-shaped O (x, y) Q{T (x, y) } area, be designated as s; Calculate apart from O (x, y) Q{T (x, y), be designated asD. Calculate the distance range d ' allowing by empirical function f (l, s); Wherein:
F (l, s): go out to meet apart from the empirical function of allowing requirement by data fitting; As given l, when s, for calculatingThe distance range d ' that goes out to allow;
If (2l) d < d ', by this Q{T (x, y) } insert in new queue Q ';
(2m) choose in Q ' and the distance d of O (x, y) and the some Q ' of the ratio minimum of its corresponding distance d ' that can allow{ T (x, y) }, directly tie point O (x, y) Q ' { T (x, y) } rejects minor arc O (x, y) Q ' { T (x, y) } from the C of border;
(2n) change T (x, y) into O (x, y), return to step (2j). Until all border C are a little marked as and examineRope, finishes.
(2o) find border C end points T (x, y) up and down, B (x, y), L (x, y), R (x, y);
(2p) be designated as l through the horizontal line of upper extreme point T (x, y)1,l1Two points of head and the tail through border C are designated as T1(x,y),T2(x, y), if | T2-T1| >=2, by l1Be designated as L1; Otherwise be designated as l through a horizontal line of T (x, y-1)1, go equally judgement, straightTo finding L1; Horizontal line through lower extreme point T (x, y) is designated as l2,l2Two points of head and the tail through border C are designated as T3(x,y),T4(x,Y), if | T4-T3| >=2, by l2Be designated as L2, in like manner find through left end point right endpoint isoline L3,L4
(2q) by four straight line L1,L2,L3,L4Calculate rectangle R, be neighborhood;
(2r) in traveling through the irregular figure diffuseing to form, extreme point connects respectively between two wherein 2 the longest companiesThe line segment becoming is as axis, relatively using axis as marginal two parts area, if top area is large, and just upper by axisThe principal direction of this extreme point of Fang Zuowei, if below area is large, the just principal direction using the below of axis as this extreme point;
Step 3 generating feature descriptor
(3a) the accurate neighborhood P of the image being obtained by said process 3 chooses at random some points near extreme point neighborhoodRight, these are put to right binary detection value and be combined into a binary string, the characteristic vector descriptor using this as extreme point.
Binary system test definition is:
&tau; ( p ; x , y ) = 1 p ( x ) < p ( y ) 0 p ( x ) > p ( y )
Wherein, p (x) is that Image neighborhood P is at an x=(u, v)TThe gray value at place;
(3b) select n (x, y) test point to time, unique binary system criterion that defined, the characteristic vector of generation is retouched like thisState son and be n dimension binary string:
f n ( p ) = &Sigma; 1 &le; i &le; n 2 i - 1 &tau; ( p ; x , y )
(n=128,256,512) specifically need to be in amount of calculation, between memory space and discrimination, weighs.
(3c) descriptor of above-mentioned generation vector mid point between correlation larger, increased the difficulty of follow-up coupling,Need to be to test point to retraining, main thought be utilize Greedy search reduce a little between correlation, in order to fallLow noise sensitiveness is found all-pair in the window of m × m (here desirable 31 × 31), with l × l (here desirable 5 × 5)Subwindow in pixel grey scale mean value, replace the mean value of certain point to some places, concrete steps are as follows:
1) in the accurate neighborhood of a certain extreme point, calculate subwindow be basis point to test value τ;
2) by all test value τ in step according to itself and the sequence of 0.5 difference (distance), form vector T;
3) progressively search for;
3aa) first test value τ is put into result vector R, and it is removed from vector T;
3bb) get next test value τ from vector T, and all test values in itself and vectorial R are compared, if phaseGuan Du is greater than a certain threshold value, is abandoned, otherwise it is added to the seat of incoming vector R as synthetic image characteristic quantityMark;
3cc) repeat above each step, until there be n coordinate in vectorial R, form final image characteristic quantity. Then willThe threshold value of the degree of correlation increases, and detects again selected these the degree of correlation of these test values τ, ensures that final result has lessThe degree of correlation;
In Fig. 3, centered by O point, find all distances to be less than dmaxPoint, insert in queue Q. Its mid point T " with O point at dmaxDistance in continuous adjacent, so ignore. And T, T ' not with O point at dmaxDistance in continuous adjacent, but according to minor arc OT andThe d ' that area S calculates and the ratio of d, be less than d ' and d that minor arc OT ' and area S ' calculateiRatio. So finalSelect to connect OT, the minor arc OT in the C of border is rejected.
Effect of the present invention can be carried out a step explanation by data experiment:
In 10000 samples pictures training sets from CMU image library, random 8 groups of data of statistics, table 1 reflectsPeople be the accuracy of each algorithm extreme point coupling in the time that the yardstick of image changes:
Group The present invention ORB SIFT
1 97.3 27.3 93.9
2 95.6 19.3 93.1
3 98.3 30.0 95.7
4 92.9 14.9 89.2
5 95.1 19.1 93.7
6 97.0 25.5 93.9
7 94.2 17.3 92.5
8 97.5 27.7 94.6
Mean value 96.0 22.6 93.3
Table 1
The extreme point matching degree of algorithm of the present invention is far above traditional algorithm ORB, and matching accuracy is about 96.0%, compares ORBMatching accuracy has improved approximately 73.4%, has shown the superiority of invention algorithm in the constant performance of yardstick, meanwhile, and in coupling essenceIn exactness, also raise a little more than SIFT mono-.
The practicality of algorithm can contrast the match time of algorithm. Choose randomly 8 groups of test patterns, as table 2, wherein4 groups of experimental image from the constant performance of above-mentioned yardstick, all the other four groups do not exist yardstick constant, only have rotation, translation, lightAccording to waiting variation:
Group The present invention ORB SIFT
1 49.5 29.3 3157.1
2 45.3 27.6 2986.8
3 12.2 6.7 765.3
4 62.6 32.7 3522.7
5 50.2 30.8 3271.3
6 51.2 32.2 3359.3
7 49.9 30.4 3215.5
8 53.4 31.0 3367.9 7 -->
Mean value 46.8 27.7 2955.8
Table 2
Be about 46.8ms average match time of the present invention, be about 19.1ms than ORB and can think roughly the same, and SIFTBe about 2955.8 average match time, is about 63.2 times of the present invention, and the matching speed of algorithm of the present invention is approximately faster than SIFT63.2 times, show the quick superiority of algorithm of the present invention.
As can be seen here, the present invention can reduce the time complexity that image feature vector generates on the whole, shortcut calculation,And make the description of extreme point neighborhood more accurate, increase its robustness, make the more simple and stable of description of extreme point, to imageCoupling has very big benefit.

Claims (4)

1. a generation method for image feature amount, comprises the steps:
(1) obtain the extreme point K (x, y) of image I (x, y);
(2) obtain the accurate neighborhood of image Near The Extreme Point according to extreme point K (x, y), and find out the main side of extreme point K (x, y)To;
(3) in described accurate neighborhood random selected point to composition binary string, as image feature amount.
2. the generation method of image feature amount according to claim 1, image I (x, y) that what its step (1) was described obtainExtreme point K (x, y), preferred, carry out as follows:
(1a) image I (x, y) at the Hessian matrix of yardstick σ is
H ( x , y , &sigma; ) = L x x ( x , y , &sigma; ) L x y ( x , y , &sigma; ) L x y ( x , y , &sigma; ) L y y ( x , y , &sigma; )
Wherein, Lxx(x, y, σ) is that image is located and Gauss's second order local derviation at point (x, y)Convolution, Lxy(x,y,σ)That image is located and Gauss's second order local derviation at point (x, y)Convolution, Lyy(x, y, σ) be image point (x, y) locate withGauss's second order local derviationConvolution;
(1b) keep image size constant, form image pyramid by the mode that increases square frame filtering mask size;
(1c) value of square frame filtering mask after with image convolution is designated as respectively Dxx,Dxy,Dyy, further obtain the near of Hessian matrixLike value be: detH=DxxDyy-(0.9Dxy)2, calculate Hessian matrix approximation, be on dutyly greater than zero and for maximum near neighborhoodWhen value, be the extreme point K (x, y) of correspondence image I (x, y).
3. the generation method of image feature amount according to claim 1, its step (2) described according to extreme point K (x, y)Obtain the accurate neighborhood of image Near The Extreme Point, and find out the principal direction of extreme point K (x, y), carry out as follows:
(2a) have an I (x, y) to be designated image and wait to expand, and set of pixels P={K (x, y) has been expanded in order };
(2b) using K (x, y) around adjacent n × n point unconditionally add in P as initial point set, n value be greater than 1 strangeNumber;
(2c) find the boundary set C ' of P;
(2d), to the each pixel E (x, y) in C ', find with it and be no more than θ apart1, do not belong to P and be designated institute to be expandedThere is pixel point set P '=(K ' (x, y)),WhereinAnd distance is littleIn equaling θ2, and do not belong to all pixel point set P "=(K " (x, y) of P '), &theta; 1 < ( E x - K &prime; &prime; x ) 2 + ( E y - K &prime; &prime; y ) 2 &le; &theta; 2 , Wherein &theta; 2 = 5 ;
(2e) to point set P ' and P " in each some K ' (x, y), K " (x, y), by function f (d1,d2) calculate its permission maximum holdPoor δ, wherein:
f(d1,d2): be the empirical function simulating by mass data; As given d1,d2Time, for calculating the maximum of permissionTolerance δ;
d1: represent the distance with E (x, y);
d2: represent " the distance of (x, y) with some K ' (x, y) or K;
δ: the tolerance allowing when expansion;
If (2f) &lsqb; V a l u e ( E ( x , y ) ) &rsqb; &CircleTimes; &lsqb; V a l u e ( K &prime; ( x , y ) ) + &delta; &rsqb; Or &lsqb; V a l u e ( E ( x , y ) ) &rsqb; &CircleTimes; &lsqb; V a l u e ( K &prime; &prime; ( x , y ) ) + &delta; &rsqb; , ?By K ' (x, y) or K, " (x, y) adds in P, and will join point identification in P for expanding; Otherwise by this point from P ' or P "Reject, wherein
For ">" or "<", determined by Laplace;
Value (E (x, y)): the gray value that is illustrated in coordinate points E (x, y) in former figure;
" if=φ, finds superset P to (2g) P '=φ and P, and expansion finishes; Otherwise carry out (2c);
(2h) find out the boundary set C of P;
(2i) using image extreme point coordinate K (x, y) as starting point, outwards travel through to any direction, be designated as O with the intersection point of border C(x,y);
(2j) meet all in the C of borderSome T (x, y) join teamIn row Q, and point corresponding in C is labeled as and is retrieved; Wherein dmaxThe maximum length of " shortcut ", d m a x = 5 ;
(2k) point in traversal Q, if this point and O (x, y) are at dmaxDistance in continuous adjacent, ignore; Otherwise calculating minor arcLong O (x, y) Q{T (x, y) }, be designated as l; Calculate fan-shaped O (x, y) Q{T (x, y) } area, be designated as s; Calculate apart from O (x, y) Q{T(x, y) }, be designated as d, calculate the distance range d ' allowing by empirical function f (l, s);
Wherein:
F (l, s): go out to meet apart from the empirical function of allowing requirement by data fitting; As given l, when s, for calculating appearanceThe distance range d ' being permitted;
If (2l) d < d ', by this Q{T (x, y) } insert in new queue Q ';
(2m) choose in Q ' and the distance d of O (x, y) and the some Q ' { T of the ratio minimum of its corresponding distance d ' that can allow(x, y) }, directly tie point O (x, y) Q ' { T (x, y) } rejects minor arc O (x, y) Q ' { T (x, y) } from the C of border;
(2n) replace O (x, y) with T (x, y), return to step (2j), retrieved until all border C are a little marked as;
(2o) find border C end points T (x, y) up and down, B (x, y), L (x, y), R (x, y);
(2p) be designated as l through the horizontal line of upper extreme point T (x, y)1,l1Two points of head and the tail through border C are designated as T1(x,y),T2(x,Y), if | T2-T1| >=2, by l1Be designated as L1; Otherwise be designated as l through a horizontal line of T (x, y-1)1, go equally judgement, until look forTo L1; Horizontal line through lower extreme point T (x, y) is designated as l2,l2Two points of head and the tail through border C are designated as T3(x,y),T4(x, y), asReally | T4-T3| >=2, by l2Be designated as L2, in like manner find the straight line L through left end point, right endpoint3,L4
(2q) by four straight line L1,L2,L3,L4Calculate rectangle R, be accurate neighborhood;
(2r) in extreme point travels through the irregular figure diffuseing to form, connect respectively between two, wherein the longest 2 are linked to beLine segment, as axis, relatively using axis as marginal two parts area, if top area is large, is just done the top of axisFor the principal direction of this extreme point, if below area is large, the just principal direction using the below of axis as this extreme point.
4. the generation method of image feature amount according to claim 1, its step (3) described in described accurate neighborhoodRandom selected point, to composition binary string, as image feature amount, is carried out as follows:
(3a) near the accurate neighborhood of image Near The Extreme Point, choose at random some test points pair, these are put to right binary systemDetected value is combined into a binary string, the characteristic vector using this as image extreme point;
Binary detection value is defined as:
&tau; ( p ; x , y ) = 1 p ( x ) < p ( y ) 0 p ( x ) > p ( y )
Wherein, p (x) is that Image neighborhood P is at an x=(u, v)TThe gray value at place;
(3b) select n (x, y) test point to time, definition binary system criterion, the characteristic vector generating is like this n and ties up binary systemString:
f n ( p ) = &Sigma; 1 &le; i &le; n 2 i - 1 &tau; ( p ; x , y )
Wherein n gets 128 integral multiple;
(3c) to test point to retraining, in the window of m × m, find all test points pair, with in the subwindow of l × lPixel grey scale mean value, replaces the mean value of certain test point to some places, and wherein m is more than or equal to 31 integer, and l is for being greater than etc.In 5 integer, concrete steps are as follows:
1) in the accurate neighborhood of a certain extreme point, calculate subwindow be basis test point to test value τ;
2) by step 1) in all test value τ according to itself and the sequence of 0.5 difference, form vector T;
3) progressively search for;
3aa) first test value τ is put into result vector R, and it is removed from vector T;
3bb) get next test value τ from vector T, and all test values in itself and vectorial R are compared, if the degree of correlationBe greater than a certain threshold value, abandoned, otherwise it is added to the coordinate of incoming vector R as synthetic image characteristic quantity;
3cc) repeat above each step, until there be n coordinate in vectorial R, form final image characteristic quantity, then will be correlated withThe threshold value of degree increases, and detects again selected these the degree of correlation of these test values τ, ensures that final result has less phaseGuan Du.
CN201510973596.8A 2015-12-22 2015-12-22 Image characteristic quantity generation method Withdrawn CN105590114A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510973596.8A CN105590114A (en) 2015-12-22 2015-12-22 Image characteristic quantity generation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510973596.8A CN105590114A (en) 2015-12-22 2015-12-22 Image characteristic quantity generation method

Publications (1)

Publication Number Publication Date
CN105590114A true CN105590114A (en) 2016-05-18

Family

ID=55929684

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510973596.8A Withdrawn CN105590114A (en) 2015-12-22 2015-12-22 Image characteristic quantity generation method

Country Status (1)

Country Link
CN (1) CN105590114A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109238160A (en) * 2018-09-06 2019-01-18 中国科学院力学研究所 The method and device of non-contact measurement large deformation material strain
CN116304179A (en) * 2023-05-19 2023-06-23 北京大学 Data processing system for acquiring target video
CN116824183A (en) * 2023-07-10 2023-09-29 北京大学 Image feature matching method and device based on multiple feature descriptors

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1959707A (en) * 2006-12-07 2007-05-09 北京航空航天大学 Image matching method based on pixel jump
CN101650784A (en) * 2009-09-23 2010-02-17 南京大学 Method for matching images by utilizing structural context characteristics
CN103456005A (en) * 2013-08-01 2013-12-18 华中科技大学 Method for matching generalized Hough transform image based on local invariant geometrical characteristics
CN104240231A (en) * 2014-07-08 2014-12-24 大连民族学院 Multi-source image registration based on local structure binary pattern
CN104766084A (en) * 2015-04-10 2015-07-08 南京大学 Nearly copied image detection method based on multi-target matching

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1959707A (en) * 2006-12-07 2007-05-09 北京航空航天大学 Image matching method based on pixel jump
CN101650784A (en) * 2009-09-23 2010-02-17 南京大学 Method for matching images by utilizing structural context characteristics
CN103456005A (en) * 2013-08-01 2013-12-18 华中科技大学 Method for matching generalized Hough transform image based on local invariant geometrical characteristics
CN104240231A (en) * 2014-07-08 2014-12-24 大连民族学院 Multi-source image registration based on local structure binary pattern
CN104766084A (en) * 2015-04-10 2015-07-08 南京大学 Nearly copied image detection method based on multi-target matching

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
梅振顺 等: "基于SURF特征的目标跟踪", 《中国体视学与图像分析》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109238160A (en) * 2018-09-06 2019-01-18 中国科学院力学研究所 The method and device of non-contact measurement large deformation material strain
CN109238160B (en) * 2018-09-06 2019-12-27 中国科学院力学研究所 Method and device for non-contact measurement of strain of large-deformation material
CN116304179A (en) * 2023-05-19 2023-06-23 北京大学 Data processing system for acquiring target video
CN116304179B (en) * 2023-05-19 2023-08-11 北京大学 Data processing system for acquiring target video
CN116824183A (en) * 2023-07-10 2023-09-29 北京大学 Image feature matching method and device based on multiple feature descriptors
CN116824183B (en) * 2023-07-10 2024-03-12 北京大学 Image feature matching method and device based on multiple feature descriptors

Similar Documents

Publication Publication Date Title
Song et al. Richly activated graph convolutional network for action recognition with incomplete skeletons
Newell et al. Associative embedding: End-to-end learning for joint detection and grouping
Li et al. Dbcface: Towards pure convolutional neural network face detection
CN104766084B (en) A kind of nearly copy image detection method of multiple target matching
Abbas et al. Region-based object detection and classification using faster R-CNN
Wan et al. End-to-end integration of a convolution network, deformable parts model and non-maximum suppression
CN106709568A (en) RGB-D image object detection and semantic segmentation method based on deep convolution network
CN110738207A (en) character detection method for fusing character area edge information in character image
CN103810473B (en) A kind of target identification method of human object based on HMM
CN111612807A (en) Small target image segmentation method based on scale and edge information
CN102222341B (en) Motion characteristic point detection method and device, moving target detecting method and device
Zhang et al. Cof-net: A progressive coarse-to-fine framework for object detection in remote-sensing imagery
CN102074015A (en) Two-dimensional image sequence based three-dimensional reconstruction method of target
CN110263731B (en) Single step human face detection system
Li et al. A complex junction recognition method based on GoogLeNet model
CN111833273A (en) Semantic boundary enhancement method based on long-distance dependence
CN110008900A (en) A kind of visible remote sensing image candidate target extracting method by region to target
CN108280844A (en) A kind of video object localization method based on the tracking of region candidate frame
CN111881716A (en) Pedestrian re-identification method based on multi-view-angle generation countermeasure network
CN107944437A (en) A kind of Face detection method based on neutral net and integral image
Fond et al. Facade proposals for urban augmented reality
CN105590114A (en) Image characteristic quantity generation method
Terrail et al. Faster RER-CNN: Application to the detection of vehicles in aerial images
CN113076891A (en) Human body posture prediction method and system based on improved high-resolution network
CN112287906A (en) Template matching tracking method and system based on depth feature fusion

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20160518

WW01 Invention patent application withdrawn after publication