CN104915949B - A kind of image matching method of combination point feature and line feature - Google Patents

A kind of image matching method of combination point feature and line feature Download PDF

Info

Publication number
CN104915949B
CN104915949B CN201510162935.4A CN201510162935A CN104915949B CN 104915949 B CN104915949 B CN 104915949B CN 201510162935 A CN201510162935 A CN 201510162935A CN 104915949 B CN104915949 B CN 104915949B
Authority
CN
China
Prior art keywords
mrow
point
msub
mtd
angle point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201510162935.4A
Other languages
Chinese (zh)
Other versions
CN104915949A (en
Inventor
王岳环
王康裕
吴明强
张天序
范蓉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong University of Science and Technology
Original Assignee
Huazhong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong University of Science and Technology filed Critical Huazhong University of Science and Technology
Priority to CN201510162935.4A priority Critical patent/CN104915949B/en
Publication of CN104915949A publication Critical patent/CN104915949A/en
Application granted granted Critical
Publication of CN104915949B publication Critical patent/CN104915949B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a kind of combination point feature and the image matching algorithm of line Feature Descriptor, including:(1) respectively to Prototype drawing and real-time figure in multiple dimensioned lower extraction angle point;(2) edge aggregation around figure in real time and Prototype drawing angle point is obtained;(3) class ORB point features description of the finally selected real-time figure that calculation procedure (1) is obtained and Prototype drawing angle point;(4) the real-time figure and the matching similarity of the edge aggregation of Prototype drawing obtained using minimum truncated side Hausdorff distance description steps (2);(5) matching similarity of the real-time figure obtained in calculation procedure (3) and class ORB point features description of Prototype drawing angle point;(6) matching result is integrated.The inventive method first carries out primary election using stable point feature angle steel joint and obtains candidate's point set, it is ensured that correct position is included, then candidate's point set screened using more global line feature, repeat pattern can be reduced, accuracy is improved.

Description

A kind of image matching method of combination point feature and line feature
Technical field
The invention belongs to images match, computer vision and digital image processing techniques field, and in particular to one kind is combined The image matching method of point feature and line feature.
Background technology
With developing rapidly for science and technology especially computer technology, scene matching aided navigation identification technology is had become at information Reason field technology of crucial importance and basic, particularly in the fields such as Aero-Space, Navigation of Pilotless Aircraft and Missile Terminal Guidance tool Have a wide range of applications.
Scene matching aided navigation identification is that the Prototype drawing by the real-time figure of aircraft shooting with preparing in advance carries out matching knowledge Other process.By image recognition to determine aircraft present position, so as to correct course deviation.The Prototype drawing used at present It is mainly derived from and defends piece or aerophotograph, real-time figure is the lower images that Airborne Camera (or missile-borne camera) is shot.
Figure is generally infrared image in real time, and Prototype drawing may be visible ray or infrared image.Prototype drawing and real-time figure because Different for shooting time, imager, its gray scale texture generally has larger difference with figure in real time.In reality scene, scheme in real time Interior object can also be by illumination, shade, and the influence such as cloud and mist is blocked, its gray scale texture features can be varied widely.Particularly exist In infrared image, it can also be changed even if the different periods of identical object its gray feature in one day.
Scene matching aided navigation identification in would generally be according to Prototype drawing and part on material priori, pass through the method for cluster Two-value (multivalue) template for the architectural feature more significantly stablized in generation reflection Prototype drawing.Two-value (multivalue) template is applied to multiple In the matching of miscellaneous scene, it can be partially removed in statistical significance and adapt to the influence that grey scale change is brought, but repetition mould occurs Formula is more, the problem of descriptive power is inadequate.
According to the difference for the data structure for participating in computing, the match cognization algorithm of image has two research directions:Based on spy The matching and recognition method levied and the matching and recognition method based on region.Wherein, the famous matching and recognition method based on point feature Including scale invariant feature conversion (Scale-invariant feature transform, SIFT), rapid robust feature (Speed-Up Robust Features, SURF), the BRIEF algorithms (ORiented Brief, ORB) of fixed setting.SIFT Class algorithm is applied to the image of description texture-rich, and in simple two-value template, texture information is less, and lines are simple, can extract Angle point negligible amounts, and SIFT classes algorithm relies on partial gradient when finding principal direction, and principal direction positioning is inaccurate to be influenceed The descriptive power of son is described.
Conventional Edge Gradient Feature method includes:Robert edge detection operators, Gauss-Laplace operator (LOG Operator), Canny operators, Hough transform etc..Edge matching algorithm develops slower, current edge matching algorithm with respect to corners Matching Easily affected by noise, matching precision and suppression noise immune generally can not be taken into account, and complex scene lower edge extracts potentially unstable, Influence matching result.
Matching and recognition method based on region is also known as template matching method, is a kind of common image matching algorithm, most often Matching process based on region includes:Gray scale cross correlation algorithm and Mutual Information Matching algorithm etc., the main ash using region Certain conversion for spending information or half-tone information carries out match cognization.Its development comparative maturity, stability is good, reliability is high, It is widely used in the fields such as pattern-recognition, aircraft navigation.Match cognization based on region is carried in terms of the selection in region Stronger priori, when template is larger, computing takes larger.
The content of the invention
The present invention proposes the image matching method of a kind of combination point feature and line Feature Descriptor, it is therefore intended that by using The line feature of angle point neighborhood is screened to multiple candidate matches points of point feature, obtains the matching preferably in complex scene Stationkeeping ability.
To realize the purpose of the present invention, the invention provides a kind of combination point feature and the images match of line Feature Descriptor Method, the method following steps taken:
(1) respectively to Prototype drawing and real-time figure in multiple dimensioned lower extraction angle point
To Prototype drawing and in real time, figure builds yardstick pyramid diagram picture respectively.Build yardstick pyramid diagram as when, keep image Size is constant, by changing Gaussian Blur size and for calculating the size of haar response windows to obtain each layer of different scale Pyramid diagram picture., will be each using the local curvature ρ of each pixel of hessian matrix computations in every layer of pyramid diagram picture The local curvature ρ of pixel and α point on its yardstick neighborhood (generally takes 26 points on 3 yardstick neighborhoods, i.e., α generally takes 26) to be compared, if the local curvature ρ of the pixel is the maximum or minimum value in this α point, retain Come, be used as the angle point of initial option.
To the angle point of initial option, dual threshold (t1, t2, and t1 are used according to its local curvature ρ<T2) screened, obtained Obtain finally selected angle point.In screening process, the angle point that local curvature ρ is less than threshold value t1 is then directly cast out, and local curvature ρ is more than Threshold value t2 angle point then directly retains, and local curvature ρ is more than threshold value t1 but the angle point less than t2, if can be extracted in its neighborhood More long edge (illustrating that the angle point neighborhood can describe enough architectural features), then retain the angle point, otherwise cast out.
(2) edge aggregation around figure in real time and Prototype drawing angle point is obtained
The finally selected angle point that traversal step (1) is obtained, respectively from real-time figure and β neighborhoods (the β neighborhoods of Prototype drawing angle point 10s radius neighborhoods are usually taken to be, s is the pixel unit of yardstick where the angle point) start to extract edge and grown, obtain angle Edge aggregation around point.Obtain during edge aggregation, the pixel for belonging to certain edge is labeled as into n, and (n represents that the pixel belongs to In the nth bar boundary curve being found), the pixel for treating but be considered as being not belonging to edge is labeled as NotEdge.If certain Angle point neighborhood includes the pixel for being marked as n, then nth bar boundary curve is directly incorporated to the edge aggregation of the angle point, no Grown again since the pixel.Treated pixel is no longer processed.
By the edge point coordinates extracted in units of curve, store into chained list, (indexed using array M on as The n stated) chained list is indexed.Edge aggregation around angle point is represented using chained list.Value in chained list is the edge that it is included Index of the curve in array M.
(3) point feature description of angle point is calculated
The class ORB of the finally selected real-time figure and Prototype drawing angle point obtained using formula (1) or (2) description step (1) Point feature description.Calculate class ORB point features to describe in subprocess, first in a number of segment pair of angle point neighborhood choice (set of i.e. two segment compositions), calculates the gray value sum of pixel in each segment, the gray value of relatively more each segment pair Sum, is encoded according to result of the comparison, and the coding of all segment compared results constitutes the class ORB point features of the angle point (mode that can be advised according to standard ORB algorithms selects 512 segments pair, all segments chi all where the angle point to description Obtained on degree).Shown in the coded system equation below of each segment pair:
Wherein t is that, according to the threshold value for requiring real-time figure contrast selection, p (x) and p (y) represent picture in the segment of selection The gray value sum of element.If only considering material contrast (such as infrared image), coding 01 and 10 can be merged into 1, i.e. formula For:
Each segment calculates a τ to ii, the point feature that the coding of K segment compared result constitutes the angle point retouches State son [τ01,…,τK-1]。
(4) edge aggregation matching similarity is calculated
Using minimum truncated side Hausdorff distance (Least Trimmed Square-Hausdorff Distance, LTS-HD real-time figure and the matching similarity of the edge aggregation of Prototype drawing that) description step (2) is obtained.LTS-HD is by point-to-point collection Sorted from small to large by min coordinates distance (being described using Euclidean distance), the average value of h is as unidirectional Hausdorff before taking Distance.
The collection A that sets up an office is to scheme the set that pixel is constituted in certain angle point surrounding edge set in real time, and point set B is Prototype drawing angle The set of pixel composition in point surrounding edge set, its pixel quantity is respectively NaAnd Nb.Calculate in point set A institute a little with The min coordinates distance of point in point set B, and be ranked up from small to large, obtain ordered set X.Wherein, dB(ai)(i)Represent point Collect the point a in AiI-th bit is ordered as in the minimum range of point in the point set B minimum range that other are put in institute pointed set A.
Similarly calculate in point set B min coordinates distance a little with the point in point set A, and be ranked up from small to large, Obtain ordered set Y.Wherein, dA(bi)(i)Represent the point b in point set BiThe minimum range of point in point set A is in institute pointed set A In other put minimum range in be ordered as i-th bit.
If K=f1×NA, L=f2×NB, f1And f2Respectively point set pixel quantity NaAnd NbCorresponding proportionality coefficient, and it is full Foot 0<f1<1,0<f2<1.Define the one-way distance h between A and BLTS(A, B) and hLTS(B, A) is:
LTS-HD distance definitions are:
H (A, B)=max [hLTS(A,B),hLTS(B,A)] (5)
It can be seen that, LTS-HD range formulas pass through 2 coefficient f1And f2Come control participate in calculate point number, will most narrow spacing Point king-sized from value is filtered out, and can not only eliminate the influence of deep error matching points, and to the elimination of Gaussian noise Ability is stronger.
The specific coordinate value being all converted to the coordinate value of the point in edge aggregation using angle point as origin.Point set A is calculated to arrive During point set B matching double points, the chained list of traversal description point set A inward flange points obtains the coordinate of marginal point, then to where B In figure lookup marginal point is from the close-by examples to those far off carried out around the p of corresponding coordinate position in r × r neighborhoods.If there is edge in r × r neighborhoods Point, then regard the distance between the coordinate of the marginal point and position p as minimum range.If not finding marginal point in neighborhood, Then lowest distance value is set to 2r to reduce similarity evaluation.Phase is used during the minimum range for the matching double points for calculating point set B to A Same computational methods.Obtain point to minimum range after, successively according to formula (3), (4) and (5) calculating two point sets between LTS-HD is defined as follows apart from H, then the matching similarity LM of edge aggregation (i.e. line feature):
(5) points correspondence Similarity Measure
By of the real-time figure obtained in formula (7) calculation procedure (3) and class ORB point features description of Prototype drawing angle point With similarity.If real-time figure point feature description is [a0,a1,…,aK-1], Prototype drawing point feature description is [b0,b1,…, bK-1], if ai=bi(i=0,1 ..., K-1), then it is assumed that it is consistent that point feature describes sub- i-th dimension matching result.Statistical match knot Really consistent dimension sum m.The ratio description point for accounting for the total dimension K of point feature description using the consistent dimension m of matching result is special The matching similarity PM levied:
Specifically used variable W is stored point feature description [a by formula (8)0,a1,…,aK-1], actually since calculating The limited bits of machine types of variables, can use multiple variable step-by-step simple combinations to represent variable W.Retouched to storing point feature The variable W step-by-step XORs of son are stated, are then the consistent dimension of expression matching result for 1 digit in outcome variable.If used Multiple byte variables (char types) represent variable W to combine, then can be directly obtained by way of tabling look-up in outcome variable For 1 digit.
(6) matching result is integrated
The points correspondence similarity and line characteristic matching similarity obtained in step (4) and step (5) is integrated, It is required that points correspondence similarity is more than threshold value tp, line characteristic matching similarity is more than threshold value tl, take before points correspondence similarity Then these candidate matches points are evaluated, line taking characteristic matching by angle point composition candidate matches point set big N using line feature Similarity highest result is used as final matching angle point.
It is more than threshold value t for the matching similarity of point featurepAngle point pair, according to the edge included around Prototype drawing angle point Real-time figure relevant position is checked in set, if the position is not extracted by edge, and relevant position neighborhood extracts side in real-time figure Edge, and by this part edge for calculating line characteristic matching similarity.
The aiming spot speculated is obtained according to the spatial relation of final matching angle point and target point, by the mesh of supposition Punctuate position is added to candidate target point set.All final matching angle points are handled using same method, by the target point position of supposition Put and be added to candidate target point concentration.The most conduct of number is carried out in nearest neighbor classifier, selection class to candidate target point set most Whole aiming spot.If cluster failure, directly concentrates selected point characteristic matching similarity highest in candidate target point Point is used as final aiming spot.
NNCA algorithm is random first to choose K point as initial cluster center from data set, then calculates each Individual sample is grouped into sample the class where that cluster centre nearest from it to the distance of cluster centre.Calculate what is newly formed The average value of the data object of each cluster obtains new cluster centre, if adjacent cluster centre twice is not any Change, illustrates that sample adjustment terminates, clustering criteria function has been restrained.The classification of each sample will be investigated in each iteration It is whether correct.If incorrect it is necessary to adjust, after whole samples have been adjusted, then cluster centre is changed, into next iteration.
Compared with prior art, the present invention has the advantages that:
(1) first carry out primary election using stable point feature angle steel joint and obtain candidate's point set, it is ensured that include correct position Come in, then candidate's point set is screened using more global line feature, repeat pattern can be reduced, improve accuracy.
(2) screened using dual threshold angle steel joint, can retain that local curvature is larger or edge feature compared with horn of plenty angle Point, the process for obtaining angle point has taken into account point feature and line feature simultaneously.
(3) automatically extract angle point and combination point feature and line feature are matched, save in Region Matching Algorithm and manually select The process in region is selected, and preferable matching result can be obtained.
Brief description of the drawings
Fig. 1 combines the overall flow figure of the image matching method of point feature and line feature for the present invention;
Fig. 2 is according to the constraint of non-maximum is preliminary to extract angle point in the present invention;
Fig. 3 is to extract edge in angle point neighborhood in the embodiment of the present invention and grown and for storing the data of curve Structure;
Fig. 4 is the segment pair for being used to generate class ORB point features description in the embodiment of the present invention;
Fig. 5 is the matching schematic diagram of angle point neighborhood edge aggregation in the embodiment of the present invention.
Embodiment
In order to make the purpose , technical scheme and advantage of the present invention be clearer, it is right below in conjunction with drawings and Examples The present invention is further elaborated.It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, and It is not used in the restriction present invention.As long as in addition, technical characteristic involved in each embodiment of invention described below Not constituting conflict each other can just be mutually combined.
Point feature uses local gray level information, more stable, but is due to the limitation of local window size, easily duplicates Pattern, line is characterized in more global, with stronger structure separating capacity, may not but extracted in complex scene lower edge It is stable.First primary election is carried out using stable point feature angle steel joint and obtain candidate's point set, it is ensured that correct position is included, so Candidate's point set is screened using more global line feature afterwards, repeat pattern can be reduced, accuracy is improved.Use edge In screening process, the matching relationship of point feature is pressed with Prototype drawing angle point according to real-time figure, the edge conduct in Prototype drawing can be used Prompting carries out edge extracting and matching to figure in real time.
As shown in figure 1, the invention provides the image matching method of a kind of combination point feature and line feature, this method is specific Step is as follows:
(1) respectively to Prototype drawing and real-time figure in multiple dimensioned lower extraction angle point
To Prototype drawing and in real time, figure builds yardstick pyramid diagram picture respectively.Build yardstick pyramid diagram as when, keep image Size is constant, by changing Gaussian Blur size and for calculating the size of haar response windows to obtain each layer of different scale Pyramid diagram picture., will be each using the local curvature ρ of each pixel of hessian matrix computations in every layer of pyramid diagram picture The local curvature ρ of pixel and α point on its yardstick neighborhood (generally takes 26 points on 3 yardstick neighborhoods, i.e., α generally takes 26) to be compared, if the local curvature ρ of the pixel is the maximum or minimum value in this α point, retain Come, be used as the angle point of initial option.
As shown in Fig. 2 labeled as × 26 points of position and its 3-dimensional neighborhood carry out size and compared, the non-maximum of progress Constraint.
To the angle point of initial option, dual threshold (t1, t2, and t1 are used according to its local curvature ρ<T2) screened, obtained Obtain finally selected angle point.In screening process, the angle point that local curvature ρ is less than threshold value t1 is then directly cast out, and local curvature ρ is more than Threshold value t2 angle point then directly retains, and local curvature ρ is more than threshold value t1 but the angle point less than t2, if can be extracted in its neighborhood More long edge (illustrating that the angle point neighborhood can describe enough architectural features), then retain the angle point, otherwise cast out.
Threshold value t1 can set smaller to retain enough information, be set to 30 here, and threshold value t2 can be with Setting must ensure that the angle point extracted is more stable than larger, and 400 are set to here (when extracting angle point in opencv 100) hessian default threshold is.Here it is considered that angle point neighborhood during in the presence of more than more than 2 10 length in pixels edge Enough structural strengths can be described, are retained when the local curvature ρ of the angle point is more than threshold value t1.
(2) edge aggregation around figure in real time and Prototype drawing angle point is obtained
The finally selected angle point that traversal step (1) is obtained, respectively from real-time figure and β neighborhoods (the β neighborhoods of Prototype drawing angle point 10s radius neighborhoods are usually taken to be, s is the pixel unit of yardstick where the angle point) start to extract edge and grown, obtain angle Edge aggregation around point.Obtain during edge aggregation, the pixel for belonging to certain edge is labeled as into n, and (n represents that the pixel belongs to In the nth bar boundary curve being found), the pixel for treating but be considered as being not belonging to edge is labeled as NotEdge.If certain Angle point neighborhood includes the pixel for being marked as n, then nth bar boundary curve is directly incorporated to the edge aggregation of the angle point, no Grown again since the pixel.Treated pixel is no longer processed.
To Prototype drawing and real-time figure is using canny operator extractions edge and grows.Real-time figure can be used Canny, sobel or faster Boundary extracting algorithm extracts edge.Can to extract edge length and edge pixel point Increase with the distance of angle point and constrain.
It is pointed out that other arithmetic operators can also be used here, such as:Sobel operators, Prewitt operators, Roberts operators, Laplacian operators, but experiment test is proved:Comparatively speaking, canny operators can be extracted more accurately Actual edge.
By the edge point coordinates extracted in units of curve, store into chained list, (indexed using array M on as The n stated) chained list is indexed.Edge aggregation around angle point is represented using chained list.Value in chained list is the edge that it is included Index of the curve in array M.
As shown in figure 3, extracting edge in point P neighborhood and being grown, curve l is obtained0And l1, then stored respectively Curve l0And l1Chained list list0 and list1, while to curve l0On pixel be labeled as m, to curve l1On element marking For n.Then index points to the pointer of list0 and list1 nodes of chained list for m and n position storage in array M.
(3) point feature description of angle point is calculated
The class ORB of the finally selected real-time figure and Prototype drawing angle point obtained using formula (1) or (2) description step (1) Point feature description.Calculate class ORB point features to describe in subprocess, first in a number of segment pair of angle point neighborhood choice (set of i.e. two segment compositions), calculates the gray value sum of pixel in each segment, the gray value of relatively more each segment pair Sum, is encoded according to result of the comparison, and the coding of all segment compared results constitutes the class ORB point features of the angle point Description (can select the segment of 512 pair of 9 × 9 pixel size, all segments are all according to the mode of the suggestion of standard ORB algorithms Obtained where the angle point on yardstick).
Shown in the coded system equation below of each segment pair:
Wherein t is that, according to the threshold value for requiring real-time figure contrast selection, p (x) and p (y) represent picture in the segment of selection The gray value sum of element.If only considering material contrast (such as infrared image), coding 01 and 10 can be merged into 1, i.e. formula For:
Each segment calculates a τ to ii, the point feature that the coding of K segment compared result constitutes the angle point retouches State son [τ01,…,τK-1]。
As shown in figure 4, representing the one pair of which segment selected in angle point neighborhood, compare the gray value in two segments, The segment corresponding value in point feature description is calculated using formula (1) or (2).Contrast requirement is 0.2 to example as required, Then t is chosen for 0.2 × min (p (x), p (y)), calculates cost of course for saving, can also be chosen as fixed value, can be direct It is taken as 405 (being 9 × 9 × 5).
(4) edge aggregation matching similarity is calculated
Using minimum truncated side Hausdorff distance (Least Trimmed Square-Hausdorff Distance, LTS-HD), the matching similarity of the real-time figure of step (2) acquisition and the edge aggregation of Prototype drawing is described according to formula (5).LTS- HD sorts point-to-point collection by min coordinates distance (being described using Euclidean distance) from small to large, the average value conduct of h before taking Unidirectional Hausdorff distances.
The specific coordinate value being all converted to the coordinate value of the point in edge aggregation using angle point as origin.Point set A is calculated to arrive During point set B matching double points, the chained list of traversal description point set A inward flange points obtains the coordinate of marginal point, then to where B In figure lookup marginal point is from the close-by examples to those far off carried out around the p of corresponding coordinate position in r × r neighborhoods.If there is edge in r × r neighborhoods Point, then regard the distance between the coordinate of the marginal point and position p as minimum range.If not finding marginal point in neighborhood, Then lowest distance value is set to 2r to reduce similarity evaluation.Phase is used during the minimum range for the matching double points for calculating point set B to A Same computational methods.Obtain point to minimum range after, successively according to formula (3), (4) and (5) calculating two point sets between LTS-HD distances, the matching similarity of edge aggregation (i.e. line feature) is calculated according to formula (6).
For example, scheming certain angular coordinate in real time for (140,420), coordinate of certain marginal point is in the edge aggregation around it (200,400), then the marginal point pair in angle point relative coordinate be (60, -20).When being matched with Prototype drawing angle point, Under using the Prototype drawing angle point as the coordinate system of origin, near (60, -20) position in the array that marked Prototype drawing marginal point Draw near the marginal point searched in Prototype drawing in r × r neighborhoods.If there is marginal point, then by the distance of acquisition successively according to Formula (3), (4) and (5) calculates the LTS-HD distances between two point sets.If there is no marginal point, then use is used for apart from 2r Calculate LTS-HD distances.Here depending on r is according to requirement of real-time and to the tolerance of site error, 5 are chosen as here.
(5) points correspondence Similarity Measure
By of the real-time figure obtained in formula (7) calculation procedure (3) and class ORB point features description of Prototype drawing angle point With similarity.If real-time figure point feature description is [a0,a1,…,aK-1], Prototype drawing point feature description is [b0,b1,…, bK-1], if ai=bi(i=0,1 ..., K-1), then it is assumed that it is consistent that point feature describes sub- i-th dimension matching result.Statistical match knot Really consistent dimension sum m.The matching similarity of point feature is described using formula (7).Here according to standard ORB suggestion by K Elect 512 as.
For example, store 8 integer variables (32) of 512 dimensions real-time figure point feature description for [0x80,0x40, 0x00,0x00,0x00,0x00,0x00,0x00], Prototype drawing point feature description son integer variable for [0x40,0x40,0x00, 0x00,0x00,0x00,0x00,0x00], after step-by-step XOR, obtain [0xc0,0x00,0x00,0x00,0x00,0x00,0x00, 0x00].By step-by-step right-shift operation, least-significant byte is taken every time, 64 variables can be obtained, and (256 are tabled look-up according to these variables The array of dimension) how many 0 (0 represents that the dimension matching result is consistent) in the binary representations of these variables can be directly obtained. Such as 0x01 is tabled look-up and understands that it includes 70,0x03 is tabled look-up and understands that it includes 60.
(6) matching result is integrated
The points correspondence similarity and line characteristic matching similarity obtained in step (4) and step (5) is integrated, It is required that points correspondence similarity is more than threshold value tp, line characteristic matching similarity is more than threshold value tl, take before points correspondence similarity Then these candidate matches points are evaluated, take online feature to retouch by angle point composition candidate matches point set big N using line feature Lower similarity highest result is stated as final matching angle point.Here tpIt is taken as 0.6 and (requires that comparative result is consistent in point feature Segment number to account for more than the 60% of total figure block number).tl0.4 is taken as (assuming that edge vertex neighborhood has corresponding marginal point Probability be 0.5, then desired LTS-HD distances can rough calculation be 6, i.e.,Obtained by formula (6) Line characteristic matching similarity LM is 0.4, therefore is used as t using 0.4l).N is taken as 5.
It is more than threshold value t for the matching similarity of point featurepAngle point pair, according to the edge included around Prototype drawing angle point Real-time figure relevant position is checked in set, if the position is not extracted by edge, and relevant position neighborhood extracts side in real-time figure Edge, and by this part edge for calculating line characteristic matching similarity.
As shown in figure 5, point P is angle point, circle is angle neighborhood of a point, and Fig. 5 (a) is the edge aggregation that Prototype drawing neighborhood is included, Point s is a bit on edge.Fig. 5 (b) is the edge near real-time figure neighborhood, and dotted line represents to fail the side extracted Edge.Because the influence of discontinuous edge, the edge aggregation that real-time figure angle point neighborhood is included only includes what is grown out from its neighborhood Part.In the unidirectional Hausdorff distances of calculation template figure angle point surrounding edge point set to real-time figure focus surrounding edge point set When, after matching double points have been primarily determined that with point feature, using edge complete in Prototype drawing, to real-time figure relevant position neighborhood In do not extracted edge region carry out edge extracting.Such as point s in Fig. 5 (a), r × r neighborhoods in real-time figure relevant position Lookup marginal point is from the close-by examples to those far off carried out in (square frame above the s of Fig. 5 (b) midpoints represents neighborhood).If there is side in r × r neighborhoods Edge point, then regard the distance between the coordinate of the marginal point and position p as minimum range.If not finding edge in neighborhood Point, then be set to 2r to reduce similarity evaluation by lowest distance value.According to formula (3) or (4) calculate unidirectional Hausdorff away from From.
The aiming spot speculated is obtained according to the spatial relation of final matching angle point and target point, by the mesh of supposition Punctuate position is added to candidate target point set.All final matching angle points are handled using same method, by the target point position of supposition Put and be added to candidate target point concentration.The most conduct of number is carried out in nearest neighbor classifier, selection class to candidate target point set most Whole aiming spot.If cluster failure, directly concentrates selected point characteristic matching similarity highest in candidate target point Point is used as final aiming spot.
For example, angle point Q matching similarity is more than threshold value t in Prototype drawingpReal-time figure angle point in, take N before similarity Big angle point formation point set [p0,p1,…,pN-1], matched in the angle point concentrated to the point using line feature, line taking feature With similarity highest result piIt is used as the matching result in real-time figure with Prototype drawing angle point Q.
Candidate target point set is clustered, what number was most in selection class is used as final aiming spot.If poly- Class fails, then directly concentrates selected point characteristic matching similarity highest point as final target point position in candidate target point Put.
As it will be easily appreciated by one skilled in the art that the foregoing is merely illustrative of the preferred embodiments of the present invention, it is not used to The limitation present invention, any modifications, equivalent substitutions and improvements made within the spirit and principles of the invention etc., it all should include Within protection scope of the present invention.

Claims (10)

1. the image matching method of a kind of combination point feature and line Feature Descriptor, it is characterised in that methods described includes as follows Step:
(1) respectively to Prototype drawing and real-time figure in multiple dimensioned lower extraction angle point:Keep image size constant, by changing Gaussian mode Paste size and the size of calculating haar response windows obtain the image of each layer of different scale;In every layer of pyramid diagram picture, make With the local curvature ρ of each pixel of hessian matrix computations, by the α in the local curvature of each pixel and its yardstick neighborhood The local curvature ρ of individual point is compared, if the local curvature ρ of the pixel is the maximum or minimum value in this α point, Then remain, be used as the angle point of initial option;To the angle point of initial option, carried out according to its local curvature ρ using dual threshold Screening, obtains finally selected angle point;
(2) edge aggregation around figure in real time and Prototype drawing angle point is obtained:The finally selected angle point that traversal step (1) is obtained, Edge is extracted since the β neighborhoods of real-time figure and Prototype drawing angle point respectively and grown, obtain the edge aggregation around angle point;
(3) class ORB point features description of the finally selected real-time figure that calculation procedure (1) is obtained and Prototype drawing angle point:Selection A number of segment pair, calculates the gray value sum of pixel in each segment, the gray value sum of relatively more each segment pair, root Encoded according to result of the comparison, the coding of all segment compared results constitutes class ORB point features description of the angle point;
(4) edge matching is calculated:The real-time figure and template obtained using minimum truncated side Hausdorff distance description steps (2) The matching similarity of the edge aggregation of figure;
(5) points correspondence is calculated:The class ORB point features of the real-time figure obtained in calculation procedure (3) and Prototype drawing angle point are described The matching similarity of son;
(6) matching result is integrated:To the points correspondence similarity and line characteristic matching phase obtained in step (4) and step (5) Integrated like degree, it is desirable to which points correspondence similarity is more than threshold value tp, line characteristic matching similarity is more than threshold value tl, take a spy The big angle point composition candidate matches point sets of N before matching similarity are levied, then these candidate matches points are commented using line feature Valency, line taking characteristic matching similarity highest result is used as final matching angle point.
2. image matching method as claimed in claim 1, it is characterised in that to the angle point of initial option in the step (1), Screened according to its local curvature ρ using dual threshold, obtain finally selected angle point, specifically include:
The dual threshold is t1, t2, and t1<t2;The angle point that local curvature ρ is less than threshold value t1 is then directly cast out, and local curvature ρ is big Then directly retain in threshold value t2 angle point, local curvature ρ is more than threshold value t1 but the angle point less than t2, if can be extracted in its neighborhood The amount of edge that the length gone out is more than b pixels is more than a bars, then retains the angle point, otherwise cast out.
3. image matching method as claimed in claim 1 or 2, it is characterised in that edge aggregation mistake is obtained in the step (2) Cheng Zhong, is labeled as n, n represents that the pixel belongs to the nth bar boundary curve being found, will handled by the pixel for belonging to certain edge The pixel crossed but be considered as being not belonging to edge is labeled as NotEdge;If certain angle point neighborhood includes the pixel for being marked as n Nth bar boundary curve, then be directly incorporated to the edge aggregation of the angle point by point, is grown no longer since the pixel;Treated picture Vegetarian refreshments is no longer processed.
4. image matching method as claimed in claim 1 or 2, it is characterised in that the volume of each segment pair in the step (3) Shown in code mode equation below:
<mrow> <mi>&amp;tau;</mi> <mrow> <mo>(</mo> <mi>p</mi> <mo>;</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <mn>01</mn> <mo>,</mo> </mrow> </mtd> <mtd> <mrow> <mi>i</mi> <mi>f</mi> </mrow> </mtd> <mtd> <mrow> <mi>p</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>&lt;</mo> <mi>p</mi> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>t</mi> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mn>00</mn> <mo>,</mo> </mrow> </mtd> <mtd> <mrow> <mi>i</mi> <mi>f</mi> </mrow> </mtd> <mtd> <mrow> <mo>|</mo> <mi>p</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>p</mi> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>|</mo> <mo>&amp;le;</mo> <mi>t</mi> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mn>10</mn> <mo>,</mo> </mrow> </mtd> <mtd> <mrow> <mi>i</mi> <mi>f</mi> </mrow> </mtd> <mtd> <mrow> <mi>p</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>&gt;</mo> <mi>p</mi> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>+</mo> <mi>t</mi> </mrow> </mtd> </mtr> </mtable> </mfenced> </mrow>
Wherein t is that, according to the threshold value for requiring real-time figure contrast selection, p (x) and p (y) represent pixel in the segment of selection Gray value sum;
Image for only considering material contrast, coding 01 and 10 is merged into 1, i.e. formula is:
<mrow> <mi>&amp;tau;</mi> <mrow> <mo>(</mo> <mi>p</mi> <mo>;</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <mn>1</mn> <mo>,</mo> </mrow> </mtd> <mtd> <mrow> <mi>i</mi> <mi>f</mi> </mrow> </mtd> <mtd> <mrow> <mo>|</mo> <mi>p</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>p</mi> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>|</mo> <mo>&gt;</mo> <mi>t</mi> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mn>0</mn> <mo>,</mo> </mrow> </mtd> <mtd> <mrow> <mi>i</mi> <mi>f</mi> </mrow> </mtd> <mtd> <mrow> <mo>|</mo> <mi>p</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>p</mi> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>|</mo> <mo>&amp;le;</mo> <mi>t</mi> </mrow> </mtd> </mtr> </mtable> </mfenced> </mrow>
Each segment calculates a τ to ii, the coding of K segment compared result constitutes point feature description of the angle point [τ0, τ1..., τK-1]。
5. image matching method as claimed in claim 1 or 2, it is characterised in that minimum truncated side is calculated in the step (4) Hausdorff distances, be specially:
Min coordinates distance of the institute a little with the point in point set B in point set A is calculated, and is ranked up from small to large, obtains orderly Set X;Wherein, point set A is to scheme the set that pixel is constituted in certain angle point surrounding edge set in real time, and point set B is Prototype drawing The set that pixel is constituted in angle point surrounding edge set, its pixel quantity is respectively NaAnd Nb;dB(ai)(i)Represent point set A In point aiI-th bit is ordered as in the minimum range of point in the point set B minimum range that other are put in institute pointed set A;
<mrow> <mi>X</mi> <mo>=</mo> <mo>{</mo> <msub> <mi>d</mi> <mi>B</mi> </msub> <msub> <mrow> <mo>(</mo> <msub> <mi>a</mi> <mn>1</mn> </msub> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </msub> <mo>,</mo> <msub> <mi>d</mi> <mi>B</mi> </msub> <msub> <mrow> <mo>(</mo> <msub> <mi>a</mi> <mn>2</mn> </msub> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </msub> <mo>,</mo> <msub> <mi>d</mi> <mi>B</mi> </msub> <msub> <mrow> <mo>(</mo> <msub> <mi>a</mi> <mn>3</mn> </msub> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> </msub> <mo>,</mo> <mo>...</mo> <mo>,</mo> <msub> <mi>d</mi> <mi>B</mi> </msub> <msub> <mrow> <mo>(</mo> <msub> <mi>a</mi> <msub> <mi>N</mi> <mi>a</mi> </msub> </msub> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <msub> <mi>N</mi> <mi>a</mi> </msub> <mo>)</mo> </mrow> </msub> <mo>}</mo> </mrow>
Similarly calculate in point set B min coordinates distance a little with the point in point set A, and be ranked up, obtain from small to large Ordered set Y;Wherein, dA(bi)(i)Represent the point b in point set BiThe minimum range of point in point set A in institute pointed set B its I-th bit is ordered as in the minimum range that he orders;
<mrow> <mi>Y</mi> <mo>=</mo> <mo>{</mo> <msub> <mi>d</mi> <mi>A</mi> </msub> <msub> <mrow> <mo>(</mo> <msub> <mi>b</mi> <mn>1</mn> </msub> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </msub> <mo>,</mo> <msub> <mi>d</mi> <mi>A</mi> </msub> <msub> <mrow> <mo>(</mo> <msub> <mi>b</mi> <mn>2</mn> </msub> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </msub> <mo>,</mo> <msub> <mi>d</mi> <mi>A</mi> </msub> <msub> <mrow> <mo>(</mo> <msub> <mi>b</mi> <mn>3</mn> </msub> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> </msub> <mo>,</mo> <mo>...</mo> <mo>,</mo> <msub> <mi>d</mi> <mi>A</mi> </msub> <msub> <mrow> <mo>(</mo> <msub> <mi>b</mi> <msub> <mi>N</mi> <mi>b</mi> </msub> </msub> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <msub> <mi>N</mi> <mi>b</mi> </msub> <mo>)</mo> </mrow> </msub> <mo>}</mo> </mrow>
If K=f1×Na, L=f2×Nb, f1And f2Respectively point set pixel quantity NaAnd NbCorresponding proportionality coefficient, and meet 0 < f1The < f of < 1,02< 1;Define the one-way distance h between A and BLTS(A, B) and hLTS(B, A) is:
<mrow> <msub> <mi>h</mi> <mrow> <mi>L</mi> <mi>T</mi> <mi>S</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi>A</mi> <mo>,</mo> <mi>B</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <mi>K</mi> </mfrac> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>K</mi> </munderover> <msub> <mi>d</mi> <mi>B</mi> </msub> <msub> <mrow> <mo>(</mo> <msub> <mi>a</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> </msub> </mrow>
<mrow> <msub> <mi>h</mi> <mrow> <mi>L</mi> <mi>T</mi> <mi>S</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi>B</mi> <mo>,</mo> <mi>A</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <mi>L</mi> </mfrac> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>L</mi> </munderover> <msub> <mi>d</mi> <mi>A</mi> </msub> <msub> <mrow> <mo>(</mo> <msub> <mi>b</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> </msub> </mrow>
Minimum truncated side Hausdorff distance definitions are:
H (A, B)=max [hLTS(A, B), hLTS(B, A)].
6. image matching method as claimed in claim 1 or 2, it is characterised in that point feature description is calculated in the step (5) The matching similarity of son, be specially:
If real-time figure point feature description is [a0, a1..., aK-1], Prototype drawing point feature description is [b0, b1..., bK-1], If ai=bi, wherein, i=0,1 ..., K-1, then it is assumed that it is consistent that point feature describes sub- i-th dimension matching result;
The consistent dimension sum m of statistical match result;The total dimension K of point feature description is accounted for using the consistent dimension m of matching result Ratio the matching similarity PM of point feature is described:
<mrow> <mi>P</mi> <mi>M</mi> <mo>=</mo> <mfrac> <mi>m</mi> <mi>K</mi> </mfrac> <mo>.</mo> </mrow>
7. image matching method as claimed in claim 1 or 2, it is characterised in that the step (6) is specially:
It is more than threshold value t for the matching similarity of point featurepAngle point pair, according to the edge aggregation included around Prototype drawing angle point Real-time figure relevant position is checked, if the position is not extracted by edge, relevant position neighborhood extracts edge in real-time figure, and This part edge is used to calculate line characteristic matching similarity;
The aiming spot speculated is obtained according to the spatial relation of final matching angle point and target point, by the target point of supposition Position is added to candidate target point set;
All final matching angle points are handled using same method, the aiming spot of supposition is added into candidate target point concentrates; To candidate target point set progress nearest neighbor classifier, what number was most in selection class is used as final aiming spot;If cluster Failure, then directly concentrate selected point characteristic matching similarity highest point to be used as final aiming spot in candidate target point.
8. image matching method as claimed in claim 7, it is characterised in that the NNCA algorithm is specially:First K point is chosen from data set at random as initial cluster center, each sample is then calculated to the distance of cluster centre, sample Originally it is grouped into the class where that cluster centre nearest from it;Calculate the average value of the data object of each cluster newly formed To obtain new cluster centre, if adjacent cluster centre twice does not have any change, illustrate that sample adjustment terminates, cluster is accurate Then function has been restrained;Whether the classification that each sample will be investigated in each iteration is correct;If incorrect it is necessary to adjust, After whole samples have been adjusted, then cluster centre is changed, into next iteration.
9. image matching method as claimed in claim 1 or 2, it is characterised in that the middle use canny operators of the step (2), Sobel operators, Prewitt operators, Roberts operators or Laplacian operator extractions edge are simultaneously grown.
10. image matching method as claimed in claim 2, it is characterised in that the α values in the step (1) are 26.
CN201510162935.4A 2015-04-08 2015-04-08 A kind of image matching method of combination point feature and line feature Expired - Fee Related CN104915949B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510162935.4A CN104915949B (en) 2015-04-08 2015-04-08 A kind of image matching method of combination point feature and line feature

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510162935.4A CN104915949B (en) 2015-04-08 2015-04-08 A kind of image matching method of combination point feature and line feature

Publications (2)

Publication Number Publication Date
CN104915949A CN104915949A (en) 2015-09-16
CN104915949B true CN104915949B (en) 2017-09-29

Family

ID=54084987

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510162935.4A Expired - Fee Related CN104915949B (en) 2015-04-08 2015-04-08 A kind of image matching method of combination point feature and line feature

Country Status (1)

Country Link
CN (1) CN104915949B (en)

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105427263A (en) * 2015-12-21 2016-03-23 努比亚技术有限公司 Method and terminal for realizing image registering
CN106909877B (en) * 2016-12-13 2020-04-14 浙江大学 Visual simultaneous mapping and positioning method based on dotted line comprehensive characteristics
CN111368126B (en) * 2017-02-13 2022-06-07 哈尔滨理工大学 Image retrieval-oriented generation method
CN107341802B (en) * 2017-07-19 2021-02-09 无锡信捷电气股份有限公司 Corner sub-pixel positioning method based on curvature and gray scale compounding
CN107437097B (en) * 2017-07-28 2020-06-09 南京航空航天大学 Two-stage local contour matching method based on angular point description
CN108961240A (en) * 2018-07-03 2018-12-07 北京邮电大学 Destructor circuit board relic recognition methods
CN111178366B (en) * 2018-11-12 2023-07-25 杭州萤石软件有限公司 Mobile robot positioning method and mobile robot
CN109582795B (en) * 2018-11-30 2021-01-05 奇安信科技集团股份有限公司 Data processing method, device, system and medium based on full life cycle
CN111311673B (en) * 2018-12-12 2023-11-03 北京京东乾石科技有限公司 Positioning method and device and storage medium
CN109766943B (en) * 2019-01-10 2020-08-21 哈尔滨工业大学(深圳) Template matching method and system based on global perception diversity measurement
CN109767442B (en) * 2019-01-15 2020-09-04 上海海事大学 Remote sensing image airplane target detection method based on rotation invariant features
CN109871908A (en) * 2019-04-11 2019-06-11 上海电机学院 Paper fractional statistics system and its application method based on smart phone
CN110334560B (en) * 2019-07-16 2023-04-07 山东浪潮科学研究院有限公司 Two-dimensional code positioning method and device
CN113128516B (en) * 2020-01-14 2024-04-05 北京京东乾石科技有限公司 Edge extraction method and device
CN111474535B (en) * 2020-03-18 2022-03-15 广东省智能机器人研究院 Mobile robot global positioning method based on characteristic thermodynamic diagram
CN112233133B (en) * 2020-10-29 2023-04-14 上海电力大学 Power plant high-temperature pipeline defect detection and segmentation method based on OTSU and area growth method
CN112348837B (en) * 2020-11-10 2023-06-09 中国兵器装备集团自动化研究所 Point-line detection fusion object edge detection method and system
CN113111212B (en) * 2021-04-01 2024-05-17 广东拓斯达科技股份有限公司 Image matching method, device, equipment and storage medium
CN113743423A (en) * 2021-09-08 2021-12-03 浙江云电笔智能科技有限公司 Intelligent temperature monitoring method and system
CN114414605B (en) * 2021-11-25 2023-10-24 上海精测半导体技术有限公司 Method for acquiring actual pixel size of charged particle beam scanning imaging equipment
CN114187267B (en) * 2021-12-13 2023-07-21 沭阳县苏鑫冲压件有限公司 Stamping part defect detection method based on machine vision
CN114581376B (en) * 2022-01-31 2023-03-24 南通摩瑞纺织有限公司 Automatic sorting method and system for textile silkworm cocoons based on image recognition

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103236050A (en) * 2013-05-06 2013-08-07 电子科技大学 Auxiliary bank note and worn coin reestablishing method based on graph clustering
CN103679636A (en) * 2013-12-23 2014-03-26 江苏物联网研究发展中心 Rapid image splicing method based on point and line features

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100656859B1 (en) * 2005-12-23 2006-12-13 학교법인 포항공과대학교 Simultaneous location and mapping method using supersonic wave sensor and vision sensor
JP4752918B2 (en) * 2009-01-16 2011-08-17 カシオ計算機株式会社 Image processing apparatus, image collation method, and program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103236050A (en) * 2013-05-06 2013-08-07 电子科技大学 Auxiliary bank note and worn coin reestablishing method based on graph clustering
CN103679636A (en) * 2013-12-23 2014-03-26 江苏物联网研究发展中心 Rapid image splicing method based on point and line features

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Fusing Points and Lines for High Performance tracking;Edward Rosten等;《Tenth IEEE International Conference on Computer Vision》;20051021;第1-8页 *
ORB:an efficient alternative to SIFT or SURF;Ethan Rublee等;《IEEE International Conference On Computer Vision》;20111113;第1-8页 *
基于SIFT点特征和Canny边缘特征匹配的多源遥感影像配准研究;王万同等;《计算机科学》;20110731;第38卷(第7期);第287-289页 *
基于特征的图像匹配算法研究;郑刚;《中国优秀硕士学位论文全文数据库 信息科技辑》;20120715;第1-72页 *

Also Published As

Publication number Publication date
CN104915949A (en) 2015-09-16

Similar Documents

Publication Publication Date Title
CN104915949B (en) A kind of image matching method of combination point feature and line feature
CN112800964B (en) Remote sensing image target detection method and system based on multi-module fusion
US9619733B2 (en) Method for generating a hierarchical structured pattern based descriptor and method and device for recognizing object using the same
EP3101594A1 (en) Saliency information acquisition device and saliency information acquisition method
CN107145829B (en) Palm vein identification method integrating textural features and scale invariant features
CN104504365A (en) System and method for smiling face recognition in video sequence
CN103778436B (en) A kind of pedestrian&#39;s attitude detecting method based on image procossing
CN109101981B (en) Loop detection method based on global image stripe code in streetscape scene
CN106650580B (en) Goods shelf quick counting method based on image processing
CN103136520A (en) Shape matching and target recognition method based on PCA-SC algorithm
JP6997369B2 (en) Programs, ranging methods, and ranging devices
CN115713694B (en) Land mapping information management method
CN108229500A (en) A kind of SIFT Mismatching point scalping methods based on Function Fitting
CN109583493A (en) A kind of credit card detection and digit recognition method based on deep learning
CN110084743B (en) Image splicing and positioning method based on multi-flight-zone initial flight path constraint
CN106340010A (en) Corner detection method based on second-order contour difference
CN107705323A (en) A kind of level set target tracking method based on convolutional neural networks
Alsmadi et al. Pattern matching in Rotated Images Using Genetic Algorithm
CN105069459B (en) One kind is directed to High Resolution SAR Images type of ground objects extracting method
Wang et al. Accurate playground localisation based on multi-feature extraction and cascade classifier in optical remote sensing images
CN111523342A (en) Two-dimensional code detection and correction method in complex scene
CN107392211A (en) The well-marked target detection method of the sparse cognition of view-based access control model
JP6403201B2 (en) Image feature registration apparatus, method, and program
Kang et al. GuidedMixup: an efficient mixup strategy guided by saliency maps
CN108897747A (en) A kind of brand logo similarity comparison method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20170929

CF01 Termination of patent right due to non-payment of annual fee