CN104915949A - Image matching algorithm of bonding point characteristic and line characteristic - Google Patents

Image matching algorithm of bonding point characteristic and line characteristic Download PDF

Info

Publication number
CN104915949A
CN104915949A CN201510162935.4A CN201510162935A CN104915949A CN 104915949 A CN104915949 A CN 104915949A CN 201510162935 A CN201510162935 A CN 201510162935A CN 104915949 A CN104915949 A CN 104915949A
Authority
CN
China
Prior art keywords
point
angle point
edge
real
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510162935.4A
Other languages
Chinese (zh)
Other versions
CN104915949B (en
Inventor
王岳环
王康裕
吴明强
张天序
范蓉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong University of Science and Technology
Original Assignee
Huazhong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong University of Science and Technology filed Critical Huazhong University of Science and Technology
Priority to CN201510162935.4A priority Critical patent/CN104915949B/en
Publication of CN104915949A publication Critical patent/CN104915949A/en
Application granted granted Critical
Publication of CN104915949B publication Critical patent/CN104915949B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention discloses an image matching algorithm of bonding point characteristic and line characteristic descriptors. The method comprises the following steps: (1) carrying out angle point extraction on a template drawing and a real-time drawing under multiscale; (2) acquiring an edge set surrounding angle points of the real-time drawing and the template drawing; (3) calculating a class ORB point characteristic descriptor of real-time drawing and template drawing angle points which are acquired from the step 1 and are selected finally; (4) using a minimum cut square Hausdorff distance to describe a matching similarity of the real-time drawing and template drawing edge set acquired from the step 2; (5) calculating a matching similarity of the class ORB point characteristic descriptor of the real-time drawing and template drawing angle points, wherein the characteristic descriptor is acquired from the step 3; (6) matching result integration. In the method of the invention, firstly, a stable point characteristic is used to carry out primary selection on the angle points so as to acquire a candidate point set and a correct position is included; then a global line characteristic is used to screen the candidate point set so that a repetition mode can be reduced and a correct rate is increased.

Description

A kind of image matching algorithm of binding site characteristic sum line features
Technical field
The invention belongs to images match, computer vision and digital image processing techniques field, be specifically related to a kind of image matching algorithm of binding site characteristic sum line features.
Background technology
Along with developing rapidly of science and technology especially computer technology, scene matching aided navigation recognition technology has become the of crucial importance and basic technology of field of information processing, is particularly with a wide range of applications in fields such as Aero-Space, Navigation of Pilotless Aircraft and Missile Terminal Guidance.
Scene matching aided navigation identification is the process of the real-time figure of aircraft shooting and the Prototype drawing to prepare in advance being carried out match cognization.By image recognition to determine aircraft present position, thus course deviation can be revised.The Prototype drawing of current employing is mainly derived from defends sheet or boat sheet, and figure is the lower images that Airborne Camera (or missile-borne camera) is taken in real time.
Real-time figure is generally infrared image, and Prototype drawing may be visible ray or infrared image.Prototype drawing is schemed because shooting time, imager are different from real-time, and its gray scale texture and real-time figure have larger difference usually.In reality scene, the object in real-time figure also can be subject to illumination, shade, and cloud and mist such as to block at the impact, and larger change can occur its gray scale texture features.Particularly in infrared image, even if the Different periods of its gray feature of identical object in one day also can change.
Usually according to Prototype drawing and part about the priori of material, two-value (many-valued) template of comparatively significantly stable structure feature in reflection Prototype drawing can be generated by the method for cluster in scene matching aided navigation identification.Two-value (many-valued) template is applied in the coupling of complex scene, can eliminate the impact adapting to grey scale change and bring, but it is more to there will be repeat pattern, the problem that descriptive power is inadequate in statistical significance upper part.
According to the difference of the data structure of participation computing, the match cognization algorithm of image has two research directions: feature-based matching recognition methods and the matching and recognition method based on region.Wherein, the famous matching and recognition method based on point patterns comprises scale invariant feature conversion (Scale-invariant feature transform, SIFT), rapid robust feature (Speed-Up Robust Features, SURF), the BRIEF algorithm (ORiented Brief, ORB) of fixed setting.SIFT class algorithm is applicable to the image describing texture-rich, and in simple two-value template, texture information is less, lines are simple, extractible angle point negligible amounts, and rely on partial gradient when SIFT class algorithm finds principal direction, the inaccurate descriptive power that can affect descriptor in principal direction location.
Conventional Edge Gradient Feature method comprises: Robert edge detection operator, Gauss-Laplace operator (LOG operator), Canny operator, Hough transform etc.The development of edge matching algorithm relative angle Point matching is comparatively slow, and current edge matching algorithm is easily affected by noise, and matching precision and restraint speckle ability can not be taken into account usually, and complex scene lower limb extracts potentially unstable, affects matching result.
Matching and recognition method based on region is also called template matching method, it is a kind of common image matching algorithm, the most frequently used matching process based on region comprises: gray scale cross correlation algorithm and Mutual Information Matching algorithm etc., mainly utilize certain conversion of the half-tone information in region or half-tone information to carry out match cognization.Its development comparative maturity, good stability, reliability are high, are widely used in the field such as pattern-recognition, aircraft navigation.Match cognization based on region wears stronger priori in the selecting party in region, and when template is larger, computing is consuming time larger.
Summary of the invention
The present invention proposes a kind of image matching algorithm of binding site characteristic sum line features descriptor, and object is, by using the multiple candidate matches points of line features to point patterns of angle point neighborhood to screen, to obtain the coupling station-keeping ability better in complex scene.
For realizing object of the present invention, the invention provides a kind of image matching algorithm of binding site characteristic sum line features descriptor, the method following steps taked:
(1) respectively to Prototype drawing and real-time figure at multiple dimensioned lower extraction angle point
Respectively yardstick pyramid diagram picture is built to Prototype drawing and real-time figure.Build yardstick pyramid diagram as time, keep image size constant, by change Gaussian Blur size and for calculate haar response window size to obtain the pyramid diagram picture of every one deck different scale.In every layer of pyramid diagram picture, use the local curvature ρ of each pixel of hessian matrix computations, the local curvature ρ of α point on each pixel and its yardstick neighborhood (is got 26 points on 3 yardstick neighborhoods usually, namely α is taken as 26 usually) compare, if the local curvature ρ of this pixel be this α point in maximal value or minimum value, then remain, as the angle point of initial option.
To the angle point of initial option, use dual threshold (t1, t2, and t1<t2) to screen according to its local curvature ρ, obtain final selected angle point.In screening process, the angle point that local curvature ρ is less than threshold value t1 is then directly cast out, the angle point that local curvature ρ is greater than threshold value t2 then directly retains, local curvature ρ is greater than threshold value t1 but is less than the angle point of t2, if more long edge (illustrating that this angle point neighborhood can describe enough architectural features) can be extracted at its neighborhood, then retain this angle point, otherwise cast out.
(2) edge aggregation schemed in real time and around Prototype drawing angle point is obtained
The final selected angle point that traversal step (1) obtains, from the β neighborhood of in real time figure and Prototype drawing angle point, (β neighborhood is taken as 10s radius neighborhood usually respectively, s is the pixel unit of this angle point place yardstick) start to extract edge and grow, obtain the edge aggregation around angle point.Obtain in edge aggregation process, the pixel belonging to certain edge is labeled as n (n represents n-th boundary curve that this pixel belongs to found), will to process but the pixel being considered to not belong to edge is labeled as NotEdge.If certain angle point neighborhood comprises the pixel being marked as n, then the direct edge aggregation n-th boundary curve being incorporated to this angle point, no longer grows from this pixel.The pixel processed is no longer processed.
By the marginal point coordinate that extracts in units of curve, be stored in chained list, use array M (index is above-mentioned n) to carry out index to chained list.Edge aggregation around angle point uses chained list to represent.Value in chained list is the index of its boundary curve comprised in array M.
(3) the point patterns descriptor of angle point is calculated
The final selected real-time figure utilizing formula (1) or (2) description step (1) to obtain and the class ORB point patterns descriptor of Prototype drawing angle point.In compute classes ORB point patterns descriptor process, first at the segment of angle point neighborhood choice some to (i.e. the set of two segments composition), calculate the gray-scale value sum of pixel in each segment, the gray-scale value sum that more each segment is right, result according to comparing is encoded, the coding of all segment compared result constitutes the class ORB point patterns descriptor (can according to way selection 512 segments pair of standard ORB algorithm suggestion, all segments all obtain on this angle point place yardstick) of this angle point.Shown in the following formula of coded system that each segment is right:
&tau; ( p ; x , y ) = 01 , if p ( x ) < p ( y ) - t , ( darker ) 00 , if | p ( x ) - p ( y ) | &le; t , ( similar ) 10 , if p ( x ) > p ( y ) + t , ( brighter ) - - - ( 1 )
Wherein t is the threshold value according to requiring real-time figure contrast to select, the gray-scale value sum of pixel in the segment that p (x) and p (y) expression is selected.If only consider material contrast (such as infrared image), coding 01 and 10 can be merged into 1, namely formula is:
&tau; ( p ; x , y ) = 1 , if | p ( x ) - p ( y ) | > t , ( different ) 0 , if | p ( x ) - p ( y ) | &le; t , ( similar ) - - - ( 2 )
Each segment calculates a τ to i i, the coding of K segment compared result constitutes the point patterns descriptor [τ of this angle point 0, τ 1..., τ k-1].
(4) edge aggregation matching similarity calculates
The matching similarity of the edge aggregation of utilize minimum truncated side Hausdorff distance (Least Trimmed Square-Hausdorff Distance, LTS-HD) to describe real-time figure that step (2) obtains and Prototype drawing.Point-to-point collection sorts by min coordinates distance (using Euclidean distance to describe) by LTS-HD from small to large, gets the mean value of front h as unidirectional Hausdorff distance.
The collection A that sets up an office is the set scheming pixel composition in certain angle point surrounding edge set in real time, and point set B is the set of pixel composition in the set of Prototype drawing angle point surrounding edge, and its pixel quantity is respectively N aand N b.To calculate in point set A a little with the min coordinates distance of the point in point set B, and to sort from small to large, obtain ordered set X.Wherein, d b(a i) (i)represent the some a in point set A iin the minor increment of minor increment other points in institute pointed set A of the point in point set B, sequence is i-th.
X = { d B ( a 1 ) ( 1 ) , d B ( a 2 ) ( 2 ) , d B ( a 3 ) ( 3 ) , . . . , d B ( a N A ) ( N A ) }
In like manner to calculate in point set B a little with the min coordinates distance of the point in point set A, and to sort from small to large, obtain ordered set Y.Wherein, d a(b i) (i)represent the some b in point set B iin the minor increment of minor increment other points in institute pointed set A of the point in point set A, sequence is i-th.
Y = { d A ( b 1 ) ( 1 ) , d A ( b 2 ) ( 2 ) , d A ( b 3 ) ( 3 ) , . . . , d A ( b N B ) ( N B ) }
If K=f 1× N a, L=f 2× N b, f 1and f 2be respectively point set pixel quantity N aand N bcorresponding scale-up factor, and meet 0<f 1<1,0<f 2<1.One-way distance h between definition A and B lTS(A, B) and h lTS(B, A) is:
h LTS ( A , B ) = 1 K &Sigma; i = 1 K d B ( a ) ( i ) - - - ( 3 )
h LTS ( B , A ) = 1 L &Sigma; i = 1 L d A ( b ) ( i ) - - - ( 4 )
LTS-HD distance definition is:
H(A,B)=max[h LTS(A,B),h LTS(B,A)] (5)
Visible, LTS-HD range formula is by 2 coefficient f 1and f 2carry out the number of the point that control and participate in calculates, by king-sized for a lowest distance value filtering, the impact of deep error matching points can not only be eliminated, and stronger to the elimination ability of Gaussian noise.
Specifically all being converted to the coordinate figure of the point in edge aggregation with angle point is the coordinate figure of initial point.When calculating the matching double points of point set A to point set B, traversal describes the chained list of point set A inward flange point, the coordinate of acquisition marginal point, then from the close-by examples to those far off carries out searching marginal point in r × r neighborhood around corresponding coordinate position p in the figure at B place.If there is marginal point in r × r neighborhood, then using the distance between the coordinate of this marginal point and position p as minor increment.If do not find marginal point in neighborhood, then lowest distance value is set to 2r to reduce similarity evaluation.Identical computing method are adopted when calculating the minor increment of the matching double points of point set B to A.After obtaining the right minor increment of point, calculate the LTS-HD distance H between two point sets according to formula (3), (4) and (5) successively, then the matching similarity LM of edge aggregation (i.e. line features) is defined as follows:
LM = 2 r - H 2 r - - - ( 6 )
(5) points correspondence Similarity Measure
By the matching similarity of the real-time figure of acquisition and the class ORB point patterns descriptor of Prototype drawing angle point in formula (7) calculation procedure (3).If figure point patterns descriptor is [a in real time 0, a 1..., a k-1], Prototype drawing point patterns descriptor is [b 0, b 1..., b k-1], if a i=b i(i=0,1 ..., K-1), then think that point patterns descriptor i-th ties up matching result consistent.The dimension sum m that statistical match result is consistent.The ratio using the consistent dimension m of matching result to account for the total dimension K of point patterns descriptor describes the matching similarity PM of point patterns:
PM = m K - - - ( 7 )
Concrete use variable W carrys out memory point Feature Descriptor [a by formula (8) 0, a 1..., a k-1], in fact because the limited bits of machine variable type, multiple variable step-by-step simple combination can be used to represent variable W.To the variable W step-by-step XOR storing point patterns descriptor, be then that namely the figure place of 1 represents the dimension that matching result is consistent in outcome variable.If use multiple byte variable (char type) to combine represent variable W, then can directly be obtained by the mode of tabling look-up in outcome variable be 1 figure place.
W = &Sigma; i = 0 K - 1 a i &times; 2 i - - - ( 8 )
(6) matching result is comprehensive
Carry out comprehensively, requiring that points correspondence similarity is greater than threshold value t to the points correspondence similarity obtained in step (4) and step (5) and line features matching similarity p, line features matching similarity is greater than threshold value t l, get the angle point composition candidate matches point set that N before points correspondence similarity is large, then use line features to evaluate these candidate matches points, the highest result of line taking characteristic matching similarity is as finally mating angle point.
Matching similarity for point patterns is greater than threshold value t pangle point pair, check real-time figure relevant position according to the edge aggregation comprised around Prototype drawing angle point, if this position is not extracted edge, then in real-time figure, relevant position neighborhood extracts edge, and is used for calculating line features matching similarity in this part edge.
Spatial relation according to final coupling angle point and impact point obtains the aiming spot inferred, adds the aiming spot of supposition to candidate target point set.Use all final coupling angle points of same method process, the aiming spot of supposition is added to candidate target point and concentrate.Nearest neighbor classifier is carried out to candidate target point set, select number in class maximum as final aiming spot.If cluster failure, then directly concentrate point that selected point characteristic matching similarity is the highest as final aiming spot at candidate target point.
NNCA algorithm is first random chooses K point as initial cluster center from data centralization, then calculates the distance of each sample to cluster centre, sample is grouped into the class from its that nearest cluster centre place.The mean value calculating the data object of new each cluster formed obtains new cluster centre, if the cluster centre of adjacent twice is without any change, illustrate that sample adjustment terminates, clustering criteria function is restrained.Whether the classification all will investigating each sample is in each iteration correct.If incorrect, will adjust, and after whole sample has adjusted, then revise cluster centre, enter next iteration.
Compared with prior art, the present invention has following beneficial effect:
(1) first use stable point patterns to carry out primary election to angle point and obtain candidate's point set, ensure correct position to include, then use the line features of the comparatively overall situation to screen candidate's point set, can repeat pattern be reduced, improve accuracy.
(2) use dual threshold to screen angle point, can retain the angle point of the comparatively large or edge feature of local curvature compared with horn of plenty, the process obtaining angle point has taken into account point patterns and line features simultaneously.
(3) automatically extract angle point and binding site characteristic sum line features mates, save the process in artificial selection region in Region Matching Algorithm, good matching result can be obtained again.
Accompanying drawing explanation
Fig. 1 is the overall flow figure of the image matching algorithm of binding site characteristic sum line features of the present invention;
Fig. 2 tentatively extracts angle point according to non-maximum value constraint in the present invention;
Fig. 3 extracts edge carrying out at angle point neighborhood in the embodiment of the present invention to grow and data structure for storing curve;
Fig. 4 is for generating the segment pair of class ORB point patterns descriptor in the embodiment of the present invention;
Fig. 5 is the coupling schematic diagram of angle point neighborhood edge aggregation in the embodiment of the present invention.
Embodiment
In order to make object of the present invention, technical scheme and advantage clearly understand, below in conjunction with drawings and Examples, the present invention is further elaborated.Should be appreciated that specific embodiment described herein only in order to explain the present invention, be not intended to limit the present invention.In addition, if below in described each embodiment of the present invention involved technical characteristic do not form conflict each other and just can mutually combine.
Point patterns uses local gray level information, more stable, but due to the restriction of local window size, easily duplicates pattern, and line features is comparatively overall, has stronger structure separating capacity, but extracts potentially unstable at complex scene lower limb.First use stable point patterns to carry out primary election to angle point and obtain candidate's point set, ensure correct position to include, then use the line features of the comparatively overall situation to screen candidate's point set, can repeat pattern be reduced, improve accuracy.Using in edge screening process, according to scheming the matching relationship pressing point patterns with Prototype drawing angle point in real time, edge extracting and coupling can be carried out as prompting to real-time figure with the edge in Prototype drawing.
As shown in Figure 1, the invention provides a kind of image matching algorithm of binding site characteristic sum line features, the method concrete steps are as follows:
(1) respectively to Prototype drawing and real-time figure at multiple dimensioned lower extraction angle point
Respectively yardstick pyramid diagram picture is built to Prototype drawing and real-time figure.Build yardstick pyramid diagram as time, keep image size constant, by change Gaussian Blur size and for calculate haar response window size to obtain the pyramid diagram picture of every one deck different scale.In every layer of pyramid diagram picture, use the local curvature ρ of each pixel of hessian matrix computations, the local curvature ρ of α point on each pixel and its yardstick neighborhood (is got 26 points on 3 yardstick neighborhoods usually, namely α is taken as 26 usually) compare, if the local curvature ρ of this pixel be this α point in maximal value or minimum value, then remain, as the angle point of initial option.
As shown in Figure 2, be labeled as × position carry out size with its 3 26 points tieing up neighborhood and compare, carry out non-maximum value constraint.
To the angle point of initial option, use dual threshold (t1, t2, and t1<t2) to screen according to its local curvature ρ, obtain final selected angle point.In screening process, the angle point that local curvature ρ is less than threshold value t1 is then directly cast out, the angle point that local curvature ρ is greater than threshold value t2 then directly retains, local curvature ρ is greater than threshold value t1 but is less than the angle point of t2, if more long edge (illustrating that this angle point neighborhood can describe enough architectural features) can be extracted at its neighborhood, then retain this angle point, otherwise cast out.
Threshold value t1 can arrange smaller to retain abundant information, here 30 are set to, threshold value t2 can arrange larger more stable to ensure the angle point extracted, and is set to 400 (default threshold extracting hessian during angle point in opencv is 100) here.Here think to exist and can describe enough structural strengths more than this angle point neighborhood during more than 2 10 length in pixels edge, retained when the local curvature ρ of this angle point is greater than threshold value t1.
(2) edge aggregation schemed in real time and around Prototype drawing angle point is obtained
The final selected angle point that traversal step (1) obtains, from the β neighborhood of in real time figure and Prototype drawing angle point, (β neighborhood is taken as 10s radius neighborhood usually respectively, s is the pixel unit of this angle point place yardstick) start to extract edge and grow, obtain the edge aggregation around angle point.Obtain in edge aggregation process, the pixel belonging to certain edge is labeled as n (n represents n-th boundary curve that this pixel belongs to found), will to process but the pixel being considered to not belong to edge is labeled as NotEdge.If certain angle point neighborhood comprises the pixel being marked as n, then the direct edge aggregation n-th boundary curve being incorporated to this angle point, no longer grows from this pixel.The pixel processed is no longer processed.
Prototype drawing and real-time figure are used canny operator extraction edge and grown.Can adopt canny to real-time figure, sobel or faster Boundary extracting algorithm extract edge.Can increase the extraction length at edge and the distance of edge pixel point and angle point and retrain.
It is to be noted, here also other arithmetic operators can be used, as: Sobel operator, Prewitt operator, Roberts operator, Laplacian operator, but experiment test proves: Comparatively speaking, canny operator can extract actual edge more accurately.
By the marginal point coordinate that extracts in units of curve, be stored in chained list, use array M (index is above-mentioned n) to carry out index to chained list.Edge aggregation around angle point uses chained list to represent.Value in chained list is the index of its boundary curve comprised in array M.
As shown in Figure 3, extract edge at the neighborhood of a P and grow, obtaining curve l 0and l 1, then obtain respectively and store curve l 0and l 1chained list list0 and list1, simultaneously to curve l 0on pixel be labeled as m, to curve l 1on pixel be labeled as n.Then in array M, index is the position storage sensing chained list list0 of m and n and the pointer of list1 head node.
(3) the point patterns descriptor of angle point is calculated
The final selected real-time figure utilizing formula (1) or (2) description step (1) to obtain and the class ORB point patterns descriptor of Prototype drawing angle point.In compute classes ORB point patterns descriptor process, first at the segment of angle point neighborhood choice some to (i.e. the set of two segments composition), calculate the gray-scale value sum of pixel in each segment, the gray-scale value sum that more each segment is right, result according to comparing is encoded, the coding of all segment compared result constitutes the class ORB point patterns descriptor (can according to the way selection 512 of the suggestion of standard ORB algorithm to the segment of 9 × 9 pixel sizes, all segments all obtain on this angle point place yardstick) of this angle point.
Shown in the following formula of coded system that each segment is right:
&tau; ( p ; x , y ) = 01 , if p ( x ) < p ( y ) - t , ( darker ) 00 , if | p ( x ) - p ( y ) | &le; t , ( similar ) 10 , if p ( x ) > p ( y ) + t , ( brighter ) - - - ( 1 )
Wherein t is the threshold value according to requiring real-time figure contrast to select, the gray-scale value sum of pixel in the segment that p (x) and p (y) expression is selected.If only consider material contrast (such as infrared image), coding 01 and 10 can be merged into 1, namely formula is:
&tau; ( p ; x , y ) = 1 , if | p ( x ) - p ( y ) | > t , ( different ) 0 , if | p ( x ) - p ( y ) | &le; t , ( similar ) - - - ( 2 )
Each segment calculates a τ to i i, the coding of K segment compared result constitutes the point patterns descriptor [τ of this angle point 0, τ 1..., τ k-1].
As shown in Figure 4, represent wherein a pair segment selected in angle point neighborhood, compare the gray-scale value in two segments, utilize formula (1) or (2) to calculate this segment value corresponding in point patterns descriptor.It is 0.2 that the contrast such as required requires, then t is chosen for 0.2 × min (p (x), p (y)), certainly assesses the cost to save, also can be elected as fixed value, directly can be taken as 405 (being 9 × 9 × 5).
(4) edge aggregation matching similarity calculates
Utilize minimum truncated side Hausdorff distance (Least Trimmed Square-Hausdorff Distance, LTS-HD), the matching similarity of the edge aggregation of step (2) the real-time figure that obtains and Prototype drawing is described according to formula (5).Point-to-point collection sorts by min coordinates distance (using Euclidean distance to describe) by LTS-HD from small to large, gets the mean value of front h as unidirectional Hausdorff distance.
Specifically all being converted to the coordinate figure of the point in edge aggregation with angle point is the coordinate figure of initial point.When calculating the matching double points of point set A to point set B, traversal describes the chained list of point set A inward flange point, the coordinate of acquisition marginal point, then from the close-by examples to those far off carries out searching marginal point in r × r neighborhood around corresponding coordinate position p in the figure at B place.If there is marginal point in r × r neighborhood, then using the distance between the coordinate of this marginal point and position p as minor increment.If do not find marginal point in neighborhood, then lowest distance value is set to 2r to reduce similarity evaluation.Identical computing method are adopted when calculating the minor increment of the matching double points of point set B to A.After obtaining the right minor increment of point, the LTS-HD distance between two point sets is calculated, according to the matching similarity of formula (6) edge calculation set (i.e. line features) successively according to formula (3), (4) and (5).
Such as, certain angular coordinate of figure is (140,420) in real time, and in the edge aggregation around it, the coordinate of certain marginal point is (200,400), then this marginal point is (60 ,-20) for the relative coordinate of angle point.When mating with Prototype drawing angle point, under the coordinate system being initial point with this angle point of Prototype drawing, draw near in r × r neighborhood near (60 ,-20) position in the array that marked Prototype drawing marginal point the marginal point searched in Prototype drawing.If there is marginal point, then the distance obtained is calculated the LTS-HD distance between two point sets according to formula (3), (4) and (5) successively.If there is no marginal point, then service range 2r is for calculating LTS-HD distance.Here r determines according to requirement of real-time and to the tolerance of site error, is elected as 5 here.
(5) points correspondence Similarity Measure
By the matching similarity of the real-time figure of acquisition and the class ORB point patterns descriptor of Prototype drawing angle point in formula (7) calculation procedure (3).If figure point patterns descriptor is [a in real time 0, a 1..., a k-1], Prototype drawing point patterns descriptor is [b 0, b 1..., b k-1], if a i=b i(i=0,1 ..., K-1), then think that point patterns descriptor i-th ties up matching result consistent.The dimension sum m that statistical match result is consistent.Formula (7) is used to describe the matching similarity of point patterns.Here K is elected as 512 according to the suggestion of standard ORB.
Such as, 8 integer variables (32) of 512 dimension real-time figure point patterns descriptor are stored for [0x80,0x40,0x00,0x00,0x00,0x00,0x00,0x00], the integer variable of Prototype drawing point patterns descriptor is [0x40,0x40,0x00,0x00,0x00,0x00,0x00,0x00], after step-by-step XOR, obtain [0xc0,0x00,0x00,0x00,0x00,0x00,0x00,0x00].By step-by-step right-shift operation, get least-significant byte at every turn, can obtain 64 variablees, carrying out tabling look-up in binary representation that (arrays of 256 dimensions) directly can obtain these variablees according to these variablees has how many 0 (0 represents that this dimension matching result is consistent).Such as 70 are comprised to 0x01 known its of tabling look-up, 60 are comprised to 0x03 known its of tabling look-up.
(6) matching result is comprehensive
Carry out comprehensively, requiring that points correspondence similarity is greater than threshold value t to the points correspondence similarity obtained in step (4) and step (5) and line features matching similarity p, line features matching similarity is greater than threshold value t l, the angle point that before getting points correspondence similarity, N is large forms candidate matches point set, then uses line features to evaluate these candidate matches points, is taken at line features and describes the highest result of lower similarity as finally mating angle point.Here t pbe taken as 0.6 (namely requiring that the segment number that in point patterns, comparative result is consistent will account for more than 60% of total segment number).T l(suppose that the probability that marginal point neighborhood exists the marginal point corresponding with it is 0.5, then the LTS-HD distance expected can rough calculation be 6, i.e. 0.5 × 2r+ to be taken as 0.4 obtaining line features matching similarity LM by formula (6) is 0.4, therefore using 0.4 as t l).N is taken as 5.
Matching similarity for point patterns is greater than threshold value t pangle point pair, check real-time figure relevant position according to the edge aggregation comprised around Prototype drawing angle point, if this position is not extracted edge, then in real-time figure, relevant position neighborhood extracts edge, and is used for calculating line features matching similarity in this part edge.
As shown in Figure 5, some P is angle point, and circle is angle neighborhood of a point, and Fig. 5 .a is the edge aggregation that Prototype drawing neighborhood comprises, and some s is a bit on edge.Fig. 5 .b is the edge near real-time figure neighborhood, and dotted line represents the edge failing and extract.Because the impact of discontinuous edge, the edge aggregation that real-time figure angle point neighborhood comprises only comprises the part grown out from its neighborhood.When calculation template figure angle point surrounding edge point set is to the unidirectional Hausdorff distance of real-time figure focus surrounding edge point set, after tentatively determining matching double points with point patterns, utilize edge complete in Prototype drawing, edge extracting is carried out to the region of not extracting edge in the neighborhood of real-time figure relevant position.Such as, some s in Fig. 5 .a, from the close-by examples to those far off carries out searching marginal point in r × r neighborhood (the box indicating neighborhood above Fig. 4 mid point s) of real-time figure relevant position.If there is marginal point in r × r neighborhood, then using the distance between the coordinate of this marginal point and position p as minor increment.If do not find marginal point in neighborhood, then lowest distance value is set to 2r to reduce similarity evaluation.Unidirectional Hausdorff distance is calculated according to formula (3) or (4).
Spatial relation according to final coupling angle point and impact point obtains the aiming spot inferred, adds the aiming spot of supposition to candidate target point set.Use all final coupling angle points of same method process, the aiming spot of supposition is added to candidate target point and concentrate.Nearest neighbor classifier is carried out to candidate target point set, select number in class maximum as final aiming spot.If cluster failure, then directly concentrate point that selected point characteristic matching similarity is the highest as final aiming spot at candidate target point.
Such as, in Prototype drawing, the matching similarity of angle point Q is greater than threshold value t preal-time figure angle point in, get the large angle point of N before similarity and form point set [p 0, p 1..., p n-1], use line features to mate at the angle point concentrated this point, the result p that line taking characteristic matching similarity is the highest ias in real-time figure with the matching result of Prototype drawing angle point Q.
Cluster is carried out to candidate target point set, select number in class maximum as final aiming spot.If cluster failure, then directly concentrate point that selected point characteristic matching similarity is the highest as final aiming spot at candidate target point.
Those skilled in the art will readily understand; the foregoing is only preferred embodiment of the present invention; not in order to limit the present invention, all any amendments done within the spirit and principles in the present invention, equivalent replacement and improvement etc., all should be included within protection scope of the present invention.

Claims (10)

1. an image matching algorithm for binding site characteristic sum line features descriptor, is characterized in that, described method comprises the steps:
(1) respectively to Prototype drawing and real-time figure at multiple dimensioned lower extraction angle point: keep image size constant, obtained the image of every one deck different scale by the size changing Gaussian Blur size and calculate haar response window; In every layer of pyramid diagram picture, use the local curvature ρ of each pixel of hessian matrix computations, the local curvature ρ of α point on each pixel and its yardstick neighborhood is compared, if the local curvature ρ of this pixel be this α point in maximal value or minimum value, then remain, as the angle point of initial option; To the angle point of initial option, use dual threshold to screen according to its local curvature ρ, obtain final selected angle point;
(2) edge aggregation schemed in real time and around Prototype drawing angle point is obtained: the final selected angle point that traversal step (1) obtains, from the β neighborhood of in real time figure and Prototype drawing angle point, extract edge respectively and grow, the edge aggregation around acquisition angle point;
(3) the class ORB point patterns descriptor of the final selected real-time figure that obtains of calculation procedure (1) and Prototype drawing angle point: the segment pair selecting some, calculate the gray-scale value sum of pixel in each segment, the gray-scale value sum that more each segment is right, result according to comparing is encoded, and the coding of all segment compared result constitutes the class ORB point patterns descriptor of this angle point;
(4) edge matching calculates: the matching similarity of the real-time figure utilizing minimum truncated side Hausdorff distance description step (2) to obtain and the edge aggregation of Prototype drawing;
(5) points correspondence calculates: the matching similarity of the real-time figure obtained in calculation procedure (3) and the class ORB point patterns descriptor of Prototype drawing angle point;
(6) matching result is comprehensive: carry out comprehensively, requiring that points correspondence similarity is greater than threshold value t to the points correspondence similarity obtained in step (4) and step (5) and line features matching similarity p, line features matching similarity is greater than threshold value t l, get the angle point composition candidate matches point set that N before points correspondence similarity is large, then use line features to evaluate these candidate matches points, the highest result of line taking characteristic matching similarity is as finally mating angle point.
2. the method for claim 1, is characterized in that, the angle point to initial option in described step (1), uses dual threshold to screen according to its local curvature ρ, obtains final selected angle point, specifically comprises:
Described dual threshold is t1, t2, and t1<t2; The angle point that local curvature ρ is less than threshold value t1 is then directly cast out, the angle point that local curvature ρ is greater than threshold value t2 then directly retains, and local curvature ρ is greater than threshold value t1 but is less than the angle point of t2, if can extract more long edge at its neighborhood, then retain this angle point, otherwise cast out.
3. method as claimed in claim 1 or 2, it is characterized in that, obtain in described step (2) in edge aggregation process, the pixel belonging to certain edge is labeled as n, n represents n-th boundary curve that this pixel belongs to found, will process but the pixel being considered to not belong to edge is labeled as NotEdge; If certain angle point neighborhood comprises the pixel being marked as n, then the direct edge aggregation n-th boundary curve being incorporated to this angle point, no longer grows from this pixel; The pixel processed is no longer processed.
4. method as claimed in claim 1 or 2, is characterized in that, shown in the following formula of coded system that in described step (3), each segment is right:
&tau; ( p ; x , y ) = 01 , if p ( x ) < p ( y ) - t , ( darker ) 00 , if | p ( x ) - p ( y ) | &le; t , ( similar ) 10 , if p ( x ) > p ( y ) + t , ( brighter )
Wherein t is the threshold value according to requiring real-time figure contrast to select, the gray-scale value sum of pixel in the segment that p (x) and p (y) expression is selected;
For the image only considering material contrast, coding 01 and 10 is merged into 1, and namely formula is:
&tau; ( p ; x , y ) = 1 , if | p ( x ) - p ( y ) | > t , ( different ) 0 , if | p ( x ) - p ( y ) | &le; t , ( similar )
Each segment calculates a τ to i i, the coding of K segment compared result constitutes the point patterns descriptor [τ of this angle point 0, τ 1..., τ k-1].
5. method as claimed in claim 1 or 2, is characterized in that, calculates minimum truncated side Hausdorff distance, be specially in described step (4):
To calculate in point set A a little with the min coordinates distance of the point in point set B, and to sort from small to large, obtain ordered set X; Wherein, point set A is the set scheming pixel composition in certain angle point surrounding edge set in real time, and point set B is the set of pixel composition in the set of Prototype drawing angle point surrounding edge, and its pixel quantity is respectively N aand N b; d b(a i) (i)represent the some a in point set A iin the minor increment of minor increment other points in institute pointed set A of the point in point set B, sequence is i-th;
X = { d B ( a 1 ) ( 1 ) , d B ( a 2 ) ( 2 ) , d B ( a 3 ) ( 3 ) , . . . , d B ( a N A ) ( N A ) }
In like manner to calculate in point set B a little with the min coordinates distance of the point in point set A, and to sort from small to large, obtain ordered set Y; Wherein, d a(b i) (i)represent the some b in point set B iin the minor increment of minor increment other points in institute pointed set A of the point in point set A, sequence is i-th;
Y = { d A ( b 1 ) ( 1 ) , d A ( b 2 ) ( 2 ) , d A ( b 3 ) ( 3 ) , . . . , d A ( b N B ) ( N B ) }
If K=f 1× N a, L=f 2× N b, f 1and f 2be respectively point set pixel quantity N aand N bcorresponding scale-up factor, and meet 0<f 1<1,0<f 2<1; One-way distance h between definition A and B lTS(A, B) and h lTS(B, A) is:
h LTS ( A , B ) = 1 K &Sigma; i = 1 K d B ( a ) ( i )
h LTS ( B , A ) = 1 L &Sigma; i = 1 L d A ( b ) ( i )
Minimum truncated side Hausdorff distance definition is:
H(A,B)=max[h LTS(A,B),h LTS(B,A)]。
6. method as claimed in claim 1 or 2, is characterized in that, calculates the matching similarity of point patterns descriptor, be specially in described step (5):
If figure point patterns descriptor is [a in real time 0, a 1..., a k-1], Prototype drawing point patterns descriptor is [b 0, b 1..., b k-1], if a i=b i(i=0,1 ..., K-1), then think that point patterns descriptor i-th ties up matching result consistent;
The dimension sum m that statistical match result is consistent; The ratio using the consistent dimension m of matching result to account for the total dimension K of point patterns descriptor describes the matching similarity PM of point patterns:
PM = m K .
7. method as claimed in claim 1 or 2, it is characterized in that, described step (6) is specially:
Matching similarity for point patterns is greater than threshold value t pangle point pair, check real-time figure relevant position according to the edge aggregation comprised around Prototype drawing angle point, if this position is not extracted edge, then in real-time figure, relevant position neighborhood extracts edge, and is used for calculating line features matching similarity in this part edge;
Spatial relation according to final coupling angle point and impact point obtains the aiming spot inferred, adds the aiming spot of supposition to candidate target point set;
Use all final coupling angle points of same method process, the aiming spot of supposition is added to candidate target point and concentrate; Nearest neighbor classifier is carried out to candidate target point set, select number in class maximum as final aiming spot; If cluster failure, then directly concentrate point that selected point characteristic matching similarity is the highest as final aiming spot at candidate target point.
8. method as claimed in claim 7, it is characterized in that, described NNCA algorithm is specially: first random from data centralization choose K point as initial cluster center, then calculate the distance of each sample to cluster centre, sample is grouped into the class from its that nearest cluster centre place; The mean value calculating the data object of new each cluster formed obtains new cluster centre, if the cluster centre of adjacent twice is without any change, illustrate that sample adjustment terminates, clustering criteria function is restrained; Whether the classification all will investigating each sample is in each iteration correct; If incorrect, will adjust, and after whole sample has adjusted, then revise cluster centre, enter next iteration.
9. method as claimed in claim 1 or 2, is characterized in that, adopts canny operator, Sobel operator, Prewitt operator, Roberts operator, Laplacian operator extraction edge growing in described step (2).
10. method as claimed in claim 2, it is characterized in that, the α value in described step (1) is 26.
CN201510162935.4A 2015-04-08 2015-04-08 A kind of image matching method of combination point feature and line feature Expired - Fee Related CN104915949B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510162935.4A CN104915949B (en) 2015-04-08 2015-04-08 A kind of image matching method of combination point feature and line feature

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510162935.4A CN104915949B (en) 2015-04-08 2015-04-08 A kind of image matching method of combination point feature and line feature

Publications (2)

Publication Number Publication Date
CN104915949A true CN104915949A (en) 2015-09-16
CN104915949B CN104915949B (en) 2017-09-29

Family

ID=54084987

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510162935.4A Expired - Fee Related CN104915949B (en) 2015-04-08 2015-04-08 A kind of image matching method of combination point feature and line feature

Country Status (1)

Country Link
CN (1) CN104915949B (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105427263A (en) * 2015-12-21 2016-03-23 努比亚技术有限公司 Method and terminal for realizing image registering
CN106844733A (en) * 2017-02-13 2017-06-13 哈尔滨理工大学 Based on the image search method that words tree information fusion is combined with Hausdorff distance
CN106909877A (en) * 2016-12-13 2017-06-30 浙江大学 A kind of vision based on dotted line comprehensive characteristics builds figure and localization method simultaneously
CN107341802A (en) * 2017-07-19 2017-11-10 无锡信捷电气股份有限公司 It is a kind of based on curvature and the compound angular-point sub-pixel localization method of gray scale
CN107437097A (en) * 2017-07-28 2017-12-05 南京航空航天大学 A kind of two benches local configuration matching process based on corner description
CN109582795A (en) * 2018-11-30 2019-04-05 北京奇安信科技有限公司 Data processing method, equipment, system and medium based on Life cycle
CN109766943A (en) * 2019-01-10 2019-05-17 哈尔滨工业大学(深圳) A kind of template matching method and system based on global perception diversity measurement
CN109767442A (en) * 2019-01-15 2019-05-17 上海海事大学 A kind of remote sensing images Aircraft Targets detection method based on invariable rotary feature
CN109871908A (en) * 2019-04-11 2019-06-11 上海电机学院 Paper fractional statistics system and its application method based on smart phone
CN110148133A (en) * 2018-07-03 2019-08-20 北京邮电大学 Circuit board relic image-recognizing method based on characteristic point and its structural relation
CN110334560A (en) * 2019-07-16 2019-10-15 济南浪潮高新科技投资发展有限公司 A kind of two dimensional code localization method and device
WO2020098532A1 (en) * 2018-11-12 2020-05-22 杭州萤石软件有限公司 Method for positioning mobile robot, and mobile robot
CN111311673A (en) * 2018-12-12 2020-06-19 北京京东尚科信息技术有限公司 Positioning method and device and storage medium
CN111474535A (en) * 2020-03-18 2020-07-31 广东省智能机器人研究院 Mobile robot global positioning method based on characteristic thermodynamic diagram
CN112233133A (en) * 2020-10-29 2021-01-15 上海电力大学 Power plant high-temperature pipeline defect detection and segmentation method based on OTSU and region growing method
CN112348837A (en) * 2020-11-10 2021-02-09 中国兵器装备集团自动化研究所 Object edge detection method and system based on point-line detection fusion
CN113111212A (en) * 2021-04-01 2021-07-13 广东拓斯达科技股份有限公司 Image matching method, device, equipment and storage medium
CN113128516A (en) * 2020-01-14 2021-07-16 北京京东乾石科技有限公司 Edge extraction method and device
CN113743423A (en) * 2021-09-08 2021-12-03 浙江云电笔智能科技有限公司 Intelligent temperature monitoring method and system
CN114187267A (en) * 2021-12-13 2022-03-15 沭阳县苏鑫冲压件有限公司 Stamping part defect detection method based on machine vision
CN114414605A (en) * 2021-11-25 2022-04-29 上海精测半导体技术有限公司 Method for acquiring actual pixel size of charged particle beam scanning imaging equipment
CN114581376A (en) * 2022-01-31 2022-06-03 南通摩瑞纺织有限公司 Automatic sorting method and system for textile silkworm cocoons based on image recognition

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100656859B1 (en) * 2005-12-23 2006-12-13 학교법인 포항공과대학교 Simultaneous location and mapping method using supersonic wave sensor and vision sensor
US20100182480A1 (en) * 2009-01-16 2010-07-22 Casio Computer Co., Ltd. Image processing apparatus, image matching method, and computer-readable recording medium
CN103236050A (en) * 2013-05-06 2013-08-07 电子科技大学 Auxiliary bank note and worn coin reestablishing method based on graph clustering
CN103679636A (en) * 2013-12-23 2014-03-26 江苏物联网研究发展中心 Rapid image splicing method based on point and line features

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100656859B1 (en) * 2005-12-23 2006-12-13 학교법인 포항공과대학교 Simultaneous location and mapping method using supersonic wave sensor and vision sensor
US20100182480A1 (en) * 2009-01-16 2010-07-22 Casio Computer Co., Ltd. Image processing apparatus, image matching method, and computer-readable recording medium
CN103236050A (en) * 2013-05-06 2013-08-07 电子科技大学 Auxiliary bank note and worn coin reestablishing method based on graph clustering
CN103679636A (en) * 2013-12-23 2014-03-26 江苏物联网研究发展中心 Rapid image splicing method based on point and line features

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
EDWARD ROSTEN等: "Fusing Points and Lines for High Performance tracking", 《TENTH IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION》 *
ETHAN RUBLEE等: "ORB:an efficient alternative to SIFT or SURF", 《IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION》 *
王万同等: "基于SIFT点特征和Canny边缘特征匹配的多源遥感影像配准研究", 《计算机科学》 *
郑刚: "基于特征的图像匹配算法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105427263A (en) * 2015-12-21 2016-03-23 努比亚技术有限公司 Method and terminal for realizing image registering
CN106909877B (en) * 2016-12-13 2020-04-14 浙江大学 Visual simultaneous mapping and positioning method based on dotted line comprehensive characteristics
CN106909877A (en) * 2016-12-13 2017-06-30 浙江大学 A kind of vision based on dotted line comprehensive characteristics builds figure and localization method simultaneously
CN106844733A (en) * 2017-02-13 2017-06-13 哈尔滨理工大学 Based on the image search method that words tree information fusion is combined with Hausdorff distance
CN106844733B (en) * 2017-02-13 2020-04-03 哈尔滨理工大学 Image retrieval method based on combination of vocabulary tree information fusion and Hausdorff distance
CN107341802A (en) * 2017-07-19 2017-11-10 无锡信捷电气股份有限公司 It is a kind of based on curvature and the compound angular-point sub-pixel localization method of gray scale
CN107341802B (en) * 2017-07-19 2021-02-09 无锡信捷电气股份有限公司 Corner sub-pixel positioning method based on curvature and gray scale compounding
CN107437097A (en) * 2017-07-28 2017-12-05 南京航空航天大学 A kind of two benches local configuration matching process based on corner description
CN107437097B (en) * 2017-07-28 2020-06-09 南京航空航天大学 Two-stage local contour matching method based on angular point description
CN110148133A (en) * 2018-07-03 2019-08-20 北京邮电大学 Circuit board relic image-recognizing method based on characteristic point and its structural relation
WO2020098532A1 (en) * 2018-11-12 2020-05-22 杭州萤石软件有限公司 Method for positioning mobile robot, and mobile robot
CN109582795A (en) * 2018-11-30 2019-04-05 北京奇安信科技有限公司 Data processing method, equipment, system and medium based on Life cycle
CN111311673A (en) * 2018-12-12 2020-06-19 北京京东尚科信息技术有限公司 Positioning method and device and storage medium
CN111311673B (en) * 2018-12-12 2023-11-03 北京京东乾石科技有限公司 Positioning method and device and storage medium
CN109766943B (en) * 2019-01-10 2020-08-21 哈尔滨工业大学(深圳) Template matching method and system based on global perception diversity measurement
CN109766943A (en) * 2019-01-10 2019-05-17 哈尔滨工业大学(深圳) A kind of template matching method and system based on global perception diversity measurement
CN109767442A (en) * 2019-01-15 2019-05-17 上海海事大学 A kind of remote sensing images Aircraft Targets detection method based on invariable rotary feature
CN109767442B (en) * 2019-01-15 2020-09-04 上海海事大学 Remote sensing image airplane target detection method based on rotation invariant features
CN109871908A (en) * 2019-04-11 2019-06-11 上海电机学院 Paper fractional statistics system and its application method based on smart phone
CN110334560A (en) * 2019-07-16 2019-10-15 济南浪潮高新科技投资发展有限公司 A kind of two dimensional code localization method and device
CN110334560B (en) * 2019-07-16 2023-04-07 山东浪潮科学研究院有限公司 Two-dimensional code positioning method and device
CN113128516A (en) * 2020-01-14 2021-07-16 北京京东乾石科技有限公司 Edge extraction method and device
CN113128516B (en) * 2020-01-14 2024-04-05 北京京东乾石科技有限公司 Edge extraction method and device
CN111474535A (en) * 2020-03-18 2020-07-31 广东省智能机器人研究院 Mobile robot global positioning method based on characteristic thermodynamic diagram
CN112233133A (en) * 2020-10-29 2021-01-15 上海电力大学 Power plant high-temperature pipeline defect detection and segmentation method based on OTSU and region growing method
CN112348837B (en) * 2020-11-10 2023-06-09 中国兵器装备集团自动化研究所 Point-line detection fusion object edge detection method and system
CN112348837A (en) * 2020-11-10 2021-02-09 中国兵器装备集团自动化研究所 Object edge detection method and system based on point-line detection fusion
WO2022205611A1 (en) * 2021-04-01 2022-10-06 广东拓斯达科技股份有限公司 Image matching method and apparatus, and device and storage medium
CN113111212A (en) * 2021-04-01 2021-07-13 广东拓斯达科技股份有限公司 Image matching method, device, equipment and storage medium
CN113743423A (en) * 2021-09-08 2021-12-03 浙江云电笔智能科技有限公司 Intelligent temperature monitoring method and system
CN114414605A (en) * 2021-11-25 2022-04-29 上海精测半导体技术有限公司 Method for acquiring actual pixel size of charged particle beam scanning imaging equipment
CN114414605B (en) * 2021-11-25 2023-10-24 上海精测半导体技术有限公司 Method for acquiring actual pixel size of charged particle beam scanning imaging equipment
CN114187267A (en) * 2021-12-13 2022-03-15 沭阳县苏鑫冲压件有限公司 Stamping part defect detection method based on machine vision
CN114187267B (en) * 2021-12-13 2023-07-21 沭阳县苏鑫冲压件有限公司 Stamping part defect detection method based on machine vision
CN114581376A (en) * 2022-01-31 2022-06-03 南通摩瑞纺织有限公司 Automatic sorting method and system for textile silkworm cocoons based on image recognition

Also Published As

Publication number Publication date
CN104915949B (en) 2017-09-29

Similar Documents

Publication Publication Date Title
CN104915949A (en) Image matching algorithm of bonding point characteristic and line characteristic
US10319107B2 (en) Remote determination of quantity stored in containers in geographical region
US11416710B2 (en) Feature representation device, feature representation method, and program
US9633282B2 (en) Cross-trained convolutional neural networks using multimodal images
CN111353512B (en) Obstacle classification method, obstacle classification device, storage medium and computer equipment
Yu et al. VLASE: Vehicle localization by aggregating semantic edges
CN103366181A (en) Method and device for identifying scene integrated by multi-feature vision codebook
JP6866095B2 (en) Learning device, image identification device, learning method, image identification method and program
CN109101981B (en) Loop detection method based on global image stripe code in streetscape scene
Dewan et al. Learning a local feature descriptor for 3d lidar scans
JP6798860B2 (en) Boundary line estimation device
CN104200228A (en) Recognizing method and system for safety belt
US20160110627A1 (en) System and method for describing image outlines
CN108960267A (en) System and method for model adjustment
CN103353941A (en) Natural marker registration method based on viewpoint classification
JP2019185787A (en) Remote determination of containers in geographical region
Zhao et al. Learning probabilistic coordinate fields for robust correspondences
CN112364881B (en) Advanced sampling consistency image matching method
Yu et al. Learning bipartite graph matching for robust visual localization
CN109615695B (en) Automatic conversion method from space photo outside house to roof CAD drawing
Wu et al. A vision-based indoor positioning method with high accuracy and efficiency based on self-optimized-ordered visual vocabulary
KR20160148806A (en) Object Detecter Generation Method Using Direction Information, Object Detection Method and Apparatus using the same
CN115601791A (en) Unsupervised pedestrian re-identification method based on Multiformer and outlier sample re-distribution
Hassan et al. A deep learning framework for automatic airplane detection in remote sensing satellite images
KR102449031B1 (en) Method for indoor localization using deep learning

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20170929