CN103473565A - Image matching method and device - Google Patents

Image matching method and device Download PDF

Info

Publication number
CN103473565A
CN103473565A CN2013103742297A CN201310374229A CN103473565A CN 103473565 A CN103473565 A CN 103473565A CN 2013103742297 A CN2013103742297 A CN 2013103742297A CN 201310374229 A CN201310374229 A CN 201310374229A CN 103473565 A CN103473565 A CN 103473565A
Authority
CN
China
Prior art keywords
point
image
characteristic
characteristic point
subpoint
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2013103742297A
Other languages
Chinese (zh)
Other versions
CN103473565B (en
Inventor
李静
冯琦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201310374229.7A priority Critical patent/CN103473565B/en
Publication of CN103473565A publication Critical patent/CN103473565A/en
Application granted granted Critical
Publication of CN103473565B publication Critical patent/CN103473565B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses an image matching method and device. The image matching method comprises the steps of acquiring initial feature point sets matched from a first image and a second image; sequentially based on a matching relation between each pair of feature points in the initial feature point sets, determining projection points of preset reference points in the first image in the second image to obtain multiple projection points; determining at least one first projection point with the distance to a first position with most dense projection points in the second image being greater than a preset threshold value from the multiple projection points; removing the feature point pair corresponding to the first projection point from the initial feature point sets, and determining the matching relation between the first image and the second image according to the residual feature point pairs in the initial feature point sets. The image matching method can improve the accuracy of image matching and the reliability of image matching.

Description

A kind of image matching method and device
Technical field
The present invention relates to technical field of image processing, relate in particular to a kind of image matching method and device.
Background technology
Image matching technology has been applied to a plurality of fields, as image registration, target identification, face recognition, Medical Image Processing etc.Yet, no matter adopt which kind of image matching technology, all can't avoid occurring the phenomenon of mistake coupling, there is more error matching points in the match point set that makes to match.
During for the matching relationship between two width images, need to calculate the transformation model parameter between two width images, but while in the match point set, having a large amount of error matching points, can have influence on the estimated accuracy of transformation model parameter, and then have influence on matching relationship between definite image.Although adopt at present stochastic sampling consistency algorithm (RANSAC, Random Sample Consensus), least square median method scheduling algorithm to carry out the estimation of image transform model parameter, can reduce Mismatching point to determining the impact of images match relation.But more when error matching points quantity, utilize these algorithms to carry out the transformation model parameter estimation, also can there is the problem that Parameter Estimation Precision is low, thereby affect the matching relationship reliability between definite image.
Summary of the invention
In view of this, the invention provides a kind of image matching method and device, the method can improve the precision of mating between image, and then improves the reliability of images match.
For realizing above purpose, a first aspect of the present invention provides a kind of image matching method, comprising:
Obtain the initial characteristics point set matched from the first image and the second image, wherein, described initial characteristics point set comprises multipair unique point pair;
Based on every pair of matching relationship that described unique point is right, determine the subpoint of reference point default in described the first image in described the second image successively, obtain a plurality of subpoints;
From described a plurality of subpoints, determine with described the second image in the distance of the most intensive primary importance of subpoint be greater than at least one first subpoint of predetermined threshold value;
Remove described the first subpoint characteristic of correspondence point right from described initial characteristics point set, and, according to remaining unique point pair in described initial characteristics point set, determine the matching relationship of described the first image and described the second image.
In conjunction with first aspect, in possible implementation, described successively based on every pair of matching relationship that described unique point is right in the first, determine the subpoint of reference point default in described the first image in described the second image, comprising:
Obtain the First Characteristic point of described every pair of unique point centering and the characteristic point information of Second Characteristic point, wherein, described First Characteristic point is the unique point in described the first image, and described Second Characteristic point is the unique point in the second image; Described characteristic point information comprises: position, yardstick and the principal direction of unique point in affiliated image;
Matching relationship based on described First Characteristic point with described Second Characteristic point, and characteristic point information and the position of described reference point in described the first image of the described First Characteristic point of foundation and Second Characteristic point, determine the subpoint of described reference point in described the second image.
The possible implementation in conjunction with the first of first aspect, at the second in possible implementation, the described matching relationship based on described First Characteristic point and described Second Characteristic point, and characteristic point information and the position of described reference point in described the first image of the described First Characteristic point of foundation and Second Characteristic point, determine the subpoint of described reference point in described the second image, comprising:
According to the characteristic point information of described First Characteristic point, calculate the first distance of described First Characteristic point and described reference point, and described First Characteristic is put the angle between the principal direction of the direction vector of described reference point and described First Characteristic point;
According to the yardstick of described First Characteristic point and the yardstick of Second Characteristic point, calculate the scaling factor that Second Characteristic is put relative First Characteristic point;
Matching relationship based on described First Characteristic point with Second Characteristic point, and put angle, described the first distance and the described scaling factor between the principal direction of the direction vector of described reference point and described First Characteristic point, the position of calculating the subpoint of described reference point in described the second image according to the principal direction of the position of described Second Characteristic point, described Second Characteristic point, described First Characteristic.
The possible implementation in conjunction with the second of first aspect, at the third in possible implementation, the described matching relationship based on described First Characteristic point and Second Characteristic point, and put angle, described the first distance and the described scaling factor between the principal direction of the direction vector of described reference point and described First Characteristic point according to the principal direction of the position of described Second Characteristic point, described Second Characteristic point, described First Characteristic, calculate the position of the subpoint of described reference point in described the second image, comprising:
In the following way, calculate the position P(X of the subpoint of described reference point in described the second image r, kj, Y r, kj):
X r,kj=x kj+S ri,kj*d r,ri*cos(w kj+Δφ r,ri);
Y r,kj=y kj+S ri,kj*d r,ri*sin(w kj+Δφ r,ri);
Wherein, x kj, y kjbe respectively horizontal ordinate and the ordinate of described Second Characteristic point in described the second image; S ri, kjfor described scaling factor; d r, rifor described the first distance; w kjprincipal direction for described Second Characteristic point; Δ φ r, rifor described First Characteristic is put the angle between the principal direction of the direction vector of described reference point and described First Characteristic point.
The possible implementation of the second of possible implementation, first aspect and the third possible implementation of first aspect in conjunction with the first of first aspect, first aspect, in the 4th kind of possible implementation, described from described a plurality of subpoints, determine with described the second image in the distance of the most intensive primary importance of subpoint be greater than at least one first subpoint of predetermined threshold value, comprising:
Add up the subpoint number at each coordinate position place in described the second image, and utilize the Density Estimator method, calculate respectively each coordinate position place in described the second image and specify the density in the core window width, determine the primary importance at density maximal value place.
Possible implementation, possible implementation, the third possible implementation of first aspect and the 4th kind of possible implementation of first aspect of the second of first aspect in conjunction with the first of first aspect, first aspect, in the 5th kind of possible implementation, described according to remaining unique point pair in described initial characteristics point set, determine the matching relationship of described the first image and described the second image, comprising:
Utilize the stochastic sampling consistency algorithm, remaining unique point in described initial characteristics point set is carried out to the transformation model parameter estimation, and determine the matching relationship of described the first image and described the second image according to the transformation model parameter estimated.
A second aspect of the present invention provides a kind of image matching apparatus, comprising:
Unique point is to acquiring unit, and for obtaining the initial characteristics point set matched from the first image and the second image, wherein, described initial characteristics point is gathered and comprised multipair unique point pair;
The subpoint determining unit, based on every pair of matching relationship that described unique point is right, determine the subpoint of reference point default in described the first image in described the second image for successively, obtains a plurality of subpoints;
Subpoint screening unit, for from described a plurality of subpoints, determine with described the second image in the distance of the most intensive primary importance of subpoint be greater than at least one first subpoint of predetermined threshold value;
Unique point is to removing unit, right for from described initial characteristics point set, removing described the first subpoint characteristic of correspondence point;
Matching unit, for according to removed described the first subpoint characteristic of correspondence point to after, remaining unique point pair in the set of described initial characteristics point, determine the matching relationship of described the first image and described the second image.
In conjunction with second aspect, in the first, in possible implementation, described subpoint determining unit comprises:
Information acquisition unit, for the First Characteristic point that obtains described every pair of unique point centering and the characteristic point information of Second Characteristic point, wherein, described First Characteristic point is the unique point in described the first image, described Second Characteristic point is the unique point in the second image; Described characteristic point information comprises: position, yardstick and the principal direction of unique point in affiliated image;
Subpoint is determined subelement, for the matching relationship with described Second Characteristic point based on described First Characteristic point, and characteristic point information and the position of described reference point in described the first image of the described First Characteristic point of foundation and Second Characteristic point, determine the subpoint of described reference point in described the second image.
May implementation in conjunction with the first of second aspect, at the second, in possible implementation, described subpoint is determined subelement, comprising:
The first computing unit, for the characteristic point information according to described First Characteristic point, calculate the first distance of described First Characteristic point and described reference point, and the angle between the principal direction of the direction vector of described First Characteristic point and described reference point and described First Characteristic point;
The second computing unit, for the yardstick of the yardstick according to described First Characteristic point and Second Characteristic point, calculate the scaling factor that Second Characteristic is put relative First Characteristic point;
The subpoint computing unit, for the matching relationship with Second Characteristic point based on described First Characteristic point, and put angle, described the first distance and the described scaling factor between the principal direction of the direction vector of described reference point and described First Characteristic point, the position of calculating the subpoint of described reference point in described the second image according to the principal direction of the position of described Second Characteristic point, described Second Characteristic point, described First Characteristic.
Possible implementation and the possible implementation of the second of second aspect in conjunction with the first of second aspect, second aspect, at the third, in possible implementation, described subpoint screening unit comprises:
Subpoint screening subelement, for the subpoint number of adding up each coordinate position place of described the second image, and utilize the Density Estimator method, and calculate respectively each coordinate position place in described the second image and specify the density in the core window width, determine the primary importance at density maximal value place.
In conjunction with the first of second aspect, second aspect, in the third possible implementation of the possible implementation of the second of possible implementation, second aspect and second aspect, in the 4th kind of possible implementation, described matching unit comprises:
The coupling subelement, for utilizing the stochastic sampling consistency algorithm, to removed described the first subpoint characteristic of correspondence point to after, in described initial characteristics point set, remaining unique point is carried out the transformation model parameter estimation, and determines the matching relationship of described the first image and described the second image according to the transformation model parameter estimated.
From above-mentioned technical scheme, can find out, after the initial characteristics point set that the present invention matches in getting the first image to be matched and the second image, based on every pair of matching relationship that unique point is right in the set of initial characteristics point, determine the subpoint of reference point default in the first image in this second image, and the primary importance that in definite and this second image, subpoint is the most intensive distance is greater than the first subpoint of predetermined threshold value, due to unique point to for proper characteristics point to the time, should be in specified scope to the position of the subpoint determined based on unique point, therefore, this the first subpoint characteristic of correspondence point is right to being mistake matching characteristic point, like this.Utilize in initial characteristics point set removed with the first subpoint characteristic of correspondence point to after unique point pair, determine the matching relationship of the first image and the second image, improved precision and the reliability of images match.
The accompanying drawing explanation
In order to be illustrated more clearly in the embodiment of the present invention or technical scheme of the prior art, below will the accompanying drawing of required use in embodiment or description of the Prior Art be briefly described, apparently, accompanying drawing in the following describes is only some embodiments of the present invention, for those of ordinary skills, under the prerequisite of not paying creative work, can also obtain according to these accompanying drawings other accompanying drawing.
The schematic flow sheet that Fig. 1 is a kind of image matching method shown in one embodiment of the invention;
The schematic flow sheet that Fig. 2 is a kind of image matching method shown in another embodiment of the present invention;
The structural representation that Fig. 3 is a kind of image matching apparatus shown in one embodiment of the invention;
The structural representation that Fig. 4 is a kind of image matching apparatus shown in another embodiment of the present invention;
The structural representation that Fig. 5 is a kind of computing node shown in one embodiment of the invention.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is clearly and completely described, obviously, described embodiment is only the present invention's part embodiment, rather than whole embodiment.Embodiment based in the present invention, those of ordinary skills, not making under the creative work prerequisite the every other embodiment obtained, belong to the scope of protection of the invention.
The embodiment of the present invention provides a kind of image matching method, to improve the precision of images match.
Referring to Fig. 1, it shows the schematic flow sheet of an embodiment of a kind of image matching method of the present invention, and the method comprises:
Step 101: obtain the initial characteristics point set matched from the first image and the second image.
The set of this initial characteristics point comprises multipair unique point pair, and the every pair of unique point comprises respectively two unique points from the first image and the second image, and these two unique points have matching relationship.The unique point comprised in this initial characteristics point set is to the unique point pair for matching by existing signature search matching algorithm, and can there be the unique point pair of mistake coupling in the unique point centering in this initial characteristics point set.
Wherein, the two width images that this first image and the second image are pending coupling.Images match is actually different resolution, and diverse location is as process translation or rotation etc., and two width images of different ratio scale are mapped, and determines matching relationship corresponding between this two width image.Matching relationship between two width images, can determine by the transformation relation between two width images.For example, there are the transformation relations such as what kind of rotation, translation, convergent-divergent etc. between this two width image.In order to determine the matching relationship between this two width image, the width in this two width image can be can be used as to reference picture in matching process, another width image is as the image that need to be mated with this reference picture.Using the first image as the reference image, the second image is the image that need to be mated with this reference picture in the embodiment of the present application.
Step 102: successively based on every pair of matching relationship that unique point is right, determine the subpoint of reference point default in the first image in the second image, obtain a plurality of subpoints.
Wherein, reference point is for being a bit selecting in the first image at reference picture in advance, and this reference point can be any point selected in this first image.Optionally, this reference point can be the central point of this first image.
For any a pair of unique point in initial characteristics point set for, really be under the prerequisite of two unique points of mutual coupling in the first image and the second image in two unique points right based on this unique point, determine the subpoint position of reference point on this second image of this first image.Like this, the right matching relationship for this multipair unique point, can obtain a plurality of subpoints of this reference point in this second image, and this plurality of subpoints position may overlap, and may be also the location point in different.
Step 103: from these a plurality of subpoints, determine with the second image in the distance of the most intensive primary importance of subpoint be greater than at least one first subpoint of predetermined threshold value.
Because reference point is a point of fixity in this first image, and when two width images are determined, the matching relationship that this two width image has is also that certain determines relation, therefore, if each unique point to be the unique point that correctly matches to the time, the subpoint projected position of this reference point in a plurality of images should be at the same position point.Hence one can see that, the right matching relationship of unique point based on correct coupling in this initial characteristics point set, the subpoint position of determining should be in the second image same point, and the right matching relationship of unique point based on the mistake coupling, the position of the subpoint of determining should be dispersed in each diverse location place in this second image.
Consider in practical application, can there be certain error in the process of determining subpoint, cause the right matching relationship of unique point based on correct coupling, it is upper that the projected position point of determining also can not drop on same position point fully, but the position of this part subpoint should be distributed in certain scope.Therefore, can determine the most intensive primary importance of subpoint in the second image, and the subpoint that will determine as the matching relationship of the unique point based on correct coupling with the subpoint of distance in predetermined threshold value of this primary importance.
Accordingly, if the distance of the position of subpoint and this primary importance is greater than this predetermined threshold value, determine that so this subpoint characteristic of correspondence point is to being the unique point pair of erroneous matching.Wherein, if based on unique point right matching relationship, determine the subpoint of reference point in this second image, this unique point is to being the corresponding unique point pair of this subpoint.
For convenience, the subpoint that will be greater than with the distance of this primary importance this predetermined threshold value in the application is called this first subpoint, when the matching relationship right based on certain unique point, when the subpoint of determining is the first subpoint, this unique point is to being the unique point pair of mistake coupling so.
Step 104: remove the first subpoint characteristic of correspondence point right from this initial characteristics point set, and, according to remaining unique point pair in the set of initial characteristics point, determine the matching relationship of the first image and the second image.
Due to this first subpoint characteristic of correspondence point to being the unique point pair of erroneous matching, the application by the unique point of this erroneous matching to removing from the set of this initial characteristics point, and utilize the unique point removed the mistake coupling to after unique point to carrying out the analysis of images match relation, to determine the matching relationship of the first image and the second image.
In the embodiment of the present application, when the initial characteristics point matched in getting the first image to be matched and the second image is gathered, be not directly to utilize unique point in this initial characteristics point set to be analyzed the matching relationship between this two width image.If but according to the matching relationship in two width images in definite situation, the principle that wherein in piece image, point of fixity should be in dropping on certain limit to the subpoint position in another width image, respectively based on every pair of matching relationship that unique point is right in the set of initial characteristics point, determine the subpoint of reference point default in the first image in this second image, and further determine in the subpoint obtained, be greater than the first subpoint of predetermined threshold value with primary importance that in this second image, subpoint is the most intensive distance, and then the unique point pair that this first subpoint characteristic of correspondence point is mated being defined as mistake.Like this, utilize in initial characteristics point set, the unique point of having removed the mistake coupling to after unique point to analyzing the matching relationship of the first image and the second image, improved precision and the reliability of images match.
Wherein, according to remaining unique point pair in the set of initial characteristics point, while determining the matching relationship of the first image and the second image, can adopt existing any-mode to carry out the transformation model parameter estimation, and then determine the matching relationship between two width images.As, can adopt least square median method, the M estimation technique (M-estimators) scheduling algorithm computational transformation model parameter estimation, and then determine the matching relationship between two width images.Carry out further removal at the Mismatching point that utilizes these several algorithms may exist this remaining unique point centering in carrying out the process of transformation model parameter estimation, further improved the accuracy of the images match relation of determining.
Optionally, consider stochastic sampling consistance (RANSAC, Random Sample Consensus) algorithm is higher in the estimated accuracy of carrying out parameter estimation, can utilize the RANSAC algorithm, remaining unique point in the set of this initial characteristics point, to carrying out the transformation model parameter estimation, and is determined to the matching relationship of this first image and the second image according to the transformation model parameter estimated.
Because the RANSAC algorithm is a kind of nondeterministic algorithm, utilizing this RANSAC to carry out the process of the transformation model parameter estimation of image, need by the mode estimation model parameter of iteration, it has certain probability to draw a rational result, in order to improve probability, must improve iterations.Therefore, in the situation that in the set of initial characteristics point, the error matching points data are more, utilize this RANSAC algorithm to carry out the transformation model parameter estimation, may need the interative computation of more number of times, calculated amount is larger, can not meet the requirement of real-time.And by method of the present invention first by this first subpoint corresponding mistake matching characteristic point to after removing, recycling RANSAC algorithm carries out the transformation model parameter estimation, can significantly reduce iterations, reduce the calculated amount of transformation model parameter estimation, meet requirement of real-time.
Be understandable that, in any one embodiment of the present invention, the process of obtaining this initial characteristics point set matched is identical with the process of existing matching characteristic point set, as, by feature point extraction, the operations such as Feature Points Matching obtain this initial characteristics point set, do not repeat them here.
Optionally, can carry out multiple dimensioned invariant features detection to the first image and the second image respectively, from the first image, extract unique point set to be matched, and extract corresponding unique point set to be matched from the second image.As, can be by yardstick invariant features conversion (SIFT, Scale-invariant feature transform) algorithm, acceleration robust features (SURF, Speeded Up Robust Features) algorithm or random fern ((Random ferns) algorithm etc., carry out multiple dimensioned invariant features detection to image.After obtaining the unique point set to be matched of the first image and the second image, can calculate respectively the matching relationship of each unique point in the unique point set to be matched in this first image and the second image, obtain and match this initial characteristics point set.As, utilize the signature search matching algorithm to match unique point pair, thereby obtain comprising the initial characteristics point set that a plurality of unique points are right.
Optionally, in any embodiment of the present invention, for the matching relationship of the First Characteristic point according to each unique point centering with Second Characteristic point, determine the subpoint position of this reference point in the second image, can obtain the characteristic point information of each unique point of every pair of unique point centering, this characteristic point information comprises position, yardstick and the principal direction of unique point in affiliated image.For convenience, the unique point that unique point centering is belonged to the first image is called the First Characteristic point, and the unique point that this unique point centering is belonged in the second image is called the Second Characteristic point.As, the characteristic point information of First Characteristic point comprises coordinate position, yardstick and the principal direction of this First Characteristic o'clock in the first image.Wherein, determine that the method for position, yardstick and the principal direction of unique point in affiliated image can be identical with existing mode.For example, in the time of can passing through the principal direction of SIFT algorithm calculated characteristics point, can be to choose and specify big or small zone at the annex of unique point, and calculate this regional histogram of gradients, the direction of histogrammic peak value be represented to the principal direction of this unique point.
Wherein, this characteristic point information can be in determining the process that unique point between two width images is right, calculates this characteristic point information, and, when determining the subpoint of reference point at the second image, directly obtains the characteristic point information calculated.
Certainly, the right characteristic point information according to unique point, and the matching relationship based on unique point centering First Characteristic point and Second Characteristic point, determine that the projection pattern of reference point may have various ways.
Referring to Fig. 2, it shows the schematic flow sheet of another embodiment of a kind of image matching method of the present invention, and this embodiment provides a kind of preferred mode of determining subpoint, and the method for the present embodiment comprises:
Step 201: obtain the initial characteristics point set matched from the first image and the second image.
Wherein, a plurality of unique points pair have been comprised in the set of initial characteristics point.
The associated description similar process of this step and front embodiment, do not repeat them here.
Step 202: obtain the First Characteristic point of every pair of unique point centering and the characteristic point information of Second Characteristic point.
Wherein, for any a pair of unique point for, this First Characteristic point belongs to the unique point in the first image for this unique point centering, this Second Characteristic point belongs to the unique point in this second image for this unique point centering.This characteristic point information comprises: position, yardstick and the principal direction of unique point in affiliated image.
Step 203: according to the characteristic point information of First Characteristic point, calculate the first distance of reference point default in First Characteristic point and the first image, and First Characteristic is put the angle between the principal direction of the direction vector of this reference point and First Characteristic point.
After the location positioning of the position of this First Characteristic point and this reference point, can calculate the first distance of this First Characteristic point and this reference point.
For convenience, establish unique point centering in the present embodiment and comprise First Characteristic point P riand Second Characteristic point P kj.If in the first image, the position of this reference point is (x r, y r)), the position of First Characteristic o'clock in the first image is (x ri, y ri), this is first apart from d r, rifor:
d r , ri = ( x ri - x r ) 2 + ( y ri - y r ) 2 (formula one)
Same, can determine that according to the position of this First Characteristic point and reference point First Characteristic puts the vector of this reference point, thereby determine direction vector, and then obtain the angle Δ φ between this direction vector and this First Characteristic point principal direction r, ri.
Step 204: according to the yardstick of this First Characteristic point and the yardstick of Second Characteristic point, calculate the scaling factor that Second Characteristic is put relative First Characteristic point.
Wherein, this scaling factor equals the yardstick of this Second Characteristic point and the ratio of the yardstick of this First Characteristic point.It is scaling factor
Figure BDA0000371366420000112
wherein, σ rifor the yardstick of First Characteristic point, σ kjyardstick for Second Characteristic point.
Step 205: the matching relationship based on First Characteristic point with Second Characteristic point, and according to angle, the first distance and scaling factor between the principal direction of the principal direction of the position of Second Characteristic point, Second Characteristic point, direction vector that this First Characteristic is put this reference point and First Characteristic point, the position of the subpoint of calculating reference point in the second image.
According to the parameter calculated in above step, can determine the matching relationship right based on this unique point, the position of the subpoint of this reference point obtained in this second image.
Concrete, the right matching relationship for this unique point, in the following way, calculate the position P(X of the subpoint of this reference point in the second image r, kj, Y r, kj):
X r, kj=x kj+ S ri, kj* d r, ri* cos (w kj+ Δ φ r, ri) (formula two)
Y r, kj=y kj+ S ri, kj* d r, ri* sin (w kj+ Δ φ r, ri) (formula three)
Wherein, x kj, y kjbe respectively horizontal ordinate and the ordinate of described Second Characteristic point in described the second image; S ri, kjfor scaling factor; d r, riit is the first distance; w kjprincipal direction for Second Characteristic point; Δ φ r, rifor First Characteristic is put the angle between the principal direction of the direction vector of this reference point and First Characteristic point.
Step 206: from the plurality of subpoint, determine with the second image in the distance of the most intensive primary importance of subpoint be greater than at least one first subpoint of predetermined threshold value.
Step 207: remove the first subpoint characteristic of correspondence point right from the set of initial characteristics point, and, according to remaining unique point pair in the set of initial characteristics point, determine the matching relationship of the first image and described the second image.
Wherein, the operation of step 208 and step 209 can, with reference to the associated description of above embodiment, not repeat them here.
In the embodiment of the present application, owing to can putting in right process at matching characteristic the characteristic point information of position, yardstick and the principal direction of two unique points that calculate unique point centering, without independent calculated characteristics dot information, avoided increasing the calculated amount of characteristic point information.And utilize this characteristic point information, while determining the subpoint of reference point, calculated amount is less, thereby can determine more fast each subpoint position of reference point, and then, according to the position of subpoint, remove mistake matching characteristic point, improved the speed of removing Mismatching point.
Optionally, in above any one embodiment, after the position that obtains a plurality of subpoints of reference point in the second image, determine that the primary importance that in the second image, subpoint is the most intensive can be: the subpoint number of adding up each coordinate position place in this second image, and utilize the Density Estimator method, calculate respectively each coordinate position place in the second image and specify the density in the core window width, determine the primary importance at density maximal value place.As, can be by totalizer to adding up the subpoint number of each coordinate position in the second image.Matching relationship whenever two unique points based on a unique point centering, while determining the position of the subpoint of reference point in this second image, this totalizer adds one to the value of the position of current subpoint, until obtain a plurality of subpoints, determine the corresponding value in each position, thereby determine the subpoint number of each position.
Wherein, the thought of Density Estimator method is to be to adopt the interior total sample number of core window width in total sample number, thereby obtains the core window position at density maximal value place.
Corresponding method of the present invention, the present invention also provides a kind of image matching apparatus.Referring to Fig. 3, show the structural representation of an embodiment of a kind of image matching apparatus of the present invention, the device of the present embodiment can comprise: unique point is screened unit 303, unique point to removing unit 304 and matching unit 305 to acquiring unit 301, subpoint determining unit 302, subpoint.
Wherein, unique point is to acquiring unit 301, and for obtaining the initial characteristics point set matched from the first image and the second image, wherein, described initial characteristics point is gathered and comprised multipair unique point pair.
Subpoint determining unit 302, based on every pair of matching relationship that described unique point is right, determine the subpoint of reference point default in described the first image in described the second image for successively, obtains a plurality of subpoints.
Subpoint screening unit 303, for from described a plurality of subpoints, determine with described the second image in the distance of the most intensive primary importance of subpoint be greater than at least one first subpoint of predetermined threshold value;
Unique point is to removing unit 304, right for from described initial characteristics point set, removing described the first subpoint characteristic of correspondence point.
Matching unit 305, for according to removed described the first subpoint characteristic of correspondence point to after, remaining unique point pair in the set of described initial characteristics point, determine the matching relationship of described the first image and described the second image.
Optionally, this unique point, to acquiring unit, specifically for described the first image and described the second image are carried out to multiple dimensioned invariant features detection, obtains respectively the unique point set to be matched in described the first image and the second image; Calculate respectively the matching relationship of each unique point in the unique point set to be matched in described the first image and the second image, obtain the described initial characteristics point set matched.
Optionally, this matching unit 305 can comprise: the coupling subelement, for utilizing the stochastic sampling consistency algorithm, to removed described the first subpoint characteristic of correspondence point to after, in described initial characteristics point set, remaining unique point is carried out the transformation model parameter estimation, and determines the matching relationship of described the first image and described the second image according to the transformation model parameter estimated.
Referring to Fig. 4, show the structural representation of a kind of another embodiment of image matching apparatus in the present invention, with the difference of a upper device embodiment, be:
This subpoint determining unit 302 can comprise in the present embodiment:
Information acquisition unit 3021, for obtaining the First Characteristic point of described every pair of unique point centering and the characteristic point information of Second Characteristic point, wherein, described First Characteristic point is the unique point in described the first image, and described Second Characteristic point is the unique point in the second image; Described characteristic point information comprises: position, yardstick and the principal direction of unique point in affiliated image;
Subpoint is determined subelement 3022, for the matching relationship with described Second Characteristic point based on described First Characteristic point, and characteristic point information and the position of described reference point in described the first image of the described First Characteristic point of foundation and Second Characteristic point, determine the subpoint of described reference point in described the second image.
Further, this subpoint is determined subelement, can comprise:
The first computing unit, for the characteristic point information according to described First Characteristic point, calculate the first distance of described First Characteristic point and described reference point, and the first angle between the principal direction of the direction vector of described First Characteristic point and described reference point and described First Characteristic point;
The second computing unit, for the yardstick of the yardstick according to described First Characteristic point and Second Characteristic point, calculate the scaling factor that Second Characteristic is put relative First Characteristic point;
The subpoint computing unit, for the matching relationship with Second Characteristic point based on described First Characteristic point, and put angle, described the first distance and the described scaling factor between the principal direction of the direction vector of described reference point and described First Characteristic point, the position of calculating the subpoint of described reference point in described the second image according to the principal direction of the position of described Second Characteristic point, described Second Characteristic point, described First Characteristic.
Concrete, this subpoint computing unit in the following way, calculates the position P(X of the subpoint of described reference point in described the second image r, kj, Y r, kj):
X r,kj=x kj+S ri,kj*d r,ri*cos(w kj+Δφ r,ri);
Y r,kj=y kj+S ri,kj*d r,ri*sin(w kj+Δφ r,ri);
Wherein, x kj, y kjbe respectively horizontal ordinate and the ordinate of described Second Characteristic point in described the second image; S ri, kjfor described scaling factor; d r, rifor described the first distance; w kjprincipal direction for described Second Characteristic point; Δ φ r, rifor described differential seat angle.
Optionally, in above any one embodiment, this subpoint screening unit can comprise:
Subpoint screening subelement, for the subpoint number of adding up each coordinate position place of described the second image, and utilize the Density Estimator method, and calculate respectively each coordinate position place in described the second image and specify the density in the core window width, determine the primary importance at density maximal value place.
On the other hand, the present invention also provides a kind of computing node, referring to Fig. 5, show the structural representation of an embodiment of a kind of computing node of the present invention, this computing node may be the host server that comprises computing power, or personal computer PC, or portable portable computer or terminal etc.The computing node 500 of the present embodiment can comprise: storer 501, processor 502, communication interface 503 and communication bus 504.
Wherein, this storer 501, processor 502 and communication interface 503 complete mutual communication by this communication bus 504.
This communication interface 504, for net element communication, such as the node in shared memory systems, other network terminals etc.
This storer 501, for the initial characteristics point set matched from the first image and the second image, and storage includes the information of the program that processor 502 moves, and wherein, this initial characteristics point has comprised a plurality of unique points pair in gathering.Wherein, this storer may comprise the high-speed RAM storer, also may also comprise nonvolatile memory.
Processor 502, based on every pair of matching relationship that described unique point is right, determine the subpoint of reference point default in described the first image in described the second image for successively, obtains a plurality of subpoints; From described a plurality of subpoints, determine with described the second image in the distance of the most intensive primary importance of subpoint be greater than at least one first subpoint of predetermined threshold value; Remove described the first subpoint characteristic of correspondence point right from described initial characteristics point set, and, according to remaining unique point pair in described initial characteristics point set, determine the matching relationship of described the first image and described the second image.
Wherein, this processor may be a central processor CPU, or specific integrated circuit ASIC(Application Specific Integrated Circuit), or be configured to implement one or more integrated circuit of the embodiment of the present invention.
In this instructions, each embodiment adopts the mode of going forward one by one to describe, and what each embodiment stressed is and the difference of other embodiment that between each embodiment, identical similar part is mutually referring to getting final product.For the disclosed device of embodiment, because it corresponds to the method disclosed in Example, so description is fairly simple, relevant part partly illustrates and gets final product referring to method.
The professional can also further recognize, unit and the algorithm steps of each example of describing in conjunction with embodiment disclosed herein, can realize with electronic hardware, computer software or the combination of the two, for the interchangeability of hardware and software clearly is described, composition and the step of each example described according to function in the above description in general manner.These functions are carried out with hardware or software mode actually, depend on application-specific and the design constraint of technical scheme.The professional and technical personnel can specifically should be used for realizing described function with distinct methods to each, but this realization should not thought and exceeds scope of the present invention.
The software module that the method for describing in conjunction with embodiment disclosed herein or the step of algorithm can directly use hardware, processor to carry out, or the combination of the two is implemented.Software module can be placed in the storage medium of any other form known in random access memory (RAM), internal memory, ROM (read-only memory) (ROM), electrically programmable ROM, electrically erasable ROM, register, hard disk, moveable magnetic disc, CD-ROM or technical field.
To the above-mentioned explanation of the disclosed embodiments, make professional and technical personnel in the field can realize or use the present invention.Multiple modification to these embodiment will be apparent for those skilled in the art, and General Principle as defined herein can be in the situation that do not break away from the spirit or scope of the present invention, realization in other embodiments.Therefore, the present invention will can not be restricted to these embodiment shown in this article, but will meet the widest scope consistent with principle disclosed herein and features of novelty.

Claims (12)

1. an image matching method, is characterized in that, comprising:
Obtain the initial characteristics point set matched from the first image and the second image, wherein, described initial characteristics point set comprises multipair unique point pair;
Based on every pair of matching relationship that described unique point is right, determine the subpoint of reference point default in described the first image in described the second image successively, obtain a plurality of subpoints;
From described a plurality of subpoints, determine with described the second image in the distance of the most intensive primary importance of subpoint be greater than at least one first subpoint of predetermined threshold value;
Remove described the first subpoint characteristic of correspondence point right from described initial characteristics point set, and, according to remaining unique point pair in described initial characteristics point set, determine the matching relationship of described the first image and described the second image.
2. method according to claim 1, is characterized in that, described successively based on every pair of matching relationship that described unique point is right, determines the subpoint of reference point default in described the first image in described the second image, comprising:
Obtain the First Characteristic point of described every pair of unique point centering and the characteristic point information of Second Characteristic point, wherein, described First Characteristic point is the unique point in described the first image, and described Second Characteristic point is the unique point in the second image; Described characteristic point information comprises: position, yardstick and the principal direction of unique point in affiliated image;
Matching relationship based on described First Characteristic point with described Second Characteristic point, and characteristic point information and the position of described reference point in described the first image of the described First Characteristic point of foundation and Second Characteristic point, determine the subpoint of described reference point in described the second image.
3. method according to claim 2, it is characterized in that, the described matching relationship based on described First Characteristic point and described Second Characteristic point, and characteristic point information and the position of described reference point in described the first image of the described First Characteristic point of foundation and Second Characteristic point, determine the subpoint of described reference point in described the second image, comprising:
According to the characteristic point information of described First Characteristic point, calculate the first distance of described First Characteristic point and described reference point, and described First Characteristic is put the angle between the principal direction of the direction vector of described reference point and described First Characteristic point;
According to the yardstick of described First Characteristic point and the yardstick of Second Characteristic point, calculate the scaling factor that Second Characteristic is put relative First Characteristic point;
Matching relationship based on described First Characteristic point with Second Characteristic point, and put angle, described the first distance and the described scaling factor between the principal direction of the direction vector of described reference point and described First Characteristic point, the position of calculating the subpoint of described reference point in described the second image according to the principal direction of the position of described Second Characteristic point, described Second Characteristic point, described First Characteristic.
4. method according to claim 3, it is characterized in that, the described matching relationship based on described First Characteristic point and Second Characteristic point, and put angle, described the first distance and the described scaling factor between the principal direction of the direction vector of described reference point and described First Characteristic point according to the principal direction of the position of described Second Characteristic point, described Second Characteristic point, described First Characteristic, calculate the position of the subpoint of described reference point in described the second image, comprising:
In the following way, calculate the position P(X of the subpoint of described reference point in described the second image r, kj, Y r, kj):
X r,kj=x kj+S ri,kj*d r,ri*cos(w kj+Δφ r,ri);
Y r,kj=y kj+S ri,kj*d r,ri*sin(w kj+Δφ r,ri);
Wherein, x kj, y kjbe respectively horizontal ordinate and the ordinate of described Second Characteristic point in described the second image; S ri, kjfor described scaling factor; d r, rifor described the first distance; w kjprincipal direction for described Second Characteristic point; Δ φ r, rifor described First Characteristic is put the angle between the principal direction of the direction vector of described reference point and described First Characteristic point.
5. according to the described method of claim 1 to 4 any one, it is characterized in that, described from described a plurality of subpoints, determine with described the second image in the distance of the most intensive primary importance of subpoint be greater than at least one first subpoint of predetermined threshold value, comprising:
Add up the subpoint number at each coordinate position place in described the second image, and utilize the Density Estimator method, calculate respectively each coordinate position place in described the second image and specify the density in the core window width, determine the primary importance at density maximal value place.
6. according to the described method of claim 1 to 5 any one, it is characterized in that, described according to remaining unique point pair in described initial characteristics point set, determine the matching relationship of described the first image and described the second image, comprising:
Utilize the stochastic sampling consistency algorithm, remaining unique point in described initial characteristics point set is carried out to the transformation model parameter estimation, and determine the matching relationship of described the first image and described the second image according to the transformation model parameter estimated.
7. according to the described method of claim 1 to 6 any one, it is characterized in that, the described initial characteristics point matched from the first image and the second image that obtains is gathered, and comprising:
Described the first image and described the second image are carried out to multiple dimensioned invariant features detection, obtain respectively the unique point set to be matched in described the first image and the second image;
Calculate respectively the matching relationship of each unique point in the unique point set to be matched in described the first image and the second image, obtain the described initial characteristics point set matched.
8. an image matching apparatus, is characterized in that, comprising:
Unique point is to acquiring unit, and for obtaining the initial characteristics point set matched from the first image and the second image, wherein, described initial characteristics point is gathered and comprised multipair unique point pair;
The subpoint determining unit, based on every pair of matching relationship that described unique point is right, determine the subpoint of reference point default in described the first image in described the second image for successively, obtains a plurality of subpoints;
Subpoint screening unit, for from described a plurality of subpoints, determine with described the second image in the distance of the most intensive primary importance of subpoint be greater than at least one first subpoint of predetermined threshold value;
Unique point is to removing unit, right for from described initial characteristics point set, removing described the first subpoint characteristic of correspondence point;
Matching unit, for according to removed described the first subpoint characteristic of correspondence point to after, remaining unique point pair in the set of described initial characteristics point, determine the matching relationship of described the first image and described the second image.
9. device according to claim 8, is characterized in that, described subpoint determining unit comprises:
Information acquisition unit, for the First Characteristic point that obtains described every pair of unique point centering and the characteristic point information of Second Characteristic point, wherein, described First Characteristic point is the unique point in described the first image, described Second Characteristic point is the unique point in the second image; Described characteristic point information comprises: position, yardstick and the principal direction of unique point in affiliated image;
Subpoint is determined subelement, for the matching relationship with described Second Characteristic point based on described First Characteristic point, and characteristic point information and the position of described reference point in described the first image of the described First Characteristic point of foundation and Second Characteristic point, determine the subpoint of described reference point in described the second image.
10. device according to claim 8, is characterized in that, described subpoint is determined subelement, comprising:
The first computing unit, for the characteristic point information according to described First Characteristic point, calculate the first distance of described First Characteristic point and described reference point, and the angle between the principal direction of the direction vector of described First Characteristic point and described reference point and described First Characteristic point;
The second computing unit, for the yardstick of the yardstick according to described First Characteristic point and Second Characteristic point, calculate the scaling factor that Second Characteristic is put relative First Characteristic point;
The subpoint computing unit, for the matching relationship with Second Characteristic point based on described First Characteristic point, and put angle, described the first distance and the described scaling factor between the principal direction of the direction vector of described reference point and described First Characteristic point, the position of calculating the subpoint of described reference point in described the second image according to the principal direction of the position of described Second Characteristic point, described Second Characteristic point, described First Characteristic.
11. the described device of according to Claim 8 to 10 any one, is characterized in that, described subpoint screening unit comprises:
Subpoint screening subelement, for the subpoint number of adding up each coordinate position place of described the second image, and utilize the Density Estimator method, and calculate respectively each coordinate position place in described the second image and specify the density in the core window width, determine the primary importance at density maximal value place.
12. the described device of according to Claim 8 to 11 any one, is characterized in that, described matching unit comprises:
The coupling subelement, for utilizing the stochastic sampling consistency algorithm, to removed described the first subpoint characteristic of correspondence point to after, in described initial characteristics point set, remaining unique point is carried out the transformation model parameter estimation, and determines the matching relationship of described the first image and described the second image according to the transformation model parameter estimated.
CN201310374229.7A 2013-08-23 2013-08-23 Image matching method and device Expired - Fee Related CN103473565B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310374229.7A CN103473565B (en) 2013-08-23 2013-08-23 Image matching method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310374229.7A CN103473565B (en) 2013-08-23 2013-08-23 Image matching method and device

Publications (2)

Publication Number Publication Date
CN103473565A true CN103473565A (en) 2013-12-25
CN103473565B CN103473565B (en) 2017-04-26

Family

ID=49798409

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310374229.7A Expired - Fee Related CN103473565B (en) 2013-08-23 2013-08-23 Image matching method and device

Country Status (1)

Country Link
CN (1) CN103473565B (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105095344A (en) * 2015-05-29 2015-11-25 广东正业科技股份有限公司 Method and device for modeling time and space ring data structure
CN105205494A (en) * 2015-08-31 2015-12-30 小米科技有限责任公司 Similar picture identification method and device
CN106779055A (en) * 2017-01-10 2017-05-31 北京邮电大学 Image characteristic extracting method and device
CN106878691A (en) * 2015-12-14 2017-06-20 现代自动车株式会社 Image projection device and its control method
CN107148632A (en) * 2014-04-24 2017-09-08 河谷控股Ip有限责任公司 Robust feature for the target identification based on image is recognized
WO2017210908A1 (en) * 2016-06-08 2017-12-14 华为技术有限公司 Processing method and terminal
CN108182457A (en) * 2018-01-30 2018-06-19 百度在线网络技术(北京)有限公司 For generating the method and apparatus of information
CN109186625A (en) * 2018-10-24 2019-01-11 北京奥特贝睿科技有限公司 Intelligent vehicle carries out pinpoint method and system using mixing sampling filter
CN105554441B (en) * 2014-10-23 2019-11-19 韩华泰科株式会社 For being registrated the device and method of image
CN111010496A (en) * 2019-12-24 2020-04-14 维沃移动通信(杭州)有限公司 Image processing method and electronic equipment
CN111083456A (en) * 2019-12-24 2020-04-28 成都极米科技股份有限公司 Projection correction method, projection correction device, projector and readable storage medium
WO2020259365A1 (en) * 2019-06-27 2020-12-30 Oppo广东移动通信有限公司 Image processing method and device, and computer-readable storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007334795A (en) * 2006-06-19 2007-12-27 Sony Corp Information processor and information processing method, and program
CN101556692A (en) * 2008-04-09 2009-10-14 西安盛泽电子有限公司 Image mosaic method based on neighborhood Zernike pseudo-matrix of characteristic points
CN101984463A (en) * 2010-11-02 2011-03-09 中兴通讯股份有限公司 Method and device for synthesizing panoramic image
CN102005047A (en) * 2010-11-15 2011-04-06 无锡中星微电子有限公司 Image registration system and method thereof
US20110134221A1 (en) * 2009-12-07 2011-06-09 Samsung Electronics Co., Ltd. Object recognition system using left and right images and method
CN102256111A (en) * 2011-07-17 2011-11-23 西安电子科技大学 Multi-channel panoramic video real-time monitoring system and method
JP2011259044A (en) * 2010-06-04 2011-12-22 Panasonic Corp Image processing device and image processing method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007334795A (en) * 2006-06-19 2007-12-27 Sony Corp Information processor and information processing method, and program
CN101556692A (en) * 2008-04-09 2009-10-14 西安盛泽电子有限公司 Image mosaic method based on neighborhood Zernike pseudo-matrix of characteristic points
US20110134221A1 (en) * 2009-12-07 2011-06-09 Samsung Electronics Co., Ltd. Object recognition system using left and right images and method
JP2011259044A (en) * 2010-06-04 2011-12-22 Panasonic Corp Image processing device and image processing method
CN101984463A (en) * 2010-11-02 2011-03-09 中兴通讯股份有限公司 Method and device for synthesizing panoramic image
CN102005047A (en) * 2010-11-15 2011-04-06 无锡中星微电子有限公司 Image registration system and method thereof
CN102256111A (en) * 2011-07-17 2011-11-23 西安电子科技大学 Multi-channel panoramic video real-time monitoring system and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张波等: "一种改进的Harris特征点匹配算法", 《计算机系统应用》 *

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107148632A (en) * 2014-04-24 2017-09-08 河谷控股Ip有限责任公司 Robust feature for the target identification based on image is recognized
CN107148632B (en) * 2014-04-24 2020-10-27 河谷控股Ip有限责任公司 Robust feature recognition for image-based object recognition
CN105554441B (en) * 2014-10-23 2019-11-19 韩华泰科株式会社 For being registrated the device and method of image
CN105095344B (en) * 2015-05-29 2018-08-17 南昌正业科技有限公司 A kind of modeling method and device of space-time loop data structure
CN105095344A (en) * 2015-05-29 2015-11-25 广东正业科技股份有限公司 Method and device for modeling time and space ring data structure
CN105205494A (en) * 2015-08-31 2015-12-30 小米科技有限责任公司 Similar picture identification method and device
CN105205494B (en) * 2015-08-31 2018-12-11 小米科技有限责任公司 Similar pictures recognition methods and device
CN106878691A (en) * 2015-12-14 2017-06-20 现代自动车株式会社 Image projection device and its control method
US10838601B2 (en) 2016-06-08 2020-11-17 Huawei Technologies Co., Ltd. Processing method and terminal
WO2017210908A1 (en) * 2016-06-08 2017-12-14 华为技术有限公司 Processing method and terminal
AU2016409676B2 (en) * 2016-06-08 2020-01-30 Huawei Technologies Co., Ltd. Processing method and terminal
CN106779055B (en) * 2017-01-10 2019-06-21 北京邮电大学 Image characteristic extracting method and device
CN106779055A (en) * 2017-01-10 2017-05-31 北京邮电大学 Image characteristic extracting method and device
CN108182457A (en) * 2018-01-30 2018-06-19 百度在线网络技术(北京)有限公司 For generating the method and apparatus of information
CN108182457B (en) * 2018-01-30 2022-01-28 百度在线网络技术(北京)有限公司 Method and apparatus for generating information
CN109186625A (en) * 2018-10-24 2019-01-11 北京奥特贝睿科技有限公司 Intelligent vehicle carries out pinpoint method and system using mixing sampling filter
CN109186625B (en) * 2018-10-24 2020-05-05 北京奥特贝睿科技有限公司 Method and system for accurately positioning intelligent vehicle by using hybrid sampling filtering
WO2020259365A1 (en) * 2019-06-27 2020-12-30 Oppo广东移动通信有限公司 Image processing method and device, and computer-readable storage medium
CN111010496A (en) * 2019-12-24 2020-04-14 维沃移动通信(杭州)有限公司 Image processing method and electronic equipment
CN111083456A (en) * 2019-12-24 2020-04-28 成都极米科技股份有限公司 Projection correction method, projection correction device, projector and readable storage medium
CN111010496B (en) * 2019-12-24 2022-07-08 维沃移动通信(杭州)有限公司 Image processing method and electronic equipment
CN111083456B (en) * 2019-12-24 2023-06-16 成都极米科技股份有限公司 Projection correction method, apparatus, projector, and readable storage medium

Also Published As

Publication number Publication date
CN103473565B (en) 2017-04-26

Similar Documents

Publication Publication Date Title
CN103473565A (en) Image matching method and device
Tardif Non-iterative approach for fast and accurate vanishing point detection
Novatnack et al. Scale-dependent/invariant local 3D shape descriptors for fully automatic registration of multiple sets of range images
Qin et al. Query adaptive similarity for large scale object retrieval
US9092697B2 (en) Image recognition system and method for identifying similarities in different images
CN104331899B (en) A kind of SAR image registration method and device
CN107368820B (en) Refined gesture recognition method, device and equipment
CN104392241A (en) Mixed regression-based head pose estimation method
TW201710989A (en) System and method for determining whether a product image includes a logo pattern
CN106340010B (en) A kind of angular-point detection method based on second order profile difference
CN103559476A (en) Fingerprint matching method and device thereof
CN106462773B (en) Use the pattern recognition system and method for GABOR function
CN103345760A (en) Method for automatically generating mark points of object shape template of medical image
CN113095187A (en) Examination paper correction method based on image feature matching alignment
CN106155540A (en) Electronic brush pen form of a stroke or a combination of strokes treating method and apparatus
CN103823887A (en) Based on low-order overall situation geometry consistency check error match detection method
CN105447869A (en) Particle swarm optimization algorithm based camera self-calibration method and apparatus
CN111161348B (en) Object pose estimation method, device and equipment based on monocular camera
Cui et al. Global propagation of affine invariant features for robust matching
CN112288813B (en) Pose estimation method based on multi-view vision measurement and laser point cloud map matching
CN114332809A (en) Image identification method and device, electronic equipment and storage medium
Xu et al. A statistical method for peak localization in Hough space by analysing butterflies
CN109978043B (en) Target detection method and device
Geng et al. SANet: A novel segmented attention mechanism and multi-level information fusion network for 6D object pose estimation
Wu et al. A hierarchical motion trajectory signature descriptor

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20170426

Termination date: 20180823

CF01 Termination of patent right due to non-payment of annual fee