CN110738222A - Image matching method and device, computer equipment and storage medium - Google Patents

Image matching method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN110738222A
CN110738222A CN201810790418.5A CN201810790418A CN110738222A CN 110738222 A CN110738222 A CN 110738222A CN 201810790418 A CN201810790418 A CN 201810790418A CN 110738222 A CN110738222 A CN 110738222A
Authority
CN
China
Prior art keywords
image
matched
matching
feature points
obtaining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810790418.5A
Other languages
Chinese (zh)
Other versions
CN110738222B (en
Inventor
余坚毅
张天桥
魏恺言
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SHENZHEN ZHAORI TECHNOLOGY Co Ltd
Original Assignee
SHENZHEN ZHAORI TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SHENZHEN ZHAORI TECHNOLOGY Co Ltd filed Critical SHENZHEN ZHAORI TECHNOLOGY Co Ltd
Priority to CN201810790418.5A priority Critical patent/CN110738222B/en
Publication of CN110738222A publication Critical patent/CN110738222A/en
Application granted granted Critical
Publication of CN110738222B publication Critical patent/CN110738222B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]

Abstract

The method comprises the steps of obtaining SIFT feature points of an image to be matched and SIFT feature points of a reference image corresponding to the image to be matched, obtaining an optimal matching point pair according to the SIFT feature points of the image to be matched and the SIFT feature points of the reference image, obtaining the position of the matching feature points in the optimal matching point pair in the reference image, obtaining a comparison area of the image to be matched in the reference image according to the position of the matching feature points in the reference image, matching the image to be matched according to the comparison area, and obtaining a matching result of the image to be matched.

Description

Image matching method and device, computer equipment and storage medium
Technical Field
The present application relates to the field of image recognition technology, and in particular, to image matching methods, apparatuses, computer devices, and storage media.
Background
With the rapid development of scientific technology, the image sensor is widely applied to various fields such as news media, national defense and military, medical health and the like, and various target observation image data can be dynamically, rapidly and accurately acquired through the image sensor.
With the rapid increase of the capacity of images, the processing requirement on image data is also increasing, for example, in image matching technology, the traditional image matching method has the problem of low matching accuracy.
Disclosure of Invention
In view of the above, it is necessary to provide image matching methods, apparatuses, computer devices, and storage media capable of improving matching accuracy.
a method of image matching, the method comprising:
acquiring SIFT feature points of an image to be matched and SIFT feature points of a reference image corresponding to the image to be matched;
obtaining an optimal matching point pair according to the SIFT feature points of the image to be matched and the SIFT feature points of the reference image, wherein the optimal matching point pair comprises matching feature points of the image to be matched in the reference image;
obtaining the position of a matching feature point in the best matching point pair in the reference image, and obtaining a comparison area of the image to be matched in the reference image according to the position of the matching feature point in the reference image;
and matching the images to be matched according to the comparison area to obtain the matching result of the images to be matched.
In embodiments, the matching the image to be matched according to the comparison region to obtain the matching result of the image to be matched includes:
obtaining a matching point pair according to the SIFT feature point of the image to be matched and the SIFT feature point of the comparison region;
and when the number of the matching point pairs is larger than the number of preset point pairs, obtaining the result of image matching between the image to be matched and the image corresponding to the comparison area.
In embodiments, the obtaining a matching point pair according to the SIFT feature point of the image to be matched and the SIFT feature point of the comparison region includes:
calculating Euclidean distances between the SIFT feature points of the image to be matched and the SIFT feature points of the comparison region;
acquiring the minimum Euclidean distance and the next approximate Euclidean distance in the Euclidean distances;
and when the ratio of the minimum Euclidean distance to the next nearest Euclidean distance is smaller than a preset ratio, obtaining a matching point pair, wherein the matching point pair comprises the SIFT feature point of the image to be matched and the matching point of the SIFT feature point of the image to be matched in the comparison area.
In embodiments, after obtaining the matching point pair according to the SIFT feature point of the image to be matched and the SIFT feature point of the comparison region, the method further includes:
obtaining the angle between each matching point pair;
eliminating the matching point pairs which are not in the preset angle interval in the angle to obtain updated matching point pairs;
when the number of the matching point pairs is greater than the number of preset point pairs, obtaining a result of image matching between the image to be matched and the comparison area, including:
and when the number of the updated matching point pairs is larger than the number of preset point pairs, obtaining the result of image matching between the image to be matched and the image corresponding to the comparison area.
In embodiments, the obtaining of the best matching point pair according to the SIFT feature points of the image to be matched and the SIFT feature points of the reference image includes:
calculating Euclidean distances between the SIFT feature points of the image to be matched and the SIFT feature points of the reference image;
and acquiring the minimum Euclidean distance in the Euclidean distances, and taking the two feature points corresponding to the minimum Euclidean distance as the best matching point pair.
In embodiments, the obtaining a comparison area of the image to be matched in the reference image according to the position of the matching feature point in the reference image includes:
and obtaining a comparison area of the image to be matched in the reference image according to the position of the matching feature point in the reference image, the position of the SIFT feature point corresponding to the matching feature point in the image to be matched and the size of the image to be matched.
In embodiments, the acquiring the SIFT feature points of the image to be matched and the SIFT feature points of the reference image corresponding to the image to be matched includes:
acquiring an image to be matched and a reference image corresponding to the image to be matched;
and respectively carrying out Gaussian blur processing on the image to be matched and the reference image to obtain SIFT feature points of the image to be matched and SIFT feature points of the reference image, wherein the data type in the Gaussian blur processing is integer data.
an image matching apparatus, the apparatus comprising:
the characteristic point acquisition module is used for acquiring SIFT characteristic points of the image to be matched and SIFT characteristic points of a reference image corresponding to the image to be matched;
the optimal matching point pair obtaining module is used for obtaining an optimal matching point pair according to the SIFT feature points of the image to be matched and the SIFT feature points of the reference image, and the optimal matching point pair comprises matching feature points of the image to be matched in the reference image;
a comparison region determining module, configured to obtain a position of a matching feature point in the best matching point pair in the reference image, and obtain a comparison region of the image to be matched in the reference image according to the position of the matching feature point in the reference image;
and the matching module is used for matching the images to be matched according to the comparison area to obtain the matching result of the images to be matched.
computer device comprising a memory and a processor, the memory storing a computer program that when executed by the processor performs the steps of:
acquiring SIFT feature points of an image to be matched and SIFT feature points of a reference image corresponding to the image to be matched;
obtaining an optimal matching point pair according to the SIFT feature points of the image to be matched and the SIFT feature points of the reference image, wherein the optimal matching point pair comprises matching feature points of the image to be matched in the reference image;
obtaining the position of a matching feature point in the best matching point pair in the reference image, and obtaining a comparison area of the image to be matched in the reference image according to the position of the matching feature point in the reference image;
and matching the images to be matched according to the comparison area to obtain the matching result of the images to be matched.
A computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of:
acquiring SIFT feature points of an image to be matched and SIFT feature points of a reference image corresponding to the image to be matched;
obtaining an optimal matching point pair according to the SIFT feature points of the image to be matched and the SIFT feature points of the reference image, wherein the optimal matching point pair comprises matching feature points of the image to be matched in the reference image;
obtaining the position of a matching feature point in the best matching point pair in the reference image, and obtaining a comparison area of the image to be matched in the reference image according to the position of the matching feature point in the reference image;
and matching the images to be matched according to the comparison area to obtain the matching result of the images to be matched.
The image matching method, the device, the computer equipment and the storage medium obtain the best matching point pair according to the SIFT feature points of the image to be matched and the SIFT feature points of the reference image corresponding to the image to be matched and the SIFT feature points of the reference image, wherein the best matching point pair comprises the matching feature points of the image to be matched in the reference image, the position of the matching feature points in the best matching point pair in the reference image is obtained, the comparison area of the image to be matched in the reference image is obtained according to the position of the matching feature points in the reference image, the matching result of the image to be matched is obtained according to the comparison area, and thus the best matching point pair is taken as the basis, the comparison area is searched from the reference image for image matching, the matching range is reduced, and the probability of mismatching is reduced, the matching accuracy is improved.
Drawings
FIG. 1 is a diagram of an application environment of an image matching method in embodiments;
FIG. 2 is a flow chart of an image matching method in embodiments;
FIG. 3 is a schematic flow chart of the matching result step of obtaining the images to be matched according to the comparison area in embodiments;
fig. 4 is a schematic flow chart of a matching point pair obtaining step according to SIFT feature points of an image to be matched and SIFT feature points of a comparison region in embodiments;
FIG. 5 is a flow chart of another exemplary image matching methods;
fig. 6 is a flowchart illustrating a step of obtaining best matching point pairs according to SIFT feature points of an image to be matched and SIFT feature points of a reference image in embodiments;
FIG. 7 is a flow chart illustrating an image matching method in further embodiments;
FIG. 8 is a block diagram showing the structure of an image matching apparatus in embodiments;
fig. 9 is an internal structural view of a computer device in embodiments.
Detailed Description
For purposes of making the present application, its objects, aspects and advantages more apparent, the present application is described in further detail with reference to the drawings and the examples.
The image matching method provided by the application takes the application to a paper-pattern anti-counterfeiting system as an example, the paper-pattern anti-counterfeiting system can be used for identifying the authenticity of input paper patterns, and the image matching method can be applied to the application environment shown in fig. 1. The paper print collecting terminal 102 communicates with the server 104 through a network. The method comprises the steps that a paper grain acquisition terminal 102 obtains an input paper grain image, the input paper grain image is sent to a server 104, the server 104 obtains SIFT feature points of the input paper grain image and SIFT feature points of a reference image corresponding to an image to be matched, a best matching point pair is obtained according to the SIFT feature points of the input paper grain image and the SIFT feature points of the reference image, the best matching point pair comprises the feature points of the input paper grain image and matching feature points of the input paper grain image in the reference image, a comparison area of the input paper grain image in the reference image is obtained according to the position of the matching feature points in the reference image, the input paper grain image is matched according to the comparison area, and a matching result of the input paper grain image is obtained. The paper print collecting terminal 102 may be, but not limited to, various personal computers, notebook computers, smart phones, tablet computers, and portable counterfeit verification devices, and the server 104 may be implemented by an independent server or a server cluster formed by a plurality of servers.
In embodiments, as shown in fig. 2, image matching methods are provided, and the method is exemplified by being applied to the server in fig. 1, and includes the following steps:
step 202, acquiring SIFT feature points of an image to be matched and SIFT feature points of a reference image corresponding to the image to be matched.
The image to be matched is an image which is input by a user and needs to obtain a matching result, the reference image is a standard image which is stored in a database in advance and corresponds to the image to be matched, the image has a known result, for example, in a paper-print anti-counterfeiting system, the image to be matched is an image corresponding to a bill with unknown authenticity, and the reference image corresponding to the image to be matched is an image corresponding to real bill data.
The method comprises the steps of sampling images by different sampling distances to obtain series of images with the size being not , forming pyramid-shaped image hierarchical structures from large to small and from bottom to top, combining a plurality of images on each layer of a pyramid into groups of images, performing Gaussian blur on images on each layer by using different parameters to enable each layer to have a plurality of Gaussian blur images, comparing pixel points with other pixel points on the same layer, comparing the pixel points with the pixel points of adjacent upper and lower layer images, extracting gray maximum values or minimum values to determine candidate feature points, and filtering low-contrast value points in the candidate feature points and candidate feature points positioned at the edges.
And 204, obtaining an optimal matching point pair according to the SIFT feature points of the image to be matched and the SIFT feature points of the reference image, wherein the optimal matching point pair comprises matching feature points of the image to be matched in the reference image.
The optimal matching point pair refers to the SIFT feature point of the image to be matched with the highest similarity and the SIFT feature point of the reference image. And respectively calculating the similarity of each SIFT feature point in the image to be matched and each SIFT feature point of the reference image, and acquiring the SIFT feature point of the image to be matched with the SIFT feature point with the highest similarity and the SIFT feature point of the reference image as the optimal matching point pair. Specifically, the euclidean distance between each SIFT feature point in the image to be matched and each SIFT feature point of the reference image may be obtained, where the euclidean distance refers to the real distance between two points in the m-dimensional space or the natural length of the vector. The Euclidean distance can reflect the feature similarity of two points, and the smaller the Euclidean distance is, the greater the similarity is; the larger the euclidean distance, the smaller the similarity. And taking the SIFT feature points in the image to be matched with the minimum Euclidean distance and the SIFT feature points of the reference image as the best matching point pairs.
And step 206, obtaining the position of the matching feature point in the best matching point pair in the reference image, and obtaining the comparison area of the image to be matched in the reference image according to the position of the matching feature point in the reference image.
The matching feature points refer to feature points in the reference image in the best matching point pair, the positions of the matching feature points in the reference image are obtained, and the positions of the feature points in the image to be matched in the best matching point pair are combined, so that an area which is times the size of the image to be matched is intercepted from the reference image and is used as a comparison area.
And 208, matching the images to be matched according to the comparison area to obtain a matching result of the images to be matched.
And matching the image to be matched with the image corresponding to the comparison area, and obtaining a matched result when the image to be matched is matched with the image corresponding to the comparison area. And when the image to be matched is not matched with the image corresponding to the comparison area, obtaining a result that the image to be matched and the image corresponding to the comparison area are not matched.
According to the image matching method, the SIFT feature points of the image to be matched and the SIFT feature points of the reference image corresponding to the image to be matched are obtained, the best matching point pair is obtained according to the SIFT feature points of the image to be matched and the SIFT feature points of the reference image, the best matching point pair comprises the matching feature points of the image to be matched in the reference image, the positions of the matching feature points in the best matching point pair in the reference image are obtained, the comparison area of the image to be matched in the reference image is obtained according to the positions of the matching feature points in the reference image, the matching result of the image to be matched is obtained according to the comparison area, the best matching point pair is used as a basis, the comparison area is searched from the reference image for image matching, the matching range is reduced, the probability of mismatching is reduced, and the matching accuracy is improved.
In embodiments, as shown in fig. 3, matching the image to be matched according to the comparison region to obtain the matching result of the image to be matched includes step 302 of obtaining matching point pairs according to the SIFT feature points of the image to be matched and the SIFT feature points of the comparison region, and step 304 of obtaining the result of image matching between the image to be matched and the comparison region when the number of the matching point pairs is greater than the number of the preset point pairs.
For example, when issuing a ticket check, an image to be matched refers to a paper pattern region image acquired by a ticket checker in real time, and a corresponding region needs to be found in a prestored verification paper pattern region when issuing the ticket check through SIFT matching, mismatching points which cannot be eliminated inevitably exist in SIFT matching due to the fact that the prestored verification paper pattern region is large, but if the area of the verification paper pattern region can be reduced, the probability of mismatching is reduced.
In embodiments, as shown in fig. 4, obtaining a matching point pair according to the SIFT feature points of the image to be matched and the SIFT feature points of the comparison region includes step 402 of calculating euclidean distances between the SIFT feature points of the image to be matched and the SIFT feature points of the comparison region, step 404 of obtaining a minimum euclidean distance and a next closest euclidean distance among the euclidean distances, and step 406 of obtaining a matching point pair when a ratio of the minimum euclidean distance to the next closest euclidean distance is smaller than a preset ratio, where the matching point pair includes the SIFT feature points of the image to be matched and the matching points of the SIFT feature points of the image to be matched in the comparison region.
Euclidean distance refers to the true distance between two points in m-dimensional space, or the natural length of a vector. The euclidean distance in two and three dimensions is the actual distance between two points. Euclidean distance between points A (x1, y 1) and B (x2, y2) in two-dimensional space
Figure BDA0001734759860000101
Point A (x1, y) in three-dimensional spaceEuclidean distance between 1, z1 and point B (x2, y2, z 2)
Figure BDA0001734759860000102
Figure BDA0001734759860000103
The similarity of two characteristic points is compared through the Euclidean distance, the ratio of the minimum Euclidean distance to the next nearest Euclidean distance is calculated, the ratio is compared with a preset threshold value, and a matching point pair is obtained.
In embodiments, as shown in fig. 5, obtaining matching point pairs according to the SIFT feature points of the image to be matched and the SIFT feature points of the comparison region further includes, after obtaining the matching point pairs, obtaining angles between the matching point pairs, and removing 508 the matching point pairs not located in the preset angle interval from the angles to obtain updated matching point pairs, and obtaining an image matching result corresponding to the image to be matched and the comparison region when the number of the matching point pairs is greater than the number of the preset point pairs, including, step 510, obtaining an image matching result corresponding to the image to be matched and the comparison region when the number of the updated matching point pairs is greater than the number of the preset point pairs.
In the paper-line anti-counterfeiting system, a paper-line graph shot by a paper-line anti-counterfeiting instrument is not obviously zoomed, the rotation can be performed according to the advantages of bill edge correction, the mismatching points can be eliminated by combining the overall matching point trend, for example, more than 80% of the remaining matching point pairs screened by the ratio of the minimum Euclidean distance to the next Euclidean distance are correct, the angles between the matching point pairs can be obtained, the matching point pairs with the angles not within a preset angle range are eliminated, particularly, the matching point pairs with the matching point pairs which are close to each other but have larger mismatching angles with the main direction are eliminated according to the parallel mode of the connecting lines of the matching point pairs in the matching main direction, and the main direction is considered to be parallel to the connecting lines, namely, the main direction is , so that the matching point pairs have larger included angles.
In embodiments, as shown in fig. 6, obtaining an optimal matching point pair according to the SIFT feature points of the image to be matched and the SIFT feature points of the reference image includes step 602 of calculating euclidean distances between the SIFT feature points of the image to be matched and the SIFT feature points of the reference image, and step 604 of obtaining a minimum euclidean distance of the euclidean distances and taking two feature points corresponding to the minimum euclidean distances as the optimal matching point pair.
The method comprises the steps of calculating Euclidean distance values between all points in an image to be matched and all points in a reference image, taking two feature points corresponding to the minimum Euclidean distance values as an optimal matching point pair, intercepting comparison areas caused by sizes of the image to be matched from the reference image according to the positions of matching feature points in a reference image in the optimal matching point pair and the positions of corresponding SIFT feature points in the image to be matched, and then obtaining the matching points in the comparison areas, so that a search area is reduced, and the risk of mismatching is reduced.
In embodiments, as shown in fig. 7, the method for acquiring the SIFT feature points of the image to be matched and the SIFT feature points of the reference image corresponding to the image to be matched includes the steps of acquiring 702 the image to be matched and the reference image corresponding to the image to be matched, and performing gaussian blur processing on 704 the image to be matched and the reference image respectively to obtain the SIFT feature points of the image to be matched and the SIFT feature points of the reference image, wherein the data type in the gaussian blur processing is integer data.
The SIFT algorithm is to search key points in different scale spaces, the acquisition of the scale spaces needs to be realized by using Gaussian blur, a Gaussian convolution kernel is a only transformation kernel for realizing scale transformation, the Gaussian blur is image filters, a blur template is calculated by using a Gaussian function, the template is used for carrying out convolution operation with an original image, and the purpose of blurring the image is achieved.
It should be understood that although the various steps in the flow charts of fig. 2-7 are shown in sequence as indicated by the arrows, the steps are not necessarily performed in the order indicated by the arrows, unless explicitly stated herein, the steps are not performed in a strict order of limitation, and the steps may be performed in other orders, and further, at least the portion of the step of in fig. 2-7 may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time , but may be performed at different times, and the order of performance of the sub-steps or stages is not necessarily performed in sequence, but may be rotated or alternated with at least portions of other steps or sub-steps of other steps.
In embodiments, for the paper veins collected by the paper vein anti-counterfeiting system, it is necessary to search for whether there is a corresponding region in the signature paper vein region, since the signature paper vein region is large, there inevitably exist mismatching points that cannot be eliminated when SIFT matching is performed, but if the signature paper vein region area can be reduced, the probability of mismatching will be reduced.
In embodiments, as shown in fig. 8, image matching devices are provided, which include a feature point obtaining module 802, a best matching point pair obtaining module 804, a comparison area determining module 806, and a matching module 808, wherein the feature point obtaining module is configured to obtain SIFT feature points of an image to be matched and SIFT feature points of a reference image corresponding to the image to be matched.
In embodiments, the matching module includes a matching point pair obtaining unit configured to obtain a matching point pair according to the SIFT feature points of the image to be matched and the SIFT feature points of the comparison region, and an image matching unit configured to obtain an image matching result between the image to be matched and the comparison region when the number of the matching point pairs is greater than the number of preset point pairs.
In embodiments, the matching point pair obtaining unit includes a euclidean distance calculating unit configured to calculate euclidean distances between each SIFT feature point of the image to be matched and each SIFT feature point of the comparison region, a distance screening unit configured to obtain a minimum euclidean distance and a next closest euclidean distance among the euclidean distances, and a ratio screening unit configured to obtain a matching point pair when a ratio of the minimum euclidean distance to the next closest euclidean distance is smaller than a preset ratio, where the matching point pair includes the SIFT feature point of the image to be matched and a matching point of the SIFT feature point of the image to be matched in the comparison region.
In embodiments, the matching point pair obtaining unit further includes an angle obtaining unit configured to obtain an angle between each matching point pair, an updating unit configured to remove matching point pairs that are not in the preset angle interval from the angles to obtain updated matching point pairs, and an image matching unit configured to obtain an image matching result between the image to be matched and the comparison area when the number of the updated matching point pairs is greater than the number of the preset point pairs.
In embodiments, the best matching point pair obtaining module includes a feature point distance calculating unit configured to calculate euclidean distances between respective SIFT feature points of the image to be matched and respective SIFT feature points of the reference image, and a minimum distance obtaining unit configured to obtain a minimum euclidean distance among the respective euclidean distances, and use two feature points corresponding to the minimum euclidean distances as the best matching point pair.
In embodiments, the comparison region determining module is configured to obtain the comparison region of the image to be matched in the reference image according to the position of the matching feature point in the reference image, the position of the SIFT feature point corresponding to the matching feature point in the image to be matched, and the size of the image to be matched.
In embodiments, the feature point extraction module includes an image acquisition unit configured to acquire an image to be matched and a reference image corresponding to the image to be matched, and a data processing unit configured to perform gaussian blur processing on the image to be matched and the reference image respectively to obtain SIFT feature points of the image to be matched and SIFT feature points of the reference image, where a data type in the gaussian blur processing is integer data.
For specific definition of the image matching device, reference may be made to the definition of the image matching method above, and details are not repeated here. The modules in the image matching device can be wholly or partially realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In embodiments, there are provided kinds of computer devices, which may be servers, the internal structure of which may be as shown in FIG. 9, the computer devices include a processor, a memory, a network interface and a database connected by a system bus, wherein the processor of the computer device is used to provide computing and control capabilities.
Those skilled in the art will appreciate that the architecture shown in fig. 9 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In embodiments, computer devices are provided, which include a memory and a processor, wherein the memory stores a computer program, and the processor executes the computer program to implement the following steps of obtaining SIFT feature points of an image to be matched and SIFT feature points of a reference image corresponding to the image to be matched, obtaining a best matching point pair according to the SIFT feature points of the image to be matched and the SIFT feature points of the reference image, wherein the best matching point pair includes matching feature points of the image to be matched in the reference image, obtaining a position of the matching feature points in the best matching point pair in the reference image, obtaining a comparison region of the image to be matched in the reference image according to the position of the matching feature points in the reference image, and matching the image to be matched according to the comparison region to obtain a matching result of the image to be matched.
In embodiments, when the processor executes the computer program, the processor further implements the following steps of obtaining SIFT feature points of an image to be matched and SIFT feature points of a reference image corresponding to the image to be matched, obtaining a best matching point pair according to the SIFT feature points of the image to be matched and the SIFT feature points of the reference image, obtaining a matching region of the image to be matched in the reference image according to the position of the matching feature points in the best matching point pair in the reference image and the position of the matching feature points in the reference image, obtaining matching point pairs according to the SIFT feature points of the image to be matched and the SIFT feature points of the matching region, and obtaining an image matching result of the image to be matched and the matching region when the number of the matching point pairs is greater than the number of preset point pairs.
In embodiments, when the processor executes the computer program, the processor further realizes the following steps of obtaining SIFT feature points of an image to be matched and SIFT feature points of a reference image corresponding to the image to be matched, obtaining the positions of the matching feature points in the best matching point pair in the reference image, obtaining a best matching point pair according to the SIFT feature points of the image to be matched and the SIFT feature points of the reference image, obtaining a comparison area of the image to be matched in the reference image according to the positions of the matching feature points in the reference image, calculating Euclidean distances between the SIFT feature points of the image to be matched and the SIFT feature points of the comparison area, obtaining a minimum Euclidean distance and a next Euclidean distance in the Euclidean distances, obtaining matching point pairs when the ratio of the minimum Euclidean distance to the next Euclidean distance is smaller than a preset point pair threshold value, obtaining matching point pairs when the number of the minimum Euclidean distance to the matching point pair to be matched is larger than the preset number, and obtaining matching result of the matching point pairs corresponding to the image.
In embodiments, the processor, when executing the computer program, further implements the steps of obtaining SIFT feature points of an image to be matched and SIFT feature points of a reference image corresponding to the image to be matched, obtaining a best matching point pair according to the SIFT feature points of the image to be matched and the SIFT feature points of the reference image, obtaining a matching point pair according to the position of the matching feature points in the best matching point pair in the reference image and the position of the matching feature points in the reference image, obtaining a matching area of the image to be matched in the reference image according to the SIFT feature points of the image to be matched and the SIFT feature points in the matching area, obtaining angles between the matching point pairs, eliminating the matching point pairs which are not in a preset angle interval in the angles, obtaining updated matching point pairs, and obtaining a result of image matching corresponding to the matching area when the number of the updated matching point pairs is greater than the number of the preset point pairs.
In embodiments, when the processor executes the computer program, the processor further realizes the following steps of obtaining SIFT feature points of an image to be matched and SIFT feature points of a reference image corresponding to the image to be matched, calculating Euclidean distances between the SIFT feature points of the image to be matched and the SIFT feature points of the reference image, obtaining a minimum Euclidean distance among the Euclidean distances, taking two feature points corresponding to the minimum Euclidean distance as a best matching point pair, obtaining the position of a matching feature point in the best matching point pair in the reference image, obtaining a comparison area of the image to be matched in the reference image according to the position of the matching feature point in the reference image, and matching the image to be matched according to the comparison area to obtain a matching result of the image to be matched.
In embodiments, when the processor executes the computer program, the processor further realizes the following steps of obtaining SIFT feature points of an image to be matched and SIFT feature points of a reference image corresponding to the image to be matched, obtaining a best matching point pair according to the SIFT feature points of the image to be matched and the SIFT feature points of the reference image, obtaining a comparison area of the image to be matched in the reference image according to the position of the matching feature points in the reference image, the position of the SIFT feature points corresponding to the matching feature points in the image to be matched and the size of the image to be matched, and obtaining a matching result of the image to be matched according to the comparison area.
In embodiments, when executing the computer program, the processor further implements the following steps of obtaining an image to be matched and a reference image corresponding to the image to be matched, respectively performing Gaussian blur processing on the image to be matched and the reference image to obtain SIFT feature points of the image to be matched and SIFT feature points of the reference image, wherein the data type in the Gaussian blur processing is integer data, obtaining a best matching point pair according to the SIFT feature points of the image to be matched and the SIFT feature points of the reference image, wherein the best matching point pair comprises matching feature points of the image to be matched in the reference image, obtaining the position of the matching feature points in the best matching point pair in the reference image, obtaining a comparison area of the image to be matched in the reference image according to the position of the matching feature points in the reference image, and matching the image to be matched according to the comparison area to obtain the matching result of the image to be matched.
In embodiments, computer readable storage media are provided, on which computer programs are stored, and when executed by a processor, the computer programs implement the steps of obtaining SIFT feature points of an image to be matched and SIFT feature points of a reference image corresponding to the image to be matched, obtaining a best matching point pair according to the SIFT feature points of the image to be matched and the SIFT feature points of the reference image, the best matching point pair including matching feature points of the image to be matched in the reference image, obtaining positions of the matching feature points in the best matching point pair in the reference image, obtaining a comparison region of the image to be matched in the reference image according to the positions of the matching feature points in the reference image, and matching the image to be matched according to the comparison region to obtain a matching result of the image to be matched.
In embodiments, when executed by a processor, the computer program further implements the steps of obtaining SIFT feature points of an image to be matched and SIFT feature points of a reference image corresponding to the image to be matched, obtaining a best matching point pair according to the SIFT feature points of the image to be matched and the SIFT feature points of the reference image, wherein the best matching point pair comprises matching feature points of the image to be matched in the reference image, obtaining a position of the matching feature points in the best matching point pair in the reference image, obtaining a comparison area of the image to be matched in the reference image according to the position of the matching feature points in the reference image, obtaining matching point pairs according to the SIFT feature points of the image to be matched and the SIFT feature points of the comparison area, and obtaining an image matching result corresponding to the comparison area of the image to be matched when the number of the matching point pairs is greater than the number of preset point pairs.
In embodiments, when executed by a processor, the computer program further realizes the steps of obtaining SIFT feature points of an image to be matched and SIFT feature points of a reference image corresponding to the image to be matched, obtaining the positions of the matching feature points in the best matching point pair in the reference image, obtaining a best matching point pair according to the SIFT feature points of the image to be matched and the SIFT feature points of the reference image, obtaining a comparison area of the image to be matched in the reference image according to the positions of the matching feature points in the reference image, calculating Euclidean distances between the SIFT feature points of the image to be matched and the SIFT feature points of the comparison area, obtaining a minimum Euclidean distance and a next Euclidean distance in the Euclidean distances, obtaining matching point pairs when the ratio of the minimum Euclidean distance to the next Euclidean distance is smaller than a preset point pair threshold value, obtaining matching point pairs when the number of the minimum Euclidean distance to the matching point pair to be matched is larger than the preset number, and obtaining matching result of the matching point pairs of the image to be matched and the image to be matched.
In embodiments, when executed by a processor, the computer program further implements the steps of obtaining SIFT feature points of an image to be matched and SIFT feature points of a reference image corresponding to the image to be matched, obtaining a best matching point pair according to the SIFT feature points of the image to be matched and the SIFT feature points of the reference image, obtaining a matching point pair according to the position of the matching feature points in the best matching point pair in the reference image and the position of the matching feature points in the reference image, obtaining a matching area of the image to be matched in the reference image according to the SIFT feature points of the image to be matched and the SIFT feature points in the matching area, obtaining angles between the matching point pairs, eliminating the matching point pairs which are not in a preset angle interval in the angles, obtaining updated matching point pairs, and obtaining a result of image matching corresponding to the matching area when the number of the updated matching point pairs is greater than the number of the preset point pairs.
In embodiments, when executed by a processor, the computer program further realizes the following steps of obtaining SIFT feature points of an image to be matched and SIFT feature points of a reference image corresponding to the image to be matched, calculating Euclidean distances between the SIFT feature points of the image to be matched and the SIFT feature points of the reference image, obtaining a minimum Euclidean distance among the Euclidean distances, taking two feature points corresponding to the minimum Euclidean distances as a best matching point pair, obtaining the position of a matching feature point in the best matching point pair in the reference image, obtaining a comparison area of the image to be matched in the reference image according to the position of the matching feature point in the reference image, and matching the image to be matched according to the comparison area to obtain a matching result of the image to be matched.
In embodiments, when executed by a processor, the computer program further realizes the following steps of obtaining SIFT feature points of an image to be matched and SIFT feature points of a reference image corresponding to the image to be matched, obtaining a best matching point pair according to the SIFT feature points of the image to be matched and the SIFT feature points of the reference image, obtaining a comparison area of the image to be matched in the reference image according to the position of the matching feature points in the reference image, the position of the SIFT feature points corresponding to the matching feature points in the image to be matched and the size of the image to be matched, and matching the image to be matched according to the comparison area to obtain a matching result of the image to be matched.
In embodiments, when executed by a processor, the computer program further realizes the following steps of obtaining an image to be matched and a reference image corresponding to the image to be matched, respectively performing Gaussian blur processing on the image to be matched and the reference image to obtain SIFT feature points of the image to be matched and SIFT feature points of the reference image, wherein the data type in the Gaussian blur processing is integer data, obtaining a best matching point pair according to the SIFT feature points of the image to be matched and the SIFT feature points of the reference image, wherein the best matching point pair comprises matching feature points of the image to be matched in the reference image, obtaining a position of the matching feature points in the best matching point pair in the reference image, obtaining a comparison area of the image to be matched in the reference image according to the position of the matching feature points in the reference image, and matching the image to be matched according to the comparison area to obtain a matching result of the image to be matched.
It will be understood by those of ordinary skill in the art that all or a portion of the processes of the methods of the embodiments described above may be implemented by a computer program that may be stored in a non-volatile computer-readable storage medium, which when executed, may include the processes of the embodiments of the methods described above, wherein any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1, a method of image matching, the method comprising:
acquiring SIFT feature points of an image to be matched and SIFT feature points of a reference image corresponding to the image to be matched;
obtaining an optimal matching point pair according to the SIFT feature points of the image to be matched and the SIFT feature points of the reference image, wherein the optimal matching point pair comprises matching feature points of the image to be matched in the reference image;
obtaining the position of a matching feature point in the best matching point pair in the reference image, and obtaining a comparison area of the image to be matched in the reference image according to the position of the matching feature point in the reference image;
and matching the images to be matched according to the comparison area to obtain the matching result of the images to be matched.
2. The method according to claim 1, wherein the matching the image to be matched according to the comparison region to obtain the matching result of the image to be matched comprises:
obtaining a matching point pair according to the SIFT feature point of the image to be matched and the SIFT feature point of the comparison region;
and when the number of the matching point pairs is larger than the number of preset point pairs, obtaining the result of image matching between the image to be matched and the image corresponding to the comparison area.
3. The method of claim 2, wherein obtaining matching point pairs according to the SIFT feature points of the image to be matched and the SIFT feature points of the comparison region comprises:
calculating Euclidean distances between the SIFT feature points of the image to be matched and the SIFT feature points of the comparison region;
acquiring the minimum Euclidean distance and the next approximate Euclidean distance in the Euclidean distances;
and when the ratio of the minimum Euclidean distance to the next nearest Euclidean distance is smaller than a preset ratio, obtaining a matching point pair, wherein the matching point pair comprises the SIFT feature point of the image to be matched and the matching point of the SIFT feature point of the image to be matched in the comparison area.
4. The method according to claim 2, wherein the obtaining of the matching point pair according to the SIFT feature points of the image to be matched and the SIFT feature points of the comparison region further comprises:
obtaining the angle between each matching point pair;
eliminating the matching point pairs which are not in the preset angle interval in the angle to obtain updated matching point pairs;
when the number of the matching point pairs is greater than the number of preset point pairs, obtaining a result of image matching between the image to be matched and the comparison area, including:
and when the number of the updated matching point pairs is larger than the number of preset point pairs, obtaining the result of image matching between the image to be matched and the image corresponding to the comparison area.
5. The method of claim 1, wherein obtaining a best matching point pair according to the SIFT feature points of the image to be matched and the SIFT feature points of the reference image comprises:
calculating Euclidean distances between the SIFT feature points of the image to be matched and the SIFT feature points of the reference image;
and acquiring the minimum Euclidean distance in the Euclidean distances, and taking the two feature points corresponding to the minimum Euclidean distance as the best matching point pair.
6. The method according to claim 1, wherein obtaining the comparison region of the image to be matched in the reference image according to the position of the matching feature point in the reference image comprises:
and obtaining a comparison area of the image to be matched in the reference image according to the position of the matching feature point in the reference image, the position of the SIFT feature point corresponding to the matching feature point in the image to be matched and the size of the image to be matched.
7. The method according to any of claims 1 to 6, wherein the obtaining the SIFT feature points of the image to be matched and the SIFT feature points of the reference image corresponding to the image to be matched comprises:
acquiring an image to be matched and a reference image corresponding to the image to be matched;
and respectively carrying out Gaussian blur processing on the image to be matched and the reference image to obtain SIFT feature points of the image to be matched and SIFT feature points of the reference image, wherein the data type in the Gaussian blur processing is integer data.
An image matching apparatus of , said apparatus comprising:
the characteristic point acquisition module is used for acquiring SIFT characteristic points of the image to be matched and SIFT characteristic points of a reference image corresponding to the image to be matched;
the optimal matching point pair obtaining module is used for obtaining an optimal matching point pair according to the SIFT feature points of the image to be matched and the SIFT feature points of the reference image, and the optimal matching point pair comprises matching feature points of the image to be matched in the reference image;
a comparison region determining module, configured to obtain a position of a matching feature point in the best matching point pair in the reference image, and obtain a comparison region of the image to be matched in the reference image according to the position of the matching feature point in the reference image;
and the matching module is used for matching the images to be matched according to the comparison area to obtain the matching result of the images to be matched.
Computer device of , comprising a memory and a processor, the memory storing a computer program, wherein the processor when executing the computer program is configured to carry out the steps of the method of any of claims 1 to 7 as claimed in .
10, computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any of claims 1 to 7, wherein represents a set of instructions for carrying out the method.
CN201810790418.5A 2018-07-18 2018-07-18 Image matching method and device, computer equipment and storage medium Active CN110738222B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810790418.5A CN110738222B (en) 2018-07-18 2018-07-18 Image matching method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810790418.5A CN110738222B (en) 2018-07-18 2018-07-18 Image matching method and device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110738222A true CN110738222A (en) 2020-01-31
CN110738222B CN110738222B (en) 2022-12-06

Family

ID=69234895

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810790418.5A Active CN110738222B (en) 2018-07-18 2018-07-18 Image matching method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110738222B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111680549A (en) * 2020-04-28 2020-09-18 肯维捷斯(武汉)科技有限公司 Paper pattern recognition method
CN112766264A (en) * 2021-01-25 2021-05-07 广州互联网法院 Picture comparison method, electronic device and computer readable storage medium
CN115205562A (en) * 2022-07-22 2022-10-18 四川云数赋智教育科技有限公司 Random test paper registration method based on feature points
CN117236566A (en) * 2023-11-10 2023-12-15 山东顺发重工有限公司 Whole-process visual flange plate package management system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101819680A (en) * 2010-05-12 2010-09-01 上海交通大学 Detection method of picture matching point pair
CN102034114A (en) * 2010-12-03 2011-04-27 天津工业大学 Characteristic point detection-based template matching tracing method
CN102722731A (en) * 2012-05-28 2012-10-10 南京航空航天大学 Efficient image matching method based on improved scale invariant feature transform (SIFT) algorithm
CN105069636A (en) * 2015-08-11 2015-11-18 深圳兆日科技股份有限公司 Material anti-counterfeiting method and material anti-counterfeiting system
WO2016062159A1 (en) * 2014-10-20 2016-04-28 网易(杭州)网络有限公司 Image matching method and platform for testing of mobile phone applications
CN106023187A (en) * 2016-05-17 2016-10-12 西北工业大学 Image registration method based on SIFT feature and angle relative distance

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101819680A (en) * 2010-05-12 2010-09-01 上海交通大学 Detection method of picture matching point pair
CN102034114A (en) * 2010-12-03 2011-04-27 天津工业大学 Characteristic point detection-based template matching tracing method
CN102722731A (en) * 2012-05-28 2012-10-10 南京航空航天大学 Efficient image matching method based on improved scale invariant feature transform (SIFT) algorithm
WO2016062159A1 (en) * 2014-10-20 2016-04-28 网易(杭州)网络有限公司 Image matching method and platform for testing of mobile phone applications
CN105069636A (en) * 2015-08-11 2015-11-18 深圳兆日科技股份有限公司 Material anti-counterfeiting method and material anti-counterfeiting system
CN106023187A (en) * 2016-05-17 2016-10-12 西北工业大学 Image registration method based on SIFT feature and angle relative distance

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
贾雯晓等: "基于SIFT和改进的RANSAC图像配准算法", 《计算机工程与应用》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111680549A (en) * 2020-04-28 2020-09-18 肯维捷斯(武汉)科技有限公司 Paper pattern recognition method
CN112766264A (en) * 2021-01-25 2021-05-07 广州互联网法院 Picture comparison method, electronic device and computer readable storage medium
CN115205562A (en) * 2022-07-22 2022-10-18 四川云数赋智教育科技有限公司 Random test paper registration method based on feature points
CN115205562B (en) * 2022-07-22 2023-03-14 四川云数赋智教育科技有限公司 Random test paper registration method based on feature points
CN117236566A (en) * 2023-11-10 2023-12-15 山东顺发重工有限公司 Whole-process visual flange plate package management system
CN117236566B (en) * 2023-11-10 2024-02-06 山东顺发重工有限公司 Whole-process visual flange plate package management system

Also Published As

Publication number Publication date
CN110738222B (en) 2022-12-06

Similar Documents

Publication Publication Date Title
Kho et al. Cancelable fingerprint template design with randomized non-negative least squares
CN108230383B (en) Hand three-dimensional data determination method and device and electronic equipment
CN110738222A (en) Image matching method and device, computer equipment and storage medium
US8306288B2 (en) Automatic identification of fingerprint inpainting target areas
CN110738236B (en) Image matching method and device, computer equipment and storage medium
WO2020143325A1 (en) Electronic document generation method and device
CN111340077B (en) Attention mechanism-based disparity map acquisition method and device
CN110147708B (en) Image data processing method and related device
JP2011511340A (en) Feature-based signature for image identification
CN110909809A (en) Card image identification method based on deep learning
Park et al. Rotation and scale invariant upsampled log-polar fourier descriptor for copy-move forgery detection
CN112132812B (en) Certificate verification method and device, electronic equipment and medium
CN111009001A (en) Image registration method, device, equipment and storage medium
El-Abed et al. Quality assessment of image-based biometric information
JP6526494B2 (en) INFORMATION PROCESSING PROGRAM AND INFORMATION PROCESSING DEVICE
CN104268550A (en) Feature extraction method and device
CN113191189A (en) Face living body detection method, terminal device and computer readable storage medium
CN112348008A (en) Certificate information identification method and device, terminal equipment and storage medium
CN115689947A (en) Image sharpening method, system, electronic device and storage medium
CN108304838B (en) Picture information identification method and terminal
Moussa et al. FRoTeMa: Fast and Robust Template Matching
CN114049491A (en) Fingerprint segmentation model training method, fingerprint segmentation device, fingerprint segmentation equipment and fingerprint segmentation medium
CN112329528A (en) Fingerprint input method and device, storage medium and electronic equipment
Song et al. The method of shape recognition based on V-system
CN107886100B (en) BRIEF feature descriptor based on rank array

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant