CN111950563A - Image matching method and device and computer readable storage medium - Google Patents

Image matching method and device and computer readable storage medium Download PDF

Info

Publication number
CN111950563A
CN111950563A CN202010582626.3A CN202010582626A CN111950563A CN 111950563 A CN111950563 A CN 111950563A CN 202010582626 A CN202010582626 A CN 202010582626A CN 111950563 A CN111950563 A CN 111950563A
Authority
CN
China
Prior art keywords
detected
color
feature
descriptor
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202010582626.3A
Other languages
Chinese (zh)
Inventor
张伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Lianbao Information Technology Co Ltd
Original Assignee
Hefei Lianbao Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Lianbao Information Technology Co Ltd filed Critical Hefei Lianbao Information Technology Co Ltd
Priority to CN202010582626.3A priority Critical patent/CN111950563A/en
Publication of CN111950563A publication Critical patent/CN111950563A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/757Matching configurations of points or features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an image matching method, an image matching device and a computer readable storage medium, wherein the image matching method comprises the following steps: extracting characteristic points to be detected of an image to be detected in a gray scale space; generating a feature descriptor to be detected and a color descriptor to be detected corresponding to the feature point to be detected according to the extracted feature point to be detected; acquiring contrast characteristic points in a contrast image and a contrast characteristic descriptor and a contrast color descriptor corresponding to the contrast characteristic points; and determining whether the feature points to be detected and the comparison feature points are matched or not according to the feature descriptors to be detected, the color descriptors to be detected, the comparison feature descriptors and the comparison color descriptors. Therefore, the color descriptor for the feature points is added on the basis of the original feature descriptor, and the image matching is accurately carried out according to the two directions of the feature descriptor and the color descriptor. And further improves the accuracy in image matching.

Description

Image matching method and device and computer readable storage medium
Technical Field
The present invention relates to the field of computer vision, and in particular, to an image matching method and apparatus, and a computer-readable storage medium.
Background
The current image matching usually uses respective feature points in the image to be detected and the comparison image to perform feature point matching, and since there are likely to be a plurality of similar feature points in the image, the accuracy in image matching is not high.
Disclosure of Invention
The embodiment of the invention provides an image matching method, an image matching device and a computer readable storage medium, which have the technical effect of improving the image matching accuracy.
One aspect of the present invention provides an image matching method, including: extracting characteristic points to be detected of an image to be detected in a gray scale space; generating a feature descriptor to be detected and a color descriptor to be detected corresponding to the feature point to be detected according to the extracted feature point to be detected; acquiring contrast characteristic points in a contrast image and a contrast characteristic descriptor and a contrast color descriptor corresponding to the contrast characteristic points; and determining whether the feature points to be detected and the comparison feature points are matched or not according to the feature descriptors to be detected, the color descriptors to be detected, the comparison feature descriptors and the comparison color descriptors.
In an implementation manner, the generating, according to the extracted feature points to be detected, a color descriptor to be detected corresponding to the feature points to be detected includes: determining a detection area in the image to be detected by taking the characteristic points to be detected as a base point; according to the determined detection area, generating a brightness value for representing the brightness of the detection area, a color range value for representing the color distribution range of the detection area and a color symmetry value for representing the color distribution symmetry of the detection area; and generating a color descriptor to be detected corresponding to the feature point to be detected according to the generated brightness value, the generated color range value and the generated color symmetry value.
In an implementation manner, the determining whether the feature points to be detected and the comparison feature points are matched according to the feature descriptor to be detected, the color descriptor to be detected, the comparison feature descriptor, and the comparison color descriptor includes: obtaining a first similarity value according to the to-be-detected feature descriptor and the comparison feature descriptor; obtaining a second similarity value according to the color descriptor to be detected and the contrast color descriptor; and determining whether the feature points to be detected and the comparison feature points are matched or not according to the first similarity value and the second similarity value obtained by calculation.
In an implementation manner, the determining whether the feature point to be detected and the comparison feature point are matched according to the calculated first similarity value and the second similarity value includes: weighting and adding the first similarity value and the second similarity value to obtain a total similarity value; and if the total similarity value is smaller than a first preset threshold value, determining that the feature points to be detected are matched with the comparison feature points.
In an embodiment, before determining that the feature points to be detected and the comparison feature points match, the method further includes: obtaining the distance between the current characteristic point to be detected and other characteristic points to be detected in the image to be detected to obtain a first distance value; obtaining the distance between the current contrast characteristic point and other contrast characteristic points in the image to be contrasted to obtain a second distance value; and if the difference value between the first distance value and the second distance value is within a second preset threshold value, determining that the feature point to be detected is matched with the comparison feature point.
Another aspect of the present invention provides an image matching apparatus, comprising: the image characteristic acquisition module is used for extracting characteristic points to be detected of the image to be detected in a gray scale space; the image descriptor generation module to be detected is used for generating a feature descriptor to be detected and a color descriptor to be detected, which correspond to the feature point to be detected, according to the extracted feature point to be detected; the contrast image characteristic acquisition module is used for acquiring contrast characteristic points in a contrast image and a contrast characteristic descriptor and a contrast color descriptor corresponding to the contrast characteristic points; and the image matching module is used for determining whether the feature points to be detected and the contrast feature points are matched or not according to the feature descriptors to be detected, the color descriptors to be detected, the contrast feature descriptors and the contrast color descriptors.
In an implementation manner, the to-be-detected image descriptor generating module is specifically configured to: determining a detection area in the image to be detected by taking the characteristic points to be detected as a base point; according to the determined detection area, generating a brightness value for representing the brightness of the detection area, a color range value for representing the color distribution range of the detection area and a color symmetry value for representing the color distribution symmetry of the detection area; and generating a color descriptor to be detected corresponding to the feature point to be detected according to the generated brightness value, the generated color range value and the generated color symmetry value.
In an embodiment, the image matching module is specifically configured to: obtaining a first similarity value according to the to-be-detected feature descriptor and the comparison feature descriptor; obtaining a second similar value according to the color descriptor to be detected and the contrast color descriptor; and determining whether the feature points to be detected and the comparison feature points are matched or not according to the first similarity value and the second similarity value obtained by calculation.
In an embodiment, the image matching module is further specifically configured to: weighting and adding the first similarity value and the second similarity value to obtain a total similarity value; and if the total similarity value is smaller than a first preset threshold value, determining that the feature points to be detected are matched with the comparison feature points.
Another aspect of the invention provides a computer-readable storage medium comprising a set of computer-executable instructions which, when executed, perform any of the image matching methods described above.
In the embodiment of the invention, a color descriptor for the feature point is added on the basis of the original feature descriptor, and the image matching is accurately carried out according to the two directions of the feature descriptor and the color descriptor. And further improves the accuracy in image matching.
Drawings
The above and other objects, features and advantages of exemplary embodiments of the present invention will become readily apparent from the following detailed description read in conjunction with the accompanying drawings. Several embodiments of the invention are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which:
in the drawings, the same or corresponding reference numerals indicate the same or corresponding parts.
FIG. 1 is a schematic diagram of an implementation flow of an image matching method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a flowchart of an embodiment of an image matching method;
fig. 3 is a schematic structural diagram of an image matching apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the objects, features and advantages of the present invention more obvious and understandable, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1 is a schematic flow chart illustrating an implementation of an image matching method according to an embodiment of the present invention.
As shown in fig. 1, an aspect of the present invention provides an image matching method, including:
step 101, extracting characteristic points to be detected of an image to be detected in a gray scale space;
102, generating a characteristic descriptor to be detected and a color descriptor to be detected corresponding to the characteristic point to be detected according to the extracted characteristic point to be detected;
103, acquiring contrast characteristic points in the contrast image, and a contrast characteristic descriptor and a contrast color descriptor corresponding to the contrast characteristic points;
and step 104, determining whether the feature points to be detected and the contrast feature points are matched or not according to the feature descriptors to be detected, the color descriptors to be detected, the contrast feature descriptors and the contrast color descriptors.
In this embodiment, the extraction of the feature points to be detected of the image to be detected in the gray scale space specifically is: firstly, an image to be detected can be converted into a gray space through color graying processing, and the Feature points of the image to be detected are extracted by using a Feature point extraction method in the existing ORB (organized FAST and Rotated BRIEF) algorithm or SIFT (Scale investment Feature transform) algorithm. In this embodiment, the ORB algorithm is preferably used to extract the feature points of the image to be detected, and compared with the SIFT algorithm, the ORB algorithm has the advantage of high computational efficiency. Note that the image in this embodiment includes a still image and also includes a video moving image.
And then generating a feature descriptor to be detected and a color descriptor to be detected corresponding to the feature point to be detected according to the extracted feature point to be detected, wherein the specific process of generating the feature descriptor to be detected corresponding to the feature point to be detected is as follows: the existing ORB algorithm or SIFT algorithm is used for generating the feature descriptors to be detected according to the feature points to be detected, and the ORB algorithm is preferably used for generating the feature descriptors to be detected according to the feature points to be detected in the embodiment.
And then, acquiring contrast characteristic points in the contrast image, and a contrast characteristic descriptor and a contrast color descriptor corresponding to the contrast characteristic points, wherein the contrast image is a standard image, and acquiring the contrast characteristic descriptor and the contrast color descriptor from the contrast image in the same way for image matching with the image to be detected.
And finally, determining whether the feature points to be detected and the comparison feature points are matched or not according to the feature descriptors to be detected, the color descriptors to be detected, the comparison feature descriptors and the comparison color descriptors.
Therefore, the color descriptor for the feature points is added on the basis of the original feature descriptor, and the image matching is accurately carried out according to the two directions of the feature descriptor and the color descriptor. And further improves the accuracy in image matching.
The method can be particularly applied to product image label detection, for example, image matching is carried out on paper image labels on a notebook computer to judge that the attached image labels are wrong, if the content of the image labels is wrong, the label types can be timely replaced at a later stage, if the attaching positions of the image labels are wrong, the offset angle can be further calculated according to the current attaching positions, and finally, angle adjustment is carried out on the attaching equipment according to the offset angle.
In an implementation manner, generating a to-be-detected color descriptor corresponding to the to-be-detected feature point according to the extracted to-be-detected feature point includes:
determining a detection area in the image to be detected by taking the characteristic points to be detected as base points;
according to the determined detection area, generating a brightness value for representing the brightness degree of the detection area, a color range value for representing the color distribution range of the detection area and a color symmetry value for representing the color distribution symmetry of the detection area;
and generating a color descriptor to be detected corresponding to the characteristic point to be detected according to the generated brightness value, the generated color range value and the generated color symmetry value.
In this embodiment, the specific process of generating the color descriptor to be detected corresponding to the feature point to be detected in the above steps is as follows:
the feature points to be detected are used as base points to determine the detection area in the image to be detected, and in this embodiment, it is particularly preferable to select a 31 × 31 pixel range around the feature points to be detected as the detection area.
Then, according to the determined detection area, generating a brightness value for representing the brightness of the detection area, a color range value for representing the color distribution range of the detection area, and a color symmetry value for representing the color distribution symmetry of the detection area, specifically:
the detection area is mapped into an RGB image and the RGB image is divided into a plurality of color channels.
Then calculate the first order color moment mutThe concrete formula is as follows:
Figure BDA0002552943440000061
wherein mutRepresents the average of all pixels on the t-th color channel to reflect the brightness of the image, N represents the total number of all pixels in the detection area, pt,kIndicating that the t-th color channel is at the k-th pixel value.
Then, the second order color moment sigma is calculatedt,σtThe variance of all pixels on the t-th color channel is expressed to reflect the color distribution range of the image, and the specific formula is as follows:
Figure BDA0002552943440000062
followed by a third order moment of color st,stRepresenting the slope of all pixels on the t-th color channel, reflecting the symmetry of the image color distribution
Figure BDA0002552943440000063
In the specific calculation, the detection area is marked as C, and the corresponding color descriptor is marked as vC,colorDividing the RGB image into a red R channel, a green G channel and a blue B channel;
thus, let vC,color=[μRR,sRGG,sGBB,sB]As the color descriptor to be detected.
In an implementation manner, determining whether the feature point to be detected and the comparison feature point are matched according to the feature descriptor to be detected, the color descriptor to be detected, the comparison feature descriptor, and the comparison color descriptor includes:
obtaining a first similarity value according to the feature descriptor to be detected and the comparison feature descriptor;
obtaining a second similarity value according to the color descriptor to be detected and the contrast color descriptor;
and determining whether the feature points to be detected and the comparison feature points are matched or not according to the first similarity value and the second similarity value obtained by calculation.
In this example, the feature descriptors to be detected are denoted by vC,rbriefAnd the comparative feature descriptor is denoted by vM,rbriefAnd obtaining a contrast color descriptor v from the contrast imageM,colorWhere M denotes the detection area in the contrast image.
Since the feature descriptor to be detected and the comparison feature descriptor are vectors, the specific calculation process of the first similarity value is to calculate a distance value between the feature descriptor to be detected and the comparison feature descriptor, and the smaller the distance value, the higher the similarity is.
Similarly, the distance value between the color descriptor to be detected and the contrast color descriptor is calculated, and the smaller the distance value is, the higher the color similarity is.
And then determining whether the feature points to be detected and the comparison feature points are matched or not according to the first similarity value and the second similarity value obtained by calculation.
In an implementation manner, determining whether the feature point to be detected and the comparison feature point are matched according to the calculated first similarity value and the second similarity value includes:
weighting and adding the first similarity value and the second similarity value to obtain a total similarity value;
and if the total similarity value is smaller than a first preset threshold value, determining that the feature points to be detected are matched with the comparison feature points.
In this embodiment, after the first similarity value and the second similarity value are obtained, the first similarity value and the second similarity value are multiplied by the corresponding weight values, and the two calculated results are added to obtain a total similarity value;
and judging that the obtained total similarity value is compared with a first preset threshold value in numerical value, and if the total similarity value is smaller than the first preset threshold value, indicating that the similarity of the two feature points is high, determining that the feature points to be detected are matched with the comparison feature points. On the contrary, if the total similarity value is not less than the first preset threshold, it is determined that the feature point to be detected and the comparison feature point are not matched.
In an embodiment, before determining that the feature points to be detected and the comparison feature points match, the method further includes:
obtaining the distance between the current characteristic point to be detected and other characteristic points to be detected in the image to be detected to obtain a first distance value;
obtaining the distance between the current contrast characteristic point and other contrast characteristic points in the contrast image to obtain a second distance value;
and if the difference value between the first distance value and the second distance value is within a second preset threshold value, determining that the feature point to be detected is matched with the comparison feature point.
In this embodiment, to further improve matching accuracy, before determining that the feature points to be detected and the comparison feature points are matched, distances between the current feature point to be detected and each of the other feature points to be detected in the image to be detected are respectively calculated, and the distances between all the two feature points are added or weighted and added to obtain a first distance value. Similarly, the distances between the current contrast characteristic point and other contrast characteristic points in the contrast image are calculated respectively, and the distances between the two characteristic points are added or weighted and added to obtain a second distance value.
And then, subtracting the first distance value from the second distance value to obtain a difference value, and if the difference value is within a second preset threshold value, determining that the feature point to be detected is matched with the comparison feature point.
Therefore, the spatial position features are added on the basis of the gray level features and the color features, and the feature points are further accurately matched under the matching of the three features, so that mismatching is reduced.
Fig. 2 is a schematic diagram of a specific implementation flow of an image matching method according to an embodiment of the present invention.
As shown in fig. 2, first, an image to be detected and a comparison image are obtained, and feature points in the respective images are detected and extracted by using the OFAST algorithm in the ORB algorithm.
Then, respectively describing the feature points in the image to be detected in the gray space and the feature points in the comparison image by using an RBRIEF algorithm in an ORB algorithm to respectively obtain a feature descriptor to be detected and a comparison feature descriptor; and then, carrying out color description on the characteristic points corresponding to the respective images to respectively obtain a color descriptor to be detected and a contrast color descriptor.
And then respectively carrying out similarity calculation on the corresponding feature descriptors and color descriptors in the image to be detected and the contrast image.
And performing weighted addition on the two obtained calculation results to obtain a total similarity value, and performing rough matching according to the total similarity value.
And then further carrying out fine matching on the feature points according to the distance features between the feature points and other feature points in the corresponding image.
After the characteristic points are matched, the position and the offset angle of the image to be detected are calculated by utilizing an ORB algorithm, so that the position of the image to be detected can be adjusted subsequently.
Fig. 3 is a schematic structural diagram of an image matching apparatus according to an embodiment of the present invention.
As shown in fig. 3, another aspect of the present invention provides an image matching apparatus, comprising:
the image feature acquisition module 201 is configured to extract feature points to be detected of an image to be detected in a gray scale space;
the to-be-detected image descriptor generating module 202 is configured to generate a to-be-detected feature descriptor and a to-be-detected color descriptor corresponding to the to-be-detected feature point according to the extracted to-be-detected feature point;
a contrast image feature obtaining module 203, configured to obtain contrast feature points in a contrast image, and a contrast feature descriptor and a contrast color descriptor corresponding to the contrast feature points;
and the image matching module 204 is configured to determine whether the feature points to be detected and the comparison feature points are matched according to the feature descriptor to be detected, the color descriptor to be detected, the comparison feature descriptor and the comparison color descriptor.
In this embodiment, first, the to-be-detected feature points of the to-be-detected image in the gray scale space are extracted by the to-be-detected image feature acquisition module 201, which specifically includes: firstly, an image to be detected can be converted into a gray space through color graying processing, and the Feature points of the image to be detected are extracted by using a Feature point extraction method in the existing ORB (organized FAST and Rotated BRIEF) algorithm or SIFT (Scale investment Feature transform) algorithm. In this embodiment, the ORB algorithm is preferably used to extract the feature points of the image to be detected, and compared with the SIFT algorithm, the ORB algorithm has the advantage of high computational efficiency. Note that the image in this embodiment includes a still image and also includes a video moving image.
Then, the image descriptor generation module 202 to be detected generates a feature descriptor to be detected and a color descriptor to be detected corresponding to the feature point to be detected according to the extracted feature point to be detected, where the specific process of generating the feature descriptor to be detected corresponding to the feature point to be detected is as follows: the existing ORB algorithm or SIFT algorithm is used for generating the feature descriptors to be detected according to the feature points to be detected, and the ORB algorithm is preferably used for generating the feature descriptors to be detected according to the feature points to be detected in the embodiment.
Then, the contrast feature points in the contrast image, and the contrast feature descriptors and the contrast color descriptors corresponding to the contrast feature points are obtained by the contrast image feature obtaining module 203, where the contrast image is a standard image, and the contrast feature descriptors and the contrast color descriptors are obtained from the contrast image in the same manner as above for image matching with the image to be detected.
And finally, determining whether the feature points to be detected and the contrast feature points are matched or not through the image matching module 204 according to the feature descriptors to be detected, the color descriptors to be detected, the contrast feature descriptors and the contrast color descriptors.
Therefore, the color descriptor for the feature points is added on the basis of the original feature descriptor, and the image matching is accurately carried out according to the two directions of the feature descriptor and the color descriptor. And further improves the accuracy in image matching.
The device can be particularly applied to product image label detection, for example, image matching is carried out on paper image labels on a notebook computer to judge that the attached image labels are wrong, if the content of the image labels is wrong, the label types can be timely replaced at the later stage, if the attaching positions of the image labels are wrong, the offset angle can be further calculated according to the current attaching positions, and finally, angle adjustment is carried out on the attaching equipment according to the offset angle.
In an implementation manner, the to-be-detected image descriptor generating module 202 is specifically configured to:
determining a detection area in the image to be detected by taking the characteristic points to be detected as base points;
according to the determined detection area, generating a brightness value for representing the brightness degree of the detection area, a color range value for representing the color distribution range of the detection area and a color symmetry value for representing the color distribution symmetry of the detection area;
and generating a color descriptor to be detected corresponding to the characteristic point to be detected according to the generated brightness value, the generated color range value and the generated color symmetry value.
In this embodiment, the specific process of the to-be-detected image descriptor generation module 202 for generating the to-be-detected color descriptor corresponding to the to-be-detected feature point is as follows:
the feature points to be detected are used as base points to determine the detection area in the image to be detected, and in this embodiment, it is particularly preferable to select a 31 × 31 pixel range around the feature points to be detected as the detection area.
Then, according to the determined detection area, generating a brightness value for representing the brightness of the detection area, a color range value for representing the color distribution range of the detection area, and a color symmetry value for representing the color distribution symmetry of the detection area, specifically:
the detection area is mapped into an RGB image and the RGB image is divided into a plurality of color channels.
Then calculate the first order color moment mutThe concrete formula is as follows:
Figure BDA0002552943440000111
wherein mutRepresents the average of all pixels on the t-th color channel to reflect the brightness of the image, N represents the total number of all pixels in the detection area, pt,kIndicating that the t-th color channel is at the k-th pixel value.
Then, the second order color moment sigma is calculatedt,σtThe variance of all pixels on the t-th color channel is expressed to reflect the color distribution range of the image, and the specific formula is as follows:
Figure BDA0002552943440000112
followed by a third order moment of color st,stRepresenting the slope of all pixels on the t-th color channel, reflecting the symmetry of the image color distribution
Figure BDA0002552943440000113
In the specific calculation, the detection area is marked as C, and the corresponding color descriptor is marked as vC,colorDividing the RGB image into a red R channel, a green G channel and a blue B channel;
thus, let vC,color=[μRR,sRGG,sGBB,sB]As the color descriptor to be detected.
In an implementation, the image matching module 204 is specifically configured to:
obtaining a first similarity value according to the feature descriptor to be detected and the comparison feature descriptor;
obtaining a second similarity value according to the color descriptor to be detected and the contrast color descriptor;
and determining whether the feature points to be detected and the comparison feature points are matched or not according to the first similarity value and the second similarity value obtained by calculation.
In this example, the feature descriptors to be detected are denoted by vC,rbriefAnd the comparative feature descriptor is denoted by vM,rbriefAnd obtaining a contrast color descriptor v from the contrast imageM,colorWhere M denotes the detection area in the contrast image.
Since the feature descriptor to be detected and the comparison feature descriptor are vectors, the specific calculation process of the first similarity value is to calculate a distance value between the feature descriptor to be detected and the comparison feature descriptor, and the smaller the distance value, the higher the similarity is.
Similarly, the distance value between the color descriptor to be detected and the contrast color descriptor is calculated, and the smaller the distance value is, the higher the color similarity is.
And then determining whether the feature points to be detected and the comparison feature points are matched or not according to the first similarity value and the second similarity value obtained by calculation.
In an implementation, the image matching module 204 is further specifically configured to:
weighting and adding the first similarity value and the second similarity value to obtain a total similarity value;
and if the total similarity value is smaller than a first preset threshold value, determining that the feature points to be detected are matched with the comparison feature points.
In this embodiment, after the first similarity value and the second similarity value are obtained, the first similarity value and the second similarity value are multiplied by the corresponding weight values, and the two calculated results are added to obtain a total similarity value;
and judging that the obtained total similarity value is compared with a first preset threshold value in numerical value, and if the total similarity value is smaller than the first preset threshold value, indicating that the similarity of the two feature points is high, determining that the feature points to be detected are matched with the comparison feature points. On the contrary, if the total similarity value is not less than the first preset threshold, it is determined that the feature point to be detected and the comparison feature point are not matched.
Another aspect of the invention provides a computer-readable storage medium comprising a set of computer-executable instructions which, when executed, perform any of the image matching methods described above.
In an embodiment of the present invention, a computer-readable storage medium includes a set of computer-executable instructions, which when executed, are configured to extract feature points to be detected in a gray scale space of an image to be detected; generating a characteristic descriptor to be detected and a color descriptor to be detected corresponding to the characteristic point to be detected according to the extracted characteristic point to be detected; acquiring contrast characteristic points in the contrast image, and a contrast characteristic descriptor and a contrast color descriptor corresponding to the contrast characteristic points; and determining whether the characteristic points to be detected and the contrast characteristic points are matched or not according to the characteristic descriptor to be detected, the color descriptor to be detected, the contrast characteristic descriptor and the contrast color descriptor.
Therefore, the color descriptor for the feature points is added on the basis of the original feature descriptor, and the image matching is accurately carried out according to the two directions of the feature descriptor and the color descriptor. And further improves the accuracy in image matching.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1. An image matching method, characterized in that the method comprises:
extracting characteristic points to be detected of an image to be detected in a gray scale space;
generating a feature descriptor to be detected and a color descriptor to be detected corresponding to the feature point to be detected according to the extracted feature point to be detected;
acquiring contrast characteristic points in a contrast image and a contrast characteristic descriptor and a contrast color descriptor corresponding to the contrast characteristic points;
and determining whether the feature points to be detected and the comparison feature points are matched or not according to the feature descriptors to be detected, the color descriptors to be detected, the comparison feature descriptors and the comparison color descriptors.
2. The method according to claim 1, wherein the generating a to-be-detected color descriptor corresponding to the to-be-detected feature point according to the extracted to-be-detected feature point comprises:
determining a detection area in the image to be detected by taking the characteristic points to be detected as a base point;
according to the determined detection area, generating a brightness value for representing the brightness of the detection area, a color range value for representing the color distribution range of the detection area and a color symmetry value for representing the color distribution symmetry of the detection area;
and generating a color descriptor to be detected corresponding to the feature point to be detected according to the generated brightness value, the generated color range value and the generated color symmetry value.
3. The method according to claim 1, wherein the determining whether the feature points to be detected and the comparison feature points are matched according to the feature descriptor to be detected, the color descriptor to be detected, the comparison feature descriptor and the comparison color descriptor comprises:
obtaining a first similarity value according to the to-be-detected feature descriptor and the comparison feature descriptor;
obtaining a second similarity value according to the color descriptor to be detected and the contrast color descriptor;
and determining whether the feature points to be detected and the comparison feature points are matched or not according to the first similarity value and the second similarity value obtained by calculation.
4. The method according to claim 3, wherein determining whether the feature points to be detected and the comparison feature points are matched according to the calculated first similarity value and the second similarity value comprises:
weighting and adding the first similarity value and the second similarity value to obtain a total similarity value;
and if the total similarity value is smaller than a first preset threshold value, determining that the feature points to be detected are matched with the comparison feature points.
5. The method according to claim 4, wherein before determining that the feature points to be detected and the comparison feature points match, the method further comprises:
obtaining the distance between the current characteristic point to be detected and other characteristic points to be detected in the image to be detected to obtain a first distance value;
obtaining the distance between the current contrast characteristic point and other contrast characteristic points in the image to be contrasted to obtain a second distance value;
and if the difference value between the first distance value and the second distance value is within a second preset threshold value, determining that the feature point to be detected is matched with the comparison feature point.
6. An image matching apparatus, characterized in that the apparatus comprises:
the image characteristic acquisition module is used for extracting characteristic points to be detected of the image to be detected in a gray scale space;
the image descriptor generation module to be detected is used for generating a feature descriptor to be detected and a color descriptor to be detected, which correspond to the feature point to be detected, according to the extracted feature point to be detected;
the contrast image characteristic acquisition module is used for acquiring contrast characteristic points in a contrast image and a contrast characteristic descriptor and a contrast color descriptor corresponding to the contrast characteristic points;
and the image matching module is used for determining whether the feature points to be detected and the contrast feature points are matched or not according to the feature descriptors to be detected, the color descriptors to be detected, the contrast feature descriptors and the contrast color descriptors.
7. The apparatus of claim 6, wherein the to-be-detected image descriptor generating module is specifically configured to:
determining a detection area in the image to be detected by taking the characteristic points to be detected as a base point;
according to the determined detection area, generating a brightness value for representing the brightness of the detection area, a color range value for representing the color distribution range of the detection area and a color symmetry value for representing the color distribution symmetry of the detection area;
and generating a color descriptor to be detected corresponding to the feature point to be detected according to the generated brightness value, the generated color range value and the generated color symmetry value.
8. The apparatus of claim 6, wherein the image matching module is specifically configured to:
obtaining a first similarity value according to the to-be-detected feature descriptor and the comparison feature descriptor;
obtaining a second similar value according to the color descriptor to be detected and the contrast color descriptor;
and determining whether the feature points to be detected and the comparison feature points are matched or not according to the first similarity value and the second similarity value obtained by calculation.
9. The apparatus of claim 8, wherein the image matching module is further specifically configured to:
weighting and adding the first similarity value and the second similarity value to obtain a total similarity value;
and if the total similarity value is smaller than a first preset threshold value, determining that the feature points to be detected are matched with the comparison feature points.
10. A computer-readable storage medium comprising a set of computer-executable instructions that, when executed, perform the image matching method of any of claims 1-5.
CN202010582626.3A 2020-06-23 2020-06-23 Image matching method and device and computer readable storage medium Withdrawn CN111950563A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010582626.3A CN111950563A (en) 2020-06-23 2020-06-23 Image matching method and device and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010582626.3A CN111950563A (en) 2020-06-23 2020-06-23 Image matching method and device and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN111950563A true CN111950563A (en) 2020-11-17

Family

ID=73337856

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010582626.3A Withdrawn CN111950563A (en) 2020-06-23 2020-06-23 Image matching method and device and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN111950563A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112766264A (en) * 2021-01-25 2021-05-07 广州互联网法院 Picture comparison method, electronic device and computer readable storage medium
CN113762289A (en) * 2021-09-30 2021-12-07 广州理工学院 Image matching system based on ORB algorithm and matching method thereof
CN114782724A (en) * 2022-06-17 2022-07-22 联宝(合肥)电子科技有限公司 Image matching method and device, electronic equipment and storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112766264A (en) * 2021-01-25 2021-05-07 广州互联网法院 Picture comparison method, electronic device and computer readable storage medium
CN112766264B (en) * 2021-01-25 2024-06-07 广州互联网法院 Picture comparison method, electronic device and computer readable storage medium
CN113762289A (en) * 2021-09-30 2021-12-07 广州理工学院 Image matching system based on ORB algorithm and matching method thereof
CN114782724A (en) * 2022-06-17 2022-07-22 联宝(合肥)电子科技有限公司 Image matching method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
WO2020207423A1 (en) Skin type detection method, skin type grade classification method and skin type detection apparatus
CN111950563A (en) Image matching method and device and computer readable storage medium
CN109241985B (en) Image identification method and device
CN111028213A (en) Image defect detection method and device, electronic equipment and storage medium
CN105279772B (en) A kind of trackability method of discrimination of infrared sequence image
CN110197185B (en) Method and system for monitoring space under bridge based on scale invariant feature transform algorithm
WO2007136332A1 (en) Method and apparatus for identifying properties of an object detected by a video surveillance camera
CN111144207A (en) Human body detection and tracking method based on multi-mode information perception
CN106600613B (en) Improvement LBP infrared target detection method based on embedded gpu
CN108805139B (en) Image similarity calculation method based on frequency domain visual saliency analysis
JP2009163682A (en) Image discrimination device and program
CN107507140B (en) Method for suppressing vehicle shadow interference in open scene of highway based on feature fusion
JP2011165170A (en) Object detection device and program
CN115760762A (en) Corrosion detection method, detection device and storage medium
CN110633691A (en) Binocular in-vivo detection method based on visible light and near-infrared camera
CN116993839B (en) Coding mode screening method and device, electronic equipment and storage medium
CN114022845A (en) Real-time detection method and computer readable medium for electrician insulating gloves
US20230386023A1 (en) Method for detecting medical images, electronic device, and storage medium
CN115731221A (en) Self-adaptive infrared small target detection method considering neighborhood anisotropy
CN115861595A (en) Multi-scale domain self-adaptive heterogeneous image matching method based on deep learning
US20160379087A1 (en) Method for determining a similarity value between a first image and a second image
CN107092912A (en) A kind of recognition methods of car plate and device
CN113870210A (en) Image quality evaluation method, device, equipment and storage medium
Hong et al. Saliency-based feature learning for no-reference image quality assessment
CN110826446B (en) Method and device for segmenting field of view region of texture-free scene video

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20201117