CN116385738A - Image feature point matching method and device, electronic equipment and storage medium - Google Patents

Image feature point matching method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN116385738A
CN116385738A CN202111559539.7A CN202111559539A CN116385738A CN 116385738 A CN116385738 A CN 116385738A CN 202111559539 A CN202111559539 A CN 202111559539A CN 116385738 A CN116385738 A CN 116385738A
Authority
CN
China
Prior art keywords
feature
image
points
feature point
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111559539.7A
Other languages
Chinese (zh)
Inventor
秦勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikrobot Co Ltd
Original Assignee
Hangzhou Hikrobot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikrobot Co Ltd filed Critical Hangzhou Hikrobot Co Ltd
Priority to CN202111559539.7A priority Critical patent/CN116385738A/en
Publication of CN116385738A publication Critical patent/CN116385738A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)
  • Collating Specific Patterns (AREA)

Abstract

The application provides a matching method and device for image feature points, electronic equipment and a storage medium, and the accuracy of feature point matching can be improved. The matching method of the image characteristic points comprises the following steps: acquiring a first image and a second image; extracting feature points of a first image and feature points of a second image, wherein the feature of each feature point comprises a bright-dark attribute, and the bright-dark attribute of each feature point is used for describing a comparison result of the gray value of each feature point and the gray values of surrounding pixel points; determining feature descriptors of feature points of the first image and feature descriptors of feature points of the second image; and carrying out feature matching judgment on feature point pairs with the same brightness and darkness attributes according to feature descriptors of feature points of the first image and feature descriptors of feature points of the second image to obtain matched feature point pairs, wherein each feature point pair with the same brightness and darkness attributes comprises one feature point of the first image and one feature point of the second image.

Description

Image feature point matching method and device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a method and an apparatus for matching image feature points, an electronic device, and a storage medium.
Background
With the rapid development of the field of computer vision, the application scenes such as image matching, image splicing and the like all relate to feature point matching.
At present, a feature point matching scheme generally determines feature descriptors of feature points, and performs feature point matching according to the matching degree of the feature descriptors. The feature descriptors may represent features of pixels in the environment surrounding the feature points.
However, the matching accuracy of the present feature point matching scheme needs to be improved.
Disclosure of Invention
The application provides a matching method and device for image feature points, electronic equipment and a storage medium, and the accuracy of feature point matching can be improved.
According to one aspect of the present application, there is provided a matching method of image feature points, including:
acquiring a first image and a second image;
extracting feature points of a first image and feature points of a second image, wherein the feature of each feature point comprises a bright-dark attribute, and the bright-dark attribute of each feature point is used for describing a comparison result of the gray value of each feature point and the gray values of surrounding pixel points;
determining feature descriptors of feature points of the first image and feature descriptors of feature points of the second image;
and carrying out feature matching judgment on feature point pairs with the same brightness and darkness attributes according to feature descriptors of feature points of the first image and feature descriptors of feature points of the second image to obtain matched feature point pairs, wherein each feature point pair with the same brightness and darkness attributes comprises one feature point of the first image and one feature point of the second image.
In some embodiments, for each feature point in the first image and the second image, when the gray value of the each feature point is higher than the gray value of surrounding pixels, the bright-dark attribute of the each feature point indicates that the gray value of the each feature point is higher than the gray value of surrounding pixels; and when the gray level of each characteristic point is lower than the gray level of surrounding pixel points, the brightness attribute of each characteristic point represents that the gray level of each characteristic point is lower than the gray level of surrounding pixel points.
In some embodiments, the performing feature matching judgment on the feature point pairs with the same brightness and darkness attribute according to the feature descriptors of the feature points of the first image and the feature descriptors of the feature points of the second image to obtain matched feature point pairs includes:
judging whether the brightness attribute of two feature points in each feature point pair is consistent or not for each feature point pair formed by the feature points of the first image and the feature points of the second image;
when the brightness attribute of the two feature points is consistent, the similarity of feature descriptors of the two feature points is determined;
when the fact that the brightness attributes of the two feature points are inconsistent is determined, similarity judgment is not carried out on feature descriptors of the two feature points;
and determining matched characteristic point pairs between the first image and the second image according to each determined similarity.
In some embodiments, the determining, according to each determined similarity, the matched feature point pair between the first image and the second image includes:
for each feature point of the first image, determining the maximum similarity from the similarity related to each feature point, and taking the feature point pair corresponding to the maximum similarity as a matched feature point pair.
According to one aspect of the present application, there is provided a matching apparatus for image feature points, including:
an image acquisition unit that acquires a first image and a second image;
a feature point extraction unit that extracts feature points of the first image and feature points of the second image, wherein a feature of each feature point includes a bright-dark attribute, and the bright-dark attribute of each feature point is used to describe a result of comparing a gray value of the feature point with gray values of surrounding pixel points;
a description information generation unit that determines a feature descriptor of a feature point of the first image and a feature descriptor of a feature point of the second image;
and the characteristic point matching unit is used for carrying out characteristic matching judgment on the characteristic point pairs with the same brightness and darkness attributes according to the characteristic descriptors of the characteristic points of the first image and the characteristic descriptors of the characteristic points of the second image to obtain matched characteristic point pairs, wherein each characteristic point pair with the same brightness and darkness attributes comprises one characteristic point of the first image and one characteristic point of the second image.
In some embodiments, for each feature point in the first image and the second image, when the gray value of the each feature point is higher than the gray value of surrounding pixels, the bright-dark attribute of the each feature point indicates that the gray value of the each feature point is higher than the gray value of surrounding pixels; and when the gray level of each characteristic point is lower than the gray level of surrounding pixel points, the brightness attribute of each characteristic point represents that the gray level of each characteristic point is lower than the gray level of surrounding pixel points.
In some embodiments, the feature point matching unit performs the feature matching judgment on the feature point pairs with the same brightness and darkness attribute according to the feature descriptors of the feature points of the first image and the feature descriptors of the feature points of the second image according to the following manner, so as to obtain the matched feature point pairs:
judging whether the brightness attribute of two feature points in each feature point pair is consistent or not for each feature point pair formed by the feature points of the first image and the feature points of the second image;
when the brightness attribute of the two feature points is consistent, the similarity of feature descriptors of the two feature points is determined;
when the fact that the brightness attributes of the two feature points are inconsistent is determined, similarity judgment is not carried out on feature descriptors of the two feature points;
and determining matched characteristic point pairs between the first image and the second image according to each determined similarity.
In some embodiments, the feature point matching unit performs the operation of determining the matched feature point pair between the first image and the second image according to the similarity determined by each of the following ways:
for each feature point of the first image, determining the maximum similarity from the similarity related to each feature point, and taking the feature point pair corresponding to the maximum similarity as a matched feature point pair.
According to one aspect of the present application, there is provided an electronic device comprising:
a memory;
a processor;
a program stored in the memory and configured to be executed by the processor, the program comprising instructions for performing a matching method of image feature points.
According to an aspect of the present application, there is provided a storage medium storing a program comprising instructions, characterized in that the instructions, when executed by an electronic device, cause the electronic device to perform a matching method of image feature points.
In summary, according to the matching scheme of the embodiment of the application, by determining the brightness attribute of each feature point and performing feature point matching judgment on the point pairs with the same brightness attribute between the first image and the second image, misjudgment on the point pairs with different brightness attributes but high similarity of feature descriptors as the matched feature point pairs can be avoided, so that the accuracy of feature point matching can be improved.
In addition, because the characteristic point matching can be avoided for the point pairs with different brightness and darkness attributes, the matching scheme of the embodiment of the application can save the calculation consumption of the characteristic point matching, and therefore the efficiency of the characteristic point matching can be improved.
Drawings
FIG. 1 illustrates a schematic diagram of a feature point no-match condition according to some embodiments of the present application;
FIG. 2 illustrates a schematic diagram of an application scenario according to some embodiments of the present application;
FIG. 3 illustrates a flow chart of a method 300 of matching image feature points according to some embodiments of the present application;
FIG. 4 illustrates a flow chart of a method 400 for feature point matching based on light and dark attributes according to some embodiments of the present application;
FIG. 5 illustrates a schematic diagram of an image feature point matching apparatus 500 according to some embodiments of the present application;
fig. 6 illustrates a schematic diagram of an electronic device according to some embodiments of the present application.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail below by referring to the accompanying drawings and examples.
In some application scenarios, the feature point matching scheme may perform feature point matching according to feature descriptors of feature points. The feature descriptors may represent features of pixel points surrounding the feature points, for example, may be represented by one-dimensional feature vectors. When the similarity of the feature descriptors of the two feature points is high, the feature point matching scheme can take the two feature points as a feature matching point pair. However, the conventional feature point matching scheme may have the following mismatching situations: the two feature points comprise a bright point and a dark point, and the feature descriptors of the bright point and the dark point have high similarity, and the bright point and the dark point are mismatched. Wherein, the bright point is that the gray value of the characteristic point is higher than the surrounding pixel points. The dark points are the gray values of the feature points lower than the surrounding pixel points. As shown in FIG. 1, P 1 Point and P 2 The points are feature point pairs extracted from two images to be matched, and the feature descriptors of the two feature points have high similarity. Therefore, in the feature point matching, the feature point matching scheme considers P as comparing the consistency of feature descriptors only 1 Point and P 2 Is a feature matching pair of points. But as can be seen from FIG. 1, P 1 The point is a dark point, P 2 The dot is a bright spot comprising P 1 And P 2 The feature matching point pairs of (2) belong to a false match.
Therefore, the embodiment of the application provides a characteristic point matching scheme, which can avoid characteristic matching of dark points and bright points and improve the accuracy of characteristic point matching. The proposed feature point matching scheme is described in detail below with reference to fig. 2 and 3.
Fig. 2 illustrates a schematic diagram of an application scenario according to some embodiments of the present application.
As shown in fig. 2, the application scenario shows a first image 110 and a second image 120, as well as an electronic device 130. The electronic device 120 may perform feature point matching on the first image 110 and the second image 120. The electronic device 120 is not limited in this application, and may be, for example, a notebook computer, a server, a network hard disk recorder, or any other device capable of performing feature point matching processing.
Fig. 3 illustrates a flow chart of a method 300 of matching image feature points according to some embodiments of the present application. The matching method 300 may be performed, for example, by the electronic device 130.
As shown in fig. 3, in step S301, a first image and a second image are acquired.
In step S302, feature points of the first image and feature points of the second image are extracted. Wherein the features of each feature point include a bright-dark attribute. The light-dark attribute of each feature point is used to describe the result of comparing the gray value of the feature point with the gray values of surrounding pixel points.
In step S303, feature descriptors of feature points of the first image and feature descriptors of feature points of the second image are determined. Here, the feature descriptors may represent features of pixel points around the feature points, and may be represented by one-dimensional feature vectors, for example.
In step S304, feature matching judgment is performed on feature point pairs with the same brightness and darkness attribute according to the feature descriptors of the feature points of the first image and the feature descriptors of the feature points of the second image, so as to obtain matched feature point pairs. Wherein each of the pairs of feature points having the same brightness attribute includes one feature point of the first image and one feature point of the second image. Here, when the light and dark attributes of the point pair formed by the feature point of the first image and the feature point of the second image are not identical, the step S304 does not perform the feature matching judgment on the point pair.
In summary, according to the matching method 300 of the embodiment of the present application, by determining the brightness attribute of each feature point and performing feature point matching judgment on the point pairs with the same brightness attribute between the first image and the second image, misjudgment on the point pairs with different brightness attributes but high similarity of feature descriptors as feature matching point pairs can be avoided, so that accuracy of feature point matching can be improved.
In addition, since feature point matching can be avoided for the point pairs with different brightness and darkness attributes, the matching method 300 of the embodiment of the present application can save calculation consumption of feature point matching, so that the efficiency of feature point matching can be improved. In addition, the feature matching point pairs determined by the embodiment of the application can be used for performing operations such as image matching or image stitching.
In some embodiments, the features of the feature points extracted in step S302 further include coordinate locations, scale information, and gradient directions. Here, in order to extract the feature points, the embodiment of the present application may upsample or downsample the first image and the second image to form a scale space. The scale space comprises a plurality of image layers of different scales. The scale information of the feature points is used for representing the image layer where the feature points are located. In step S303, the coordinate axes of the adjacent areas of the feature points may be rotated to coincide with the gradient directions of the feature points, and then feature descriptors of the feature points may be generated according to the gradient directions of the pixel points in the adjacent areas.
In some embodiments, for any one of the feature points of the first image and the second image, when the gray value of each feature point is higher than the gray value of surrounding pixel points, the bright-dark attribute of each feature point indicates that the gray value of each feature point is higher than the gray value of surrounding pixel points. Here, when the gray value is higher than that of surrounding pixel points, the feature point may be referred to as a bright point. The surrounding pixel points may be regarded as pixel points other than the feature points in one adjoining region centered on the feature points. In addition, when the gray value of the feature point is lower than the gray value of the surrounding pixel points, the brightness attribute of each feature point indicates that the gray value of each feature point is lower than the gray value of the surrounding pixel points. When the gray value is higher than that of surrounding pixel points, the feature point may be referred to as a dark point.
In some embodiments, step S304 may be implemented as method 400.
As shown in fig. 4, in step S401, for each of the feature point pairs composed of the feature point of the first image and the feature point of the second image, it is determined whether or not the brightness attribute of both feature points in each of the feature point pairs is identical.
When it is determined in step S401 that the light and dark attributes of the two feature points are identical, the method 400 may perform step S402 to determine the similarity of feature descriptors of the two feature points. For example, when both the feature points are bright points (or dark points), it is determined that the co-ordination properties of the two feature points are identical. Step S402 may determine the similarity of the feature descriptors based on, for example, the similarity measurement methods such as mahalanobis distance and euclidean distance, which is not limited in this application.
In step S403, pairs of feature points that match between the first image and the second image are determined according to the determined similarity.
For example, for each feature point of the first image, step S403 determines the maximum similarity from the similarities related to each feature point, and takes the feature point pair corresponding to the maximum similarity as the matched feature point pair. For example, step S403 may sort the similarity related to each feature point (i.e., the similarity related to the feature point determined in step S402), and use the feature point corresponding to the greatest similarity as the matched feature point pair. For example, one feature point of the first image is P a1 . Brightness and darkness attributes and P in second image a1 The consistent feature point is P b1 、P b2 、…P b10 。P a1 Respectively with P b1 To P b10 The similarity of (C) is S in turn 1 To S 10 . Step S403 at decision S 5 Is equal to P a1 At the maximum similarity of interest, S can be 5 For the corresponding characteristic point P a1 And P b5 A matching pair of feature points is determined. In addition, when it is determined in step S401 that the light-dark attributes of the two feature points are inconsistent, the method 400 executes step S404 without performing similarity determination on the feature descriptors of the two feature points.
In summary, according to the method 400 of the embodiment of the present application, by performing similarity determination and performing feature point matching determination on only the pairs of points (a single pair of points includes a feature point of a first image and a feature point of a second image) with consistent brightness and darkness attributes, misdetermination of the pairs of points with different brightness and darkness attributes but high similarity of feature descriptors as feature matching pairs can be avoided, so that accuracy of feature point matching can be improved.
Fig. 5 illustrates a schematic diagram of a matching device 500 for image feature points according to some embodiments of the present application. The matching apparatus 500 may be deployed in the electronic device 130, for example.
As shown in fig. 5, the matching apparatus 500 includes: an image acquisition unit 501, a feature point extraction unit 502, a description information generation unit 503, and a feature point matching unit 504.
Wherein the image acquisition unit 501 acquires a first image and a second image.
The feature point extraction unit 502 extracts feature points of the first image and feature points of the second image. Wherein the features of each feature point include a bright-dark attribute. The light-dark attribute of each feature point is used to describe the result of comparing the gray value of the feature point with the gray values of surrounding pixel points.
The description information generation unit 503 determines feature descriptors of feature points of the first image and feature descriptors of feature points of the second image.
And a feature point matching unit 504, configured to perform feature matching judgment on feature point pairs with identical brightness and darkness attributes according to feature descriptors of feature points of the first image and feature descriptors of feature points of the second image, so as to obtain matched feature point pairs, where each feature point pair with identical brightness and darkness attributes includes one feature point of the first image and one feature point of the second image.
In summary, according to the matching device 500 of the embodiment of the present application, by determining the brightness attribute of each feature point and performing feature point matching judgment on the point pairs with the same brightness attribute between the first image and the second image, misjudgment on the point pairs with different brightness attributes but high similarity of feature descriptors as feature matching point pairs can be avoided, so that accuracy of feature point matching can be improved.
In addition, since feature point matching can be avoided for the point pairs with different brightness and darkness attributes, the matching device 500 in the embodiment of the present application can save calculation consumption of feature point matching, so that the efficiency of feature point matching can be improved.
In some embodiments, for each feature point in the first image and the second image, when the gray value of the each feature point is higher than the gray value of surrounding pixels, the bright-dark attribute of the each feature point indicates that the gray value of the each feature point is higher than the gray value of surrounding pixels. And when the gray level of each characteristic point is lower than the gray level of surrounding pixel points, the brightness attribute of each characteristic point represents that the gray level of each characteristic point is lower than the gray level of surrounding pixel points.
In some embodiments, for each feature point pair composed of the feature point of the first image and the feature point of the second image, the feature point matching unit 504 determines whether the light-dark attributes of the two feature points in each feature point pair are identical. Upon determining that the light and dark attributes of the two feature points coincide, the feature point matching unit 504 determines the similarity of feature descriptors of the two feature points. Based on each of the determined similarities, the feature point matching unit 504 determines a pair of feature points that match between the first image and the second image. In addition, when it is determined that the light-dark attributes of the two feature points are inconsistent, the feature point matching unit 504 does not make a similarity judgment for the feature descriptors of the two feature points.
In some embodiments, for each feature point of the first image, the feature point matching unit 504 determines the maximum similarity from the similarities related to each feature point, and takes the feature point pair corresponding to the maximum similarity as the matched feature point pair.
In addition, more specific embodiments of the matching device 500 are similar to the matching method 300, and will not be described herein.
Fig. 6 illustrates a schematic diagram of an electronic device according to some embodiments of the present application. As shown in fig. 6, the electronic device includes one or more processors (CPUs) 602, a communication module 604, a memory 606, a user interface 610, and a communication bus 608 for interconnecting these components.
The processor 602 may receive and transmit data via the communication module 604 to enable network communication and/or local communication.
The user interface 610 includes an output device 612 and an input device 614.
Memory 606 may be a high-speed random access memory such as DRAM, SRAM, DDR RAM, or other random access solid state storage devices; or non-volatile memory such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices.
Memory 606 stores a set of instructions executable by processor 602, including:
an operating system 616 including programs for handling various basic system services and for performing hardware related tasks;
application 618 includes various programs for implementing the above described schemes. Such a program can implement the process flow in the examples described above, such as the matching method 300 that may include image feature points.
In addition, each of the embodiments of the present application may be implemented by a data processing program executed by a data processing apparatus such as a computer. Obviously, the data processing program constitutes the invention. In addition, a data processing program typically stored in one storage medium is executed by directly reading the program out of the storage medium or by installing or copying the program into a storage device (such as a hard disk and/or a memory) of the data processing apparatus. Therefore, such a storage medium also constitutes the present invention. The storage medium may use any type of recording means, such as paper storage medium (e.g., paper tape, etc.), magnetic storage medium (e.g., floppy disk, hard disk, flash memory, etc.), optical storage medium (e.g., CD-ROM, etc.), magneto-optical storage medium (e.g., MO, etc.), etc.
The present application also discloses a nonvolatile storage medium in which a program is stored. The program comprises instructions that, when executed by a processor, cause an electronic device to perform a matching method 300 of image feature points according to the present application.
In addition, the method steps described herein may be implemented by hardware, such as logic gates, switches, application Specific Integrated Circuits (ASIC), programmable logic controllers, embedded microcontrollers, etc., in addition to data processing programs. Such hardware that can implement the method of determining relationship information between objects described herein may also constitute the present application.
The foregoing description of the preferred embodiments of the present invention is not intended to limit the invention to the precise form disclosed, and any modifications, equivalents, and variations which fall within the spirit and principles of the invention are intended to be included within the scope of the present invention.

Claims (10)

1. A method for matching image feature points, comprising:
acquiring a first image and a second image;
extracting feature points of a first image and feature points of a second image, wherein the feature of each feature point comprises a bright-dark attribute, and the bright-dark attribute of each feature point is used for describing a comparison result of the gray value of each feature point and the gray values of surrounding pixel points;
determining feature descriptors of feature points of the first image and feature descriptors of feature points of the second image;
and carrying out feature matching judgment on feature point pairs with the same brightness and darkness attributes according to feature descriptors of feature points of the first image and feature descriptors of feature points of the second image to obtain matched feature point pairs, wherein each feature point pair with the same brightness and darkness attributes comprises one feature point of the first image and one feature point of the second image.
2. The matching method of claim 1, wherein,
for each feature point in the first image and the second image, when the gray value of each feature point is higher than the gray value of surrounding pixel points, the brightness attribute of each feature point represents that the gray value of each feature point is higher than the gray value of surrounding pixel points;
and when the gray level of each characteristic point is lower than the gray level of surrounding pixel points, the brightness attribute of each characteristic point represents that the gray level of each characteristic point is lower than the gray level of surrounding pixel points.
3. The matching method according to claim 1, wherein the step of performing feature matching judgment on the feature point pairs with the same brightness and darkness attribute according to the feature descriptors of the feature points of the first image and the feature descriptors of the feature points of the second image to obtain the matched feature point pairs includes:
judging whether the brightness attribute of two feature points in each feature point pair is consistent or not for each feature point pair formed by the feature points of the first image and the feature points of the second image;
when the brightness attribute of the two feature points is consistent, the similarity of feature descriptors of the two feature points is determined;
when the fact that the brightness attributes of the two feature points are inconsistent is determined, similarity judgment is not carried out on feature descriptors of the two feature points;
and determining matched characteristic point pairs between the first image and the second image according to each determined similarity.
4. A matching method as claimed in claim 3, wherein said determining pairs of matched feature points between the first image and the second image based on each of said determined similarities comprises:
for each feature point of the first image, determining the maximum similarity from the similarity related to each feature point, and taking the feature point pair corresponding to the maximum similarity as a matched feature point pair.
5. An image feature point matching apparatus, comprising:
an image acquisition unit that acquires a first image and a second image;
a feature point extraction unit that extracts feature points of the first image and feature points of the second image, wherein a feature of each feature point includes a bright-dark attribute, and the bright-dark attribute of each feature point is used to describe a result of comparing a gray value of the feature point with gray values of surrounding pixel points;
a description information generation unit that determines a feature descriptor of a feature point of the first image and a feature descriptor of a feature point of the second image;
and the characteristic point matching unit is used for carrying out characteristic matching judgment on the characteristic point pairs with the same brightness and darkness attributes according to the characteristic descriptors of the characteristic points of the first image and the characteristic descriptors of the characteristic points of the second image to obtain matched characteristic point pairs, wherein each characteristic point pair with the same brightness and darkness attributes comprises one characteristic point of the first image and one characteristic point of the second image.
6. The matching apparatus according to claim 5, wherein,
for each feature point in the first image and the second image, when the gray value of each feature point is higher than the gray value of surrounding pixel points, the brightness attribute of each feature point represents that the gray value of each feature point is higher than the gray value of surrounding pixel points;
and when the gray level of each characteristic point is lower than the gray level of surrounding pixel points, the brightness attribute of each characteristic point represents that the gray level of each characteristic point is lower than the gray level of surrounding pixel points.
7. The matching device as set forth in claim 5, wherein the feature point matching unit performs the operation of performing feature matching judgment on feature point pairs with the same brightness attribute according to feature descriptors of feature points of the first image and feature descriptors of feature points of the second image to obtain matched feature point pairs according to the following manner:
judging whether the brightness attribute of two feature points in each feature point pair is consistent or not for each feature point pair formed by the feature points of the first image and the feature points of the second image;
when the brightness attribute of the two feature points is consistent, the similarity of feature descriptors of the two feature points is determined;
when the fact that the brightness attributes of the two feature points are inconsistent is determined, similarity judgment is not carried out on feature descriptors of the two feature points;
and determining matched characteristic point pairs between the first image and the second image according to each determined similarity.
8. The matching device according to claim 7, wherein the feature point matching unit performs the operation of determining the matched pair of feature points between the first image and the second image according to the similarity determined by each of:
for each feature point of the first image, determining the maximum similarity from the similarity related to each feature point, and taking the feature point pair corresponding to the maximum similarity as a matched feature point pair.
9. An electronic device, comprising:
a memory;
a processor;
a program stored in the memory and configured to be executed by the processor, the program comprising instructions for executing the matching method of image feature points of any one of claims 1 to 4.
10. A storage medium storing a program comprising instructions that, when executed by an electronic device, cause the electronic device to perform the method of matching image feature points of any one of claims 1-4.
CN202111559539.7A 2021-12-20 2021-12-20 Image feature point matching method and device, electronic equipment and storage medium Pending CN116385738A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111559539.7A CN116385738A (en) 2021-12-20 2021-12-20 Image feature point matching method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111559539.7A CN116385738A (en) 2021-12-20 2021-12-20 Image feature point matching method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116385738A true CN116385738A (en) 2023-07-04

Family

ID=86967950

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111559539.7A Pending CN116385738A (en) 2021-12-20 2021-12-20 Image feature point matching method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116385738A (en)

Similar Documents

Publication Publication Date Title
Abdel-Basset et al. 2-Levels of clustering strategy to detect and locate copy-move forgery in digital images
US20210295114A1 (en) Method and apparatus for extracting structured data from image, and device
US20190114777A1 (en) Systems and methods for edge points based monocular visual slam
CN107209942B (en) Object detection method and image retrieval system
US9025863B2 (en) Depth camera system with machine learning for recognition of patches within a structured light pattern
US11714921B2 (en) Image processing method with ash code on local feature vectors, image processing device and storage medium
AU2012202352A1 (en) Method, system and apparatus for determining a hash code representing a portion of an image
CN110941989A (en) Image verification method, image verification device, video verification method, video verification device, equipment and storage medium
CN113378710A (en) Layout analysis method and device for image file, computer equipment and storage medium
Gómez-Silva et al. Transferring learning from multi-person tracking to person re-identification
CN110135428B (en) Image segmentation processing method and device
CN112445926B (en) Image retrieval method and device
WO2019100348A1 (en) Image retrieval method and device, and image library generation method and device
Jia et al. Autosplice: A text-prompt manipulated image dataset for media forensics
Huang et al. Robust simultaneous localization and mapping in low‐light environment
Liao et al. Multi-scale saliency features fusion model for person re-identification
US11514702B2 (en) Systems and methods for processing images
Kang et al. Combining random forest with multi-block local binary pattern feature selection for multiclass head pose estimation
Matusiak et al. Unbiased evaluation of keypoint detectors with respect to rotation invariance
CN116385738A (en) Image feature point matching method and device, electronic equipment and storage medium
Machiraju et al. A dataset generation framework for evaluating megapixel image classifiers and their explanations
Jia et al. An adaptive framework for saliency detection
CA3070701C (en) Systems and methods for processing images
CN117292304B (en) Multimedia data transmission control method and system
Li et al. Image splicing localization using superpixel segmentation and noise level estimation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination