CN117197712A - ORB feature matching algorithm optimization method and device and computer equipment - Google Patents

ORB feature matching algorithm optimization method and device and computer equipment Download PDF

Info

Publication number
CN117197712A
CN117197712A CN202311135217.9A CN202311135217A CN117197712A CN 117197712 A CN117197712 A CN 117197712A CN 202311135217 A CN202311135217 A CN 202311135217A CN 117197712 A CN117197712 A CN 117197712A
Authority
CN
China
Prior art keywords
target
points
orb
determining
feature point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311135217.9A
Other languages
Chinese (zh)
Inventor
肖冀
谭时顺
张家铭
郑可
成涛
徐鸿宇
程瑛颖
邹波
万树伟
于千傲
何珉
周峰
胡建明
刘寒
谢宏
苏渝
刘拧滔
雷婧颖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
State Grid Chongqing Electric Power Co Marketing Service Center
State Grid Corp of China SGCC
Original Assignee
State Grid Chongqing Electric Power Co Marketing Service Center
State Grid Corp of China SGCC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by State Grid Chongqing Electric Power Co Marketing Service Center, State Grid Corp of China SGCC filed Critical State Grid Chongqing Electric Power Co Marketing Service Center
Priority to CN202311135217.9A priority Critical patent/CN117197712A/en
Publication of CN117197712A publication Critical patent/CN117197712A/en
Pending legal-status Critical Current

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Image Analysis (AREA)

Abstract

The application discloses an ORB feature matching algorithm optimization method, an ORB feature matching algorithm optimization device and computer equipment, relates to the technical field of computer vision, and solves the problems that the prior vision SLAM image matching algorithm can cause larger mismatching rate of image feature points and reduce positioning accuracy of the vision SLAM image matching algorithm. The method comprises the following steps: and determining a plurality of feature point pairs by adopting an ORB algorithm based on the acquired image frames to be matched and the target image frames, calculating cosine similarity corresponding to each feature point pair based on the plurality of feature point pairs, selecting optimal cosine similarity among the plurality of cosine similarities, matching the image frames to be matched with the target image frames by adopting a Hamming distance feature matching method based on the optimal cosine similarity, and carrying out mismatching elimination on the plurality of target feature point pairs by adopting a RANSAC algorithm, wherein a plurality of optimal feature points are determined in the image frames to be matched.

Description

ORB feature matching algorithm optimization method and device and computer equipment
Technical Field
The application belongs to the technical field of computer vision, and in particular relates to an ORB feature matching algorithm optimization method, an ORB feature matching algorithm optimization device, computer equipment and a computer readable storage medium.
Background
In recent decades, mobile robot technology is developing towards the intelligent direction of autonomous mapping and autonomous navigation, which not only makes great contribution to the development of many industrial production and service industries, but also has important application in the fields of industry, medicine, agriculture, building industry, even military and the like, and is one of the application fields with the highest potential in the 21 st century. Therefore, research on autonomous navigation technology of mobile robots has important significance.
The core technology of the mobile robot is a visual SLAM image matching algorithm, and in the applied engineering, the mobile robot firstly acquires an external environment image through a self-contained camera, and the configured visual SLAM image matching algorithm performs visual synchronous positioning and mapping on the acquired external environment image so as to construct an environment map. However, the applicant realizes that in the process of visual synchronous positioning and mapping, a plurality of external environment images need to be subjected to feature point matching, and because of the difference of the feature points in each external environment image, a larger false matching rate of the image feature points can be caused by using a visual SLAM image matching algorithm, and the positioning precision of the visual SLAM image matching algorithm is reduced.
Disclosure of Invention
In view of this, the present invention provides an ORB feature matching algorithm optimization method, apparatus, computer device, and computer readable storage medium, and aims to solve the problem that the existing visual SLAM image matching algorithm can cause a larger false matching rate of image feature points, and reduce the positioning accuracy of the visual SLAM image matching algorithm.
According to a first aspect of the present application, there is provided a method for optimizing an ORB feature matching algorithm, comprising:
acquiring an image frame to be matched and a target image frame adjacent to the image frame to be matched, and determining a plurality of feature point pairs by adopting an ORB algorithm for feature point extraction and description based on the image frame to be matched and the target image frame;
calculating the cosine similarity corresponding to each feature point pair based on the feature point pairs to obtain a plurality of cosine similarities, and selecting the optimal cosine similarity from the cosine similarities;
matching the image frames to be matched with the target image frames by adopting a Hamming distance feature matching method based on the optimal cosine similarity to obtain a plurality of target feature point pairs;
and carrying out mismatching elimination on the target feature point pairs by adopting a random consistency sampling RANSAC algorithm, and determining a plurality of optimal feature points in the image frames to be matched.
Optionally, the determining a plurality of feature point pairs based on the image frame to be matched and the target image frame using an ORB algorithm for feature point extraction and description includes:
extracting a plurality of first FAST corner points from the image frame to be matched by using a FAST algorithm included in the ORB algorithm, and extracting a plurality of second FAST corner points from the target image frame by using the FAST algorithm, wherein the number of the plurality of first FAST corner points is the same as the number of the plurality of second FAST corner points;
Calculating a first BRIEF descriptor corresponding to each first FAST corner point in the plurality of first FAST corner points by using a BRIEF algorithm included in the ORB algorithm, and calculating a second BRIEF descriptor corresponding to each second FAST corner point in the plurality of second FAST corner points by using the BRIEF algorithm;
determining a plurality of first ORB feature points based on the plurality of first BRIEF descriptors, and determining a plurality of second ORB feature points based on the plurality of second BRIEF descriptors;
and matching the plurality of first ORB characteristic points with the plurality of second ORB characteristic points by adopting a Hamming distance characteristic matching method to obtain a plurality of characteristic point pairs.
Optionally, the extracting, by using a FAST algorithm included in the ORB algorithm, a plurality of first FAST corner points from the image frame to be matched, and extracting, by using the FAST algorithm, a plurality of second FAST corner points from the target image frame includes:
dividing the image frame to be matched into a plurality of first image blocks, wherein each image block in the plurality of first image blocks comprises a plurality of first pixel points;
setting the brightness of each first pixel point in the plurality of first pixel points as first target brightness, determining a first target threshold value, calculating the sum value between the first target threshold value and the first target brightness to obtain first highest brightness, and calculating the difference value between the first target brightness and the first target threshold value to obtain first lowest brightness;
The following is performed for each first pixel: selecting a plurality of first target pixel points on a circumference taking a first preset value as a radius by taking the first pixel points as a center, determining first current brightness of each first target pixel point in the plurality of first target pixel points, and determining the first pixel points as first characteristic points if the first current brightness of the first target pixel points with continuous preset number on the circumference is determined to be larger than first highest brightness or smaller than first lowest brightness;
obtaining a plurality of first characteristic points corresponding to the plurality of first image blocks, balancing the plurality of first characteristic points by adopting a non-maximum suppression method, determining a plurality of first target characteristic points in the plurality of first characteristic points, and taking the plurality of first target characteristic points as the plurality of first FAST corner points; the method comprises the steps of,
dividing the target image frame into a plurality of second image blocks, wherein each second image block in the plurality of second image blocks comprises a plurality of second pixel points;
setting the brightness of each second pixel point in the plurality of second pixel points as second target brightness, determining a second target threshold value, calculating the sum value between the second target threshold value and the second target brightness to obtain second highest brightness, and calculating the difference value between the second target brightness and the second target threshold value to obtain second lowest brightness;
The following is performed for each second pixel: selecting a plurality of second target pixel points on a circumference taking a second preset value as a radius by taking the second pixel points as a center, determining second current brightness of each second target pixel point in the plurality of second target pixel points, and determining the second pixel points as second characteristic points if the second current brightness of the second target pixel points with continuous preset number on the circumference is determined to be larger than second highest brightness or smaller than second lowest brightness;
and obtaining a plurality of second characteristic points corresponding to the plurality of second image blocks, balancing the plurality of second characteristic points by adopting the non-maximum suppression method, determining a plurality of second target characteristic points in the plurality of second characteristic points, and taking the plurality of second target characteristic points as the plurality of second FAST corner points.
Optionally, the determining the plurality of first ORB feature points based on the plurality of first BRIEF descriptors and determining the plurality of second ORB feature points based on the plurality of second BRIEF descriptors includes:
performing the following operation on each first image block of the plurality of first image blocks: determining the mass center and the geometric center of a first image block, connecting the geometric center with the mass center by taking the geometric center as an origin, obtaining a direction vector pointing to the mass center from the geometric center, determining at least one coordinate corresponding to at least one first BRIEF descriptor included in the first image block, determining the direction corresponding to each first BRIEF descriptor according to the mass center and the coordinate corresponding to each first BRIEF descriptor by taking the direction vector as a reference, constructing an image pyramid, and forming a first ORB characteristic point corresponding to each first BRIEF descriptor based on the image pyramid and the direction corresponding to each first BRIEF descriptor;
Obtaining a plurality of first ORB feature points based on the first ORB feature points corresponding to each first BRIEF descriptor in each first image block; the method comprises the steps of,
performing the following on each of the plurality of second image blocks: determining the mass center and the geometric center of a second image block, connecting the geometric center with the mass center by taking the geometric center as an origin, obtaining a direction vector pointing to the mass center from the geometric center, determining at least one coordinate corresponding to at least one second BRIEF descriptor included in the second image block, determining the direction corresponding to each second BRIEF descriptor according to the coordinate corresponding to the mass center and each second BRIEF descriptor by taking the direction vector as a reference, constructing an image pyramid, and forming a second ORB characteristic point corresponding to each second BRIEF descriptor based on the direction corresponding to the image pyramid and each second BRIEF descriptor;
and obtaining the plurality of second ORB feature points based on the second ORB feature points corresponding to each second BRIEF descriptor in each second image block.
Optionally, the matching the plurality of first ORB feature points with the plurality of second ORB feature points by using a hamming distance feature matching method to obtain the plurality of feature point pairs includes:
The following operations are performed on each first ORB feature point of the plurality of first ORB feature points included in the image frame to be matched: calculating the hamming distance between a first ORB feature point and each of the plurality of second ORB feature points included in the target image frame, determining at least one target hamming distance with a hamming distance smaller than a Yu Hanming distance threshold value in the plurality of hamming distances, determining at least one second target ORB feature point corresponding to the at least one target hamming distance in the plurality of second ORB feature points, calculating the cosine similarity between each second target ORB feature point in the at least one second target ORB feature point and the first ORB feature point, obtaining a plurality of cosine similarities, determining optimal cosine similarity in the plurality of cosine similarities, determining a similarity matching value corresponding to each second target ORB feature point based on the optimal cosine similarity and the cosine similarity corresponding to each second target ORB feature point, obtaining at least one similarity matching value, inquiring the maximum similarity matching value in the at least one similarity matching value, and combining the maximum similarity matching value with the second ORB feature point corresponding to the first ORB feature point;
And obtaining the feature point pairs based on a feature point pair corresponding to each first ORB feature point.
Optionally, the calculating the cosine similarity corresponding to each feature point pair based on the feature point pairs to obtain a plurality of cosine similarities, and selecting an optimal cosine similarity from the plurality of cosine similarities includes:
performing the following operation for each of the plurality of feature point pairs: determining directions of two feature points included in a feature point pair, and calculating cosine similarity corresponding to the feature point pair based on the directions of the two feature points;
obtaining a plurality of cosine similarities based on the cosine similarities corresponding to each feature point pair, calculating the difference value between every two cosine similarities in the plurality of cosine similarities to obtain a plurality of first difference values, calculating the difference value between every two first difference values in the plurality of first difference values to obtain a plurality of second difference values, inquiring the difference value with the highest occurrence frequency in the plurality of second difference values to be used as a group of second target difference values, and determining the cosine similarity corresponding to the group of second target difference values to be the optimal cosine similarity.
Optionally, the performing mismatching elimination on the plurality of target feature point pairs by using a random consistency sampling RANSAC algorithm, determining a plurality of optimal feature points in the image frame to be matched includes:
Randomly selecting four target characteristic point pairs from the plurality of target characteristic point pairs, and calculating the four target characteristic point pairs by adopting a four-point method to obtain an initialization homography matrix;
randomly selecting four updated target feature point pairs for a plurality of times in the plurality of target feature point pairs, calculating the four updated target feature point pairs selected each time by adopting the four-point method to obtain a plurality of updated homography matrixes, calculating the error between each updated homography matrix in the plurality of updated homography matrixes and the initialized homography matrix to obtain a plurality of errors, determining at least one target error with the error smaller than an error threshold value in the plurality of errors, determining a plurality of target ORB feature point pairs corresponding to the at least one target error in the plurality of target feature point pairs, determining a plurality of inner points corresponding to the plurality of target ORB feature point pairs in the image frame to be matched, and counting the number of the plurality of inner points until the number of the plurality of inner points meets a preset condition;
and taking the plurality of interior points as the plurality of optimal characteristic points.
According to a second aspect of the present application, there is provided an apparatus for optimizing an ORB feature matching algorithm, comprising:
The determining module is used for acquiring an image frame to be matched and a target image frame adjacent to the image frame to be matched, and determining a plurality of feature point pairs by adopting an ORB algorithm for feature point extraction and description based on the image frame to be matched and the target image frame;
the selecting module is used for calculating the cosine similarity corresponding to each characteristic point pair based on the characteristic point pairs to obtain a plurality of cosine similarities, and selecting the optimal cosine similarity from the cosine similarities;
the matching module is used for matching the image frames to be matched with the target image frames by adopting a Hamming distance feature matching method based on the optimal cosine similarity to obtain a plurality of target feature point pairs;
and the elimination module is used for carrying out mismatching elimination on the target feature point pairs by adopting a random consistency sampling RANSAC algorithm, and determining a plurality of optimal feature points in the image frames to be matched.
According to a third aspect of the present application there is provided a computer device comprising a memory storing a computer program and a processor implementing the steps of the method of any of the first aspects described above when the computer program is executed by the processor.
According to a fourth aspect of the present application there is provided a computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of the method of any of the first aspects described above.
By means of the technical scheme, the application provides an ORB feature matching algorithm optimizing method, an ORB feature matching device, computer equipment and a computer readable storage medium, wherein a plurality of feature point pairs are determined according to an obtained image frame to be matched and a target image frame by adopting an ORB algorithm for feature point extraction and description, cosine similarity corresponding to each feature point pair is calculated based on the plurality of feature point pairs to obtain a plurality of cosine similarity, optimal cosine similarity is selected among the plurality of cosine similarity, the image frame to be matched and the target image frame are matched by adopting a Hamming distance feature matching method based on the optimal cosine similarity to obtain a plurality of target feature point pairs, error matching is eliminated on the plurality of target feature point pairs by adopting a random consistency sampling RANSAC algorithm, and a plurality of optimal feature points are determined in the image frame to be matched; the method comprises the steps of determining a plurality of feature point pairs through an ORB algorithm, calculating cosine similarity corresponding to each feature point pair, selecting optimal cosine similarity from the cosine similarities, and carrying out accurate matching on an image frame to be matched and a target image frame by utilizing a Hamming distance feature matching method based on the optimal cosine similarity, namely eliminating non-compliant feature points by utilizing the cosine similarity, further eliminating the non-compliant feature points in the plurality of target feature points by utilizing a RANSAC algorithm after the accurate matching, and determining the optimal feature points.
The foregoing description is only an overview of the present application, and is intended to be implemented in accordance with the teachings of the present application in order that the same may be more clearly understood and to make the same and other objects, features and advantages of the present application more readily apparent.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the application. Also, like reference numerals are used to designate like parts throughout the figures. In the drawings:
FIG. 1 shows a flowchart of an optimization method of an ORB feature matching algorithm provided by an embodiment of the present application;
FIG. 2 is a schematic diagram showing a method for optimizing an ORB feature matching algorithm according to another embodiment of the present application, wherein a target pixel point is selected according to a pixel;
FIG. 3 is a schematic structural diagram of an optimization device of an ORB feature matching algorithm according to an embodiment of the present application;
fig. 4 shows a schematic device structure of a computer device according to an embodiment of the present application.
Detailed Description
Various aspects and features of the present application are described herein with reference to the accompanying drawings.
It should be understood that various modifications may be made to the embodiments of the application herein. Therefore, the above description should not be taken as limiting, but merely as exemplification of the embodiments. Other modifications within the scope and spirit of the application will occur to persons of ordinary skill in the art.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the application and, together with a general description of the application given above, and the detailed description of the embodiments given below, serve to explain the principles of the application.
These and other characteristics of the application will become apparent from the following description of a preferred form of embodiment, given as a non-limiting example, with reference to the accompanying drawings.
It is also to be understood that, although the application has been described with reference to some specific examples, those skilled in the art can certainly realize many other equivalent forms of the application.
The above and other aspects, features and advantages of the present application will become more apparent in light of the following detailed description when taken in conjunction with the accompanying drawings.
Specific embodiments of the present application will be described hereinafter with reference to the accompanying drawings; however, it is to be understood that the disclosed embodiments are merely exemplary of the application, which can be embodied in various forms. Well-known and/or repeated functions and constructions are not described in detail to avoid obscuring the application in unnecessary or unnecessary detail. Therefore, specific structural and functional details disclosed herein are not intended to be limiting, but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the present application in virtually any appropriately detailed structure.
The specification may use the word "in one embodiment," "in another embodiment," "in yet another embodiment," or "in other embodiments," which may each refer to one or more of the same or different embodiments in accordance with the application.
The embodiment of the application provides an optimization method of an ORB feature matching algorithm, which is shown in figure 1 and comprises the following steps:
101. and acquiring an image frame to be matched and a target image frame adjacent to the image frame to be matched, and determining a plurality of feature point pairs by adopting an ORB algorithm for feature point extraction and description based on the image frame to be matched and the target image frame.
In the embodiment of the application, when images are matched, a plurality of input image frames are needed to be matched in the actual operation process, and the specific implementation mode is to match each two adjacent image frames, because the matching modes of the adjacent image frames are the same, in the embodiment of the application, the characteristic matching of the two adjacent image frames is described, in order to distinguish, one of the two adjacent image frames can be called as an image frame to be matched, the other can be called as a target image frame, and it is required to make sure that one image frame can be called as a first image frame to be matched, the other image frame can be called as a second image frame to be matched, the specific naming mode can be selected according to habits, and after the adjacent image frames are determined, a plurality of characteristic point pairs can be determined according to the two image frames by adopting an ORB algorithm.
It should be noted that, the ORB algorithm is an image feature extraction algorithm, which is called Oriented FAST and Rotated BRIEF algorithm in full, and is based on FAST algorithm to extract feature points, and based on BRIEF algorithm to construct descriptors of feature points, and correct the descriptors based on original parameters to realize scale invariance and rotation invariance of feature points, that is, the feature points after scaling and rotation can still generate descriptors similar to original parameters, and FAST (Features from accelerated segment test) algorithm is used to extract corner points as feature points in ORB algorithm, the basic idea is that: if there are enough pixels around a pixel that the difference from its value is large, then the point is likely to be a corner point. The implementation process of the FAST algorithm is as follows: firstly, a circle with a radius r and a center of a target pixel is determined, and if the differences between the continuous k pixel values and the target pixel value on the circle are all larger than a certain threshold value T, the target pixel is judged to be a corner point. Taking r=3 in the normal case, the circle falls on 16 pixels in fig. 1, and the FAST-9 algorithm, i.e. the k takes 9, and also the FAST-12 algorithm is commonly used in the text where the ORB algorithm is proposed. In addition, the method for judging the angular points through the decision tree divides 16 points into three categories of more than, similar to and less than according to the difference between the 16 points and the target point value, and the three categories are used as the input of the decision tree to distinguish whether the target point is the angular point or not. After the feature points are obtained, the feature points are required to be changed and converted into a string of data capable of representing the feature point information, the ORB is completed by adopting an adjusted BRIEF (Binary Robust Independent Elementary Features) algorithm, the BRIEF algorithm is used for selecting a plurality of points in the neighborhood of the target point, 128, 256 or 512 are generally selected, and then a binary number is generated for each point pair (px, py) by comparing the gray scale sizes of px and py, and the obtained binary string is the descriptor. The descriptors obtained through the Brief algorithm can judge whether the two descriptors are matched or not through the hamming distance. Hamming distance refers to the number of times that the corresponding position in the binary string is different, i.e., the number of occurrences of 1 after exclusive or.
102. And calculating the cosine similarity corresponding to each feature point pair based on the plurality of feature point pairs to obtain a plurality of cosine similarities, and selecting the optimal cosine similarity from the plurality of cosine similarities.
In the embodiment of the application, each of the plurality of characteristic point pairs comprises two characteristic points, each of the two characteristic points has a direction, each direction corresponds to a vector, the similarity of the two characteristic points is determined by calculating cosine values of included angles of the two vectors, the cosine value of 0 degree is 1, and the cosine value of any other angle is not more than 1; and its minimum value is-1. The cosine value of the angle between the two vectors thus determines whether the two vectors point approximately in the same direction. When the two vectors have the same direction, the cosine similarity value is 1; when the included angle of the two vectors is 90 degrees, the cosine similarity value is 0; when the two vectors point in diametrically opposite directions, the cosine similarity has a value of-1. This results in dependence on the length of the vector, only on the pointing direction of the vector. Cosine similarity is usually used for positive space and therefore gives values between-1 and 1. And after the plurality of cosine similarities are obtained, selecting the optimal cosine similarity from the plurality of cosine similarities.
103. And matching the image frames to be matched with the target image frames by adopting a Hamming distance feature matching method based on the optimal cosine similarity to obtain a plurality of target feature point pairs.
In the embodiment of the application, after the optimal cosine similarity is obtained, the image frames to be matched are matched with the target image frames by adopting a Hamming distance feature matching method according to the optimal cosine similarity, so that a plurality of target feature point pairs can be obtained, and the step can be understood as accurate matching after coarse matching.
It should be noted that, the hamming distance feature matching method may be understood as a measurement method that sums absolute values of differences between corresponding elements of two vectors as a distance, for example, hamming distances between two equal-length strings s1 and s2 are: the minimum number of character substitutions required to change one to the other, e.g., the hamming distance between 1011101 and 1001001 is 2143896 and 2233796 is 3, and the hamming distance between "connected" and "axles" is 3.
104. And carrying out mismatching elimination on a plurality of target feature point pairs by adopting a random consistency sampling RANSAC algorithm, and determining a plurality of optimal feature points in the image frames to be matched.
In the embodiment of the application, a plurality of target feature point pairs are determined, a random consistency sampling RANSAC algorithm can be adopted to perform mismatching elimination on the plurality of target feature point pairs, and a plurality of optimal feature points are determined in an image frame to be matched.
It should be noted that, the RANSAC (Random Sample Consensus, random sampling consistency) algorithm is a commonly used parameter estimation method, and by randomly selecting a part of data, then fitting a model according to the data, calculating the deviation between the model and other data, and finally screening out data meeting a certain threshold value, which is used for estimating parameters, has strong robustness to noise data and outliers. It can also be understood as an outlier detection method. Outliers can be detected and rejected by the RANSAC algorithm.
The method provided by the embodiment of the application comprises the steps of firstly determining a plurality of characteristic point pairs by adopting an ORB algorithm for characteristic point extraction and description according to an obtained image frame to be matched and a target image frame, then calculating cosine similarity corresponding to each characteristic point pair based on the plurality of characteristic point pairs to obtain a plurality of cosine similarities, selecting optimal cosine similarity from the plurality of cosine similarities, matching the image frame to be matched with the target image frame by adopting a Hamming distance characteristic matching method based on the optimal cosine similarity to obtain a plurality of target characteristic point pairs, and finally carrying out mismatching elimination on the plurality of target characteristic point pairs by adopting a random consistency sampling RANSAC algorithm to determine a plurality of optimal characteristic points in the image frame to be matched; the method comprises the steps of determining a plurality of feature point pairs through an ORB algorithm, calculating cosine similarity corresponding to each feature point pair, selecting optimal cosine similarity from the cosine similarities, and carrying out accurate matching on an image frame to be matched and a target image frame by utilizing a Hamming distance feature matching method based on the optimal cosine similarity, namely eliminating non-compliant feature points by utilizing the cosine similarity, further eliminating the non-compliant feature points in the plurality of target feature points by utilizing a RANSAC algorithm after the accurate matching, and determining the optimal feature points.
Further, as a refinement and extension of the foregoing embodiment, in order to fully describe a specific implementation manner of the embodiment, an embodiment of the present application provides an optimization method of another ORB feature matching algorithm, including:
201. and acquiring an image frame to be matched and a target image frame adjacent to the image frame to be matched, and determining a plurality of feature point pairs by adopting an ORB algorithm for feature point extraction and description based on the image frame to be matched and the target image frame.
In the embodiment of the application, after an image frame to be matched and a target image frame adjacent to the image frame to be matched are acquired, firstly, a plurality of first FAST corner points are extracted from the image frame to be matched by using a FAST algorithm included in an ORB algorithm, and a plurality of second FAST corner points are extracted from the target image frame by using the FAST algorithm, wherein the number of the plurality of first FAST corner points is the same as the number of the plurality of second FAST corner points, then, a first BRIEF descriptor corresponding to each first FAST corner point in the plurality of first FAST corner points is calculated by using a BRIEF algorithm included in the ORB algorithm, a second BRIEF descriptor corresponding to each second FAST corner point in the plurality of second FAST corner points is calculated by using the BRIEF algorithm, then, a plurality of first ORB feature points are determined based on the plurality of first BRIEF descriptors, a plurality of second ORB feature points are determined based on the plurality of second BRIEF descriptors, and finally, a hamming distance feature matching method is adopted to match the plurality of first ORB feature points with the plurality of second feature points.
Further, the specific way of extracting a plurality of first FAST corner points from the image frames to be matched by using a FAST algorithm included in the ORB algorithm and extracting a plurality of second FAST corner points from the target image frames by using the FAST algorithm is as follows:
for image frames to be matched: dividing an image frame to be matched into a plurality of first image blocks, wherein each image block in the plurality of first image blocks comprises a plurality of first pixel points; setting the brightness of each first pixel point in the plurality of first pixel points as first target brightness, determining a first target threshold value, calculating the sum value between the first target threshold value and the first target brightness to obtain first highest brightness, and calculating the difference value between the first target brightness and the first target threshold value to obtain first lowest brightness; the following is performed for each first pixel: selecting a plurality of first target pixel points on a circumference with a first preset value as a radius by taking the first pixel points as a center, determining first current brightness of each first target pixel point in the plurality of first target pixel points, and determining the first pixel points as first characteristic points if the first current brightness of the first target pixel points with continuous preset quantity on the circumference is determined to be larger than first highest brightness or smaller than first lowest brightness; and obtaining a plurality of first characteristic points corresponding to the plurality of first image blocks, balancing the plurality of first characteristic points by adopting a non-maximum value inhibition method, determining a plurality of first target characteristic points in the plurality of first characteristic points, and taking the plurality of first target characteristic points as a plurality of first FAST corner points.
Illustratively, as shown in FIG. 2, a pixel p is first selected in an image block, assuming its brightness is I p Setting a threshold T, selecting 16 pixel points on a circle with radius of 3 by taking the pixel p as the center, and if the brightness of N continuous points on the selected circle is greater than I p +T or less, I p T, then the pixel p can be considered as a first feature point, and in order to prevent the corner from "bunching up", a non-maximum suppression method is used, where only the corner responding to the maximum is retained in a certain area.
For a target image frame: dividing the target image frame into a plurality of second image blocks, wherein each second image block in the plurality of second image blocks comprises a plurality of second pixel points; setting the brightness of each second pixel point in the plurality of second pixel points as second target brightness, determining a second target threshold value, calculating the sum value between the second target threshold value and the second target brightness to obtain second highest brightness, and calculating the difference value between the second target brightness and the second target threshold value to obtain second lowest brightness; the following is performed for each second pixel: selecting a plurality of second target pixel points on the circumference taking a second preset value as a radius by taking the second pixel points as the center, determining second current brightness of each second target pixel point in the plurality of second target pixel points, and determining the second pixel points as second characteristic points if the second current brightness of the second target pixel points with continuous preset number on the circumference is determined to be larger than second highest brightness or smaller than second lowest brightness; and obtaining a plurality of second characteristic points corresponding to the plurality of second image blocks, balancing the plurality of second characteristic points by adopting a non-maximum value inhibition method, determining a plurality of second target characteristic points in the plurality of second characteristic points, and taking the plurality of second target characteristic points as a plurality of second FAST corner points.
Further, the specific method for determining the plurality of first ORB feature points based on the plurality of first BRIEF descriptors and determining the plurality of second ORB feature points based on the plurality of second BRIEF descriptors is as follows:
the following is performed for each of a plurality of first tiles: determining the mass center and the geometric center of a first image block, connecting the geometric center with the mass center by taking the geometric center as an origin to obtain a direction vector pointing to the mass center from the geometric center, determining at least one coordinate corresponding to at least one first BRIEF descriptor included in the first image block, determining the direction corresponding to each first BRIEF descriptor according to the mass center and the coordinate corresponding to each first BRIEF descriptor by taking the direction vector as a reference, constructing an image pyramid, and forming a first ORB feature point corresponding to each first BRIEF descriptor based on the image pyramid and the direction corresponding to each first BRIEF descriptor; and obtaining a plurality of first ORB feature points based on the first ORB feature points corresponding to each first BRIEF descriptor in each first image block.
Illustratively, in one divided image block B, the moment of the image block B may be defined as:
wherein p is a pixel point in the image block B, x and y are coordinates of the pixel point p, m pq A moment for the image block;
the centroid of image block B can be found by the moment of the image block, as shown in equation 2,
wherein m10 is a horizontal component of the first moment, m01 is a horizontal component of the first moment, and m00 is a zero moment; that is, the centroid of image block B can be calculated from the moment of the image block.
Connecting the geometric center O and the mass center C of the image block B to obtain a direction vectorThe corresponding direction of this pixel p in the image block (which pixel has been converted into the first BRIEF descriptor) can be defined as formula 3:
wherein θ is the direction corresponding to the first BRIEF descriptor;
based on the corresponding direction of each first BRIEF descriptor, the obtained ORB characteristic points have scale invariance by combining the constructed image pyramid.
Performing the following on each of the plurality of second image blocks: determining the mass center and the geometric center of the second image block, connecting the geometric center with the mass center by taking the geometric center as an origin, obtaining a direction vector pointing to the mass center from the geometric center, determining at least one coordinate corresponding to at least one second BRIEF descriptor included in the second image block, determining the direction corresponding to each second BRIEF descriptor according to the mass center and the coordinate corresponding to each second BRIEF descriptor by taking the direction vector as a reference, constructing an image pyramid, and forming a second ORB characteristic point corresponding to each second BRIEF descriptor based on the image pyramid and the direction corresponding to each second BRIEF descriptor; and obtaining a plurality of second ORB feature points based on the second ORB feature points corresponding to each second BRIEF descriptor in each second image block.
Further, the specific method for matching the plurality of first ORB feature points with the plurality of second ORB feature points by adopting the Hamming distance feature matching method to obtain a plurality of feature point pairs comprises the following steps:
the method comprises the steps of executing the following operation on each first ORB characteristic point in a plurality of first ORB characteristic points included in an image frame to be matched: calculating the hamming distance between the first ORB feature point and each second ORB feature point in a plurality of second ORB feature points included in the target image frame, determining at least one target hamming distance with a hamming distance smaller than a Yu Hanming distance threshold in the plurality of hamming distances, determining at least one second target ORB feature point corresponding to the at least one target hamming distance in the plurality of second ORB feature points, calculating the cosine similarity between each second target ORB feature point in the at least one second target ORB feature point and the first ORB feature point to obtain a plurality of cosine similarities, determining the maximum cosine similarity as the optimal cosine similarity in the plurality of cosine similarities, determining the similarity matching value corresponding to each second target ORB feature point based on the optimal cosine similarity and the cosine similarity corresponding to each second target ORB feature point, obtaining at least one similarity matching value, inquiring the maximum similarity matching value in the at least one similarity matching value, and combining the second feature point corresponding to the maximum similarity matching value with the first ORB feature point; and obtaining a plurality of feature point pairs based on a feature point pair corresponding to each first ORB feature point.
It should be noted that, a specific method for determining the maximum cosine similarity among the plurality of cosine similarities is to determine the optimal cosine similarity among the plurality of cosine similarities by using a gradient calculation method, and detailed description will be made in a specific process subsequent method, which is not described in detail herein.
202. And calculating the cosine similarity corresponding to each feature point pair based on the plurality of feature point pairs to obtain a plurality of cosine similarities, and selecting the optimal cosine similarity from the plurality of cosine similarities.
In the embodiment of the present application, the following operations are performed for each of a plurality of pairs of feature points: determining directions of two feature points included in a feature point pair, and calculating cosine similarity corresponding to the feature point pair based on the directions of the two feature points; obtaining a plurality of cosine similarities based on the cosine similarities corresponding to each feature point pair, calculating the difference value between every two cosine similarities in the plurality of cosine similarities to obtain a plurality of first difference values, then calculating the difference value between every two first difference values in the plurality of first difference values to obtain a plurality of second difference values, inquiring the difference value with the highest occurrence frequency in the plurality of second difference values to be used as a group of second target difference values, and further determining the cosine similarity corresponding to the group of second target difference values to be the optimal cosine similarity.
203. And matching the image frames to be matched with the target image frames by adopting a Hamming distance feature matching method based on the optimal cosine similarity to obtain a plurality of target feature point pairs.
In the embodiment of the application, the step can be regarded as accurate matching, and the Hamming distance feature matching method is applied to carry out freedom degree on the image againFor m feature point matching, if the target threshold is P, the Hamming distance is [0, P+m]The corresponding feature points in the two images can be firstly considered to be correctly matched, the feature point pairs which can be matched are screened out, the cosine similarity of all the feature point pairs is calculated, if the cosine similarity is within the range of the experience threshold value, the correct matching is firstly considered, and if the cosine similarity is not, the false matching is considered to be eliminated. After the filtering by the method, a situation that a feature point in the image frame to be matched can be matched with a plurality of feature points in the target image frame is still required, so that continuous filtering is required, and continuous filtering is performed by using the formula 4, that is, one-pass filtering of the formula 4 is performed on each feature point in the image to be matched, wherein in the formula 4, H i The Hamming distance between the ith feature point and the feature point being screened is the Hamming distance between the ith feature point and the feature point being screened; h 1 The length of the binary descriptor being the 1 st feature point; cos (θ) i ) Cosine similarity between the ith feature vector and the feature vector being screened; θ is the optimal cosine similarity; s is a similarity matching value; m and n are distance weight and cosine similarity weight respectively, and represent the importance degree of the two in the matching process. In this way, each feature point in the image to be matched may correspond to a plurality of similarity matching values, and for each feature point, a maximum similarity matching value is selected as an optimal similarity matching value, so that an optimal match can be determined through the similarity matching values, and the result of the optimal match is that the target feature point is aligned, and other matches can be understood to be eliminated. If the cosine angle values of the lease matches are not within the threshold range, the lease matches are considered to be mismatching elimination.
204. And carrying out mismatching elimination on a plurality of target feature point pairs by adopting a random consistency sampling RANSAC algorithm, and determining a plurality of optimal feature points in the image frames to be matched.
In the embodiment of the application, four target characteristic point pairs are randomly selected from a plurality of target characteristic point pairs, and the four target characteristic point pairs are calculated by adopting a four-point method to obtain an initialization homography matrix; randomly selecting four updated target feature point pairs for a plurality of times in a plurality of target feature point pairs, calculating the four updated target feature point pairs selected each time by adopting a four-point method to obtain a plurality of updated homography matrixes, calculating the error between each updated homography matrix in the plurality of updated homography matrixes and the initialized homography matrix to obtain a plurality of errors, determining at least one target error with the error smaller than an error threshold value in the plurality of errors, determining a plurality of target ORB feature point pairs corresponding to the at least one target error in the plurality of target feature point pairs, determining a plurality of inner points corresponding to the plurality of target ORB feature point pairs in an image frame to be matched, and counting the number of the plurality of inner points until the number of the plurality of inner points meets a preset condition; and finally, taking the plurality of interior points as a plurality of optimal characteristic points.
It should be noted that, the preset condition to be satisfied is a formula, and the specific form of the formula is:
where m is the minimum data size required for computing the homography matrix, this can be increased by selecting the target feature point pairs multiple times, P m Is a confidence level representing the probability that at least one interior point of the m ORB feature point pairs, η represents the probability that the selected ORB feature point pair is an exterior point.
It should be noted that after determining the plurality of interior points, the screening can be further performed on the plurality of interior points, that is, the m data four-point method is continuously selected to perform the above process, so that the optimal feature points are screened out, and the subsequent visual SLAM image matching algorithm performs image matching through the optimal feature points, so that the positioning accuracy can be effectively improved.
The method provided by the embodiment of the application comprises the steps of firstly determining a plurality of characteristic point pairs by adopting an ORB algorithm for characteristic point extraction and description according to an obtained image frame to be matched and a target image frame, then calculating cosine similarity corresponding to each characteristic point pair based on the plurality of characteristic point pairs to obtain a plurality of cosine similarities, selecting optimal cosine similarity from the plurality of cosine similarities, matching the image frame to be matched with the target image frame by adopting a Hamming distance characteristic matching method based on the optimal cosine similarity to obtain a plurality of target characteristic point pairs, and finally carrying out mismatching elimination on the plurality of target characteristic point pairs by adopting a random consistency sampling RANSAC algorithm to determine a plurality of optimal characteristic points in the image frame to be matched; the method comprises the steps of determining a plurality of feature point pairs through an ORB algorithm, calculating cosine similarity corresponding to each feature point pair, selecting optimal cosine similarity from the cosine similarities, and carrying out accurate matching on an image frame to be matched and a target image frame by utilizing a Hamming distance feature matching method based on the optimal cosine similarity, namely eliminating non-compliant feature points by utilizing the cosine similarity, further eliminating the non-compliant feature points in the plurality of target feature points by utilizing a RANSAC algorithm after the accurate matching, and determining the optimal feature points.
Further, as shown in fig. 3, as a specific implementation of the method shown in fig. 1, an embodiment of the present invention provides an apparatus for optimizing an ORB feature matching algorithm, including: a determining module 301, a selecting module 302, a matching module 303 and an eliminating module 304.
The determining module 301 is configured to obtain an image frame to be matched and a target image frame adjacent to the image frame to be matched, determine a plurality of feature point pairs by using an ORB algorithm for feature point extraction and description based on the image frame to be matched and the target image frame;
the selecting module 302 is configured to calculate cosine similarity corresponding to each feature point pair based on the plurality of feature point pairs, obtain a plurality of cosine similarities, and select an optimal cosine similarity from the plurality of cosine similarities;
the matching module 303 is configured to match the image frame to be matched with the target image frame by using a hamming distance feature matching method based on the optimal cosine similarity, so as to obtain a plurality of target feature point pairs;
the cancellation module 304 is configured to perform mismatching cancellation on the plurality of target feature point pairs by using a random consistency sampling RANSAC algorithm, and determine a plurality of optimal feature points in the image frame to be matched.
In a specific application scenario, the determining module 301 is further configured to: extracting a plurality of first FAST corner points from the image frame to be matched by using a FAST algorithm included in the ORB algorithm, and extracting a plurality of second FAST corner points from the target image frame by using the FAST algorithm, wherein the number of the plurality of first FAST corner points is the same as the number of the plurality of second FAST corner points; calculating a first BRIEF descriptor corresponding to each first FAST corner point in the plurality of first FAST corner points by using a BRIEF algorithm included in the ORB algorithm, and calculating a second BRIEF descriptor corresponding to each second FAST corner point in the plurality of second FAST corner points by using the BRIEF algorithm; determining a plurality of first ORB feature points based on the plurality of first BRIEF descriptors, and determining a plurality of second ORB feature points based on the plurality of second BRIEF descriptors; and matching the plurality of first ORB characteristic points with the plurality of second ORB characteristic points by adopting a Hamming distance characteristic matching method to obtain a plurality of characteristic point pairs.
In a specific application scenario, the determining module 301 is further configured to: dividing the image frame to be matched into a plurality of first image blocks, wherein each image block in the plurality of first image blocks comprises a plurality of first pixel points; setting the brightness of each first pixel point in the plurality of first pixel points as first target brightness, determining a first target threshold value, calculating the sum value between the first target threshold value and the first target brightness to obtain first highest brightness, and calculating the difference value between the first target brightness and the first target threshold value to obtain first lowest brightness; the following is performed for each first pixel: selecting a plurality of first target pixel points on a circumference taking a first preset value as a radius by taking the first pixel points as a center, determining first current brightness of each first target pixel point in the plurality of first target pixel points, and determining the first pixel points as first characteristic points if the first current brightness of the first target pixel points with continuous preset number on the circumference is determined to be larger than first highest brightness or smaller than first lowest brightness; obtaining a plurality of first characteristic points corresponding to the plurality of first image blocks, balancing the plurality of first characteristic points by adopting a non-maximum suppression method, determining a plurality of first target characteristic points in the plurality of first characteristic points, and taking the plurality of first target characteristic points as the plurality of first FAST corner points; and dividing the target image frame into a plurality of second image blocks, each of the plurality of second image blocks comprising a plurality of second pixel points; setting the brightness of each second pixel point in the plurality of second pixel points as second target brightness, determining a second target threshold value, calculating the sum value between the second target threshold value and the second target brightness to obtain second highest brightness, and calculating the difference value between the second target brightness and the second target threshold value to obtain second lowest brightness; the following is performed for each second pixel: selecting a plurality of second target pixel points on a circumference taking a second preset value as a radius by taking the second pixel points as a center, determining second current brightness of each second target pixel point in the plurality of second target pixel points, and determining the second pixel points as second characteristic points if the second current brightness of the second target pixel points with continuous preset number on the circumference is determined to be larger than second highest brightness or smaller than second lowest brightness; and obtaining a plurality of second characteristic points corresponding to the plurality of second image blocks, balancing the plurality of second characteristic points by adopting the non-maximum suppression method, determining a plurality of second target characteristic points in the plurality of second characteristic points, and taking the plurality of second target characteristic points as the plurality of second FAST corner points.
In a specific application scenario, the determining module is further configured to: performing the following operation on each first image block of the plurality of first image blocks: determining the mass center and the geometric center of a first image block, connecting the geometric center with the mass center by taking the geometric center as an origin, obtaining a direction vector pointing to the mass center from the geometric center, determining at least one coordinate corresponding to at least one first BRIEF descriptor included in the first image block, determining the direction corresponding to each first BRIEF descriptor according to the mass center and the coordinate corresponding to each first BRIEF descriptor by taking the direction vector as a reference, constructing an image pyramid, and forming a first ORB characteristic point corresponding to each first BRIEF descriptor based on the image pyramid and the direction corresponding to each first BRIEF descriptor; obtaining a plurality of first ORB feature points based on the first ORB feature points corresponding to each first BRIEF descriptor in each first image block; and performing the following on each of the plurality of second image blocks: determining the mass center and the geometric center of a second image block, connecting the geometric center with the mass center by taking the geometric center as an origin, obtaining a direction vector pointing to the mass center from the geometric center, determining at least one coordinate corresponding to at least one second BRIEF descriptor included in the second image block, determining the direction corresponding to each second BRIEF descriptor according to the coordinate corresponding to the mass center and each second BRIEF descriptor by taking the direction vector as a reference, constructing an image pyramid, and forming a second ORB characteristic point corresponding to each second BRIEF descriptor based on the direction corresponding to the image pyramid and each second BRIEF descriptor; and obtaining the plurality of second ORB feature points based on the second ORB feature points corresponding to each second BRIEF descriptor in each second image block.
In a specific application scenario, the determining module is further configured to: the following operations are performed on each first ORB feature point of the plurality of first ORB feature points included in the image frame to be matched: calculating the hamming distance between a first ORB feature point and each of the plurality of second ORB feature points included in the target image frame, determining at least one target hamming distance with a hamming distance smaller than a Yu Hanming distance threshold value in the plurality of hamming distances, determining at least one second target ORB feature point corresponding to the at least one target hamming distance in the plurality of second ORB feature points, calculating the cosine similarity between each second target ORB feature point in the at least one second target ORB feature point and the first ORB feature point, obtaining a plurality of cosine similarities, determining optimal cosine similarity in the plurality of cosine similarities, determining a similarity matching value corresponding to each second target ORB feature point based on the optimal cosine similarity and the cosine similarity corresponding to each second target ORB feature point, obtaining at least one similarity matching value, inquiring the maximum similarity matching value in the at least one similarity matching value, and combining the maximum similarity matching value with the second ORB feature point corresponding to the first ORB feature point; and obtaining the feature point pairs based on a feature point pair corresponding to each first ORB feature point.
In a specific application scenario, the selecting module is further configured to: performing the following operation for each of the plurality of feature point pairs: determining directions of two feature points included in a feature point pair, and calculating cosine similarity corresponding to the feature point pair based on the directions of the two feature points; obtaining a plurality of cosine similarities based on the cosine similarities corresponding to each feature point pair, calculating the difference value between every two cosine similarities in the plurality of cosine similarities to obtain a plurality of first difference values, calculating the difference value between every two first difference values in the plurality of first difference values to obtain a plurality of second difference values, inquiring the difference value with the highest occurrence frequency in the plurality of second difference values to be used as a group of second target difference values, and determining the cosine similarity corresponding to the group of second target difference values to be the optimal cosine similarity.
In a specific application scenario, the cancellation module 304 is further configured to: randomly selecting four target characteristic point pairs from the plurality of target characteristic point pairs, and calculating the four target characteristic point pairs by adopting a four-point method to obtain an initialization homography matrix; randomly selecting four updated target feature point pairs for a plurality of times in the plurality of target feature point pairs, calculating the four updated target feature point pairs selected each time by adopting the four-point method to obtain a plurality of updated homography matrixes, calculating the error between each updated homography matrix in the plurality of updated homography matrixes and the initialized homography matrix to obtain a plurality of errors, determining at least one target error with the error smaller than an error threshold value in the plurality of errors, determining a plurality of target ORB feature point pairs corresponding to the at least one target error in the plurality of target feature point pairs, determining a plurality of inner points corresponding to the plurality of target ORB feature point pairs in the image frame to be matched, and counting the number of the plurality of inner points until the number of the plurality of inner points meets a preset condition; and taking the plurality of interior points as the plurality of optimal characteristic points.
The device provided by the embodiment of the application comprises the steps of firstly, determining a plurality of characteristic point pairs by a determining module according to an obtained image frame to be matched and a target image frame by adopting an ORB algorithm for characteristic point extraction and description, then calculating cosine similarity corresponding to each characteristic point pair by a selecting module based on the plurality of characteristic point pairs to obtain a plurality of cosine similarity, selecting optimal cosine similarity from the plurality of cosine similarity, then matching the image frame to be matched with the target image frame by adopting a Hamming distance characteristic matching method based on the optimal cosine similarity by a matching module to obtain a plurality of target characteristic point pairs, finally, carrying out mismatching elimination on the plurality of target characteristic point pairs by adopting a random consistency sampling RANSAC algorithm by adopting a eliminating module, and determining a plurality of optimal characteristic points in the image frame to be matched; the method comprises the steps of determining a plurality of feature point pairs through an ORB algorithm, calculating cosine similarity corresponding to each feature point pair, selecting optimal cosine similarity from the cosine similarities, and carrying out accurate matching on an image frame to be matched and a target image frame by utilizing a Hamming distance feature matching method based on the optimal cosine similarity, namely eliminating non-compliant feature points by utilizing the cosine similarity, further eliminating the non-compliant feature points in the plurality of target feature points by utilizing a RANSAC algorithm after the accurate matching, and determining the optimal feature points.
It should be noted that, other corresponding descriptions of each functional unit related to the optimization device of the ORB feature matching algorithm provided by the embodiment of the present application may refer to corresponding descriptions in fig. 1 and fig. 3, and are not repeated herein.
In an exemplary embodiment, referring to fig. 4, there is also provided a computer device, which includes a bus, a processor, a memory, and a communication interface, and may further include an input-output interface and a display device, where each functional unit may perform communication with each other through the bus. The memory stores a computer program, and a processor for executing the program stored in the memory, and executing the optimization method of the ORB feature matching algorithm in the above embodiment.
A computer readable storage medium having stored thereon a computer program which when executed by a processor implements the steps of the method of optimizing an ORB feature matching algorithm.
From the above description of the embodiments, it will be clear to those skilled in the art that the present application may be implemented in hardware, or may be implemented by means of software plus necessary general hardware platforms. Based on such understanding, the technical solution of the present application may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.), and includes several instructions for causing a computer device (may be a personal computer, a server, or a network device, etc.) to execute the method described in the respective implementation scenario of the present application.
Those skilled in the art will appreciate that the drawing is merely a schematic illustration of a preferred implementation scenario and that the modules or flows in the drawing are not necessarily required to practice the application.
Those skilled in the art will appreciate that modules in an apparatus in an implementation scenario may be distributed in an apparatus in an implementation scenario according to an implementation scenario description, or that corresponding changes may be located in one or more apparatuses different from the implementation scenario. The modules of the implementation scenario may be combined into one module, or may be further split into a plurality of sub-modules.
The above-mentioned inventive sequence numbers are merely for description and do not represent advantages or disadvantages of the implementation scenario.
The above embodiments are only exemplary embodiments of the present application and are not intended to limit the present application, the scope of which is defined by the claims. Various modifications and equivalent arrangements of this application will occur to those skilled in the art, and are intended to be within the spirit and scope of the application.

Claims (10)

1. A method for optimizing an ORB feature matching algorithm, comprising:
acquiring an image frame to be matched and a target image frame adjacent to the image frame to be matched, and determining a plurality of feature point pairs by adopting an ORB algorithm for feature point extraction and description based on the image frame to be matched and the target image frame;
Calculating the cosine similarity corresponding to each feature point pair based on the feature point pairs to obtain a plurality of cosine similarities, and selecting the optimal cosine similarity from the cosine similarities;
matching the image frames to be matched with the target image frames by adopting a Hamming distance feature matching method based on the optimal cosine similarity to obtain a plurality of target feature point pairs;
and carrying out mismatching elimination on the target feature point pairs by adopting a random consistency sampling RANSAC algorithm, and determining a plurality of optimal feature points in the image frames to be matched.
2. The method of optimizing an ORB feature matching algorithm according to claim 1, wherein said determining a plurality of feature point pairs using an ORB algorithm for feature point extraction and description based on said image frames to be matched and said target image frame comprises:
extracting a plurality of first FAST corner points from the image frame to be matched by using a FAST algorithm included in the ORB algorithm, and extracting a plurality of second FAST corner points from the target image frame by using the FAST algorithm, wherein the number of the plurality of first FAST corner points is the same as the number of the plurality of second FAST corner points;
calculating a first BRIEF descriptor corresponding to each first FAST corner point in the plurality of first FAST corner points by using a BRIEF algorithm included in the ORB algorithm, and calculating a second BRIEF descriptor corresponding to each second FAST corner point in the plurality of second FAST corner points by using the BRIEF algorithm;
Determining a plurality of first ORB feature points based on the plurality of first BRIEF descriptors, and determining a plurality of second ORB feature points based on the plurality of second BRIEF descriptors;
and matching the plurality of first ORB characteristic points with the plurality of second ORB characteristic points by adopting a Hamming distance characteristic matching method to obtain a plurality of characteristic point pairs.
3. The optimization method of an ORB feature matching algorithm according to claim 2, wherein said extracting a plurality of first FAST corner points from said image frame to be matched using a FAST algorithm included in said ORB algorithm and extracting a plurality of second FAST corner points from said target image frame using said FAST algorithm comprises:
dividing the image frame to be matched into a plurality of first image blocks, wherein each image block in the plurality of first image blocks comprises a plurality of first pixel points;
setting the brightness of each first pixel point in the plurality of first pixel points as first target brightness, determining a first target threshold value, calculating the sum value between the first target threshold value and the first target brightness to obtain first highest brightness, and calculating the difference value between the first target brightness and the first target threshold value to obtain first lowest brightness;
The following is performed for each first pixel: selecting a plurality of first target pixel points on a circumference taking a first preset value as a radius by taking the first pixel points as a center, determining first current brightness of each first target pixel point in the plurality of first target pixel points, and determining the first pixel points as first characteristic points if the first current brightness of the first target pixel points with continuous preset number on the circumference is determined to be larger than first highest brightness or smaller than first lowest brightness;
obtaining a plurality of first characteristic points corresponding to the plurality of first image blocks, balancing the plurality of first characteristic points by adopting a non-maximum suppression method, determining a plurality of first target characteristic points in the plurality of first characteristic points, and taking the plurality of first target characteristic points as the plurality of first FAST corner points; the method comprises the steps of,
dividing the target image frame into a plurality of second image blocks, wherein each second image block in the plurality of second image blocks comprises a plurality of second pixel points;
setting the brightness of each second pixel point in the plurality of second pixel points as second target brightness, determining a second target threshold value, calculating the sum value between the second target threshold value and the second target brightness to obtain second highest brightness, and calculating the difference value between the second target brightness and the second target threshold value to obtain second lowest brightness;
The following is performed for each second pixel: selecting a plurality of second target pixel points on a circumference taking a second preset value as a radius by taking the second pixel points as a center, determining second current brightness of each second target pixel point in the plurality of second target pixel points, and determining the second pixel points as second characteristic points if the second current brightness of the second target pixel points with continuous preset number on the circumference is determined to be larger than second highest brightness or smaller than second lowest brightness;
and obtaining a plurality of second characteristic points corresponding to the plurality of second image blocks, balancing the plurality of second characteristic points by adopting the non-maximum suppression method, determining a plurality of second target characteristic points in the plurality of second characteristic points, and taking the plurality of second target characteristic points as the plurality of second FAST corner points.
4. The method of claim 3, wherein determining a plurality of first ORB feature points based on a plurality of first BRIEF descriptors and determining a plurality of second ORB feature points based on a plurality of second BRIEF descriptors comprises:
performing the following operation on each first image block of the plurality of first image blocks: determining the mass center and the geometric center of a first image block, connecting the geometric center with the mass center by taking the geometric center as an origin, obtaining a direction vector pointing to the mass center from the geometric center, determining at least one coordinate corresponding to at least one first BRIEF descriptor included in the first image block, determining the direction corresponding to each first BRIEF descriptor according to the mass center and the coordinate corresponding to each first BRIEF descriptor by taking the direction vector as a reference, constructing an image pyramid, and forming a first ORB characteristic point corresponding to each first BRIEF descriptor based on the image pyramid and the direction corresponding to each first BRIEF descriptor;
Obtaining a plurality of first ORB feature points based on the first ORB feature points corresponding to each first BRIEF descriptor in each first image block; the method comprises the steps of,
performing the following on each of the plurality of second image blocks: determining the mass center and the geometric center of a second image block, connecting the geometric center with the mass center by taking the geometric center as an origin, obtaining a direction vector pointing to the mass center from the geometric center, determining at least one coordinate corresponding to at least one second BRIEF descriptor included in the second image block, determining the direction corresponding to each second BRIEF descriptor according to the coordinate corresponding to the mass center and each second BRIEF descriptor by taking the direction vector as a reference, constructing an image pyramid, and forming a second ORB characteristic point corresponding to each second BRIEF descriptor based on the direction corresponding to the image pyramid and each second BRIEF descriptor;
and obtaining the plurality of second ORB feature points based on the second ORB feature points corresponding to each second BRIEF descriptor in each second image block.
5. The method for optimizing an ORB feature matching algorithm according to claim 2, wherein said matching the plurality of first ORB feature points with the plurality of second ORB feature points using a hamming distance feature matching method to obtain the plurality of feature point pairs comprises:
The following operations are performed on each first ORB feature point of the plurality of first ORB feature points included in the image frame to be matched: calculating the hamming distance between a first ORB feature point and each of the plurality of second ORB feature points included in the target image frame, determining at least one target hamming distance with a hamming distance smaller than a Yu Hanming distance threshold value in the plurality of hamming distances, determining at least one second target ORB feature point corresponding to the at least one target hamming distance in the plurality of second ORB feature points, calculating the cosine similarity between each second target ORB feature point in the at least one second target ORB feature point and the first ORB feature point, obtaining a plurality of cosine similarities, determining optimal cosine similarity in the plurality of cosine similarities, determining a similarity matching value corresponding to each second target ORB feature point based on the optimal cosine similarity and the cosine similarity corresponding to each second target ORB feature point, obtaining at least one similarity matching value, inquiring the maximum similarity matching value in the at least one similarity matching value, and combining the maximum similarity matching value with the second ORB feature point corresponding to the first ORB feature point;
And obtaining the feature point pairs based on a feature point pair corresponding to each first ORB feature point.
6. The method for optimizing an ORB feature matching algorithm according to claim 1, wherein calculating cosine similarities corresponding to each feature point pair based on the feature point pairs to obtain a plurality of cosine similarities, and selecting an optimal cosine similarity from the cosine similarities comprises:
performing the following operation for each of the plurality of feature point pairs: determining directions of two feature points included in a feature point pair, and calculating cosine similarity corresponding to the feature point pair based on the directions of the two feature points;
obtaining a plurality of cosine similarities based on the cosine similarities corresponding to each feature point pair, calculating the difference value between every two cosine similarities in the plurality of cosine similarities to obtain a plurality of first difference values, calculating the difference value between every two first difference values in the plurality of first difference values to obtain a plurality of second difference values, inquiring the difference value with the highest occurrence frequency in the plurality of second difference values to be used as a group of second target difference values, and determining the cosine similarity corresponding to the group of second target difference values to be the optimal cosine similarity.
7. The method for optimizing an ORB feature matching algorithm according to claim 1 wherein said performing a mismatch cancellation on said plurality of target feature point pairs using a random consensus sampling RANSAC algorithm, determining a plurality of optimal feature points in said image frame to be matched comprises:
randomly selecting four target characteristic point pairs from the plurality of target characteristic point pairs, and calculating the four target characteristic point pairs by adopting a four-point method to obtain an initialization homography matrix;
randomly selecting four updated target feature point pairs for a plurality of times in the plurality of target feature point pairs, calculating the four updated target feature point pairs selected each time by adopting the four-point method to obtain a plurality of updated homography matrixes, calculating the error between each updated homography matrix in the plurality of updated homography matrixes and the initialized homography matrix to obtain a plurality of errors, determining at least one target error with the error smaller than an error threshold value in the plurality of errors, determining a plurality of target ORB feature point pairs corresponding to the at least one target error in the plurality of target feature point pairs, determining a plurality of inner points corresponding to the plurality of target ORB feature point pairs in the image frame to be matched, and counting the number of the plurality of inner points until the number of the plurality of inner points meets a preset condition;
And taking the plurality of interior points as the plurality of optimal characteristic points.
8. An apparatus for optimizing an ORB feature matching algorithm, comprising:
the determining module is used for acquiring an image frame to be matched and a target image frame adjacent to the image frame to be matched, and determining a plurality of feature point pairs by adopting an ORB algorithm for feature point extraction and description based on the image frame to be matched and the target image frame;
the selecting module is used for calculating the cosine similarity corresponding to each characteristic point pair based on the characteristic point pairs to obtain a plurality of cosine similarities, and selecting the optimal cosine similarity from the cosine similarities;
the matching module is used for matching the image frames to be matched with the target image frames by adopting a Hamming distance feature matching method based on the optimal cosine similarity to obtain a plurality of target feature point pairs;
and the elimination module is used for carrying out mismatching elimination on the target feature point pairs by adopting a random consistency sampling RANSAC algorithm, and determining a plurality of optimal feature points in the image frames to be matched.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any of claims 1 to 8 when the computer program is executed.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 8.
CN202311135217.9A 2023-09-04 2023-09-04 ORB feature matching algorithm optimization method and device and computer equipment Pending CN117197712A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311135217.9A CN117197712A (en) 2023-09-04 2023-09-04 ORB feature matching algorithm optimization method and device and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311135217.9A CN117197712A (en) 2023-09-04 2023-09-04 ORB feature matching algorithm optimization method and device and computer equipment

Publications (1)

Publication Number Publication Date
CN117197712A true CN117197712A (en) 2023-12-08

Family

ID=88993575

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311135217.9A Pending CN117197712A (en) 2023-09-04 2023-09-04 ORB feature matching algorithm optimization method and device and computer equipment

Country Status (1)

Country Link
CN (1) CN117197712A (en)

Similar Documents

Publication Publication Date Title
CN110322500B (en) Optimization method and device for instant positioning and map construction, medium and electronic equipment
CN111080529A (en) Unmanned aerial vehicle aerial image splicing method for enhancing robustness
CN109829853B (en) Unmanned aerial vehicle aerial image splicing method
CN109712071B (en) Unmanned aerial vehicle image splicing and positioning method based on track constraint
CN109919971B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN108805915B (en) Visual angle change resistant close-range image region feature matching method
CN108550166B (en) Spatial target image matching method
CN110084743B (en) Image splicing and positioning method based on multi-flight-zone initial flight path constraint
Wang et al. An improved ORB image feature matching algorithm based on SURF
US20200005078A1 (en) Content aware forensic detection of image manipulations
CN110942473A (en) Moving target tracking detection method based on characteristic point gridding matching
CN112085709A (en) Image contrast method and equipment
CN111199558A (en) Image matching method based on deep learning
CN112652020A (en) Visual SLAM method based on AdaLAM algorithm
US20230053952A1 (en) Method and apparatus for evaluating motion state of traffic tool, device, and medium
CN110969128A (en) Method for detecting infrared ship under sea surface background based on multi-feature fusion
CN117197712A (en) ORB feature matching algorithm optimization method and device and computer equipment
CN113221914B (en) Image feature point matching and mismatching elimination method based on Jacobsad distance
CN116128919A (en) Multi-temporal image abnormal target detection method and system based on polar constraint
Dinh et al. StereoPairFree: self-constructed stereo correspondence network from natural images
CN111160363B (en) Method and device for generating feature descriptors, readable storage medium and terminal equipment
CN109919998B (en) Satellite attitude determination method and device and terminal equipment
CN114399532A (en) Camera position and posture determining method and device
Wu et al. Real-time robust algorithm for circle object detection
Abdellaoui et al. Template matching approach for automatic human body tracking in video

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination