CN116863170A - Image matching method, device and storage medium - Google Patents

Image matching method, device and storage medium Download PDF

Info

Publication number
CN116863170A
CN116863170A CN202310716636.5A CN202310716636A CN116863170A CN 116863170 A CN116863170 A CN 116863170A CN 202310716636 A CN202310716636 A CN 202310716636A CN 116863170 A CN116863170 A CN 116863170A
Authority
CN
China
Prior art keywords
matching
image
point
feature
algorithm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310716636.5A
Other languages
Chinese (zh)
Inventor
杨琼楠
吴力涛
孙帮东
陈剑钧
仇晨光
薛海英
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
No 214 Institute of China North Industries Group Corp
Original Assignee
No 214 Institute of China North Industries Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by No 214 Institute of China North Industries Group Corp filed Critical No 214 Institute of China North Industries Group Corp
Priority to CN202310716636.5A priority Critical patent/CN116863170A/en
Publication of CN116863170A publication Critical patent/CN116863170A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/757Matching configurations of points or features

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses an image matching method, equipment and a storage medium in the field of image processing, wherein the method comprises the following steps: processing a plurality of images acquired in the same scene to acquire a corresponding image sequence; performing corner detection on the image sequence by adopting a FAST algorithm to obtain image corners; determining a feature point detection area according to the image corner points; detecting the characteristic points in the characteristic point detection area by adopting an SFIT algorithm; generating a descriptor of the feature point; according to the descriptors of the feature points, a bidirectional FLANN algorithm is adopted to match the feature points of the adjacent time images, and matching point pairs are obtained; and eliminating the error matching point pair in the matching point pair to complete image matching. The application can improve the accuracy and stability of image matching.

Description

Image matching method, device and storage medium
Technical Field
The application relates to an image matching method, image matching equipment and a storage medium, and belongs to the technical field of image processing.
Background
Image matching refers to searching overlapping parts of a plurality of images in the same scene through a certain matching algorithm, and then forming a scene picture through transformation and fusion. The image matching technology is widely applied to the fields of visual SLAM, target identification, target tracking, three-dimensional reconstruction, video monitoring, machine vision and the like. The current image matching methods are mainly divided into three categories: a matching method based on gray information, a matching method based on a transform domain and a matching method based on characteristics. The feature-based matching method has large information quantity and strong anti-interference capability, so the feature-based matching method is always a research hot spot.
The traditional Harris algorithm has small calculated amount and good stability for rotation, illumination change and the like of the image. However, when scaling occurs, the repeatability of Harris corner points will be reduced. The scale invariant feature transform SIFT algorithm was proposed by Lowe in 1999 and improved in 2004. The algorithm can keep good stability for translation, scale scaling, rotation, brightness change, affine transformation and the like of the image, but the calculated amount is large, and the real-time performance of the algorithm is seriously affected. In the FAST algorithm, the corner points are selected only by comparing the brightness among pixels, so that the calculation complexity is low, the real-time performance is good, the influence of external environment is easy, the number of the detected corner points is large, the accuracy is low, and the defects of low repeatability and uneven distribution are overcome. Furthermore, FAST corner has no direction information. Meanwhile, since it uses a circle with a constant radius of 3, it has no dimensional invariance. .
Disclosure of Invention
The application aims to overcome the defects in the prior art and provides an image matching method, device and storage medium, which can improve the accuracy and stability of image matching.
In order to achieve the above purpose, the application is realized by adopting the following technical scheme:
in a first aspect, the present application provides an image matching method, including:
processing a plurality of images acquired in the same scene to acquire a corresponding image sequence;
performing corner detection on the image sequence by adopting a FAST algorithm to obtain image corners;
determining a feature point detection area according to the image corner points;
detecting the characteristic points in the characteristic point detection area by adopting an SFIT algorithm;
generating a descriptor of the feature point;
according to the descriptors of the feature points, a bidirectional FLANN algorithm is adopted to match the feature points of the adjacent time images, and matching point pairs are obtained;
and eliminating the error matching point pair in the matching point pair to complete image matching.
With reference to the first aspect, further, the detecting the feature point in the feature point detection area by using an SFIT algorithm includes:
establishing a Gaussian pyramid scale space according to the feature point detection region, and establishing a DoG scale space based on the Gaussian pyramid scale space;
comparing the image corner with 26 pixels of the DoG scale space, and if the pixel value of the image corner is the maximum value or the minimum value in the pixel values of the 26 pixels of the DoG scale space, considering the image corner as a characteristic point of the image in the DoG scale space, wherein the 26 pixels of the DoG scale space comprise 8 adjacent pixels of the current DoG scale space and 9 adjacent pixels in the upper and lower DoG scale spaces.
With reference to the first aspect, further, if feature points exist in the DoG scale space of the corner neighborhood, reserving the DoG scale space; otherwise, deleting the DoG scale space.
With reference to the first aspect, further, the generating a descriptor of the feature point includes:
determining a main direction of the feature point according to the gradient amplitude of the 16×16 field point pixels of the feature point;
selecting a region with the size of 16 multiplied by 16 by taking a characteristic point as a center, using an arrow direction to represent the gradient direction of the characteristic point, using an arrow length to represent a gradient modulus value, and using a Gaussian weight function with a scale factor of one half of a Gaussian window to carry out weighting operation on the gradient modulus value of the field point pixel of the characteristic point;
and calculating gradient information of 8 directions in each 4×4 small region to obtain a 128-dimensional vector, wherein the vector and the main direction of the feature point are descriptors of the feature point.
With reference to the first aspect, further, the performing feature point matching on the adjacent time images by using a bidirectional FLANN algorithm to obtain a matching point pair includes:
matching descriptors of feature points of the adjacent time images by using a FLANN algorithm to obtain a first calculation sequence candidate matching point pair;
exchanging the calculation sequence of the descriptors of the adjacent time image feature points, and matching the descriptors of the adjacent time image feature points by using a FLANN algorithm to obtain a second calculation sequence candidate matching point pair;
and acquiring an intersection of the first calculation sequence candidate matching point pair and the second calculation sequence candidate matching point pair as the matching point pair.
With reference to the first aspect, further, the performing feature point matching on the adjacent time images by using a bidirectional FLANN algorithm to obtain a matching point pair includes:
matching descriptors of feature points of the adjacent time images by using a FLANN algorithm to obtain a first calculation sequence candidate matching point pair;
exchanging the calculation sequence of descriptors of the feature points of the adjacent time images in the first calculation sequence candidate matching point pairs, and screening the matching point pairs from the first calculation sequence candidate matching point pairs by using a FLANN algorithm to match the descriptors of the feature points of the adjacent time images in the first calculation sequence candidate matching point pairs.
With reference to the first aspect, further, a RANSAC algorithm is adopted to reject the mismatching point pair of the matching point pairs.
With reference to the first aspect, further, in the process of eliminating the mismatching point pairs by using the RANSAC algorithm, a homography matrix is calculated according to the randomly selected 4 pairs of matching point pairs.
In a second aspect, the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the method of any of the first aspects.
In a third aspect, the present application provides an electronic device comprising a processor and a memory connected to the processor, a computer program being stored in the memory, which when executed by the processor performs the steps of the method of any of the first aspects.
Compared with the prior art, the application has the beneficial effects that:
according to the application, the FAST is utilized to detect the corner points, and the SIFT corner point detection area is determined according to the corner point detection area of the FAST, so that the stability of the feature points is improved, and the feature points are more reliable; and the feature point matching is completed by adopting a bidirectional FLANN algorithm, so that the precision of the feature point matching is improved. The application has strong robustness.
Drawings
Fig. 1 is a flowchart of an image matching method according to an embodiment of the present application.
FIG. 2 is a schematic diagram of a Gaussian differential pyramid according to an embodiment of the application;
fig. 3 is a schematic diagram of a FAST algorithm provided according to an embodiment of the present application;
fig. 4 is a flow chart of a RANSAC algorithm employed in accordance with an embodiment of the present application.
Description of the embodiments
The following detailed description of the technical solutions of the present application will be given by way of the accompanying drawings and specific embodiments, and it should be understood that the specific features of the embodiments and embodiments of the present application are detailed descriptions of the technical solutions of the present application, and not limiting the technical solutions of the present application, and that the embodiments and technical features of the embodiments of the present application may be combined with each other without conflict.
Examples
Fig. 1 is a flowchart of an image matching method in a first embodiment of the present application. The flow chart merely shows the logical sequence of the method according to the present embodiment, and the steps shown or described may be performed in a different order than shown in fig. 1 in other possible embodiments of the application without mutual conflict.
Referring to fig. 1, the method of the present embodiment specifically includes the following steps:
step one: processing a plurality of images acquired in the same scene to acquire a corresponding image sequence;
multiple images of the same scene may be acquired using a vision sensor, which may optionally be a camera.
Step two: performing corner detection on the image sequence by adopting a FAST algorithm to obtain image corners;
as shown in FIG. 3, a Bresenham circle with a radius of 3 pixels is constructed centering on P, and the brightness of the point P isSetting threshold +.>The brightness of 16 points on the circle is combined with +.>Difference operations are performed, if there is a succession +.>Absolute value ratio of the difference valuesIf large, the point P can be regarded as candidate corner (++>Take 9 or 12). And then performing non-maximum suppression, wherein the maximum response value is the corner point. The detection process is as follows:
step1: firstly, acquiring a picture;
step2: selecting a point p in the image, assuming that its gray value is
Step3: setting a threshold T;
step4: taking the point p as the center, obtaining 16 points on a circle with the radius of 3 pixels;
step5: the brightness of the 16 points is obtained and then+T or +.>-T is compared;
step6: if the brightness ratio of the selected 16 points with continuous M points+T is greater or greater than->-T is small, then p is a corner point; (M is usually 9,11, 12)
Step7: repeating Step2-Step6, and executing the same operation on each point;
step8: and suppressing the non-maximum value of the key point.
Step three: determining a feature point detection area according to the image corner points;
step four: detecting the characteristic points in the characteristic point detection area by adopting an SFIT algorithm; the method specifically comprises the following steps:
establishing a Gaussian pyramid scale space according to the feature point detection region, and establishing a DoG scale space based on the Gaussian pyramid scale space; referring to fig. 2, the left side of the figure is a gaussian pyramid scale space and the right side is a DoG scale space.
Comparing the image corner with 26 pixels of the DoG scale space, and if the pixel value of the image corner is the most value (maximum value or minimum value) in the pixel values of 26 pixels of the current DoG scale space and the upper and lower DoG scale spaces, considering the image corner as a characteristic point of the image in the DoG scale space, wherein the 26 pixels comprise 8 adjacent pixels of the current DoG scale space and 9 adjacent pixels of the upper and lower DoG scale spaces. As shown in FIG. 2, one layer is a DoG scale space, taking a pixel as the center, taking a 3×3 grid, 8 pixels around, and 9 pixels on each of the upper and lower layers.
It should be noted that, if feature points exist in the DoG scale space of the corner neighborhood, the DoG scale space is reserved; otherwise, deleting the DoG scale space. The corner neighborhood is the area around the corner.
Step five: generating a descriptor of the feature point; the method specifically comprises the following steps:
determining a main direction of the feature point according to the gradient amplitude of the 16×16 field point pixels of the feature point;
selecting a region with the size of 16 multiplied by 16 by taking a characteristic point as a center, using an arrow direction to represent the gradient direction of the characteristic point, using an arrow length to represent a gradient modulus value, and using a Gaussian weight function with a scale factor of one half of a Gaussian window to carry out weighting operation on the gradient modulus value of the field point pixel of the characteristic point;
and calculating gradient information of 8 directions in each 4×4 small area to obtain a 4×4×8=128-dimensional vector, wherein the vector and the main direction of the feature point are descriptors of the feature point.
Step six: according to the descriptors of the feature points, a bidirectional FLANN algorithm is adopted to match the feature points of the adjacent time images, and matching point pairs are obtained;
as an embodiment of the present application, the feature point matching is performed on the adjacent time images by using a bidirectional FLANN algorithm, and the obtaining of the matching point pair includes:
matching descriptors of feature points of the adjacent time images by using a FLANN algorithm to obtain a first calculation sequence candidate matching point pair;
exchanging the calculation sequence of the descriptors of the adjacent time image feature points, and matching the descriptors of the adjacent time image feature points by using a FLANN algorithm to obtain a second calculation sequence candidate matching point pair;
and acquiring an intersection of the first calculation sequence candidate matching point pair and the second calculation sequence candidate matching point pair as the matching point pair.
As another embodiment of the present application, the feature point matching is performed on the adjacent time images by adopting a bidirectional FLANN algorithm, and a matching point pair is obtained, which further comprises the following steps:
matching descriptors of feature points of the adjacent time images by using a FLANN algorithm to obtain a first calculation sequence candidate matching point pair;
exchanging the calculation sequence of descriptors of the feature points of the adjacent time images in the first calculation sequence candidate matching point pairs, and screening the matching point pairs from the first calculation sequence candidate matching point pairs by using a FLANN algorithm to match the descriptors of the feature points of the adjacent time images in the first calculation sequence candidate matching point pairs.
Step seven: and eliminating the error matching point pair in the matching point pair to complete image matching.
As an embodiment of the present application, a RANSAC algorithm may be used to reject mismatching point pairs among the matching point pairs. In the process of eliminating the mismatching point pairs by using the RANSAC algorithm, a homography matrix is calculated according to the randomly selected 4 pairs of matching point pairs. Referring to fig. 4, a flow chart of a RANSAC algorithm provided by an embodiment of the present application includes the following steps:
step A: input data: the method comprises the steps of judging the maximum iteration times L as an error threshold delta of the inner points and judging the number threshold E of the inner points;
and (B) step (B): judging whether the iteration times are larger than the maximum iteration times L or not: if yes, indicating that the conforming model cannot be found, and giving an error prompt; if not, randomly selecting 4 pairs of matching point pairs to calculate a homography matrix H;
step C: calculating the number e of interior points according to the homography matrix H;
step D: comparing the number E of the interior points with a set threshold value E of the number of the interior points, if E is larger than E, recalculating the homography matrix H and the interior points according to E, and returning; otherwise, the iteration times are added with 1, and the step B is returned.
The image matching method provided by the embodiment of the application comprises the steps of firstly utilizing a FAST-SIFT depth fusion algorithm to integrate and display image information in real time, secondly utilizing the FAST algorithm to detect angular points in the image, extracting SIFT feature points in the vicinity of the FAST angular points, wherein the feature points have common features of the FAST angular points and the SIFT feature points at the same time, describing the extracted feature points by utilizing a feature description method of SIFT descriptors, utilizing a bidirectional FLANN algorithm to complete rough matching, acquiring matching point pairs, and finally eliminating mismatching pairs.
The image matching method provided in this embodiment may be applied to a terminal, and may be performed by an image matching apparatus, where the apparatus may be implemented by software and/or hardware, and the apparatus may be integrated in the terminal, for example: any smart phone, tablet computer or computer device with communication function.
Examples
The embodiment of the application also provides electronic equipment, which comprises a processor and a storage medium;
the storage medium is used for storing instructions;
the processor is configured to operate in accordance with the instructions to perform the steps of the method of embodiment one.
The electronic device provided by the embodiment of the application can execute the method provided by the embodiment of the application, and has the corresponding beneficial effects of executing the method.
Examples
The embodiment of the present application also provides a computer-readable storage medium, on which a computer program is stored, which when being executed by a processor, implements the steps of the method of the embodiment one.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The foregoing is merely a preferred embodiment of the present application, and it should be noted that modifications and variations could be made by those skilled in the art without departing from the technical principles of the present application, and such modifications and variations should also be regarded as being within the scope of the application.

Claims (10)

1. An image matching method, comprising:
processing a plurality of images acquired in the same scene to acquire a corresponding image sequence;
performing corner detection on the image sequence by adopting a FAST algorithm to obtain image corners;
determining a feature point detection area according to the image corner points;
detecting the characteristic points in the characteristic point detection area by adopting an SFIT algorithm;
generating a descriptor of the feature point;
according to the descriptors of the feature points, a bidirectional FLANN algorithm is adopted to match the feature points of the adjacent time images, and matching point pairs are obtained;
and eliminating the error matching point pair in the matching point pair to complete image matching.
2. The image matching method according to claim 1, wherein the detecting the feature points in the feature point detection area using the SFIT algorithm includes:
establishing a Gaussian pyramid scale space according to the feature point detection region, and establishing a DoG scale space based on the Gaussian pyramid scale space;
comparing the image corner with 26 pixels of the DoG scale space, and if the pixel value of the image corner is the maximum value or the minimum value in the pixel values of the 26 pixels of the DoG scale space, considering the image corner as a characteristic point of the image in the DoG scale space, wherein the 26 pixels of the DoG scale space comprise 8 adjacent pixels of the current DoG scale space and 9 adjacent pixels in the upper and lower DoG scale spaces.
3. The image matching method according to claim 2, wherein if there are feature points in the DoG scale space of the corner neighborhood, the DoG scale space is preserved; otherwise, deleting the DoG scale space.
4. The image matching method according to claim 1, wherein the generating the descriptor of the feature point includes:
determining a main direction of the feature point according to the gradient amplitude of the 16×16 field point pixels of the feature point;
selecting a region with the size of 16 multiplied by 16 by taking a characteristic point as a center, using an arrow direction to represent the gradient direction of the characteristic point, using an arrow length to represent a gradient modulus value, and using a Gaussian weight function with a scale factor of one half of a Gaussian window to carry out weighting operation on the gradient modulus value of the field point pixel of the characteristic point;
and calculating gradient information of 8 directions in each 4×4 small region to obtain a 128-dimensional vector, wherein the vector and the main direction of the feature point are descriptors of the feature point.
5. The image matching method according to claim 1, wherein the feature point matching is performed on the adjacent time images by using a bi-directional FLANN algorithm, and the obtaining of the matching point pair includes:
matching descriptors of feature points of the adjacent time images by using a FLANN algorithm to obtain a first calculation sequence candidate matching point pair;
exchanging the calculation sequence of the descriptors of the adjacent time image feature points, and matching the descriptors of the adjacent time image feature points by using a FLANN algorithm to obtain a second calculation sequence candidate matching point pair;
and acquiring an intersection of the first calculation sequence candidate matching point pair and the second calculation sequence candidate matching point pair as the matching point pair.
6. The image matching method according to claim 1, wherein the feature point matching is performed on the adjacent time images by using a bi-directional FLANN algorithm, and the obtaining of the matching point pair includes:
matching descriptors of feature points of the adjacent time images by using a FLANN algorithm to obtain a first calculation sequence candidate matching point pair;
exchanging the calculation sequence of descriptors of the feature points of the adjacent time images in the first calculation sequence candidate matching point pairs, and screening the matching point pairs from the first calculation sequence candidate matching point pairs by using a FLANN algorithm to match the descriptors of the feature points of the adjacent time images in the first calculation sequence candidate matching point pairs.
7. The image matching method according to claim 1, wherein a RANSAC algorithm is used to reject mismatching point pairs among the matching point pairs.
8. The image matching method according to claim 7, wherein the homography matrix is calculated from 4 pairs of matching points selected randomly in the process of eliminating the pairs of mismatching points using RANSAC algorithm.
9. A computer readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the steps of the method according to any one of claims 1-8.
10. An electronic device comprising a processor and a memory coupled to the processor, wherein a computer program is stored in the memory, which, when executed by the processor, performs the steps of the method according to any of claims 1-8.
CN202310716636.5A 2023-06-15 2023-06-15 Image matching method, device and storage medium Pending CN116863170A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310716636.5A CN116863170A (en) 2023-06-15 2023-06-15 Image matching method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310716636.5A CN116863170A (en) 2023-06-15 2023-06-15 Image matching method, device and storage medium

Publications (1)

Publication Number Publication Date
CN116863170A true CN116863170A (en) 2023-10-10

Family

ID=88222477

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310716636.5A Pending CN116863170A (en) 2023-06-15 2023-06-15 Image matching method, device and storage medium

Country Status (1)

Country Link
CN (1) CN116863170A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117690016A (en) * 2023-11-07 2024-03-12 国网四川省电力公司信息通信公司 Service scene image matching method and system for transformer substation

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117690016A (en) * 2023-11-07 2024-03-12 国网四川省电力公司信息通信公司 Service scene image matching method and system for transformer substation

Similar Documents

Publication Publication Date Title
Han et al. Visible and infrared image registration in man-made environments employing hybrid visual features
CN102640185B (en) The method and apparatus of the combined tracking that object represents in real time in image sequence
US20140270362A1 (en) Fast edge-based object relocalization and detection using contextual filtering
Chen et al. An improved edge detection algorithm for depth map inpainting
CN103279952A (en) Target tracking method and device
Valiente García et al. Visual Odometry through Appearance‐and Feature‐Based Method with Omnidirectional Images
Qu et al. Image seamless stitching and straightening based on the image block
Son et al. A multi-vision sensor-based fast localization system with image matching for challenging outdoor environments
CN111105452A (en) High-low resolution fusion stereo matching method based on binocular vision
CN116863170A (en) Image matching method, device and storage medium
Qu et al. The algorithm of seamless image mosaic based on A‐KAZE features extraction and reducing the inclination of image
Stentoumis et al. A local adaptive approach for dense stereo matching in architectural scene reconstruction
Majdik et al. New approach in solving the kidnapped robot problem
Żak et al. Local image features matching for real-time seabed tracking applications
Ekekrantz et al. Adaptive iterative closest keypoint
CN108447084B (en) Stereo matching compensation method based on ORB characteristics
Zhao Rapid multimodal image registration based on the local edge histogram
CN113704276A (en) Map updating method and device, electronic equipment and computer readable storage medium
Wang et al. Target recognition and localization of mobile robot with monocular PTZ camera
Dong et al. Superpixel-based local features for image matching
CN116128919A (en) Multi-temporal image abnormal target detection method and system based on polar constraint
CN113674340A (en) Binocular vision navigation method and device based on landmark points
CN112614166A (en) Point cloud matching method and device based on CNN-KNN
Ziomek et al. Evaluation of interest point detectors in presence of noise
Cai et al. Unfeatured weld positioning technology based on neural network and machine vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination