CN107330436B - Scale criterion-based panoramic image SIFT optimization method - Google Patents

Scale criterion-based panoramic image SIFT optimization method Download PDF

Info

Publication number
CN107330436B
CN107330436B CN201710443220.5A CN201710443220A CN107330436B CN 107330436 B CN107330436 B CN 107330436B CN 201710443220 A CN201710443220 A CN 201710443220A CN 107330436 B CN107330436 B CN 107330436B
Authority
CN
China
Prior art keywords
sift
criterion
panoramic
scale
matching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710443220.5A
Other languages
Chinese (zh)
Other versions
CN107330436A (en
Inventor
朱齐丹
纪勋
王靖淇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Engineering University
Original Assignee
Harbin Engineering University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Engineering University filed Critical Harbin Engineering University
Priority to CN201710443220.5A priority Critical patent/CN107330436B/en
Publication of CN107330436A publication Critical patent/CN107330436A/en
Application granted granted Critical
Publication of CN107330436B publication Critical patent/CN107330436B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a panoramic image SIFT optimization method based on scale criteria. Aiming at the problem of mismatching in the SIFT matching algorithm, the invention provides a mechanism which can enable a computer to identify and eliminate the mismatching feature pairs in the panoramic visual image. For each SIFT feature matching pair, the algorithm respectively carries out panoramic imaging system criterion and scale criterion, if two criterion conclusions conflict, the matching pair is regarded as a mismatching pair, and removal processing is carried out; if the two criteria conclude that no conflict exists, the matching pair is regarded as a correct matching pair, and the matching pair is reserved. Compared with the traditional SIFT algorithm, the algorithm has the capability of automatically detecting the mismatching pairs, and the matching precision of the SIFT algorithm is improved.

Description

Scale criterion-based panoramic image SIFT optimization method
Technical Field
The invention belongs to the field of image matching in computer vision, and particularly relates to a scale criterion-based panoramic image SIFT optimization method.
Background
Currently, image matching is one of the most difficult problems in computer vision. For example, when a spatial three-dimensional scene is projected as a two-dimensional image, the images of the same scene at different viewpoints may be greatly different, and factors in the scene, such as lighting conditions, scene geometry and physical characteristics, noise interference and distortion, and camera characteristics, may change the gray scale value of the image to some extent.
Image matching is now more and more widely applied along with the development of computers, for example, image matching technology can be applied to the military field, for example, in a field battle environment, matching and fusion of an obtained infrared image and a visible light image are needed, and the two images are combined to obtain a more accurate result. The image matching can also be applied to the fields of weather forecast and aviation, and the image fusion is realized by correcting the change among the multi-source remote sensing images, so that more comprehensive ground feature information is obtained.
The image matching method is divided into gray-scale-based image matching and feature-based image matching according to a matching process, wherein the feature matching refers to an algorithm for performing parameter description on features by respectively extracting features of two or more images and then matching the described parameters by using a certain similarity measure. The scale invariant feature transform method (SIFT) belongs to feature-based image matching.
The main idea of SIFT is to translate the match between images into the match between feature vectors. The method comprises the steps of firstly extracting stable features to be matched, describing the stable features, and then matching generated feature vectors. The stable feature to be matched is a feature which can keep certain invariance to the change of an image and can still keep better matching performance under the conditions of object motion, shielding, noise influence and the like. Generally, the SIFT is an algorithm for extracting local features, finding extreme points, extracting positions, scales and rotation invariants in a scale space, the SIFT is a feature description method with good robustness and unchanged scales, and is widely applied to the fields of image registration, image splicing, household article classification, face recognition and the like, and the SIFT has many defects, such as high time complexity, long algorithm time consumption, large calculation data amount, manual intervention requirement and the like.
Aiming at the defects of SIFT, scholars at home and abroad are always dedicated to optimizing the algorithm in recent years, for example, a patent (with the publication number of CN104834931A) proposes an improved scale invariant feature matching algorithm based on wavelet transform, and introduces a two-dimensional fast wavelet transform algorithm on the basis of the original classical algorithm to reconstruct the low-frequency component of an image, then adjusts the number of Gaussian pyramids, reduces the down-sampling times, and finally eliminates mismatching points through the optimized algorithm. The improved algorithm not only reduces the matching time consumption, but also improves the matching rate. However, the current image matching technology still faces the difficult problem that full-automatic matching cannot be achieved, and on the premise that no manual intervention is needed, a computer can automatically complete matching among multi-source images according to a set program.
Disclosure of Invention
Compared with the traditional SIFT algorithm, the algorithm has the capability of automatically detecting the mismatching pairs, and improves the matching precision of the SIFT algorithm.
The invention is realized by the following steps:
a scale criterion-based panoramic image SIFT optimization algorithm is specifically realized by the following steps:
step 1, shooting a plurality of panoramic images at different positions on the same horizontal plane by using a panoramic vision imaging system;
step 2, increasing the number of layers and groups in the original SIFT algorithm, and extracting and matching SIFT features of a plurality of images to obtain SIFT feature matching pairs;
step 3, respectively carrying out panoramic imaging system criterion and scale criterion on each SIFT feature matching pair;
step 4, comparing the criteria of the panoramic imaging system with the scale criteria for each SIFT matching pair, if two criteria conclusions conflict, the matching pair is regarded as a mismatching pair and is removed, and if two criteria conclusions do not conflict, the matching pair is regarded as a correct matching pair and is reserved;
and 5, checking whether an undetermined matching pair exists, if so, continuing to repeat the steps 3, 4 and 5, and if all matching pairs are judged, ending the program execution.
The number of the panoramic image in the step 1 is at least 2.
And 3, judging the position relation between the image positions of two feature points in each SIFT feature matching pair and a panoramic image ring according to the panoramic imaging criterion 1 and the panoramic imaging criterion 2, wherein the panoramic imaging criterion 1 is executed if the feature points are all in the panoramic image ring, and the panoramic imaging criterion 2 is executed if the feature points are all outside the panoramic image ring.
The scale criterion in step 3 specifies that, for each SIFT feature matching pair, the scale size relationship between two feature points in the matching pair is judged and the scale criterion is executed, when the scale value of the SIFT feature point is larger, the actual distance represented by the feature point from the shooting position is smaller, and when the scale value of the SIFT feature point is smaller, the actual distance represented by the feature point from the shooting position is larger.
And (3) in the scale criterion, increasing the number of image layers in the scale space on the premise of ensuring the matching precision.
When the panoramic image ring is mapped to the panoramic image by the natural road sign on the horizontal plane of the optical axis, the imaging point is only located on one ring in the panoramic image, and no matter how the imaging system moves horizontally, the imaging point of the natural road sign on the horizontal plane of the optical axis cannot leave the panoramic image ring.
The invention has the beneficial effects that:
aiming at the feature extraction of the panoramic image, the invention designs a scale criterion-based panoramic image SIFT optimization algorithm. By setting the panoramic image criterion and the scale criterion, the computer can automatically detect and remove the mismatching points, and the matching precision of the characteristic points is improved. The method performs algorithm optimization in the traditional SIFT extraction algorithm, effectively improves the algorithm accuracy, and can be widely applied to the fields of image processing, pattern recognition, robot navigation and the like.
Drawings
FIG. 1 is a flow chart of the system of the present invention.
Fig. 2 is a front view of the panoramic imaging system criterion of the present invention.
FIG. 3 is a top view of the panoramic imaging system criteria of the present invention.
Detailed Description
The invention is further described below with reference to the accompanying drawings.
Fig. 1 shows a flow chart of the system of the present invention. The detailed process is as follows:
step 1, respectively shooting two panoramic images at different positions on the same horizontal plane by using a panoramic vision imaging system;
step 2, adjusting the number of layers in the original SIFT algorithm from 3 to 6, and extracting and matching SIFT features of the two images;
step 3, judging the position relation between the image positions of two feature points in each SIFT feature matching pair and a panoramic image ring, if the feature points are all in the ring, executing a panoramic imaging criterion 1, and if the feature points are all outside the ring, executing a panoramic imaging criterion 2; for each SIFT feature matching pair, judging the scale size relationship of two feature points in the matching pair, and executing a scale criterion;
and 4, checking whether the two criterions in the step 3 conflict. If the collision happens, the matching pair is regarded as a mismatching pair, and is removed, and if the collision does not happen, the matching pair is regarded as a correct matching pair and is reserved;
and 5, checking whether an undetermined matching pair exists, if so, continuing to repeat the steps 3, 4 and 5, and if all matching pairs are judged, ending the program execution.
Fig. 2 and fig. 3 show a front view and a top view of the panoramic imaging system criteria of the present invention, wherein fig. 2 is a front view and fig. 3 is a top view. According to the imaging principle of the panoramic vision imaging system, almost all environmental information in the space can be mapped to one panoramic image. In FIG. 2A ', B', C ', D', H1' and H2' is 6 signposts in a real scene, where H1' and H2The panoramic vision imaging system is positioned on a horizontal plane where an optical axis of the panoramic vision imaging system is positioned, A 'and B' are positioned above the optical axis and have the same vertical height, C 'and D' are positioned below the optical axis and have the same vertical height, and F1And F2In order to distinguish the connection mode between the landmark and the imaging point corresponding to the landmark in fig. 2, the color of the straight line connecting a ' and a is black, the color of the straight line connecting B ' and B is dark gray, and the color of the straight line connecting D ' and D is light gray. In FIG. 3, A, B, C, D, H1And H2Is road sign A ', B', C ', D', H1' and H2' imaging points each in the panoramic image.
As can be seen from fig. 2 and 3, when the natural road sign on the horizontal plane of the optical axis is mapped into the panoramic image, the imaging point is only located on one circular ring in the panoramic image, and no matter how the imaging system moves horizontally, the imaging point of the natural road sign on the horizontal plane of the optical axis does not leave the circular ring; the natural road sign above the optical axis has corresponding imaging points outside the ring, and under the premise that the vertical height is unchanged, the farther the distance between the road sign and the shooting position is, the closer the imaging points are to the ring and the image center; the imaging points corresponding to the natural road signs below the optical axis are all located in the circular ring, and on the premise that the vertical height is unchanged, the farther the distance between the road signs and the shooting position is, the closer the imaging points are to the circular ring, and the imaging points are far away from the image center.
According to the phenomenon, the relative distance between the corresponding actual road sign and the two shooting positions is judged by judging the image distance between the two SIFT feature points in the feature pair and the center of each image. Let { f }P,fQIs a pair of feature matching pairs in two panoramic images taken at the P position and the Q position, L is fP,fQA natural landmark in the real scene represented by; o isIPAnd OIQIs the center position of both images. Thus by calculating the image distance d (f)P,OIP) And d (f)Q,OIQ) The relationship between the actual distance d (L, P) and d (L, Q) is determined, so the panoramic imaging system criterion utilized by the present invention is as follows:
when f isPAnd fQWhen all are located inside the panoramic image ring, record as panoramic imaging criterion 1, the expression is:
Figure BDA0001320548770000041
when f isPAnd fQWhen all are located panoramic image ring outside, remember as panoramic imaging criterion 2, the expression is:
Figure BDA0001320548770000042
the scale information is obtained through the extreme value detection and key point positioning of the scale space in the SIFT algorithm, the SIFT forms a plurality of groups and layers of transformed pictures through continuous Gaussian blur and down sampling of an initial picture, so that the scale space is formed, and each SIFT point is generated from the scale space, so that each SIFT point has a specific scale value. And the scale value can provide oneThe important information is: if a natural landmark in a real scene is closer to a shooting position, gaussian smoothing processing with higher fuzzy degree is often needed, that is, the corresponding SIFT feature point of the natural landmark is often larger in scale value. Therefore, aiming at the phenomena, the number of image layers in the scale space is increased, so that the total matching number of SIFT feature points is increased and the scale information is enriched on the premise of ensuring the matching precision; and by judging the matching pair { fP,fQScale values of two feature points σ inPAnd σQThe distance between the feature pair and the two shooting positions is judged according to the size of the feature pair. I.e. the scale criterion, as follows:
Figure BDA0001320548770000051
as the judgment standards of two different angles, the panoramic imaging system criterion and the scale criterion can judge the relative distance relationship between the natural road sign and the actual shooting position. Therefore, for a certain SIFT feature matching pair, if the conclusions drawn by the two criteria conflict with each other, the basic information of the matching pair is necessarily violated by one of the criteria, and thus the matching pair can be proved to be a mismatching pair.

Claims (3)

1. A scale criterion-based panoramic image SIFT optimization method is characterized by comprising the following specific implementation steps:
step 1, shooting a plurality of panoramic images at different positions on the same horizontal plane by using a panoramic vision imaging system;
step 2, increasing the number of layers and groups in the original SIFT algorithm, and extracting and matching SIFT features of a plurality of images to obtain SIFT feature matching pairs;
step 3, respectively carrying out panoramic imaging system criterion and scale criterion on each SIFT feature matching pair;
step 4, comparing the criteria of the panoramic imaging system with the scale criteria for each SIFT matching pair, if two criteria conclusions conflict, the matching pair is regarded as a mismatching pair and is removed, and if two criteria conclusions do not conflict, the matching pair is regarded as a correct matching pair and is reserved;
step 5, checking whether an undetermined matching pair exists, if so, continuing to repeat the steps 3, 4 and 5, and if all matching pairs are judged, ending the program execution;
the panoramic imaging criterion in the step 3 comprises a panoramic imaging criterion 1 and a panoramic imaging criterion 2, for each SIFT feature matching pair, the position relation between the image positions of two feature points in the matching pair and a panoramic image ring is judged, if the feature points are all in the panoramic image ring, the panoramic imaging criterion 1 is executed, and if the feature points are all outside the panoramic image ring, the panoramic imaging criterion 2 is executed;
the scale criterion in step 3 specifies that, for each SIFT feature matching pair, the scale size relationship between two feature points in the matching pair is judged and the scale criterion is executed, when the scale value of the SIFT feature point is larger, the actual distance represented by the feature point from the shooting position is smaller, and when the scale value of the SIFT feature point is smaller, the actual distance represented by the feature point from the shooting position is larger.
2. The method of claim 1, wherein the panoramic image SIFT optimization method based on the scale criterion is characterized in that: the number of the panoramic image in the step 1 is at least 2.
3. The method of claim 1, wherein the panoramic image SIFT optimization method based on the scale criterion is characterized in that: when the panoramic image ring is mapped to the panoramic image by the natural road sign on the horizontal plane of the optical axis, the imaging point is only located on one ring in the panoramic image, and no matter how the imaging system moves horizontally, the imaging point of the natural road sign on the horizontal plane of the optical axis cannot leave the panoramic image ring.
CN201710443220.5A 2017-06-13 2017-06-13 Scale criterion-based panoramic image SIFT optimization method Active CN107330436B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710443220.5A CN107330436B (en) 2017-06-13 2017-06-13 Scale criterion-based panoramic image SIFT optimization method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710443220.5A CN107330436B (en) 2017-06-13 2017-06-13 Scale criterion-based panoramic image SIFT optimization method

Publications (2)

Publication Number Publication Date
CN107330436A CN107330436A (en) 2017-11-07
CN107330436B true CN107330436B (en) 2020-07-28

Family

ID=60195554

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710443220.5A Active CN107330436B (en) 2017-06-13 2017-06-13 Scale criterion-based panoramic image SIFT optimization method

Country Status (1)

Country Link
CN (1) CN107330436B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108805799B (en) * 2018-04-20 2021-04-23 平安科技(深圳)有限公司 Panoramic image synthesis apparatus, panoramic image synthesis method, and computer-readable storage medium
CN110245566B (en) * 2019-05-16 2021-07-13 西安交通大学 Infrared target remote tracking method based on background features

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101833765A (en) * 2010-04-30 2010-09-15 天津大学 Characteristic matching method based on bilateral matching and trilateral restraining
EP2237227A1 (en) * 2009-04-01 2010-10-06 Nederlandse Organisatie voor toegepast -natuurwetenschappelijk onderzoek TNO Video sequence processing method and system
CN106355555A (en) * 2016-10-24 2017-01-25 北京林业大学 Image stitching method and device
CN106530407A (en) * 2016-12-14 2017-03-22 深圳市金大象文化发展有限公司 Three-dimensional panoramic splicing method, device and system for virtual reality

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9361943B2 (en) * 2006-11-07 2016-06-07 The Board Of Trustees Of The Leland Stanford Jr. University System and method for tagging objects in a panoramic video and associating functions and indexing panoramic images with same

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2237227A1 (en) * 2009-04-01 2010-10-06 Nederlandse Organisatie voor toegepast -natuurwetenschappelijk onderzoek TNO Video sequence processing method and system
CN101833765A (en) * 2010-04-30 2010-09-15 天津大学 Characteristic matching method based on bilateral matching and trilateral restraining
CN106355555A (en) * 2016-10-24 2017-01-25 北京林业大学 Image stitching method and device
CN106530407A (en) * 2016-12-14 2017-03-22 深圳市金大象文化发展有限公司 Three-dimensional panoramic splicing method, device and system for virtual reality

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
A Novel Robot Visual Homing Method Based on SIFT Features;Qidan Zhu et al.;《sensors》;20151014;26063-26084 *

Also Published As

Publication number Publication date
CN107330436A (en) 2017-11-07

Similar Documents

Publication Publication Date Title
Drost et al. 3d object detection and localization using multimodal point pair features
WO2022121039A1 (en) Bankcard tilt correction-based detection method and apparatus, readable storage medium, and terminal
CN108122256B (en) A method of it approaches under state and rotates object pose measurement
CN103093191B (en) A kind of three dimensional point cloud is in conjunction with the object identification method of digital image data
Yuan et al. Robust lane detection for complicated road environment based on normal map
CN103136525B (en) High-precision positioning method for special-shaped extended target by utilizing generalized Hough transformation
CN104182973A (en) Image copying and pasting detection method based on circular description operator CSIFT (Colored scale invariant feature transform)
CN104331682A (en) Automatic building identification method based on Fourier descriptor
CN103839277A (en) Mobile augmented reality registration method of outdoor wide-range natural scene
CN103400384A (en) Large viewing angle image matching method capable of combining region matching and point matching
CN104036523A (en) Improved mean shift target tracking method based on surf features
CN113160285B (en) Point cloud matching method based on local depth image criticality
Wang et al. An overview of 3d object detection
CN105160649A (en) Multi-target tracking method and system based on kernel function unsupervised clustering
CN114331879B (en) Visible light and infrared image registration method for equalized second-order gradient histogram descriptor
CN110222661B (en) Feature extraction method for moving target identification and tracking
CN113408584A (en) RGB-D multi-modal feature fusion 3D target detection method
CN104123554A (en) SIFT image characteristic extraction method based on MMTD
CN115861352A (en) Monocular vision, IMU and laser radar data fusion and edge extraction method
CN107330436B (en) Scale criterion-based panoramic image SIFT optimization method
CN111435429B (en) Gesture recognition method and system based on binocular stereo data dynamic cognition
Wang et al. Hand posture recognition from disparity cost map
CN104484647B (en) A kind of high-resolution remote sensing image cloud height detection method
Cai et al. Feature detection and matching with linear adjustment and adaptive thresholding
CN117870659A (en) Visual inertial integrated navigation algorithm based on dotted line characteristics

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant