CN103136760B - A kind of multi-sensor image matching process based on FAST Yu DAISY - Google Patents

A kind of multi-sensor image matching process based on FAST Yu DAISY Download PDF

Info

Publication number
CN103136760B
CN103136760B CN201310099478.XA CN201310099478A CN103136760B CN 103136760 B CN103136760 B CN 103136760B CN 201310099478 A CN201310099478 A CN 201310099478A CN 103136760 B CN103136760 B CN 103136760B
Authority
CN
China
Prior art keywords
point
image
characteristic point
matched
reference picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310099478.XA
Other languages
Chinese (zh)
Other versions
CN103136760A (en
Inventor
赵振兵
闫亚静
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
North China Electric Power University
Original Assignee
North China Electric Power University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by North China Electric Power University filed Critical North China Electric Power University
Priority to CN201310099478.XA priority Critical patent/CN103136760B/en
Publication of CN103136760A publication Critical patent/CN103136760A/en
Application granted granted Critical
Publication of CN103136760B publication Critical patent/CN103136760B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses image analysis technology field, particularly relate to a kind of multi-sensor image matching process based on FAST Yu DAISY.Its technical scheme is, first, respectively reference picture and image to be matched is carried out FAST Corner Detection, extracts characteristic point;Secondly, respectively reference picture and image zooming-out to be matched characteristic point out are done DAISY process, generate and describe subvector;Finally, use Euclidean distance to obtain initial matching point pair from characteristic point, then filter out accurate matching double points with RANSAC algorithm from initial matching point centering, thus realize reference picture and the coupling of image to be matched.Present invention substantially reduces the time of whole matching process and achieve preferable matching effect, the conceptual design of relevant issues is had certain reference.

Description

A kind of multi-sensor image matching process based on FAST Yu DAISY
Technical field
The invention belongs to image analysis technology field, particularly relate to a kind of multi-sensor image based on FAST Yu DAISY Matching process.
Background technology
Images match is as a kind of key technology of Pattern recognition and image processing, at remotely-sensed data analysis, medical image Process, and computer vision aspect has been obtained for being widely applied.
In simple terms, images match is exactly the technology carrying out the different images that Same Scene shoots aliging, and i.e. relates to From various visual angles, multi-time Scales and multi-modal etc. analysis.Imaging mode yet with different images sensor there are differences, and causes leading to There is obvious limitation and otherness, so multi-sensor image matching technique is in the view data crossing different sensors acquisition One particularly important problem.
The detection of characteristic point and description are steps particularly important in images match, therefore feature point detection and description algorithm Quality directly influence the effect of coupling.
FAST (Features from Accelerated Segment Test) is a kind of simple and quick feature spot check Method of determining and calculating, this algorithm can quickly determine angle point.And DAISY feature interpretation algorithm uses the histogram replacement of convolution to commonly use Histogram, reduces computing cost, it is achieved that the dense correspondence of wide baseline.At present, FAST and DAISY is the most respectively at single-sensor Images match has been applied.
Summary of the invention
In order to improve the accuracy and speed of multi-sensor image coupling, the present invention proposes a kind of based on FAST and DAISY Multi-sensor image matching process.
A kind of multi-sensor image matching process based on FAST Yu DAISY, it is characterised in that described method specifically includes Following steps:
Step 1: respectively reference picture and image to be matched are carried out FAST Corner Detection, extract characteristic point;
Step 2: reference picture and image zooming-out to be matched characteristic point out do DAISY process respectively, generates and describes Subvector;
Step 3: use Euclidean distance to obtain initial matching point pair from characteristic point, then with RANSAC algorithm from initial Join a centering and filter out accurate matching double points, thus realize reference picture and the coupling of image to be matched.
The process of described feature point extraction is:
Step 101: choose a candidate point C to be determined;
Step 102: check the point with a C as the center of circle, on the circle with R as radius;Consider 16 pixels on circumference, if There is the grey scale pixel value of continuous 9 and above point more than/grey scale pixel value less than some C and difference more than setting threshold value T, then recognize It is a characteristic point for a C.
The process that described generation describes subvector is:
Step 201: for each characteristic point, calculates its convolution in 8 directions;
Step 202: form characteristic vector with the convolution value cascade of 8 different directions calculating gained, and this vector is entered Row normalized obtains unit character vector;
Step 203:DAISY feature interpretation algorithm uses roundness mess method to obtain feature neighborhood of a point and supports point set, And utilize unit character vector to generate the description subvector of this point.
Described employing Euclidean distance obtains the process of initial matching point pair from characteristic point:
Step 301: the description subvector of certain characteristic point in reference picture is special with each in image to be matched respectively Levy description subvector a little and carry out Euclidean distance calculating;
Step 302: by the ratio of the minimum euclidean distance corresponding to this feature point that obtains and time minimum euclidean distance with The threshold value set compares;If the ratio of minimum euclidean distance and time minimum euclidean distance is less than the threshold value set, then should Characteristic point constitutes an initial matching point pair with the characteristic point in the image to be matched corresponding to minimum euclidean distance;Otherwise, recognize Coupling cannot be obtained for this feature point in reference picture;
Step 303: repeat step 301 and 302, characteristic point remaining in reference picture is carried out initial matching, until ginseng Examine till in image, all of characteristic point all completes initial matching.
The computing formula of described Euclidean distance is:
d ( x , y ) = ( x 1 - y 1 ) 2 + · · · + ( x p - y p ) 2
Wherein, p represents the dimension in space;x=(x1,…,xp) it is with reference to the description subvector of characteristic point in reference picture;y= (y1,…,yp) it is the description subvector of characteristic point in image to be matched.
The present invention detects angle point first with FAST algorithm, then uses DAISY arthmetic statement, then realizes coupling.This Bright practical, greatly reduce the time of whole matching process and achieve preferable matching effect, the side to relevant issues Case is designed with certain reference.
Accompanying drawing explanation
Fig. 1 is the flow chart of a kind of based on FAST Yu DAISY multi-sensor image matching process that the present invention provides;
Fig. 2 is the experimental result picture mated remote sensing images by the inventive method;Wherein, (a) and (b) be respectively by The remote sensing images of two width difference spectrum of Daedalus scanner shooting;C () is through reducing postrotational image to (a);(d) (e) it is the image after (c) and (b) detects angle point respectively;F () is the design sketch of accurate matching double points line;
Fig. 3 is the infrared image of transformer and visible images carries out the experimental result picture that mates;Wherein, (a) and (b) point It not infrared image and visible images;C () and (d) is (a) and (b) image after FAST Corner Detection respectively;E () is The accurately design sketch of matching double points line.
Detailed description of the invention
Below in conjunction with the accompanying drawings, preferred embodiment is elaborated.It should be emphasized that the description below is merely exemplary Rather than in order to limit the scope of the present invention and application thereof.
Fig. 1 is the flow chart of a kind of based on FAST Yu DAISY multi-sensor image matching process that the present invention provides.Figure In 1, the concrete steps of a kind of multi-sensor image matching process based on FAST with DAISY include:
Step 1: respectively reference picture and image to be matched are carried out FAST Corner Detection, extract characteristic point;
Step 2: reference picture and image zooming-out to be matched characteristic point out do DAISY process respectively, generates and describes Subvector;
Step 3: use Euclidean distance to obtain initial matching point pair from characteristic point, then with RANSAC algorithm from initial Join a centering and filter out accurate matching double points, thus realize reference picture and the coupling of image to be matched.
Set reference picture and have m characteristic point (description subvector), image to be matched have n characteristic point (description sub to Amount), then the detailed process of initial matching is as follows:
Step a: calculate the 1st characteristic point and the Euclidean distance of all characteristic points of image to be matched in reference picture, it is thus achieved that n Individual Euclidean distance;
Step b: the minimum euclidean distance of n Euclidean distance and the ratio of time minimum euclidean distance and the setting that will obtain Threshold value compares;If the ratio of minimum of a value and time minimum of a value is less than threshold value set in advance in this n Euclidean distance, then the Levy for 1 and a little constitute an initial matching point pair with the characteristic point in the image to be matched corresponding to minimum euclidean distance, otherwise recognize Coupling cannot be obtained for the in reference picture the 1st characteristic point;
Step c: repeating step a and b, m-1 characteristic point remaining to reference picture carries out initial matching.
Embodiment 1:
With multi-sensor image matching process based on FAST with DAISY, the remote sensing images of the two different spectrum of width are imitated True experiment.Original image is respectively as shown in Fig. 2 (a) and Fig. 2 (b), and wherein Fig. 2 (c) is that Fig. 2 (a) reduces 0.8 times, the most inverse Image after hour hands rotation 10 degree.Fig. 2 (c) and Fig. 2 (b) is mated by this experiment, first two width images is carried out FAST angle Point extracts, and wherein selects FAST9 detection, and threshold value is 30, and result is respectively as shown in Fig. 2 (d) and Fig. 2 (e);The most respectively to Fig. 2 D the angle point of () and Fig. 2 (e) carries out DAISY description and generates description vectors, wherein the parameter of DAISY description vectors is set to: Central pixel point is 15 to the distance of outer outermost layer mesh point, and the number around central point annulus is 3, the histogram in a layer Quantity is 8, and the direction quantity of histogram gradient is 8;Then utilize Euclidean distance to obtain initial matching point pair, and calculate with RANSAC Method removes Mismatching point pair, accurate matching double points such as Fig. 2 (f).Can be seen that this method can successfully realize from the simulation experiment result The coupling of the remote sensing images of two width difference spectrum.
Embodiment 2:
With multi-sensor image matching process based on FAST and DAISY to from on-the-spot infrared image and visible ray figure As processing.Fig. 3 (a) and Fig. 3 (b) is respectively infrared image and the visible images of transformer.The most respectively to two width figures As carrying out FAST Corner Detection, wherein selecting FAST9 detection, threshold value is 30, shown in result such as Fig. 3 (c) and Fig. 3 (d);Then Respectively the angle point of Fig. 3 (c) He Fig. 3 (d) is carried out DAISY description and generates description vectors, wherein the parameter of DAISY description vectors It is set to: the distance of central pixel point to outer outermost layer mesh point is 15, and the number around central point annulus is 3, at one layer In histogram quantity be 8, the direction quantity of histogram gradient is 8;Euclidean distance is finally utilized to obtain initial matching point pair, and Mismatching point pair is removed, accurately shown in the line effect such as Fig. 3 (e) after coupling with RANSAC algorithm.Can from experimental result Go out this method and the matching treatment from on-the-spot different sensor images can be obtained preferable matching effect.
The above, the only present invention preferably detailed description of the invention, but protection scope of the present invention is not limited thereto, Any those familiar with the art in the technical scope that the invention discloses, the change that can readily occur in or replacement, All should contain within protection scope of the present invention.Therefore, protection scope of the present invention should be with scope of the claims It is as the criterion.

Claims (2)

1. a multi-sensor image matching process based on FAST Yu DAISY, it is characterised in that described method specifically include with Lower step:
Step 1: if reference picture is the two different spectrum of width shot by Daedalus scanner with image to be matched respectively One of them remote sensing images is reduced 0.8 times and rotate 10 degree relative to another remote sensing images by remote sensing images, then to reducing Two width images after Zhuaning carry out FAST Corner Detection, extract characteristic point;
If reference picture and image to be matched are respectively infrared image and the visible images of transformer, respectively to two width images Carrying out FAST Corner Detection, wherein select FAST 9 to detect son, threshold value is 30, extracts characteristic point;
Step 2: respectively reference picture and image zooming-out to be matched characteristic point out are done DAISY process, generate describe son to Amount;
Step 3: use Euclidean distance to obtain initial matching point pair from characteristic point, then with RANSAC algorithm from initial matching point Centering filters out accurate matching double points, thus realizes reference picture and the coupling of image to be matched;
The process of described feature point extraction is:
Step 101: choose a candidate point C to be determined;
Step 102: check the point with a C as the center of circle, on the circle with R as radius;Consider 16 pixels on circumference, if had even The grey scale pixel value of continuous 9 and above point more than/grey scale pixel value less than some C and difference more than setting threshold value T, then it is assumed that point C is a characteristic point;
The process that described generation describes subvector is:
Step 201: for each characteristic point, calculates its convolution in 8 directions;
Step 202: form characteristic vector with the convolution value cascade of 8 different directions calculating gained, and this vector is returned One change processes and obtains unit character vector;
Step 203:DAISY feature interpretation algorithm uses roundness mess method to obtain feature neighborhood of a point and supports point set, and profit The description subvector of this point is generated with unit character vector;The parameter of DAISY description vectors is set to: central pixel point is to outward The distance of outermost layer mesh point is 15, and the number around central point annulus is 3, and the histogram quantity in one layer is 8, histogram The direction quantity of gradient is 8;
Described employing Euclidean distance obtains the process of initial matching point pair from characteristic point:
Step 301: by reference picture certain characteristic point describe subvector respectively with each characteristic point in image to be matched Description subvector carry out Euclidean distance calculating;
Step 302: by the minimum euclidean distance corresponding to this feature point that obtains and the ratio of time minimum euclidean distance and setting Threshold value compare;If the ratio of minimum euclidean distance and time minimum euclidean distance is less than the threshold value set, then this feature Point constitutes an initial matching point pair with the characteristic point in the image to be matched corresponding to minimum euclidean distance;Otherwise it is assumed that ginseng Examine this feature point in image and cannot obtain coupling;
Step 303: repeat step 301 and 302, characteristic point remaining in reference picture is carried out initial matching, until with reference to figure Till in Xiang, all of characteristic point all completes initial matching.
A kind of multi-sensor image matching process based on FAST Yu DAISY the most according to claim 1, its feature exists In, the computing formula of described Euclidean distance is:
d ( x , y ) = ( x 1 - y 1 ) 2 + ... + ( x p - y p ) 2
Wherein, p represents the dimension in space;X=(x1,…,xp) it is the description subvector of characteristic point in reference picture;Y= (y1,…,yp) it is the description subvector of characteristic point in image to be matched.
CN201310099478.XA 2013-03-26 2013-03-26 A kind of multi-sensor image matching process based on FAST Yu DAISY Active CN103136760B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310099478.XA CN103136760B (en) 2013-03-26 2013-03-26 A kind of multi-sensor image matching process based on FAST Yu DAISY

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310099478.XA CN103136760B (en) 2013-03-26 2013-03-26 A kind of multi-sensor image matching process based on FAST Yu DAISY

Publications (2)

Publication Number Publication Date
CN103136760A CN103136760A (en) 2013-06-05
CN103136760B true CN103136760B (en) 2016-08-31

Family

ID=48496550

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310099478.XA Active CN103136760B (en) 2013-03-26 2013-03-26 A kind of multi-sensor image matching process based on FAST Yu DAISY

Country Status (1)

Country Link
CN (1) CN103136760B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6149829B2 (en) * 2014-09-03 2017-06-21 コニカミノルタ株式会社 Image processing apparatus and image processing method
CN107507226B (en) * 2017-09-26 2021-04-06 中国科学院长春光学精密机械与物理研究所 Image matching method and device
CN110008964A (en) * 2019-03-28 2019-07-12 上海交通大学 The corner feature of heterologous image extracts and description method
CN110139212B (en) * 2019-06-21 2021-07-06 Oppo广东移动通信有限公司 Positioning processing method and related product
CN113674174B (en) * 2021-08-23 2023-10-20 宁波棱镜空间智能科技有限公司 Line scanning cylinder geometric correction method and device based on significant line matching
CN113658080B (en) * 2021-08-23 2023-12-22 宁波棱镜空间智能科技有限公司 Linear scanning cylinder geometric correction method and device based on characteristic point matching
CN113450357B (en) * 2021-09-01 2021-12-17 南昌市建筑科学研究所(南昌市建筑工程质量检测中心) Segment image online analysis subsystem and subway shield detection system

Also Published As

Publication number Publication date
CN103136760A (en) 2013-06-05

Similar Documents

Publication Publication Date Title
CN103136760B (en) A kind of multi-sensor image matching process based on FAST Yu DAISY
CN110135455B (en) Image matching method, device and computer readable storage medium
Li et al. Automatic crack detection and measurement of concrete structure using convolutional encoder-decoder network
CN105427298B (en) Remote sensing image registration method based on anisotropic gradient metric space
CN105814607B (en) Image processing apparatus and image processing method
CN111210477B (en) Method and system for positioning moving object
CN100430690C (en) Method for making three-dimensional measurement of objects utilizing single digital camera to freely shoot
JP2019514123A (en) Remote determination of the quantity stored in containers in geographical areas
CN107424160A (en) The system and method that image center line is searched by vision system
CN104751465A (en) ORB (oriented brief) image feature registration method based on LK (Lucas-Kanade) optical flow constraint
CN106981077A (en) Infrared image and visible light image registration method based on DCE and LSS
CN101826157B (en) Ground static target real-time identifying and tracking method
CN107085728B (en) Method and system for effectively scoring probe in image by using vision system
CN104899888B (en) A kind of image sub-pixel edge detection method based on Legendre squares
CN108154066B (en) Three-dimensional target identification method based on curvature characteristic recurrent neural network
CN107341824B (en) Comprehensive evaluation index generation method for image registration
Chalom et al. Measuring image similarity: an overview of some useful applications
CN113393439A (en) Forging defect detection method based on deep learning
JP4003465B2 (en) Specific pattern recognition method, specific pattern recognition program, specific pattern recognition program recording medium, and specific pattern recognition apparatus
CN115631210A (en) Edge detection method and device
CN114612412A (en) Processing method of three-dimensional point cloud data, application of processing method, electronic device and storage medium
Kim et al. Learning Structure for Concrete Crack Detection Using Robust Super‐Resolution with Generative Adversarial Network
CN109902695B (en) Line feature correction and purification method for image pair linear feature matching
JP2011107878A (en) Position detection apparatus and position detection method
CN103606146B (en) A kind of angular-point detection method based on circular target

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant