CN1959707A - Image matching method based on pixel jump - Google Patents

Image matching method based on pixel jump Download PDF

Info

Publication number
CN1959707A
CN1959707A CN 200610161164 CN200610161164A CN1959707A CN 1959707 A CN1959707 A CN 1959707A CN 200610161164 CN200610161164 CN 200610161164 CN 200610161164 A CN200610161164 A CN 200610161164A CN 1959707 A CN1959707 A CN 1959707A
Authority
CN
China
Prior art keywords
template
search graph
pixel
jump
numerical value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 200610161164
Other languages
Chinese (zh)
Other versions
CN100463002C (en
Inventor
张广军
雷鸣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Beijing University of Aeronautics and Astronautics
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CNB2006101611648A priority Critical patent/CN100463002C/en
Publication of CN1959707A publication Critical patent/CN1959707A/en
Application granted granted Critical
Publication of CN100463002C publication Critical patent/CN100463002C/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

A method for matching image based on pixel skip can obtain corresponding position of template in searching pattern in the fastest speed by analyzing relevant peak condition on related plane to confirm pixel number of movement skip done by template on searching pattern.

Description

Image matching method based on pixel jump
Technical field
The invention belongs to the digital image processing techniques field, particularly relate to a kind of image matching method based on pixel jump.
Background technology
In the process of machine recognition affairs, often different sensors or same sensor are spatially aimed at two width of cloth or multiple image that same scenery obtains under different time, different image-forming condition, or be called coupling according to known mode is sought corresponding modes in another width of cloth image process.So-called images match is exactly that two width of cloth gray level images that utilize two different sensors to enroll from same scenery to get off are spatially aimed at, to determine the process of relativity shift between two width of cloth images.Development along with Digital Signal Processing, images match has become an important technology of modern digital image processing field, has obtained using widely in the numerous areas such as terminal guidance of sport video compression, automatic target identification, medical image analysis, the guidance of cruising, guided missile projection system.
Present image matching technology can be divided into three levels: based on the relevant matching process of gray scale, based on the matching process of feature with based on the matching process of explaining.Wherein, need be based upon on the expert system of picture automatic interpretation, do not make a breakthrough, thereby present image matching technology mainly concentrate on preceding two kinds of methods always based on the matching process of explaining.
In the application of space picture coupling, because the difference of factors such as imaging system, imaging mode, time, to make the search graph of same target position and have deviation inevitably between the figure in real time, real-time figure even also may be subjected to artificially electronic interferences and make matched environment become more abominable, in this case,, obtained using the most widely in the space picture coupling owing to have stronger adaptive faculty based on the relevant matching process of gray scale.This method is by template is traveled through on search graph, calculate the cross correlation value of each situation template and search graph counterpart, thereby judge the position of template correspondence in search graph, this matching process is owing to have the irrelevant characteristic of gray scale linear change, thereby has stronger adaptability, but shortcoming is a calculated amount to be increased along with the change in size of template and search graph is geometric series, thereby makes the real-time based on the relevant matching process of gray scale be subjected to restricting widely.
Matching process based on feature is the focus of present image matching technique research, it generally relates to a large amount of geometry and morphological image calculates, there is not general model to follow, need select suitable separately feature and model at different application scenarios, for example: models such as edge feature point, provincial characteristics, figure, sentence structure, thereby, is not very high based on the matching process of feature to the adaptability of scene type and image deformation, reliability is not very strong, can not satisfy the application demand of actual engineering well.
Summary of the invention
In view of this, fundamental purpose of the present invention is to provide a kind of image matching method based on pixel jump, this method can be obtained template corresponding position in search graph with the fastest speed, thus time that is consumed that greatly reduces images match and the application demand that satisfies actual engineering well.
For achieving the above object, technical scheme of the present invention is achieved in that the situation by analyzing relevant peaks on the correlation surface moves the pixel count that jumps to determine template on search graph, thereby obtains the position of template correspondence in search graph with the fastest speed.
Image matching method based on pixel jump of the present invention specifically may further comprise the steps:
The first step, determine the direction that template moves on search graph, template is pressed moving direction move three location of pixels successively continuously on search graph, and logging template first location of pixels, second location of pixels and the 3rd location of pixels of on search graph, moving;
In second step, the facies relationship numerical value of the pixel position that template moves on search graph in the calculation procedure one is respectively determined the situation of relevant peaks on the correlation surface according to the magnitude relationship of nearest three the facies relationship numerical value that comprise current facies relationship numerical value;
In the 3rd step, according to the situation of determined relevant peaks in the step 2, calculation template moves to the pixel count N of the required jump in position next time on search graph;
In the 4th step, template is moved to location of pixels next time by moving direction on search graph;
The 5th step, judge that template moves whether to search for by moving direction to finish on search graph, if then execution in step six, otherwise repeated execution of steps two is to step 4;
In the 6th step, the facies relationship numerical value size of the pixel position that the comparison template moves on search graph is obtained the maximal value in all facies relationship numerical value, obtains the position of template correspondence in search graph.
In step 1, the direction that described template moves on search graph is: move from left to right line by line or move from right to left line by line or move from bottom to up or move from top to bottom by row ground by row ground, up to the complete width of cloth image of search.
The situation of the relevant peaks described in the step 2 comprises the sunny side of relevant peaks and the back of relevant peaks.
Two kinds of situations of the sunny side of described relevant peaks and the back are determined according to the symbol SIGN of K in the following formula:
Figure A20061016116400061
Wherein, ρ When last timeRepresentation template moves to the facies relationship numerical value of current pixel position, ρ on search graph Before onceRepresentation template moves to the facies relationship numerical value of the last location of pixels of current pixel, ρ on search graph Preceding secondaryRepresentation template moves to the facies relationship numerical value of preceding two location of pixels of current pixel on search graph.
In the step 3, for the sunny side situation of relevant peaks, template moves to the required jump in position next time on search graph pixel count N is:
N=INT[(|1/ ρ When last time|-1.1) 10]
For the back situation of relevant peaks, template moves to half of pixel count N delivery board size size of the required jump in position next time with interior integer on search graph, at template size less than 30 * 30 or greater than 90 * 90 o'clock, N got the integer near [20,30].
The peaked process of obtaining described in the step 6 in all facies relationship numerical value comprises:
An at first given initial value ρ 0=0, compare ρ then When last timeAnd ρ 0Size, if ρ When last timeGreater than ρ 0, then with ρ When last timeValue give ρ 0, otherwise, do not do assign operation.
Template corresponding position in search graph is ρ 0Get the pixel coordinate of maximal value correspondence.
Image matching method based on pixel jump of the present invention has the following advantages:
(1) template carries out need not traveling through each pixel coordinate when mobile on search graph, only need on search graph, to move the pixel count that jumps to determine template by the situation of analyzing relevant peaks on the correlation surface, at the enterprising line search of the location of pixels that is not jumped, thereby greatly reduce the calculated amount of images match, and the time that greatly reduces images match and consumed;
(2) when the relevant peaks on the correlation surface is analyzed, by distinguishing sunny side and two kinds of situations of the back of relevant peaks, and carry out the search of corresponding pixel jump at different situations, therefore take into account the characteristic information of correlation surface, and made image matching method of the present invention satisfy the application demand of actual engineering well;
(3) experiment shows, image matching method of the present invention guaranteeing under the precondition of matching probability, along with the increase of template size will make that the advantage of image matching effect is more and more obvious.
Description of drawings
Search graph and template that Fig. 1 has been adopted when having illustrated images match;
Fig. 2 is template correlation surface synoptic diagram when mobile on search graph;
Fig. 3 is the synoptic diagram of a relevant peaks section on the correlation surface shown in Figure 2;
Fig. 4 is a method flow synoptic diagram of the present invention;
Fig. 5 is template moves three location of pixels successively continuously on search graph a process status synoptic diagram;
Fig. 6 to Fig. 8 is arranged in different situation synoptic diagram on the relevant peaks for template of the present invention moving three pairing facies relationship numerical value of location of pixels on the search graph successively continuously;
Fig. 9 carries out the jump pixel count of emulation experiment and the graph of a relation of match time for adopting the inventive method, last figure is depicted as 30 * 30 the emulation of template on 400 * 400 search graph, and figure below is depicted as 90 * 90 the emulation of template on 400 * 400 search graph;
Figure 10 carries out the jump pixel count of emulation experiment and the graph of a relation of matching probability for adopting the inventive method, last figure is depicted as 30 * 30 the emulation of template on 400 * 400 search graph, and figure below is depicted as 90 * 90 the emulation of template on 400 * 400 search graph;
Figure 11 carries out the jump pixel count of emulation experiment and the graph of a relation of template size for adopting the inventive method, the template that last figure is depicted as the different size size is carried out the emulation of mobile search on 400 * 400 search graph, the template that figure below is depicted as the different size size is carried out the emulation of mobile search on 768 * 576 search graph.
Embodiment
With reference to the accompanying drawings the present invention is done explanation in further detail.
At first, introduce the normalized crosscorrelation principle of operation, in images match, the similarity degree between two identical sized images adopts related coefficient ρ to characterize:
One of situation, the gray level image that two width of cloth sizes is N * N is remembered work respectively:
{ X|x Ij∈ X, i, j=0 ..., N-1} and { Y|y Ij∈ Y, i, j=0 ..., N-1},
Wherein, x IjWith y IjBe respectively the gray-scale value of each point on two width of cloth images, then the related coefficient of two width of cloth images is:
ρ ( X , Y ) = E ( XY ) - E ( X ) E ( Y ) D ( X ) D ( Y )
Wherein, E (X), E (Y) are the gray average of two width of cloth images, D (X), D (Y) are the variance of two width of cloth images, E (XY) is the average after the corresponding dot product of two width of cloth images, related coefficient ρ (X, Y) represented the similarity degree of linear relationship between image X and Y, related coefficient approaches 1 or more at-1 o'clock, and the linear similarity degree between presentation video is obvious more.
Two of situation, with two width of cloth sizes respectively for M * M and N * N (gray level image of M≤N) is remembered work respectively:
{ X|x Ij∈ X, i, j=0 ..., M-1} and { Y|y Ij∈ Y, i, j=0 ..., N-1},
Then the related coefficient of two width of cloth images is: ρ U, v=| ρ (X, Y U, v) |
Wherein, Y U, vFor origin coordinates on the search graph be (u, v), size is the sub-piece of M * M, u, v=0,1 ..., N-M.If template T is of a size of M * M, search graph S be of a size of N * N (M≤N), as shown in Figure 1, template T translation on search graph S, the subgraph that covers under the template is designated as S I, j, (i j) is the coordinate of the upper left angle point of subgraph in S, and the point of this coordinate correspondence compares T and S reference point I, jContent, if both unanimities, then their difference is 0, with quadratic sum the estimating of error as their similarity degrees:
D ( i , j ) = Σ m = 1 M Σ n = 1 N [ S i , j ( m , n ) - T ( m , n ) ] 2 - - - ( 1 )
Launch above-mentioned formula, then have:
D ( i , j ) = Σ m = 1 M Σ n = 1 N [ S i , j ( m , n ) ] 2 - 2 Σ m = 1 M Σ n = 1 N [ S i , j ( m , n ) × T ( m , n ) ] + Σ m = 1 M Σ n = 1 N [ T ( m , n ) ] 2 - - - ( 2 )
The gross energy of the 3rd representation template on formula (2) equal sign the right is a constant; First is the subgraph energy under template covers, and slowly changes with the position of reference point; Second is the simple crosscorrelation of subgraph and template, change with the position of reference point, and when template and subgraph coupling, this cross correlation value maximum, related coefficient is expressed as following formula:
ρ ( i , j ) = Σ m = 1 M Σ n = 1 n S i , j ( m , n ) × T ( m , n ) Σ m = 1 M Σ n = 1 N [ S i , j ( m , n ) ] 2 Σ m = 1 M Σ n = 1 N [ T ( m , n ) ] 2 - - - ( 3 )
According to Cauchy-Schwarz inequality as can be known 0<ρ in the formula (3) (i, j)≤1, and only at S I, j(i, j)/[T (m, n)] when being constant, (i j) gets maximal value (equaling 1) to ρ.
Calculate the correlation matrix between two width of cloth images fully, need to surpass N 2(M-N+1) 2Inferior multiplying and (N 2-1) (M-N+1) 2The sub-addition computing has increased the difficulty that the realtime graphic matching system designs greatly.
One width of cloth bitmap can be converted to corresponding image array, and for the application of framing, the coordinate of maximum correlation coefficient correspondence is the elements of a fix of small image on image significantly in the correlation matrix.The correlation matrix C of image array A and B is definition like this, if the A matrix is M * N matrix, and is not less than the B matrix, then correlation matrix is expressed as following formula:
C ( i , j ) = Σ m = 0 M Σ n = 0 N A ( m , n ) × B ( m - i , n - j )
The unit of correlation matrix C have such character, if A is identical with B, then at the image geometry center position one peak value is arranged, and other position all is little value.If A and B are inequality, then there is not peak value in the C matrix, and whole matrix all is the little value that is more or less the same.Related operation is equivalent to piece image is placed on another width of cloth image and moves, and whenever moves and once compares: if two width of cloth images are all or part of identical, then will occur all or part of overlappingly in a certain position, and peak value at this moment occur; If two width of cloth images are not identical, then can be not overlapping in any position, at this moment do not have peak value and occur, therefore, search graph must be the place that peak value occurs with the matched position of the image of giving.
Fig. 2 is that template moves the correlation surface synoptic diagram that forms on search graph, this correlation surface is to move each corresponding location of pixels according to template on search graph to utilize formula (3) to calculate a facies relationship numerical value, the direction that the facies relationship numerical value that calculates is moved on search graph according to template is arranged in, among Fig. 2, location of pixels coordinate and facies relationship numerical value constitute the three-dimensional system of coordinate of correlation surface jointly.As shown in Figure 2, correlation surface presents the shape that height rises and falls, and generally the local maximum zone in the correlation surface is called relevant peaks.According to relative theory, the matched position of the peak value correspondence image of relevant peaks, and the mild face of relevant peaks and low ebb face are the position of coupling scarcely.In addition, the sharpness of relevant peaks also affects the probability and the precision of images match, and precipitous relevant peaks means high orientation precision, and coupling also is not easy drift, and the peak that the gradient is mild then can influence the precision of images match, even can cause mismatch.
Continuity based on image neighbor information, can make prediction to the location of pixels next time that template moves on search graph, as shown in Figure 3, facies relationship numerical value a, b, c, d, e, f, g are distributed on the relevant peaks, in the present invention, analyze sunny side and two kinds of situations of the back of relevant peaks according to the magnitude relationship of nearest three the facies relationship numerical value that comprise current facies relationship numerical value.
Fig. 4 is the schematic flow sheet of the image matching method based on pixel jump of the present invention, and as shown in Figure 4, this method specifically may further comprise the steps:
The first step, determine the direction that template moves on search graph, template is pressed moving direction move three location of pixels successively continuously on search graph, and logging template first location of pixels, second location of pixels and the 3rd location of pixels of on search graph, moving.As shown in Figure 5, the direction that template moves on search graph is for moving from left to right line by line, each lattice is represented a location of pixels among Fig. 5, Fig. 5 (a) is the initial position state of template on search graph, Fig. 5 (b) is the template location of pixels that moved right on the basis of Fig. 5 (a), template residing location of pixels note this moment is made I 1, Fig. 5 (c) is the template location of pixels that moved right on the basis of Fig. 5 (b), template residing location of pixels note this moment is made I 2, Fig. 5 (d) is the template location of pixels that moved right on the basis of Fig. 5 (c), template residing location of pixels note this moment is made I 3
Second step, the facies relationship numerical value of the pixel position that template moves on search graph in the calculation procedure one respectively; Determine the situation of relevant peaks on the correlation surface according to the magnitude relationship of nearest three the facies relationship numerical value that comprise current facies relationship numerical value.Calculate by formula (3): template moves residing location of pixels I on search graph 1The facies relationship numerical value ρ at place 1, template moves residing location of pixels I on search graph 2The facies relationship numerical value ρ at place 2, template moves residing location of pixels I on search graph 3The facies relationship numerical value ρ at place 3
With reference to Fig. 6 to Fig. 8, ρ 1, ρ 2And ρ 3The difference that the magnitude relationship of these three facies relationship numerical value can be reflected on the following relevant peaks is basically arranged:
Shown in Fig. 6 (a), ρ 12>0, ρ 23<0;
Shown in Fig. 6 (b), ρ 12>0, ρ 23<0;
Shown in Fig. 6 (c), ρ 12<0, ρ 23>0;
Shown in Fig. 6 (d), ρ 12<0, ρ 23>0;
Shown in Fig. 7 (a), ρ 12>0, ρ 23=0;
Shown in Fig. 7 (b), ρ 12<0, ρ 23>0;
Shown in Fig. 7 (c), ρ 12=0, ρ 23=0;
Shown in Fig. 7 (d), ρ 12=0, ρ 23<0;
Shown in Fig. 8 (a), ρ 12<0, ρ 23<0;
Shown in Fig. 8 (b), ρ 12>0, ρ 22>0;
Shown in Fig. 8 (c), ρ 12<0, ρ 23<0;
Shown in Fig. 8 (d), ρ 12>0, ρ 23>0.
Determine the situation of relevant peaks by the symbol SIGN (positive sign or negative sign) of K in the following formula:
Figure A20061016116400111
Wherein, ρ When last timeRepresentation template moves to the facies relationship numerical value of current pixel position, ρ on search graph Before onceRepresentation template moves to the facies relationship numerical value of the last location of pixels of current pixel, ρ on search graph Preceding secondaryRepresentation template moves to the facies relationship numerical value of preceding two location of pixels of current pixel on search graph.In the present embodiment, ρ Preceding secondary1, ρ Before once2, ρ When last time3, below, when analysis K gets different symbols, the sunny side situation of corresponding relevant peaks and the back situation of relevant peaks:
(1) works as ρ 12Value and ρ 23Value when being contrary sign,
Preceding secondaryBefore once) * (ρ Before onceWhen last time) symbol SIGN get negative sign, K<0,
The sunny side that is called relevant peaks corresponding to the situation of Fig. 6 (a) and Fig. 6 (b);
The back that is called relevant peaks corresponding to the situation of Fig. 6 (c) and Fig. 6 (d);
(2) situation of K=0 is divided into:
ρ 1≠ ρ 23The time, be called the back of relevant peaks corresponding to the situation of Fig. 7 (a);
ρ 12≠ ρ 3The time, be called the back of relevant peaks corresponding to the situation of Fig. 7 (b);
ρ 123The time, corresponding to Fig. 7 (c), this situation does not have the relevant peaks peak value to occur, and based on the general characteristic of image, the probability that this situation appearance of K=0 occurs generally is very little, and the present invention does not consider this kind situation.
ρ 12≠ ρ 3The time, be called the sunny side of relevant peaks corresponding to the situation of Fig. 7 (d);
(3) work as ρ 12Value and ρ 23Value when being same-sign,
Preceding secondaryBefore once) * (ρ Before onceWhen last time) symbol SIGN get positive sign, K>0,
The sunny side that is called relevant peaks corresponding to the situation of Fig. 8 (a) and Fig. 8 (c);
The back that is called relevant peaks corresponding to the situation of Fig. 8 (b) and Fig. 8 (d);
In the 3rd step, according to the situation of determined relevant peaks in the step 3, calculation template moves to the pixel count N of the required jump in position next time on search graph.For the sunny side situation of above-mentioned relevant peaks, template moves to the required jump in position next time on search graph pixel count N is:
N=INT[(|1/ ρ When last time|-1.1) 10]
The 4th step moved to next time location of pixels by moving direction with template on search graph, if template is I by the location of pixels that moving direction moves to next time on search graph 7, then:
I 7=I 3+N,
Then, judge that template moves whether to search for by moving direction to finish on search graph, if for not, then marker template moves residing current pixel position I on search graph 7, calculate facies relationship numerical value ρ by formula (3) 7, at this moment, corresponding formula (4), ρ Preceding secondary2, ρ Before once3, ρ When last time7, then, repeat above-mentioned correlation step, calculate the pixel count of template mobile required jump in position next time on search graph.
For the back situation of relevant peaks, template moves to the pixel count N delivery board size of the required jump in position next time on search graph big half as large with interior integer, at template size less than 30 * 30 or greater than 90 * 90 o'clock, N got the integer near [20,30].
Location of pixels is I if template moves to next time by moving direction on search graph 8, then:
I 8=I 3+N,
Then, judge that template moves whether to search for by moving direction to finish on search graph, if for not, then marker template moves residing current pixel position I on search graph 8, calculate facies relationship numerical value ρ by formula (3) 8, at this moment, corresponding formula (4), ρ Preceding secondary2, ρ Before once3, ρ When last time8, then, repeat above-mentioned correlation step, the pixel count of the required jump in position next time that calculation template moves on search graph.
At last, the facies relationship numerical value size of the pixel position that the comparison template moves on search graph is obtained the maximal value in all facies relationship numerical value, obtains the position of template correspondence in search graph.
Below, adopt the image matching method based on pixel jump of the present invention to carry out emulation experiment, the scope of matching error is set at ± 5 pixels during experiment, and the angularity correction scope is-5 °~+ 5 °, and the threshold value of relevant peaks characteristic index is 1.5.In experiment, by predesignating the size of search graph and template, from a right figure of image, intercept search graph one by one, and in same area, intercept the template of some from another figure at random by certain distribution with a fixed step size, carry out the Matching Location experiment, and comparative analysis.Table 1 has provided under the identical experiment environment original normalizing eliminate indigestion correlation technique and the matching probability of the inventive method and the comparative result of match time (averaging time) for coupling calculated amount comparison sheet in the table 1.
Emulation mode Original method The inventive method
The searching image size/figure is big or small in real time 400/90 400/60 400/30 400/90 400/60 400/30
Jump pixel count (n) - - - 45 30 20
Matching probability (%) 91.6 85.1 80.7 90.1 85.6 79.2
Match time (s) 24.657 13.125 4.016 1.734 1.453 0.560
Table 1
Table 2 has provided based on gray scale relevant matching process and the deviations of the inventive method and the comparative result of matching probability, and the template size size here is 60 * 60, and the size of search graph is 400 * 400, and jump pixel count N gets 30.
The intercepting coordinate Deviations based on the gray scale correlation technique The deviations of the inventive method
(310,261) (0,0) (0,0)
(190,116) (1,0) (0,0)
(100,118) (0,0) (0,1)
(84,107) (0,0) (3,1)
(30,26) (0,1) (1,1)
(10,200) (0,0) (1,2)
(211,24) (1,1) (0,3)
(69,317) (0,0) (2,0)
(188,272) (0,0) (0,0)
(190,273) (0,2) (0,5)
(314,259) (0,0) (0,0)
(112,83) (0,0) (0,7)
(257,46) (1,1) (0,0)
(210,101) (0,0) (0,0)
(198,110) (1,0) (2,3)
(162,133) (0,0) (0,0)
The matching probability of deviation in 5 pixels 85.1 85.6
The matching probability of deviation in 10 pixels 86.8 88.2
Table 2
From table 1 and table 2 as can be seen, the matching probability of the image matching method based on pixel jump of the present invention is with suitable based on the matching probability of gray scale relevant matches method, but significantly reduce match time, and, will bring up to by 7.2 times of original minimizings match time and reduce 14.2 times along with template size ground increases.The result of emulation experiment shows, under the fixing prerequisite of search graph size, template size is big more, and adopting the present invention to carry out the advantage of images match will be obvious more.
Fig. 9 carries out the jump pixel count of emulation experiment and the graph of a relation of match time for adopting the inventive method, here two groups of experiments have been carried out, first group template size size is 30 * 30, the size of search graph is 400 * 400, second group template size size is 90 * 90, the size of search graph is 400 * 400, and the hollow dots among the figure is jump pixel count and its institute correspondence coordinate position of match time, and curve is matched curve.As can be seen from the figure jump pixel count and match time basic symbols hop index function the regularity of distribution, and the jump pixel count is big more, required match time is few more, after the jump pixel count is greater than 30, the minimizing of match time will be no longer obvious.
Figure 10 carries out the jump pixel count of emulation experiment and the graph of a relation of matching probability for adopting the inventive method, here two groups of experiments have been carried out, first group template size size is 30 * 30, the size of search graph is 400 * 400, second group template size size is 90 * 90, and the size of search graph is 400 * 400, and the error allowed band is ± 5 pixels among the figure, the match is successful in matching probability 1 expression, and it fails to match in matching probability 0 expression.As can be seen from the figure jump pixel count in 30, concerning matching probability without any influence, (, can not provide the graph of a relation that other respectively organize all jump pixel counts and matching probability because length limits.In fact comprehensive its all graphs of a relation generally are in half of jump pixel count delivery board size size the time as can be known, concerning matching probability without any influence, can also obtain greatlyyer certainly, but just do not had ubiquity.) in addition, from the graph also as can be seen, matching probability and size of images have bigger relation, when template size is less, mate as can be seen and this " vibration " the relatively fierceness that do not match, and when template size is bigger, this " vibration " is more slow relatively, and this is reasonably, because along with template size ground increases, mate required big the increasing that contain much information, thereby improved the matching probability of image.
Figure 11 carries out the jump pixel count of emulation experiment and the graph of a relation of template image size for adopting the inventive method, here two groups of experiments have been carried out, first group template is carried out mobile search on 400 * 400 search graph, second group template is carried out mobile search on 768 * 576 search graph, what provide among the figure is under the prerequisite of correct coupling, the scope that error allows is ± 5 pixels, the maximal value that the jump pixel count can reach, and curve is matched curve.The relation object of pixel count and the image size of as can be seen from the figure jumping is like " ∽ " type curve, and the size of this linear dependence and search graph it doesn't matter, only and the template size size relation is arranged.From this linear dependence also as can be seen, when template size less than 30 * 30 with greater than 90 * 90 the time, bigger to the influence of jump pixel count, template size was less than 30 * 30 o'clock, and the jump pixel count is obtained smaller, even less than half of template size size, and template size is greater than 90 * 90 o'clock, and it is bigger that the jump pixel count can be obtained, even can be far longer than template size half, and between 30-90, more slow relatively to the influence of jump pixel count.
Show by above-mentioned The simulation experiment result, image matching method based on pixel jump of the present invention is very effective, and method realizes simply, easily goes that all images match with relevant peaks characteristic can adopt this kind method to carry out pixel jump with acceleration search.Guaranteeing under the precondition of matching probability, the time that the present invention can greatly reduce coupling and consumed, and also along with template size ground increases (the search graph size is fixed), the advantage of this method can be more and more obvious.
The above is preferred embodiment of the present invention only, is not to be used to limit protection scope of the present invention.

Claims (8)

1. the image matching method based on pixel jump is characterized in that, may further comprise the steps:
The first step, determine the direction that template moves on search graph, template is pressed moving direction move three location of pixels successively continuously on search graph, and logging template first location of pixels, second location of pixels and the 3rd location of pixels of on search graph, moving;
In second step, the facies relationship numerical value of the pixel position that template moves on search graph in the calculation procedure one is respectively determined the situation of relevant peaks on the correlation surface according to the magnitude relationship of nearest three the facies relationship numerical value that comprise current facies relationship numerical value;
In the 3rd step, according to the situation of determined relevant peaks in the step 2, calculation template moves to the pixel count N of the required jump in position next time on search graph;
In the 4th step, template is moved to location of pixels next time by moving direction on search graph;
The 5th step, judge that template moves whether to search for by moving direction to finish on search graph, if then execution in step six, otherwise repeated execution of steps two is to step 4;
In the 6th step, the facies relationship numerical value size of the pixel position that the comparison template moves on search graph is obtained the maximal value in all facies relationship numerical value, obtains the position of template correspondence in search graph.
2. the image matching method based on pixel jump as claimed in claim 1, it is characterized in that: in step 1, the direction that described template moves on search graph is: move from left to right line by line or move from right to left line by line or move from bottom to up or move from top to bottom by row ground by row ground, up to the complete width of cloth image of search.
3. the image matching method based on pixel jump as claimed in claim 2 is characterized in that: the situation of the relevant peaks described in the step 2 comprises the sunny side of relevant peaks and the back of relevant peaks.
4. the image matching method based on pixel jump as claimed in claim 3 is characterized in that: two kinds of situations of the sunny side of described relevant peaks and the back are determined according to the symbol SIGN of K in the following formula:
Figure A2006101611640002C1
Wherein, ρ When last timeLast time representation template moved to the facies relationship numerical value of current pixel position, ρ on search graph Before onceRepresentation template moves to the facies relationship numerical value of the last location of pixels of current pixel, ρ on search graph Preceding secondaryRepresentation template moves to the facies relationship numerical value of preceding two location of pixels of current pixel on search graph.
5. the image matching method based on pixel jump as claimed in claim 4 is characterized in that: in the step 3, for the sunny side situation of relevant peaks, template moves to the required jump in position next time on search graph pixel count N is:
N=INT[(|1/ ρ Before last time|-1.1) 10]
6. the image matching method based on pixel jump as claimed in claim 4, it is characterized in that: in the step 3, back situation for relevant peaks, template moves to half of pixel count N delivery board size size of the required jump in position next time with interior integer on search graph, at template size less than 30 * 30 or greater than 90 * 90 o'clock, N gets the integer near [20,30].
7. as claim 5 or 6 described image matching methods based on pixel jump, it is characterized in that: the peaked process of obtaining described in the step 6 in all facies relationship numerical value comprises:
An at first given initial value ρ 0=0, compare ρ then When last timeAnd ρ 0Size, if ρ When last timeGreater than ρ 0, then with ρ When last timeValue give ρ 0, otherwise, do not make assign operation.
8. the image matching method based on pixel jump as claimed in claim 7 is characterized in that: template corresponding position in search graph is ρ 0Get the pixel coordinate of maximal value correspondence.
CNB2006101611648A 2006-12-07 2006-12-07 Image matching method based on pixel jump Expired - Fee Related CN100463002C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNB2006101611648A CN100463002C (en) 2006-12-07 2006-12-07 Image matching method based on pixel jump

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNB2006101611648A CN100463002C (en) 2006-12-07 2006-12-07 Image matching method based on pixel jump

Publications (2)

Publication Number Publication Date
CN1959707A true CN1959707A (en) 2007-05-09
CN100463002C CN100463002C (en) 2009-02-18

Family

ID=38071393

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2006101611648A Expired - Fee Related CN100463002C (en) 2006-12-07 2006-12-07 Image matching method based on pixel jump

Country Status (1)

Country Link
CN (1) CN100463002C (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105590114A (en) * 2015-12-22 2016-05-18 马洪明 Image characteristic quantity generation method
CN108665477A (en) * 2018-04-17 2018-10-16 华中科技大学 A kind of adaptive area adaptive choosing method in real-time target matching positioning
CN109410175A (en) * 2018-09-26 2019-03-01 北京航天自动控制研究所 SAR radar imagery quality quick automatic evaluation method based on multiple subarea images match
CN109815869A (en) * 2019-01-16 2019-05-28 浙江理工大学 A kind of finger vein identification method based on the full convolutional network of FCN
CN112200864A (en) * 2019-07-08 2021-01-08 深圳中科飞测科技有限公司 Image processing method, positioning method, device, equipment and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1080444A4 (en) * 1998-05-18 2002-02-13 Datacube Inc Image recognition and correlation system
WO2002021438A2 (en) * 2000-09-07 2002-03-14 Koninklijke Philips Electronics N.V. Image matching
JP2004240909A (en) * 2003-02-10 2004-08-26 Hitachi High-Technologies Corp Image processor and image processing method
US7463773B2 (en) * 2003-11-26 2008-12-09 Drvision Technologies Llc Fast high precision matching method
CN100363943C (en) * 2004-06-21 2008-01-23 南开大学 Color image matching analytical method based on color content and distribution

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105590114A (en) * 2015-12-22 2016-05-18 马洪明 Image characteristic quantity generation method
CN108665477A (en) * 2018-04-17 2018-10-16 华中科技大学 A kind of adaptive area adaptive choosing method in real-time target matching positioning
CN109410175A (en) * 2018-09-26 2019-03-01 北京航天自动控制研究所 SAR radar imagery quality quick automatic evaluation method based on multiple subarea images match
CN109410175B (en) * 2018-09-26 2020-07-14 北京航天自动控制研究所 SAR radar imaging quality rapid automatic evaluation method based on multi-subregion image matching
CN109815869A (en) * 2019-01-16 2019-05-28 浙江理工大学 A kind of finger vein identification method based on the full convolutional network of FCN
CN112200864A (en) * 2019-07-08 2021-01-08 深圳中科飞测科技有限公司 Image processing method, positioning method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN100463002C (en) 2009-02-18

Similar Documents

Publication Publication Date Title
CN1301482C (en) System and method for mode recognising
CN1258894A (en) Apparatus and method for identifying character
CN1276389C (en) Graph comparing device and graph comparing method
CN1737824A (en) Set up the method and apparatus of deterioration dictionary
CN101069192A (en) Computer implemented method for extracting integral histogram from sampled data
CN1959707A (en) Image matching method based on pixel jump
CN1928889A (en) Image processing apparatus and method
CN105308944A (en) Classifying objects in images using mobile devices
CN1977286A (en) Object recognition method and apparatus therefor
CN1828632A (en) Object detection apparatus, learning apparatus, object detection system, object detection method
CN1879553A (en) Method for detecting boundary of heart, thorax and diaphragm, device and storage medium thereof
CN1991865A (en) Device, method, program and media for extracting text from document image having complex background
CN1947151A (en) A system and method for toboggan based object segmentation using divergent gradient field response in images
CN1619574A (en) Image matching method, program, and image matching system
CN1141666C (en) Online character recognition system for recognizing input characters using standard strokes
CN1873694A (en) Method for extracting radiation area and image processing apparatus
CN101038626A (en) Method and device for recognizing test paper score
CN1790052A (en) Area feature variation detection method based on remote sensing image and GIS data
CN1653810A (en) Image angle detection device and scan line interpolation device having the same
CN1716281A (en) Visual quick identifying method for football robot
CN1643540A (en) Comparing patterns
CN1215438C (en) Picture contrast equipment, picture contrast method and picture contrast program
CN100344937C (en) Quick matching and recognition method for star tracking apparatus
CN1916940A (en) Template optimized character recognition method and system
CN1577380A (en) Process and device for detecting faces in a colour image

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20090218

Termination date: 20141207

EXPY Termination of patent right or utility model