CN106156699A - Image processing apparatus and image matching method - Google Patents
Image processing apparatus and image matching method Download PDFInfo
- Publication number
- CN106156699A CN106156699A CN201510147484.7A CN201510147484A CN106156699A CN 106156699 A CN106156699 A CN 106156699A CN 201510147484 A CN201510147484 A CN 201510147484A CN 106156699 A CN106156699 A CN 106156699A
- Authority
- CN
- China
- Prior art keywords
- search
- correlation coefficient
- parameter
- feature
- search parameter
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
- G06V40/1365—Matching; Classification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/751—Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
The present invention provides a kind of image processing apparatus and image matching method, it is possible to reduces amount of calculation and amount of storage that images match processes, ensures the accuracy of images match simultaneously.Image processing equipment includes converter unit, the first search unit, the second search unit and identifying unit.First search parameter is respectively set as multiple predefined parameter by the first search unit, generate the first intermediate features figure based on the first set search parameter and fisrt feature figure, calculate the first correlation coefficient between the first intermediate features figure and the template image generated.Second search unit is based on multiple first correlation coefficienies the most corresponding with multiple predefined parameters, determine the span of the second search parameter, and in the span of the second search parameter, set the second search parameter respectively, generate the second intermediate features figure based on the second set search parameter and second feature figure, calculate the second correlation coefficient between the second intermediate features figure and the template image generated.
Description
Technical field
The present invention relates to image processing apparatus and image matching method.
Background technology
In the image matching method of the such as fingerprint matching of prior art, face coupling etc., by calculating
Input picture and the correlation coefficient of template image, such as, can determine that as defeated when correlation coefficient is more than threshold value
Enter image to mate with template image.Additionally, when calculating input image is with the correlation coefficient of template image,
In order to tackle the situation of the matching error that position skew causes with direction deflection, need to calculate in each position
Correlation coefficient in the case of skew and all directions deflection, thus amount of calculation and amount of storage increase further.
Specifically, when the correlation coefficient of calculating input image and template image, in order to reduce amount of calculation and
Amount of storage, is the relatively little of intermediate image of data volume by input picture conversion (scaling), and calculates centre
Image and the correlation coefficient of template image.But, along with the reduction of the data volume of intermediate image, coupling essence
Degree also can decline.
Summary of the invention
The present invention completes in view of the above problems, its object is to provide a kind of image processing apparatus and image
Matching process, it is possible to reduce amount of calculation and amount of storage that images match processes, ensure images match simultaneously
Accuracy.
According to an aspect of the present invention, it is provided that a kind of image processing apparatus.Described image processing apparatus bag
Include: converter unit, input picture is converted, generate fisrt feature figure and data volume relative to first
The second feature figure that characteristic pattern is big;First search unit, is respectively set as multiple pre-by the first search parameter
Determine parameter, generate the first intermediate features figure based on the first set search parameter and fisrt feature figure,
Calculate the first correlation coefficient between the first intermediate features figure and the described template image generated, Qi Zhongsuo
State the first correlation coefficient corresponding with predefined parameter;Second search unit, divides based on multiple predefined parameters
Not corresponding multiple first correlation coefficienies, determine the span of the second search parameter, and search second
Described second search parameter is set respectively, based on the second set search ginseng in the span of rope parameter
Number and second feature figure generate the second intermediate features figure, calculate the second intermediate features figure and institute generated
State the second correlation coefficient between template image;Identifying unit, meets at the second correlation coefficient calculated
In the case of predetermined condition, it is determined that mate with described template image for described input picture, based on
It is all that all second search parameters set in the span of two search parameters and second feature figure generate
The second correlation coefficient between second intermediate features figure and described template image is all unsatisfactory for predetermined condition
In the case of, it is determined that do not mate with described template image for described input picture.
According to a further aspect in the invention, it is provided that a kind of image matching method.Described image matching method bag
Include: input picture is converted, generate fisrt feature figure and data volume is big relative to fisrt feature figure
Second feature figure;First search parameter is respectively set as multiple predefined parameter, based on set first
Search parameter and fisrt feature figure generate the first intermediate features figure, calculate the first intermediate features generated
The first correlation coefficient between figure and described template image, wherein said first correlation coefficient and predefined parameter
Corresponding;Based on multiple first correlation coefficienies the most corresponding with multiple predefined parameters, determine the second search
The span of parameter;Described second search parameter is set respectively in the span of the second search parameter,
Generate the second intermediate features figure based on the second set search parameter and second feature figure, calculate and given birth to
The second correlation coefficient between the second intermediate features figure and the described template image that become;Second calculated
In the case of correlation coefficient meets predetermined condition, it is determined that mate with described template image for described input picture;
Based on all second search parameters set in the span of the second search parameter and second feature figure
The second correlation coefficient between all second intermediate features figures and the described template image that generate all is unsatisfactory for
In the case of predetermined condition, it is determined that do not mate with described template image for described input picture.
Image processing apparatus according to the present invention and image matching method, calculate data when search for the first time
Measure the correlation coefficient of relatively small fisrt feature figure and template image, and utilize the correlation coefficient calculated
Determine parameter value scope during second time search, thus can only take in parameter when second time search
The correlation coefficient of the relatively large second feature figure of data volume and template image is calculated in the range of value.Therefore,
By image processing apparatus and the image matching method of the present invention, it is possible to the accuracy of images match is kept
In utilizing second feature figure to carry out accuracy during images match, simultaneously amount of calculation can be reduced.
Accompanying drawing explanation
Fig. 1 is the functional block diagram of the image processing apparatus representing embodiments of the present invention.
Fig. 2 is the flow chart of the image matching method representing embodiments of the present invention.
Detailed description of the invention
Below, it is explained with reference to embodiments of the present invention.Description referring to the drawings is provided,
To help the understanding of the example embodiment to the present invention limited by appended claims and their equivalents.Its
The various details understood including help, but they can only be counted as exemplary.Therefore, ability
Field technique personnel are not it will be recognized that can make various changes and modifications embodiment described herein, and not
Depart from the scope of the present invention and spirit.And, in order to make description more clear succinct, by omission to this
Field well-known functions and the detailed description of structure.
Below, the image processing apparatus of embodiments of the present invention is described with reference to Fig. 1.Fig. 1 is to represent
The functional block diagram of the image processing apparatus of embodiments of the present invention.
As it is shown in figure 1, image processing apparatus 1 includes converter unit the 11, first search unit 12, second
Search unit 13 and identifying unit 14.Wherein, image processing apparatus 1 for example, smart mobile phone, flat board
The image processing apparatus of computer, notebook computer, fingerprint identification device, face identification device etc., as long as
Possesses the ability that view data is processed.
Input picture is converted by converter unit 11, generates fisrt feature figure and data volume relative to first
The second feature figure that characteristic pattern is big.
Wherein, input picture can be the image gathered itself by acquisition module by image processing apparatus 1,
It can also be the image received from other device.Additionally, about the content of input picture, at image
Be correlated with in the field that reason device 1 is applied, such as, if image processing apparatus 1 should be used for carrying out fingerprint recognition,
Then input picture is fingerprint image, if image processing apparatus 1 should be used for carrying out recognition of face, then inputs figure
It seem facial image.
Specifically, converter unit 11 such as carries out wavelet transformation and reduces conversion input picture, thus raw
Become second feature figure, the most again input picture is such as carried out wavelet transformation and reduces conversion, thus raw
Become fisrt feature figure.Wherein, generate fisrt feature figure time reducing conversion and generate second feature figure time
Reduce conversion difference, thus the data volume of second feature figure is more than fisrt feature figure.Such as, converter unit
The input picture of 1024*1024 pixel is carried out wavelet transformation and reduces conversion by 11, generates 64*64 pixel
Second feature figure, and the input picture of 1024*1024 pixel is carried out wavelet transformation and reduce conversion,
Generate the second feature figure of 32*32 pixel.When generating fisrt feature figure, converter unit 11 can also be right
Second feature legend after generation such as carries out wavelet transformation and reduces conversion, thus generates fisrt feature figure.
Preferably, input picture or second feature figure are become by converter unit 11 with different transformation parameters
Change, thus generate at least two fisrt feature figure corresponding from different transformation parameters.Specifically, conversion
Unit carries out wavelet transformation and reduces conversion input picture or second feature figure, generates fisrt feature figure.
Such as, when carrying out wavelet transformation, utilize the window function of different angles to input picture or second feature figure
Carry out wavelet transformation, carry out reducing conversion the most again, thus generate corresponding with different angles multiple respectively
Fisrt feature figure.In concrete example, converter unit 11 respectively with 0 degree, 30 degree, 60 degree, 90 degree,
120 degree, the window function of 150 degree, carry out wavelet transformation to input picture or second feature figure, enter
Row reduces conversion, thus generates 6 corresponding with above-mentioned angle respectively fisrt feature figures.Thus, rear
When continuous process utilizes fisrt feature figure to carry out the first search, it is possible to increase the accuracy of the first search,
And then can more accurately set the span of the second search parameter.
Moreover it is preferred that input picture is filtered by converter unit 11, generate one filtered defeated
Enter image, and filtered input picture is converted, generate fisrt feature figure relative with data volume
In the second feature figure that fisrt feature figure is big.Specifically, converter unit 11 is carrying out generation fisrt feature figure
Before process with second feature figure, input picture is carried out the pretreatment of such as wavelet filtering, thus disappears
Except the noise in original input picture.Thereby, it is possible to eliminate noise on image matching treatment process in advance
In interference, improve images match accuracy rate.Then, converter unit 11 utilizes filtered input figure
As generating fisrt feature figure and second feature figure.
In above-mentioned conversion process and Filtering Processing, had as a example by wavelet transformation, wavelet filtering
The explanation of body, but the present invention is not limited to this, it is also possible to carry out other such as dct transform, average
The process of filtering etc..As long as the fisrt feature figure after Sheng Chenging and second feature figure can represent input well
The feature of image.
First search parameter is respectively set as multiple predefined parameter by the first search unit 12, based on set
The first search parameter and fisrt feature figure generate the first intermediate features figure, calculate in first generated
Between the first correlation coefficient between characteristic pattern and described template image.Wherein the first correlation coefficient and predetermined ginseng
Number is corresponding.
Wherein, predefined parameter such as represents position offset.First carried out by the first search unit 12
During search processes, such as, the first search parameter is respectively set as different multiple position offsets.Specifically
Ground, position offset by the central point of the first intermediate features figure corresponding line number in fisrt feature figure and
Columns represents.Plurality of position offset can be redefined for (1 row, 1 row), (1 row, 3 row),
(1 row, 5 row) ... (15 row, 15 row) etc..Additionally, above-mentioned multiple positions set in advance
Side-play amount is an example, can set other position offset as required.Additionally, predefined parameter example
As direction rotation amount can also be represented.Now, process in the first search carried out by the first search unit 12
In, such as the first search parameter is respectively set as different multiple directions rotation amounts.Wherein, Duo Gefang
To rotation amount can be redefined for-40 degree ,-30 degree ,-20 degree ..., 40 degree etc..Additionally,
Above-mentioned multiple directions rotation amount set in advance is an example, can set other direction rotation as required
Turn amount.Additionally, predefined parameter such as can also represent position offset and direction rotation amount simultaneously, and then
Other parameter can also be represented as required.
Below, represent as a example by position offset and direction rotation amount by predefined parameter simultaneously, launch follow-up
Explanation.Now, the first search parameter is each set to above-mentioned multiple position offset and above-mentioned multiple side
To the various combination of rotation amount.
Specifically, the first search parameter be set to certain position offset and certain direction rotation amount it
After, generate the first intermediate features figure based on the first set search parameter and fisrt feature figure, and
Calculate the first correlation coefficient between the first intermediate features figure and the described template image generated.Now,
Certain position offset of the first correlation coefficient and this calculated and this certain direction rotation amount are relevant.Then,
The first search parameter is set as other combination of position offset and direction rotation amount again, repeats above-mentioned
Process, thus calculate other the combination relevant first to this position offset and direction rotation amount
Correlation coefficient.Repeat above-mentioned process, until all of combination of position offset and direction rotation amount is all
Once it was set to the first search parameter, it is possible to the institute calculated with position offset and direction rotation amount
There is the first correlation coefficient that combination is the most corresponding.
The first search parameter be set to position offset (a row, b arrange) and direction rotation amount z degree it
After, the first search unit 12 is based on set the first search parameter (that is, position offset (a row, b
Row) and direction rotation amount z degree), utilize fisrt feature figure to generate the first intermediate features figure.Concrete
In process, the fisrt feature figure of such as 16*16 pixel is represented by the matrix including 16*16 element.
Specifically, element and the expression fisrt feature of the matrix B of the first intermediate features figure generated are represented
Relation between the element of the matrix A of figure is as follows.The element of the first row first row of matrix B is, matrix
The element of the xth row y row of A.Wherein, in the case of z is less than or equal to 0, x=a-(cosz °
-sinz °) * d/2, y=b-(cosz °-sinz °) * d/2.Additionally, in the case of z is more than 0, x=a+ (cosz °
+ sinz °) * d/2, y=b+ (cosz °-sinz °) * d/2.In the case of fisrt feature figure is 16*16 pixel,
D=16.The element of the first row secondary series of matrix B is, (x-sinz °) row (y+cosz °) of matrix A
The element of row.The tertial element of the first row of matrix B is, (x-2*sinz °) of matrix A OK
The element that (y+2*cosz °) arranges.Additionally, the element of the second row first row of matrix B is, matrix
The element that (x+cosz °) row (y+sinz °) of A arranges.The unit of the third line first row of matrix B
Element is, the element that (x+cosz °+cosz °) row (y+sinz °+sinz °) of matrix A arranges.
The element of the second row secondary series of matrix B is, (x+cosz °-sinz °) row the (y+sinz ° of matrix A
+ cosz °) element that arranges.That is, the element of the line n m row of matrix B is, the of matrix A
The unit that (x-(m-1) * sinz °+(n-1) * cosz °) row (y+ (m-1) cosz °+(n-1) * sinz °) arranges
Element.So, it is possible to calculate the value of all elements of matrix B.
Additionally, in the process of the element value of above-mentioned calculating matrix B, the line number such as calculated or row
Number (x-(m-1) * sinz °+(n-1) * cosz °), (y+ (m-1) cosz °+(n-1) * sinz °) are not integers
In the case of, it is such as rounded up, thus obtains integer-valued line number and columns.
Preferably, the first search unit 12 determines at fisrt feature figure based on the first set search parameter
In pixel, and merely with determined by the value of pixel, generate the first intermediate features figure.
Specifically, at the first search unit 12 based on set the first search parameter (that is, position skew
Amount (a row, b arrange) and direction rotation amount z degree) utilize fisrt feature figure to generate the first intermediate features figure
Time, as it has been described above, show generated the according to the unit's usually computational chart of matrix A representing fisrt feature figure
The element of the matrix B of one intermediate features figure.When the line number calculated in matrix A or columns, it may appear that super
Go out the line number of matrix A or the situation of columns.Such as, in the case of the matrix that matrix A is 16*16,
If the line number calculated or columns are more than 16 or less than 1, then the value by the corresponding element in matrix B is direct
It is set as null value.Accordingly, when the first search parameter is set to concrete predefined parameter, in profit
When generating matrix B by matrix A, exist in the matrix of fisrt feature figure and will not be utilized to generate the
The entry of a matrix element of one intermediate features figure, therefore in concrete processing procedure, will not by not reading this
It is utilized to generate the entry of a matrix element of the first intermediate features figure, also will not profit in the calculating of correlation coefficient
Use this element such that it is able to reduce amount of calculation.Owing to the entry of a matrix element of fisrt feature figure is corresponding to pixel,
Therefore the first search unit 12 determines the picture in fisrt feature figure based on the first set search parameter
Element, and merely with determined by the value of pixel, generate the first intermediate features figure.
Additionally, it is above-mentioned based on set the first search parameter (that is, position offset (a row, b arrange)
With direction rotation amount z degree) process that utilizes fisrt feature figure to generate the first intermediate features figure is only one
Example, it would however also be possible to employ other method generates the first intermediate features figure, such as in order to improve and Prototype drawing
The computational accuracy of the first correlation coefficient between Xiang, it is also possible to carry out suitable conversion process.
Additionally, as it has been described above, converter unit 11 with different transformation parameters to input picture or second feature
Figure converts, thus generates the situation of at least two fisrt feature figure corresponding from different transformation parameters
Under, the first search unit 12, when generating the first intermediate features figure, utilizes at least two fisrt feature figure
Generate the first intermediate features figure.Specifically, it is assumed that generate 6 fisrt feature figures, these 6 are represented
The matrix of one characteristic pattern is A1~A6.Such as, based on the first set search parameter, (that is, position is inclined
Shifting amount (a row, b arrange) and direction rotation amount z degree) when generating the first intermediate features figure, as it has been described above,
Determine the line number in the matrix representing fisrt feature figure and columns, then represent 6 fisrt feature figures
6 matrixes read the value of the element of the row and column of correspondence respectively, and to 6 elements read
Value is such as weighted averagely, thus calculates the value of the element of the matrix B representing the first intermediate features figure.
Specifically, the element of the first row secondary series of matrix B is, (x-sinz °) row to matrix A 1
The unit that element that (y+cosz °) arranges, (x-sinz °) row (y+cosz °) of matrix A 2 arrange
Element, (x-sinz °) row (y+cosz °) of matrix A 3 arrange element, (x-sinz °) of matrix A 4
Element, (x-sinz °) row (y+cosz °) of matrix A 5 that row (y+cosz °) arranges arrange
Element, the element that arranges of (x-sinz °) row (y+cosz °) of matrix A 6 is weighted averagely
After value.By calculating the first correlation coefficient of the first intermediate features figure and the template image so generated,
The accuracy of the first correlation coefficient can be improved, and then can more accurately set the second search parameter
Span.
First search unit 12, after generating the first intermediate features figure, calculates in the middle of first generated
The first correlation coefficient between characteristic pattern and described template image.This calculating the first intermediate features figure and template
The process of the first correlation coefficient of image, it is possible to use method of the prior art to carry out, do not open up at this
Open and illustrate.
Wherein, be used for generating the template image of the first correlation coefficient with the first intermediate features figure, be preferably with
The template image of the first intermediate features figure same pixel.The first correlation coefficient is being calculated thus, it is possible to reduce
Time amount of calculation.Furthermore it is preferred that be, for generating the mould of the first correlation coefficient with the first intermediate features figure
Plate image is, by registered images is carried out with for generating the place that the conversion process of fisrt feature figure is identical
Reason, thus generate this template image.Thereby, it is possible to improve the credibility of the first correlation coefficient calculated.
Additionally, the process of generation the first intermediate features figure of described above and the place of calculating the first correlation coefficient
Reason can also executed in parallel.Specifically, in certain unit generating the matrix representing the first intermediate features figure
After element, the unit in the matrix of this generation is utilized usually to carry out calculating the process of the first correlation coefficient.Thus,
The time calculated needed for the first correlation coefficient can be saved, improve the efficiency of images match.
By above-mentioned process, the first search unit 12 calculates respectively the most corresponding multiple with multiple predefined parameters
First correlation coefficient.Specifically, the first search unit 12 calculates and position offset and direction rotation amount
Respectively corresponding the first correlation coefficienies of all combinations.
Second search unit 13, based on multiple first correlation coefficienies calculated by the first search unit, determines
The span of the second search parameter.
Specifically, the second search unit 13 compares the size of the first correlation coefficient, and according to the first phase relation
The size of number determines the span of the second search parameter in the second search processes.Wherein, to second
The content represented by parameter that search parameter sets is identical with the first search parameter.Such as, to the first search
In the case of the predefined parameter that parameter sets represents position offset and direction rotation amount, to the second search ginseng
The parameter that number sets also illustrates that position offset and direction rotation amount.
It is preferably, in first correlation coefficient respectively the most corresponding with multiple predefined parameters, determines with value
The predefined parameter that the first big correlation coefficient is corresponding, and based on first correlation coefficient pair maximum with value
The predefined parameter answered, determines the span of the second search parameter.
Such as, in multiple first correlation coefficienies calculated by the first search unit 12, offset with position
In the case of the value maximum of the first correlation coefficient of amount (3 row, 4 row) and direction rotation amount 30 degree correspondence,
Taking of the second search parameter is determined according to this predefined parameter (3 row, 4 row, direction rotation amount 30 degree)
Value scope.Specifically, such as the span of the second search parameter is defined as 2-4 row, 3-5 row, side
To rotation amount 21-39 degree.
In addition it is also possible to determined the span of the second search parameter by other method.Such as,
Determine the predefined parameter and second largest with value first that first correlation coefficient maximum with value is corresponding
The predefined parameter that correlation coefficient is corresponding, and based on first correlation coefficient corresponding make a reservation for maximum with value
Parameter and predefined parameter corresponding to first correlation coefficient second largest with value, determine the second search ginseng
The span of number.Thus, although the amount of calculation that the second search processes can improve, but correspondingly can carry
The accuracy of hi-vision coupling.
After determining the span of the second search parameter, the second search unit 13 is in the second search ginseng
The second search parameter is set respectively, based on set the second search parameter and second in the span of number
Characteristic pattern generates the second intermediate features figure, calculates the second intermediate features figure and described Prototype drawing generated
The second correlation coefficient between Xiang.
Specifically, certain in the second search parameter is set as span by the second search unit 13 successively
After position offset and certain direction rotation amount, based on the second set search parameter and second feature
Figure generates the second intermediate features figure, and calculates the second generated intermediate features figure and described Prototype drawing
The second correlation coefficient between Xiang.If being judged as by the process of identifying unit 14 described later calculating
Second correlation coefficient is unsatisfactory for predetermined condition, the most again the second search parameter is set as position offset and side
To other the combination (certainly setting in span) of rotation amount, repeat above-mentioned process, thus count
Calculate the second correlation coefficient.
Additionally, in the second search carried out by the second search unit 13 processes, based on set second
Search parameter and second feature figure generate the process of the second intermediate features figure and calculate the generated
Processing and the above-mentioned first search of the second correlation coefficient between two intermediate features figures and described template image
Process identical, do not carry out the explanation repeated.Additionally, identically with the first search process, second searches
Cable elements 13 determines the pixel in second feature figure, and only profit based on the second set search parameter
The value of pixel determined by with, generates the second intermediate features figure.Additionally, identically with the first search process,
The method that can also use other generates the second intermediate features figure, such as in order to improve with template image it
Between the computational accuracy of the first correlation coefficient, it is also possible to carry out suitable conversion process.
Wherein, be used for generating the template image of the second correlation coefficient with the second intermediate features figure, be preferably with
The template image of the second intermediate features figure same pixel.The second correlation coefficient is being calculated thus, it is possible to reduce
Time amount of calculation.Furthermore it is preferred that be, for generating the mould of the second correlation coefficient with the second intermediate features figure
Plate image is, by registered images is carried out with for generating the place that the conversion process of second feature figure is identical
Reason, thus generate this template image.Thereby, it is possible to improve the credibility of the first correlation coefficient calculated.
Additionally, the process of generation the second intermediate features figure of described above and the place of calculating the second correlation coefficient
Reason can also executed in parallel.Specifically, in certain unit generating the matrix representing the second intermediate features figure
After element, the unit in the matrix of this generation is utilized usually to carry out calculating the process of the second correlation coefficient.Thus,
The time calculated needed for the second correlation coefficient can be saved, improve the efficiency of images match.
Identifying unit 14 is in the case of the second correlation coefficient calculated meets predetermined condition, it is determined that for defeated
Enter image to mate with template image, based on all the set in the span of the second search parameter
Between all second intermediate features figures and described template image that two search parameters and second feature figure generate
The second correlation coefficient be all unsatisfactory for predetermined condition in the case of, it is determined that for described input picture and described mould
Plate image does not mates.
Specifically, the second search unit 13 under certain second search parameter, the second phase relation is being calculated
After number, identifying unit 14 judge that this second correlation coefficient calculated by the second search unit 13 is
No meet predetermined condition (such as, if more than threshold value), meet predetermined condition at this second correlation coefficient
In the case of, it is determined that unit 14 is judged to that input picture mates with template image.Thus, images match processes
Terminate.When identifying unit 14 is judged to that this second correlation coefficient of being calculated by the second search unit 13 is not
When meeting predetermined condition, as it has been described above, the second search unit 13 is in the span of the second search parameter
Reset the second search parameter, and repeat above-mentioned process.Second search unit 13 is by the second search ginseng
All values in the span of number was the most once set to the second search parameter, the second search unit 13 calculate
In the case of the second correlation coefficient gone out still is unsatisfactory for predetermined condition, it is determined that unit 14 can determine that as input
Image does not mates with template image.
Image processing equipment 1 according to the embodiment of the present invention, by utilize data volume relatively small
First search of one characteristic pattern processes the scope determining that the second search processes, and is therefore utilizing data volume relative
Second search of big second feature figure only calculates in processing in the range of determining, therefore, it is possible to fall
Amount of calculation in the process of low whole images match, can be maintained at the precision that images match processes simultaneously
The second feature figure utilizing data volume relatively large in four corner carries out the level calculated.
Below, the image matching method of embodiments of the present invention is described with reference to Fig. 2.Fig. 2 is to represent
The flow chart of the image matching method of embodiments of the present invention.
Image matching method shown in Fig. 2 can be applied to the image processing equipment shown in Fig. 1.Such as Fig. 1
Shown in, image processing equipment 1 includes converter unit the 11, first search unit the 12, second search unit
13 and identifying unit 14.
In step sl, input picture is converted, generate fisrt feature figure and data volume relative to the
The second feature figure that one characteristic pattern is big.
Wherein, input picture can be the image gathered itself by acquisition module by image processing apparatus 1,
It can also be the image received from other device.Additionally, about the content of input picture, at image
Be correlated with in the field that reason device 1 is applied, such as, if image processing apparatus 1 should be used for carrying out fingerprint recognition,
Then input picture is fingerprint image, if image processing apparatus 1 should be used for carrying out recognition of face, then inputs figure
It seem facial image.
Specifically, converter unit 11 such as carries out wavelet transformation and reduces conversion input picture, thus raw
Become second feature figure, the most again input picture is such as carried out wavelet transformation and reduces conversion, thus raw
Become fisrt feature figure.Wherein, generate fisrt feature figure time reducing conversion and generate second feature figure time
Reduce conversion difference, thus the data volume of second feature figure is more than fisrt feature figure.Such as, converter unit
The input picture of 1024*1024 pixel is carried out wavelet transformation and reduces conversion by 11, generates 64*64 pixel
Second feature figure, and the input picture of 1024*1024 pixel is carried out wavelet transformation and reduce conversion,
Generate the second feature figure of 32*32 pixel.When generating fisrt feature figure, converter unit 11 can also be right
Second feature legend after generation such as carries out wavelet transformation and reduces conversion, thus generates fisrt feature figure.
Preferably, in step sl, input picture is filtered, generates a filtered input figure
Picture, and convert filtered input picture, generates fisrt feature figure and data volume relative to the
The second feature figure that one characteristic pattern is big.Specifically, converter unit 11 generates fisrt feature figure and the carrying out
Before the process of two characteristic patterns, input picture is carried out the pretreatment of such as wavelet filtering, thus eliminates former
Noise in the input picture begun.Thereby, it is possible to during eliminating noise on image matching treatment in advance
Interference, improves the accuracy rate of images match.Then, converter unit 11 utilizes filtered input picture
Generate fisrt feature figure and second feature figure.
Moreover it is preferred that in step sl, with different transformation parameters to input picture or second feature
Figure converts, thus generates at least two fisrt feature figure corresponding from different transformation parameters.Specifically
Ground, converter unit 11 carries out wavelet transformation and reduces conversion input picture or second feature figure, generates the
One characteristic pattern.Such as, when carrying out wavelet transformation, utilize the window function of different angles to input picture or
Second feature figure carries out wavelet transformation, carries out reducing conversion the most again, thus generate respectively with different angles
Corresponding multiple fisrt feature figures.In concrete example, converter unit 11 respectively with 0 degree, 30 degree, 60
Degree, 90 degree, 120 degree, the window function of 150 degree, carry out small echo change to input picture or second feature figure
Change, carry out reducing conversion the most again, thus generate 6 corresponding with above-mentioned angle respectively fisrt feature figures.
Thus, when utilizing fisrt feature figure to carry out the first search in follow-up process, it is possible to increase first searches
The accuracy of rope, and then can more accurately set the span of the second search parameter.
In above-mentioned conversion process and Filtering Processing, had as a example by wavelet transformation, wavelet filtering
The explanation of body, but the present invention is not limited to this, it is also possible to carry out other such as dct transform, average
The process of filtering etc..As long as the fisrt feature figure after Sheng Chenging and second feature figure can represent input well
The feature of image.
In step s 2, the first search parameter is respectively set as multiple predefined parameter, based on set
First search parameter and fisrt feature figure generate the first intermediate features figure, calculate in the middle of first generated
The first correlation coefficient between characteristic pattern and described template image, wherein said first correlation coefficient is with predetermined
Parameter is corresponding.
Wherein, predefined parameter such as represents position offset and/or direction rotation amount.Single by the first search
During the first search that unit 12 is carried out processes, such as, the first search parameter is respectively set as different multiple positions
Put side-play amount and/or the combination of different multiple directions rotation amounts.Specifically, position offset is by first
Between corresponding line number in fisrt feature figure of the central point of characteristic pattern and columns represent.Plurality of position
Side-play amount can be redefined for (1 row, 1 row), (1 row, 3 row), (1 row, 5 row) ... (15
Row, 15 row) etc..Additionally, multiple directions rotation amount can be redefined for-40 degree ,-30 degree ,-20
Degree ..., 40 degree etc..Additionally, above-mentioned multiple position offsets set in advance and multiple directions rotation
Turn an amount simply example, other direction rotation amount can be set as required.Additionally, predefined parameter is such as
Position offset and direction rotation amount can also be represented simultaneously, and then can also be represented other as required
Parameter.
Specifically, the first search parameter is being each set to above-mentioned multiple position offset and above-mentioned many
In the case of the various combination of individual direction rotation amount, it is set to the skew of certain position at the first search parameter
After amount and certain direction rotation amount, generate based on the first set search parameter and fisrt feature figure
First intermediate features figure, and calculate between generated the first intermediate features figure and described template image
First correlation coefficient.Now, the first correlation coefficient calculated and this certain position offset and this certain
Direction rotation amount is correlated with.Then, then by the first search parameter it is set as position offset and direction rotation amount
Other combination, repeat above-mentioned process, thus calculate and this position offset and direction rotation amount
Relevant the first correlation coefficient of other combination.Repeat above-mentioned process, until position offset and side
The most once it was set to the first search parameter to all of combination of rotation amount, and it is possible to calculate and position
The first correlation coefficient that side-play amount is the most corresponding with all combinations of direction rotation amount.
The first search parameter be set to position offset (a row, b arrange) and direction rotation amount z degree it
After, the first search unit 12 is based on set the first search parameter (that is, position offset (a row, b
Row) and direction rotation amount z degree), utilize fisrt feature figure to generate the first intermediate features figure.Concrete
In process, the fisrt feature figure of such as 16*16 pixel is represented by the matrix including 16*16 element.
Specifically, element and the expression fisrt feature of the matrix B of the first intermediate features figure generated are represented
Relation between the element of the matrix A of figure is as follows.The element of the first row first row of matrix B is, matrix
The element of the xth row y row of A.Wherein, in the case of z is less than or equal to 0, x=a-(cosz °
-sinz °) * d/2, y=b-(cosz °-sinz °) * d/2.Additionally, in the case of z is more than 0, x=a+ (cosz °
+ sinz °) * d/2, y=b+ (cosz °-sinz °) * d/2.In the case of fisrt feature figure is 16*16 pixel,
D=16.The element of the first row secondary series of matrix B is, (x-sinz °) row (y+cosz °) of matrix A
The element of row.The tertial element of the first row of matrix B is, (x-2*sinz °) of matrix A OK
The element that (y+2*cosz °) arranges.Additionally, the element of the second row first row of matrix B is, matrix
The element that (x+cosz °) row (y+sinz °) of A arranges.The unit of the third line first row of matrix B
Element is, the element that (x+cosz °+cosz °) row (y+sinz °+sinz °) of matrix A arranges.
The element of the second row secondary series of matrix B is, (x+cosz °-sinz °) row the (y+sinz ° of matrix A
+ cosz °) element that arranges.That is, the element of the line n m row of matrix B is, the of matrix A
The unit that (x-(m-1) * sinz °+(n-1) * cosz °) row (y+ (m-1) cosz °+(n-1) * sinz °) arranges
Element.So, it is possible to calculate the value of all elements of matrix B.
Additionally, in the process of the element value of above-mentioned calculating matrix B, the line number such as calculated or row
Number (x-(m-1) * sinz °+(n-1) * cosz °), (y+ (m-1) cosz °+(n-1) * sinz °) are not integers
In the case of, it is such as rounded up, thus obtains integer-valued line number and columns.
Additionally, as it has been described above, in step sl with different transformation parameters to input picture or second feature
Figure converts, thus generates the situation of at least two fisrt feature figure corresponding from different transformation parameters
Under, when generating the first intermediate features figure in step s 2, utilize at least two fisrt feature figure to generate
One intermediate features figure.Specifically, it is assumed that generate 6 fisrt feature figures, these 6 fisrt feature are represented
The matrix of figure is A1~A6.Such as, based on set the first search parameter (that is, position offset (a
Row, b row) and direction rotation amount z degree) when generating the first intermediate features figure, as it has been described above, determine table
Show the line number in the matrix of fisrt feature figure and columns, then at 6 squares representing 6 fisrt feature figures
The value of element of the row and column of correspondence is read respectively in Zhen, and to the value of 6 elements read such as
It is weighted average, thus calculates the value of the element of the matrix B representing the first intermediate features figure.Specifically
Ground, the element of the first row secondary series of matrix B is, (x-sinz °) row (y+cosz °) to matrix A 1
Row element, (x-sinz °) row (y+cosz °) of matrix A 2 arrange element, matrix A 3
(x-sinz °) row (y+cosz °) arrange element, (x-sinz °) row of matrix A 4
The unit that element that (y+cosz °) arranges, (x-sinz °) row (y+cosz °) of matrix A 5 arrange
After the element that element, (x-sinz °) row (y+cosz °) of matrix A 6 arrange is weighted averagely
Value.By calculating the first correlation coefficient of the first intermediate features figure and the template image so generated, it is possible to
Improve the accuracy of the first correlation coefficient, and then can more accurately set the value of the second search parameter
Scope.
Additionally, it is above-mentioned based on set the first search parameter (that is, position offset (a row, b arrange)
With direction rotation amount z degree) process that utilizes fisrt feature figure to generate the first intermediate features figure is only one
Example, it would however also be possible to employ other method generates the first intermediate features figure, such as in order to improve and Prototype drawing
The computational accuracy of the first correlation coefficient between Xiang, it is also possible to carry out suitable conversion process.
Preferably, in step s 2, determine in fisrt feature figure based on the first set search parameter
Pixel, and merely with determined by the value of pixel, generate the first intermediate features figure.
Specifically, in step s 2 based on set the first search parameter (that is, position offset (a
Row, b row) and direction rotation amount z degree) when utilizing fisrt feature figure to generate the first intermediate features figure, as
Upper described, show in the middle of generated first according to unit's usually computational chart of the matrix A representing fisrt feature figure
The element of the matrix B of characteristic pattern.When the line number calculated in matrix A or columns, it may appear that beyond matrix
The line number of A or the situation of columns.Such as, in the case of the matrix that matrix A is 16*16, if calculating
The value of the corresponding element in matrix B more than 16 or less than 1, is then directly set as by the line number gone out or columns
Null value.Accordingly, when the first search parameter is set to concrete predefined parameter, matrix is being utilized
When A generates matrix B, exist in the matrix of fisrt feature figure and will not be utilized to generate in the middle of first
The entry of a matrix element of characteristic pattern, therefore in concrete processing procedure, will not be utilized by not reading this
Generate the entry of a matrix element of the first intermediate features figure, the calculating of correlation coefficient also will not utilize this yuan
Element such that it is able to reduce amount of calculation.Owing to the entry of a matrix element of fisrt feature figure is corresponding to pixel, therefore
The pixel in fisrt feature figure is determined in step s 2 based on the first set search parameter, and only
The value of pixel determined by utilization, generates the first intermediate features figure.
In step s 2, after generating the first intermediate features figure, calculate in the middle of first generated special
Levy the first correlation coefficient between figure and described template image.This calculating the first intermediate features figure and Prototype drawing
The process of the first correlation coefficient of picture, it is possible to use method of the prior art to carry out, do not launch at this
Illustrate.Wherein, for generating the template image of the first correlation coefficient with the first intermediate features figure, excellent
Elect as and the template image of the first intermediate features figure same pixel.The first phase is being calculated thus, it is possible to reduce
Close amount of calculation during coefficient.Furthermore it is preferred that be, it is used for generating the first phase relation with the first intermediate features figure
The template image of number is, by carrying out registered images and the conversion process phase for generating fisrt feature figure
Same process, thus generate this template image.Thereby, it is possible to improve first correlation coefficient calculated
Credibility.
Additionally, the process of generation the first intermediate features figure of described above and the place of calculating the first correlation coefficient
Reason can also executed in parallel.Specifically, in certain unit generating the matrix representing the first intermediate features figure
After element, the unit in the matrix of this generation is utilized usually to carry out calculating the process of the first correlation coefficient.Thus,
The time calculated needed for the first correlation coefficient can be saved, improve the efficiency of images match.
By above-mentioned process, in step s 2, calculate and multiple predefined parameters respectively the most corresponding multiple the
One correlation coefficient (specifically, the most corresponding with all combinations of position offset and direction rotation amount the
One correlation coefficient).
In step s3, based on multiple first correlation coefficienies the most corresponding with multiple predefined parameters, determine
The span of the second search parameter.
Specifically, the size of the first correlation coefficient calculated the most in step s 2,
And determine the value model of the second search parameter in the second search processes according to the size of the first correlation coefficient
Enclose.Wherein, the content represented by parameter set the second search parameter is identical with the first search parameter.
Such as, the predefined parameter set the first search parameter represents position offset and the situation of direction rotation amount
Under, the parameter setting the second search parameter also illustrates that position offset and direction rotation amount.
It is preferably, in step s3, in first correlation coefficient the most corresponding with multiple predefined parameters,
Determine the predefined parameter that first correlation coefficient maximum with value is corresponding, and based on maximum with value the
The predefined parameter that one correlation coefficient is corresponding, determines the span of the second search parameter.
Such as, in multiple first correlation coefficienies calculated in step s 2, with position offset (3 row,
4 row) and direction rotation amount 30 degree correspondence the first correlation coefficient value maximum in the case of, pre-according to this
Determine parameter (3 row, 4 row, direction rotation amount 30 degree) and determine the span of the second search parameter.
Specifically, such as the span of the second search parameter is defined as 2-4 row, 3-5 row, direction rotation amount
21-39 degree.
In addition it is also possible to determined the span of the second search parameter by other method.Such as,
Determine the predefined parameter and second largest with value first that first correlation coefficient maximum with value is corresponding
The predefined parameter that correlation coefficient is corresponding, and based on first correlation coefficient corresponding make a reservation for maximum with value
Parameter and predefined parameter corresponding to first correlation coefficient second largest with value, determine the second search ginseng
The span of number.Thus, although the amount of calculation that the second search processes can improve, but correspondingly can carry
The accuracy of hi-vision coupling.
In step s 4, in the span of the second search parameter, set described second search parameter respectively,
Generate the second intermediate features figure based on the second set search parameter and second feature figure, calculate and given birth to
The second correlation coefficient between the second intermediate features figure and the described template image that become.
Specifically, in step s 4, successively the second search parameter is set as certain position in span
After putting side-play amount and certain direction rotation amount, based on the second set search parameter and second feature figure
Generate the second intermediate features figure, and calculate the second generated intermediate features figure and described template image
Between the second correlation coefficient.If being judged as, by the process of step S5 described later, the second phase calculated
Close coefficient and be unsatisfactory for predetermined condition, the most again the second search parameter is set as that position offset and direction rotate
Other the combination (certainly in span set) of amount, repeats above-mentioned process, thus calculates the
Two correlation coefficienies.
Additionally, in the second search of step S4 processes, based on set the second search parameter and second
Characteristic pattern generate the second intermediate features figure process and calculate the second intermediate features figure of being generated with
First search with above-mentioned step S2 that processes of the second correlation coefficient between described template image processes
Identical, do not carry out the explanation repeated.Additionally, identically with the first search process, in step s 4
The pixel in second feature figure is determined based on the second set search parameter, and merely with being determined
The value of pixel, generate the second intermediate features figure.Additionally, identically with the first search process, it is also possible to
The method using other generates the second intermediate features figure, such as in order to improve between template image
The computational accuracy of one correlation coefficient, it is also possible to carry out suitable conversion process.
Wherein, be used for generating the template image of the second correlation coefficient with the second intermediate features figure, be preferably with
The template image of the second intermediate features figure same pixel.The second correlation coefficient is being calculated thus, it is possible to reduce
Time amount of calculation.Furthermore it is preferred that be, for generating the mould of the second correlation coefficient with the second intermediate features figure
Plate image is, by registered images is carried out with for generating the place that the conversion process of second feature figure is identical
Reason, thus generate this template image.Thereby, it is possible to improve the credibility of the first correlation coefficient calculated.
Additionally, the process of generation the second intermediate features figure of described above and the place of calculating the second correlation coefficient
Reason can also executed in parallel.Specifically, in certain unit generating the matrix representing the second intermediate features figure
After element, the unit in the matrix of this generation is utilized usually to carry out calculating the process of the second correlation coefficient.Thus,
The time calculated needed for the second correlation coefficient can be saved, improve the efficiency of images match.
In step s 5, in the case of the second correlation coefficient calculated meets predetermined condition, it is determined that for
Input picture mates with template image.Additionally, in step s 6, based on taking at the second search parameter
In the range of value set all second search parameters and second feature figure generate all second intermediate features figures,
And in the case of the second correlation coefficient between described template image is all unsatisfactory for predetermined condition, it is determined that for institute
State input picture not mate with described template image.
Specifically, after calculating the second correlation coefficient under certain second search parameter in step s 4,
Judged whether this second correlation coefficient calculated in step s 4 meets predetermined condition by identifying unit 14
(such as, if more than threshold value), in the case of this second correlation coefficient meets predetermined condition, it is determined that single
Unit 14 is judged to that input picture mates with template image.Thus, images match process terminates.Single when judging
Unit 14 is judged to when this second correlation coefficient calculated in step s 4 is unsatisfactory for predetermined condition, as above
Described, in step s 4, in the span of the second search parameter, reset the second search parameter,
And repeat above-mentioned process.By the process of above-mentioned repetition, taking the second search parameter in step s 4
All values in the range of value was the most once set to the second search parameter, and the second correlation coefficient calculated is the most not
In the case of meeting predetermined condition, it is determined that unit 14 can determine that and do not mates with template image for input picture.
Image matching method according to the embodiment of the present invention, by utilize data volume relatively small first
First search of characteristic pattern processes the scope determining that the second search processes, and is therefore utilizing data volume relatively large
Second feature figure second search process in only calculate in the range of determining, therefore, it is possible to reduce
Amount of calculation in the process of whole images match, can be maintained at the precision that images match processes entirely simultaneously
The second feature figure utilizing data volume relatively large in the range of portion carries out the level calculated.
Those of ordinary skill in the art are it is to be appreciated that be combined in each of embodiments of the present invention description
Unit and step, it is possible to electronic hardware, computer software or the two be implemented in combination in.And it is soft
Part module can be placed in any form of computer-readable storage medium.In order to clearly demonstrate hardware and software
Interchangeability, the most generally describe composition and the step of each example according to function
Suddenly.These functions perform with hardware or software mode actually, depend on the application-specific of technical scheme
And design constraint.Each specifically should being used for can be used different methods to by those skilled in the art
Function described by realization, but this realization is it is not considered that beyond the scope of this invention.
Each embodiment of the present invention described in detail above.But, those skilled in the art should
Understand, without departing from the principles and spirit of the present invention, can these embodiments be carried out various
Amendment, combination or sub-portfolio, and such amendment should fall within the scope of the present invention.
Claims (10)
1. an image matching method, including:
Input picture is converted, generates fisrt feature figure and data volume is big relative to fisrt feature figure
Second feature figure;
First search parameter is respectively set as multiple predefined parameter, based on the first set search parameter
Generate the first intermediate features figure with fisrt feature figure, calculate the first intermediate features figure generated with described
The first correlation coefficient between template image, wherein said first correlation coefficient is corresponding with predefined parameter;
Based on multiple first correlation coefficienies the most corresponding with multiple predefined parameters, determine the second search parameter
Span;
Described second search parameter is set respectively, based on set in the span of the second search parameter
The second search parameter and second feature figure generate the second intermediate features figure, calculate in second generated
Between the second correlation coefficient between characteristic pattern and described template image;
In the case of the second correlation coefficient calculated meets predetermined condition, it is determined that for described input picture
Mate with described template image;
Special based on all second search parameters set in the span of the second search parameter and second
Levy the second correlation coefficient between all second intermediate features figures and the described template image that figure generates the most not
In the case of meeting predetermined condition, it is determined that do not mate with described template image for described input picture.
2. image matching method as claimed in claim 1, wherein,
Input picture is converted, generates fisrt feature figure and data volume is big relative to fisrt feature figure
The step of second feature figure includes:
Input picture is filtered, generates a filtered input picture;
Filtered input picture is converted, generates fisrt feature figure and data volume special relative to first
Levy the second feature figure that figure is big.
3. image matching method as claimed in claim 2, wherein,
Filtered input picture is converted, generates fisrt feature figure and data volume special relative to first
Levy in the step of the big second feature figure of figure,
With different transformation parameters, filtered input picture or described second feature figure are converted, from
And generate at least two fisrt feature figure corresponding from different transformation parameters.
4. image matching method as claimed in claim 2, wherein,
Based on multiple first correlation coefficienies the most corresponding with multiple predefined parameters, determine the second search parameter
The step of span include:
In multiple first correlation coefficienies the most corresponding with multiple predefined parameters, determine and value maximum
The predefined parameter that first correlation coefficient is corresponding;
Based on the predefined parameter that first correlation coefficient maximum with value is corresponding, determine the second search parameter
Span.
5. image matching method as claimed in claim 1, wherein,
Generate the first intermediate features figure based on the first set search parameter and fisrt feature figure, calculate
In the step of the first correlation coefficient between the first intermediate features figure and the described template image that are generated,
The pixel in described fisrt feature figure, and only profit is determined based on the first set search parameter
The value of pixel determined by with, generates described first intermediate features figure.
6. an image processing apparatus, including:
Converter unit, converts input picture, generates fisrt feature figure and data volume relative to first
The second feature figure that characteristic pattern is big;
First search unit, is respectively set as multiple predefined parameter by the first search parameter, based on set
The first search parameter and fisrt feature figure generate the first intermediate features figure, calculate in first generated
Between the first correlation coefficient between characteristic pattern and described template image, wherein said first correlation coefficient is with pre-
Determine parameter corresponding;
Second search unit, based on multiple first correlation coefficienies the most corresponding with multiple predefined parameters, really
The span of fixed second search parameter, and in the span of the second search parameter, set institute respectively
State the second search parameter, generate in the middle of second based on the second set search parameter and second feature figure
Characteristic pattern, calculates the second correlation coefficient between the second intermediate features figure and the described template image generated;
Identifying unit, in the case of the second correlation coefficient calculated meets predetermined condition, it is determined that for institute
State input picture to mate with described template image, setting based in the span of the second search parameter
All second search parameters and second feature figure generate all second intermediate features figures and described template
In the case of the second correlation coefficient between image is all unsatisfactory for predetermined condition, it is determined that for described input picture
Do not mate with described template image.
7. image processing apparatus as claimed in claim 6, wherein,
Input picture is filtered by described converter unit, generates a filtered input picture, and
Filtered input picture is converted, generates fisrt feature figure and data volume relative to fisrt feature figure
Big second feature figure.
8. image processing apparatus as claimed in claim 7, wherein,
Described converter unit with different transformation parameters to filtered input picture or described second feature figure
Convert, thus generate at least two fisrt feature figure corresponding from different transformation parameters.
9. image processing apparatus as claimed in claim 7, wherein,
Described second search unit, in first correlation coefficient the most corresponding with multiple predefined parameters, determines
The predefined parameter that first correlation coefficient maximum with value is corresponding, and based on first phase maximum with value
Close the predefined parameter that coefficient is corresponding, determine the span of the second search parameter.
10. image processing apparatus as claimed in claim 6, wherein,
Described first search unit determines in described fisrt feature figure based on the first set search parameter
Pixel, and merely with determined by the value of pixel, generate the first intermediate features figure.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510147484.7A CN106156699B (en) | 2015-03-31 | 2015-03-31 | Image processing apparatus and image matching method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510147484.7A CN106156699B (en) | 2015-03-31 | 2015-03-31 | Image processing apparatus and image matching method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106156699A true CN106156699A (en) | 2016-11-23 |
CN106156699B CN106156699B (en) | 2019-06-25 |
Family
ID=57337240
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510147484.7A Active CN106156699B (en) | 2015-03-31 | 2015-03-31 | Image processing apparatus and image matching method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106156699B (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101276411A (en) * | 2008-05-12 | 2008-10-01 | 北京理工大学 | Fingerprint identification method |
CN101515286A (en) * | 2009-04-03 | 2009-08-26 | 东南大学 | Image matching method based on image feature multi-level filtration |
CN102087710A (en) * | 2009-12-03 | 2011-06-08 | 索尼公司 | Learning device and method, recognition device and method, and program |
CN102292745A (en) * | 2009-01-23 | 2011-12-21 | 日本电气株式会社 | image signature extraction device |
CN103714159A (en) * | 2013-12-27 | 2014-04-09 | 中国人民公安大学 | Coarse-to-fine fingerprint identification method fusing second-level and third-level features |
CN104268880A (en) * | 2014-09-29 | 2015-01-07 | 沈阳工业大学 | Depth information obtaining method based on combination of features and region matching |
-
2015
- 2015-03-31 CN CN201510147484.7A patent/CN106156699B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101276411A (en) * | 2008-05-12 | 2008-10-01 | 北京理工大学 | Fingerprint identification method |
CN102292745A (en) * | 2009-01-23 | 2011-12-21 | 日本电气株式会社 | image signature extraction device |
CN101515286A (en) * | 2009-04-03 | 2009-08-26 | 东南大学 | Image matching method based on image feature multi-level filtration |
CN102087710A (en) * | 2009-12-03 | 2011-06-08 | 索尼公司 | Learning device and method, recognition device and method, and program |
CN103714159A (en) * | 2013-12-27 | 2014-04-09 | 中国人民公安大学 | Coarse-to-fine fingerprint identification method fusing second-level and third-level features |
CN104268880A (en) * | 2014-09-29 | 2015-01-07 | 沈阳工业大学 | Depth information obtaining method based on combination of features and region matching |
Non-Patent Citations (1)
Title |
---|
曹国等: "快速的多级指纹混合匹配方法", 《模式识别与人工智能》 * |
Also Published As
Publication number | Publication date |
---|---|
CN106156699B (en) | 2019-06-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109993296B (en) | Quantitative implementation method and related product | |
Hyvärinen et al. | Independent component analysis | |
Lee et al. | A robust algorithm for the fractal dimension of images and its applications to the classification of natural images and ultrasonic liver images | |
US20080144941A1 (en) | Face recognition apparatus, face recognition method, gabor filter application apparatus, and computer program | |
WO2015195300A1 (en) | Obtaining structural information from images | |
CN105518717B (en) | A kind of face identification method and device | |
CN111091075B (en) | Face recognition method and device, electronic equipment and storage medium | |
CN105718848B (en) | Quality evaluation method and device for fingerprint image | |
WO2013134932A1 (en) | A method and apparatus for improved facial recognition | |
CN111985414B (en) | Joint position determining method and device | |
CN110651273B (en) | Data processing method and equipment | |
CN111025914A (en) | Neural network system remote state estimation method and device based on communication limitation | |
CN113298870B (en) | Object posture tracking method and device, terminal equipment and storage medium | |
CN112862095B (en) | Self-distillation learning method and device based on feature analysis and readable storage medium | |
CN110765843A (en) | Face verification method and device, computer equipment and storage medium | |
CN110288026A (en) | A kind of image partition method and device practised based on metric relation graphics | |
Bi et al. | A robust color edge detection algorithm based on the quaternion Hardy filter | |
CN109448037A (en) | A kind of image quality evaluating method and device | |
CN102289679B (en) | Method for identifying super-resolution of face in fixed visual angle based on related characteristics and nonlinear mapping | |
CN106156699A (en) | Image processing apparatus and image matching method | |
CN110458754B (en) | Image generation method and terminal equipment | |
CN108596959A (en) | A kind of extracting method of video image space-time characteristic point | |
CN113838104B (en) | Registration method based on multispectral and multimodal image consistency enhancement network | |
CN115410249A (en) | Face recognition model training method, recognition method, device, equipment and medium | |
CN112241740B (en) | Feature extraction method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |