CN116468604A - Image stitching method and terminal equipment - Google Patents
Image stitching method and terminal equipment Download PDFInfo
- Publication number
- CN116468604A CN116468604A CN202310350683.2A CN202310350683A CN116468604A CN 116468604 A CN116468604 A CN 116468604A CN 202310350683 A CN202310350683 A CN 202310350683A CN 116468604 A CN116468604 A CN 116468604A
- Authority
- CN
- China
- Prior art keywords
- feature
- vector
- matching
- point
- points
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 74
- 239000013598 vector Substances 0.000 claims abstract description 130
- 238000012545 processing Methods 0.000 claims abstract description 13
- 238000004590 computer program Methods 0.000 claims description 16
- 238000013507 mapping Methods 0.000 claims description 13
- 230000009467 reduction Effects 0.000 claims description 12
- 238000011946 reduction process Methods 0.000 claims description 2
- 239000011159 matrix material Substances 0.000 description 26
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 13
- 230000009466 transformation Effects 0.000 description 12
- 238000005516 engineering process Methods 0.000 description 9
- 238000012847 principal component analysis method Methods 0.000 description 9
- 230000008569 process Effects 0.000 description 8
- 238000004364 calculation method Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 238000013519 translation Methods 0.000 description 6
- 238000004458 analytical method Methods 0.000 description 5
- 238000004422 calculation algorithm Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 238000011161 development Methods 0.000 description 3
- 238000000605 extraction Methods 0.000 description 3
- 238000010606 normalization Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000000513 principal component analysis Methods 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4038—Image mosaicing, e.g. composing plane images from plane sub-images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/7715—Feature extraction, e.g. by transforming the feature space, e.g. multi-dimensional scaling [MDS]; Mappings, e.g. subspace methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Databases & Information Systems (AREA)
- Computing Systems (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Collating Specific Patterns (AREA)
Abstract
The application is applicable to the technical field of image stitching, and provides a fingerprint stitching method and terminal equipment, wherein the fingerprint stitching method comprises the following steps: an image stitching method, comprising: acquiring a first vector of a first feature point in a first fingerprint image, wherein the first vector represents a main component in a feature vector of the first feature point; obtaining a second vector of a second feature point in a second fingerprint graph, wherein the second vector represents a main component in a feature vector of the second feature point; calculating a first matching value between the first feature point and the second feature point according to the first vector and the second vector; determining matching feature points between the first fingerprint image and the second fingerprint image according to the first matching value; and performing image stitching processing on the first fingerprint image and the second fingerprint image according to the matching feature points to obtain a target fingerprint image. By the method, fingerprint matching speed can be improved, and fingerprint matching accuracy can be improved.
Description
Technical Field
The application belongs to the technical field of image stitching, and particularly relates to an image stitching method and terminal equipment.
Background
The most important and widely applied technology among the biological characteristic recognition technologies during fingerprint recognition. The method utilizes the uniqueness and life invariance of the position features to authenticate the personal identity, and has extremely high safety and usability. With the rapid development of computer hardware performance, fingerprint identification technology has been applied to various fields.
However, with the development of the acquisition technology of portable electronic devices, the area of the fingerprint acquisition instrument is smaller and smaller. The overlapping area and correspondence between different fingerprint images of the same finger becomes smaller, which seriously affects the performance of the fingerprint recognition system. In the related art, when fingerprint splicing is performed, only independent information of minutiae is used as a matching condition of a reference point, and related information of additional minutiae and surrounding areas is ignored, so that errors of a characteristic splicing template are large, and the fingerprint splicing effect is poor.
Disclosure of Invention
The embodiment of the application provides an image stitching method and terminal equipment, which can improve fingerprint stitching speed and accuracy of fingerprint matching.
In a first aspect, an embodiment of the present application provides an image stitching method, including:
an image stitching method, comprising:
Acquiring a first vector of a first feature point in a first fingerprint image, wherein the first vector represents a main component in a feature vector of the first feature point;
obtaining a second vector of a second feature point in a second fingerprint graph, wherein the second vector represents a main component in a feature vector of the second feature point;
calculating a first matching value between the first feature point and the second feature point according to the first vector and the second vector;
determining matching feature points between the first fingerprint image and the second fingerprint image according to the first matching value;
and performing image stitching processing on the first fingerprint image and the second fingerprint image according to the matching feature points to obtain a target fingerprint image.
In the embodiment of the application, the feature points of the two small-area fingerprint images are respectively obtained, the principal component vector of the first fingerprint image and the principal component vector of the second fingerprint image are respectively obtained by using a principal component analysis method, the matched feature points in the two fingerprint images are obtained by using a score matching method, finally, the images are spliced according to the obtained matched feature points, in other words, the feature points in the two fingerprint images are subjected to dimension reduction processing by using the principal component analysis method to obtain low-dimension feature vectors, the obtained low-latitude feature vectors are subjected to score calculation to obtain the best matched feature points in the two fingerprint images, and the matching speed of the matched feature points is faster and the accuracy is higher by using the method.
In a possible implementation manner of the first aspect, the acquiring a first vector of a first feature point in the first fingerprint includes:
acquiring a pixel group corresponding to the first feature point, wherein the pixel group comprises a plurality of first pixel points in the first fingerprint image;
extracting characteristic data of each first pixel point in the pixel group;
generating a feature vector of the first feature point according to the feature data of the first pixel point;
and performing dimension reduction processing on the feature vector of the first feature point to obtain the first vector.
In a possible implementation manner of the first aspect, the extracting feature data of each of the first pixel points in the pixel includes:
for each first pixel point, rotating the first pixel point to a preset direction to obtain the rotated first pixel point, wherein the preset direction represents a direction consistent with the fingerprint ridge line direction;
calculating a first feature and a second feature of the rotated first pixel point, wherein the first feature represents a distance feature along the horizontal direction of the first fingerprint image, and the second feature represents a distance feature along the vertical direction of the first fingerprint image;
And generating feature data of the first pixel point according to the first feature and the second feature of the first pixel point.
In a possible implementation manner of the first aspect, the performing a dimension reduction process on the feature vector of the first feature point to obtain the first vector includes:
obtaining a third vector corresponding to the feature vector of the first feature point, wherein the third vector represents the main component of the feature vector of the first feature point;
the first vector is obtained from the third vector.
In a possible implementation manner of the first aspect, the determining, according to the first matching value, a matching feature point between the first fingerprint map and the second fingerprint map includes:
for the first feature points, determining a plurality of matching groups according to the first matching values, wherein each matching group comprises one first feature point and one second feature point;
respectively calculating a second matching value corresponding to each matching group;
and determining the matching characteristic points from a plurality of matching groups according to the second matching values.
In a possible implementation manner of the first aspect, the determining a plurality of matching groups according to the first matching value includes:
Calculating a ratio between first data and second data, wherein the first data and the second value are two first matching values with the smallest value;
and if the ratio is in the preset range, determining the first characteristic point and the second characteristic point corresponding to the minimum value in the first numerical value and the second numerical value as one matching group.
In a possible implementation manner of the first aspect, the calculating, respectively, a second matching value corresponding to each matching group includes:
for each of the matching groups, the second feature points are mapped into the first fingerprint map. Obtaining mapping feature points;
calculating an attitude difference value between a fourth vector and a fifth vector, wherein the fourth vector is generated according to the second feature point and a target feature point, the fifth vector is generated according to the mapping feature point and the target feature point, and the target feature point is a first feature point in the matching group;
and calculating a second matching value corresponding to the matching group according to the attitude difference value.
In a possible implementation manner of the first aspect, the method further includes:
and setting an updating strategy, wherein the updating strategy updates the feature points with higher reserved components.
In a possible implementation manner of the first aspect, the method further includes:
acquiring a first feature point and a second feature point, wherein the first feature point comprises at least one of the following: end points, fork points and inflection points, the second feature points comprising at least one of: end points, fork points, and inflection points.
In a second aspect, an embodiment of the present application provides a terminal device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the image stitching method according to any one of the first aspects when executing the computer program.
In a third aspect, embodiments of the present application provide a computer program product, which, when run on a terminal device, causes the terminal device to perform the image stitching method according to any one of the first aspects above.
It will be appreciated that the advantages of the second to fifth aspects may be found in the relevant description of the first aspect, and are not described here again.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required for the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic system flow diagram of an image stitching method according to an embodiment of the present application;
FIG. 2 is a flow chart of determining a first vector according to an embodiment of the present application;
FIG. 3 is a flow chart of determining feature data provided by an embodiment of the present application;
FIG. 4 is a schematic flow chart of determining matching feature points according to an embodiment of the present application;
FIG. 5 is a schematic diagram of image stitching provided by one embodiment;
fig. 6 is a schematic structural diagram of a terminal device provided in an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system configurations, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in this specification and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
In addition, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used merely to distinguish between descriptions and are not to be construed as indicating or implying relative importance.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise.
With increasing importance of information security, identity representations (such as identity cards, keys or passports) or identity identification knowledge (such as user names and passwords, personal identification passwords or prompting problems) used in traditional identity recognition methods are easy to steal or replace by others, even users themselves may be lost or forgotten, and large potential safety hazards exist, so that the demands of people are difficult to meet. Since biological features such as fingerprints have uniqueness and invariance, fingerprint recognition technology has been widely used in life as the most mature biological recognition technology.
With the development of portable electronic equipment and acquisition technology, the area of a fingerprint acquisition instrument is smaller and smaller, and the overlapping area between different fingerprint images of the same finger is correspondingly smaller, which seriously affects the performance of a fingerprint identification system. The small area fingerprint stitching technique is through the ability to merge two or more fingerprint images or feature templates of the same finger. Therefore, the small-area fingerprint stitching technology becomes a new research hot spot in the target fingerprint identification technology.
The traditional fingerprint splicing method only uses independent information of minutiae as a matching condition of reference points, ignores related information of minutiae and surrounding areas, possibly causes larger errors of characteristic splicing templates, and the fingerprint splicing algorithm is based on an improved iterative nearest point algorithm, when the reference points of initial transformation are selected, a plurality of related minutiae are used for solving corresponding matching minutiae pairs, and a plurality of matching minutiae pairs are selected and generated as the reference points, so that the calculated amount is larger, and for a fingerprint image with a smaller area, the minutiae number is very few, even no minutiae, and the splicing effect is poor.
In order to solve the above problems, an embodiment of the present application provides an image stitching method. According to the method, the feature points of the two small-area fingerprint images are obtained, the principal component vector of the first fingerprint image and the principal component vector of the second fingerprint image are respectively obtained by utilizing a principal component analysis method, the matched feature points in the two fingerprint images are obtained by utilizing a score matching method, and finally the images are spliced according to the obtained matched feature points, in other words, the feature points in the two fingerprint images are subjected to dimension reduction processing by utilizing the principal component analysis method to obtain low-dimension feature vectors, the obtained low-dimension feature vectors are subjected to score calculation to obtain the best matched feature points in the two fingerprint images, and the matching speed of the feature points is faster and the accuracy is higher by utilizing the method.
Referring to fig. 1, a system flow diagram of an image stitching method according to an embodiment of the present application is shown. By way of example, and not limitation, the method includes the steps of:
step S101, a first vector of a first feature point in a first fingerprint image is obtained, where the first vector represents a principal component in a feature vector of the first feature point.
In the embodiment of the application, the method adopted for splicing two small-area fingerprint images is a principal component analysis method, which is also called PCA (Principal Compomet Analysis) descriptor. The principal component analysis method is one of the most widely used data dimension reduction algorithms. The main idea of PCA is to map n-dimensional features onto k dimensions, which is also called principal component, i.e. the first vector mentioned above, which is a k-dimensional feature reconstructed on the basis of the original n-dimensional features, k being smaller than n.
In this embodiment of the present application, for two small-area fingerprint graphs to be spliced, it is first required to obtain principal component vectors of feature points in the middle of two fingerprints respectively by using principal component analysis. For the convenience of analysis, the two fingerprint patterns may be respectively defined as a first fingerprint pattern and a second fingerprint pattern, all feature points in the first fingerprint pattern are called first feature points, and a principal component vector of each first feature point, namely a first vector, is obtained through a principal component analysis method, and the second feature point and the second vector of the second fingerprint pattern are defined as above.
In an application embodiment, referring to fig. 2, a flowchart of determining a first vector according to an embodiment of the present application is shown in fig. 2, and one implementation of step S101 includes:
step S201, obtaining a pixel group corresponding to the first feature point, where the pixel group includes a plurality of first pixel points in the first fingerprint.
In the embodiment of the present application, the PCA descriptor (first vector) of each feature point needs to be obtained by using the principal component analysis method. The generation of the PCA descriptor mainly comprises the steps of training a feature point sample, generating a PCA projection matrix, reducing the dimension of the PCA, and generating a low-dimension PCA descriptor.
In the embodiment of the application, the feature sample training is to take feature points as centers, select a matrix with a suitable latitude to form a matrix group, wherein the matrix group comprises a plurality of pixel points, and analyze the pixel points to obtain the PCA descriptor with a low dimensionality. It is therefore preferred to obtain a corresponding set of pixels for each feature point before obtaining the PCA descriptor for each feature point.
Taking the first feature point in the first fingerprint as an example, taking any one of the first feature points as a center, taking a 17×17 matrix as a size to obtain a matrix group (which can be empirically set according to the size of the fingerprint image), where the matrix group includes a plurality of first pixel points, so that the matrix group including the first pixel points is called a pixel group corresponding to the first feature point.
Step S202, extracting feature data of each first pixel point in the pixel group.
In this embodiment of the present application, after the pixel group of each first feature point is obtained, feature analysis is required to be performed on each pixel in the pixel group to obtain feature data. The pixels are made up of tiles of the image, which each have a definite position and assigned color value. Thus, various information of the image, such as resolution, pixel value, etc., can be obtained through feature extraction of the pixels.
In an application embodiment, referring to fig. 3, a flowchart of determining feature data according to an embodiment of the present application is shown in fig. 3, and one implementation of step S202 includes:
step S301, for each first pixel point, rotating the first pixel point to a preset direction, to obtain the rotated first pixel point, where the preset direction indicates a direction consistent with a fingerprint ridge line direction.
In the embodiment of the application, since the pixel points are vectors with directional sizes, before the feature analysis is performed on the pixels, the first pixel point in the pixel group needs to be calibrated in the direction, and the pixel points can be calibrated under the same coordinate system to perform the tasks such as feature extraction. When the matrix of pixel groups is obtained, the matrix groups need to contain a ridge line of a fingerprint, and the ridge line is a raised ridge line of the finger. When the first pixel points are aligned, all the first pixel points are required to be rotationally aligned along the ridge line. Because the direction of the ridge line is generally the main direction, the pixel is generally aligned according to the main direction.
Step S302, calculating a first feature and a second feature of the rotated first pixel point, where the first feature represents a distance feature along a horizontal direction of the first fingerprint image, and the second feature represents a distance feature along a vertical direction of the first fingerprint image.
In the embodiment of the present application, the features of the calibrated pixel point need to be acquired after all the first pixel points are rotated to the main direction. Illustratively, selecting a matrix of 15 x 15 from the above-obtained 17 x 17 pixel matrix set requires removing as much as possible the pixel points or feature points at the edges of the matrix set that may be bad. And calculating the distance characteristics of all pixel points in the selected 15 x 15 matrix. Because the pixel points are substantially positions in the image, in a two-dimensional image, the pixel points represent positions in the horizontal direction and the vertical direction. It is therefore necessary to acquire a distance feature of the calibrated first pixel point in the horizontal direction, i.e., a first feature, and a distance feature in the vertical direction, i.e., a second feature.
Step S303, generating feature data of the first pixel according to the first feature and the second feature of the first pixel.
In this embodiment, the foregoing distance feature may be expressed in a form that obtains partial derivatives of the first pixel in horizontal and vertical directions, and illustratively obtains partial derivatives of each pixel in the 15×15 matrix in horizontal and vertical directions, to obtain a 15×15×2=450-dimensional vector.
Step S203, generating a feature vector of the first feature point according to the feature data of the first pixel point.
In the embodiment of the present application, the 15×15×2=450-dimensional vector obtained above is normalized. Normalization is to process the data to be processed to a certain extent that you need. The normalization is first for convenience of later data processing. The specific role of normalization is to generalize the statistical distribution of the unified samples. Assuming that the number of feature points in the first fingerprint is N, normalizing the 15×15×2=450-dimensional vectors, and then storing the 450-dimensional vectors of the N feature points to obtain N×450-dimensional vectors, namely the feature vectors of the first feature points.
And step S204, performing dimension reduction processing on the feature vector of the first feature point to obtain the first vector.
In the embodiment of the application, dimension reduction refers to reducing the dimension of data features by retaining some important features, removing some redundant features, and representing the original high-dimension features by a relatively low-dimension vector. There are many methods for dimension reduction, for example, linear discriminant analysis, principal component analysis, nonlinear dimension reduction, and the like. The feature vector of the first feature point obtained by the principal component analysis method is subjected to dimension reduction.
In one embodiment, one implementation of step S203 includes:
obtaining a third vector corresponding to the feature vector of the first feature point, wherein the third vector represents the main component of the feature vector of the first feature point;
the first vector is obtained from the third vector.
In the embodiment of the present application, the third vector is generated by taking the eigenvectors corresponding to the first k maximum eigenvalues of the matrix of n×450. The eigenvalue refers to an n-order square matrix, if a number m and a non-zero n-dimensional vector x exist, so that ax=mx is satisfied, then m is referred to as an eigenvalue of a, and the non-zero n-dimensional column vector x is referred to as an eigenvector of the matrix a corresponding to the eigenvalue m. The matrix formed by extracting the eigenvectors corresponding to the first k maximum eigenvalues of the N450 matrix can reduce the dimension of the data features while retaining the important features of N450.
The feature vectors corresponding to the first k maximum feature values can form a 450 x k projection matrix, and the first feature points [ N x 450 x [450 x k ] =N x k matrix is used for reducing the dimension of the feature vector of each first feature to k dimension, wherein the k dimension vector is the PCA descriptor of the first feature points.
Step S102, a second vector of a second feature point in the second fingerprint graph is obtained, wherein the second vector represents a principal component in a feature vector of the second feature point.
The method for acquiring the second vector corresponding to the second feature point in the second fingerprint image is the same as the method for acquiring the first vector of the first feature point in the first fingerprint image, and will not be described herein.
Through the method, each first feature point of the first fingerprint image is subjected to dimension reduction processing to obtain a first vector with low dimension, and the method for obtaining the second vector corresponding to the second feature point in the second fingerprint image is the same as the method for obtaining the first vector of the first feature point in the first fingerprint image, and is not repeated here. The descriptors of the first characteristic points and the descriptors of the second characteristic points are respectively obtained through the PCA dimension reduction method, and the descriptions of the first characteristic points and the second characteristic points are utilized to match the corresponding characteristic points, so that the speed is faster and the accuracy is higher.
Step S103, calculating a first matching value between the first feature point and the second feature point according to the first vector and the second vector.
In the embodiment of the application, the PCA descriptor is used for determining the matching reference point, only the Euclidean distance between the first feature point and the second feature point, namely the first matching value, is needed to be calculated, and the Euclidean distance is used for determining the matching reference point. The euclidean distance is the most common distance measure, and is measured as the absolute distance between two points in a multidimensional space.
The Euclidean distance is a commonly used distance definition, which is the true distance between two points in the m-weft space. For example, assuming that the coordinates of two points are t1 (x 1, y 1), t2 (x 2, y 2), respectively, the Euclidean distance between the two coordinates is
It is understood that the mahalanobis distance or cosine similarity equidistant metric between the first feature point and the second feature point may also be used as the first matching value.
In an application embodiment, referring to fig. 4, a flowchart of determining matching feature points according to an embodiment of the present application is shown in fig. 4, and an implementation manner of step S103 includes:
s401, for the first feature points, determining a plurality of matching groups according to the first matching values, wherein each matching group comprises one first feature point and one second feature point.
In the embodiment of the application, euclidean distance between each feature point in the first fingerprint image and each second feature point in the second fingerprint image is calculated, and a matching group is determined according to the Euclidean distance. For example, the first fingerprint is F, the first feature points are p1, p2, and p3, respectively, the second fingerprint is T, the second feature points are q1, q2, and q3, respectively, the distances between the first feature point p1 and the second feature points q1, q2, and q3 are p1q1, p1q2, and p1q3, respectively, the distances between the first feature point p2 and the second feature points q1, q2, and q3, p2q2, and p2q3, respectively. The distance between the two feature points is a first matching value, but there are multiple sets of matching point pairs for each first feature point. As in the above example, each first feature point has three sets of matching point pairs, and for the first feature point p1, there are p1q1, p1q2, and p1q3 matching point pairs, and it is necessary to determine a set of matching point pairs with higher matching degree from the three matching point pairs as a matching group.
In one embodiment, one implementation of step S401 includes:
and calculating the ratio between the first data and the second data, wherein the first data and the second numerical value are the two smallest first matching values.
In this embodiment of the present application, the euclidean distances between each first feature point and each second feature point are sorted according to the distance sizes, and the smallest two distances are selected, so as to calculate the ratio between the two smallest distances. For example, as in the above example, the distances between the first feature point p1 and each of the remaining second feature points are p1q1, p1q2 and p1q3, the distances of the three are compared and ordered, and assuming that p1q1 < p1q2 < p1q3, the smallest p1q1 and p1q2 are selected and the ratio is calculatedIs of a size of (a) and (b).
And if the ratio is in the preset range, determining the first characteristic point and the second characteristic point corresponding to the minimum value in the first numerical value and the second numerical value as one matching group.
In the embodiments of the present application, mention is made ofA preset range of preset points is determined, if the ratio between the minimum distance and the second minimum distance is selected from the Euclidean distances, to be 0 < t < 0.8 (obtained according to practical experience), namely The pair of matching points p1q1 having the smallest p1q1 and p1q2 is determined as the reference matching point pair, i.e., the matching group. The minimum distance is a first value, and the second minimum distance is a second value. In another example, if the second smallest distance of the Euclidean distances is compared with the smallest distance, the preset range can be set to t > 1, i.e. if + ∈>The pair of matching points p1q1 having the smallest p1q1 and p1q2 is determined as the reference matching point pair, i.e., the matching group. The second minimum distance is the first value, and the minimum distance is the second value.
The determining logic for determining the matching group according to the ratio of the minimum two Euclidean distances is as follows: if the feature points are matched correctly, the feature descriptors of the feature points are similar, the Euclidean distance is far smaller than the feature points which are matched incorrectly, and therefore the reliability of the matched point can be judged by judging the ratio of the minimum distance of the feature point descriptors to the second minimum distance.
By the method, the second feature points matched with each feature point in the first fingerprint image can be quickly obtained according to the PCA descriptor.
S402, respectively calculating a second matching value corresponding to each matching group.
In the embodiment of the application, a plurality of matching groups are determined through the first matching value, final feature matching points are required to be determined from the plurality of matching groups, and image stitching is performed according to the final feature matching point pairs. The final feature matching points are determined from the multiple sets of matching groups, and the score matching method, namely the calculation of the second matching value, is also used. The calculation of the first matching value adopts Euclidean distance, and the calculation of the second matching value in the application carries out the matching of scores according to the coordinate difference of characteristics and the like. And finally, comparing the score of each matching group, and determining the matching group with the highest score as the final characteristic matching point.
In one embodiment, one implementation of step S402 includes:
for each of the matching groups, the second feature points are mapped into the first fingerprint map. And obtaining mapping characteristic points.
In this embodiment of the present application, for calculation of the second matching score, an affine matrix of each first feature point needs to be established according to the first data and the second data, and the affine matrix is used to map all second feature points in the second fingerprint map to the first fingerprint map. The affine transformation is also called affine mapping, namely, in geometry, one vector space is subjected to linear transformation, and the transformation is converted into the other vector space, so that the linear transformation from two-dimensional coordinates to two-dimensional coordinates is realized, and the straightness and the parallelism of the two-dimensional graph are maintained. And mapping all the mapping points in the second fingerprint to the first fingerprint according to the affine matrix to obtain mapped second characteristic points. And calculating a second score according to the difference between the feature points before and after affine transformation.
And calculating an attitude difference value between a fourth vector and a fifth vector, wherein the fourth vector is generated according to the second characteristic point and a target characteristic point, the fifth vector is generated according to the mapping characteristic point and the target characteristic point, and the target characteristic point is a first characteristic point in the matching group.
In the embodiment of the present application, the posture difference refers to a translation distance of the coordinates before and after affine transformation, a rotation angle, a translation distance of the coordinates of the center point, and the like. First, a third vector is required to be obtained before affine transformation is performed, wherein the third vector refers to a vector formed by the first feature point and the second feature point in the matching group.
Exemplary, the vector formed by the matching group p1q1 corresponding to the first feature point p1 in the above example isThe vector of the matching group p2q2 corresponding to the first feature point p2 is +.>Etc.)>And->And the like are collectively referred to as a third vector. Obtaining a fourth vector, wherein the fourth vector is a vector formed by the first feature point after affine transformation and the second feature point after affine transformation, and if the point after affine transformation of q1 and q2 in the second feature points is q1', q2', the fourth vector is ∈>And->
Respectively calculating offset distances of the matching point p1q1 and the coordinates of the matching point p1q1' after affine transformation, and matchingThe coordinate distance and rotation angle of the point pairs. The coordinate offset distance is calculated based on the difference in coordinates of the matching points. For example, if the coordinates of the first feature p1 are (x 2, y 2), the coordinates of the second feature point q1 before mapping are (x 3, y 3), the coordinates of the second feature point q1' after mapping are (x 3', y3 '), and the translation distance between the two coordinates is d1=x3 ' -x3, d2=y3 ' -y3. The rotation angle is obtained by a third vector and a fourth vector, and the vector +. >Is +.>Vector->Is +.>The angle between the two vectors is +.>The rotation angle between the two vectors can be obtained. The coordinate distance of the pair of matching points p1, q1' after mapping can be according to +.> 2 And (5) performing calculation. Wherein the affine matrix is as follows:
and calculating a second matching value corresponding to the matching group according to the attitude difference value.
The translation distance, the rotation angle and the coordinate distance of the matching point pair before and after mapping each first feature point and each second feature point can be calculated according to the above, and the score can be given to each first feature point according to the translation distance, the rotation angle and the coordinate distance of the matching point pair. The scoring basis is as follows; the smaller the three factor values, the higher the matching point pair score.
By the method, the second matching value corresponding to each matching group can be calculated.
And step S104, determining matching characteristic points between the first fingerprint image and the second fingerprint image according to the first matching value.
In the embodiment of the application, different scores of each matching group are obtained through the method, and a group of matching point pairs with the highest scores are determined as the matching feature points. Illustratively, if the scores of the matching groups p1q1, p2q2, and p3q2 are 90, 60, and 40, respectively, then p1q1 is determined as the final feature matching point.
And step S105, performing image stitching processing on the first fingerprint image and the second fingerprint image according to the matching feature points to obtain a target fingerprint image.
In this embodiment of the present application, referring to fig. 5, a schematic diagram of image stitching provided in an embodiment of the present application is shown in fig. 5, where a graph a represents a first fingerprint graph, b graph b represents a second fingerprint graph, and c graph c is a target fingerprint graph that is an image stitching fingerprint graph obtained by performing rotation and translation according to the obtained matching feature points.
In one embodiment, the method further comprises:
and setting an updating strategy, wherein the updating strategy updates the feature points with higher reserved components.
In the embodiment of the application, in the process of fingerprint splicing, the number of the characteristic points can be set to be an upper limit, and the characteristic points cannot be infinitely increased, so that an updating strategy is needed in the process of splicing, and good characteristic points and characteristic points with point difference are left; a3-dimensional array flag bit [3] is set for each feature point, wherein flag [0] indicates that the feature point is not in the overlapping region of two fingerprints, flag [1] indicates that the feature point is in the overlapping region, but is not a matching point, and flag [2] indicates that the feature point is in the overlapping region and is a matching point. And each time the characteristic points meet which condition, the flag bit is increased by one, but the maximum value is not more than seven. If the flag 1-flag 2 is less than 4, the feature point is removed and other feature points are reserved.
In one embodiment, the method further comprises:
acquiring a first feature point and a second feature point, wherein the first feature point comprises at least one of the following: end points, fork points and inflection points, the second feature points comprising at least one of: end points, fork points, and inflection points.
In the embodiment of the application, when feature points of the fingerprint image are extracted, minutiae (end points and cross points) are generally extracted, but as the collected fingerprint image is smaller in area, the number of minutiae is also small, and some fingerprint images may have no minutiae, so that we increase the extraction inflection points as feature points. The inflection point is the point on the fingerprint ridge where the curvature is greatest.
The application provides an image stitching method, which is used for solving PCA descriptors of each feature point in two fingerprint images and then matching the corresponding feature points by using the PCA descriptors. And the matching characteristic point pairs of the two fingerprint images are obtained by utilizing PCA descriptor comparison, so that the calculated amount is reduced, and the inflection point is added, so that the method can adapt to fingerprint image splicing with smaller size, and the splicing effect is more accurate. Characteristic point update is added in the splicing process, so that the robustness of the whole fingerprint identification system is better.
For example, assume that there are three first feature points p1, p2, and p3 in the first fingerprint to be stitched, and three second feature points q1, q2, and q3 in the second fingerprint. According to the method, p1q1, p1q2 and p1q3, p2q1, p2q2 and p2q3, p3q1, p3q2 and p3q3 can be obtained by calculating Euclidean distance between each feature point in the first feature points and the second feature points, namely a first matching value, and the smallest two distances are selected from Euclidean distances corresponding to each first feature point to conduct ratio. And if the ratio is within the preset range, taking the second characteristic point corresponding to the minimum distance as a matching point of the corresponding first characteristic point. Assuming that p1q1, p1q2 and p1q3 corresponding to the first feature point p1, if p1q1, p1q2 are the smallest and the second smallest two distances and the ratio between the two is within the preset range, p1q1 is taken as a group of matching groups. And determining the matching point corresponding to each first characteristic point through the first matching value, assuming that the obtained matching groups are p1q1, p2q2 and p3q1, and finally determining the characteristic matching point pairs in the three matching groups through calculating the second matching values. If the score of the p1q1 in the second matching value is highest, the p1q1 is the matching characteristic point, and the image stitching of the first fingerprint image and the second fingerprint image is carried out according to the p1q 1.
Therefore, the method utilizes PCA descriptors to compare and obtain the matched characteristic point pairs of the two fingerprint images, reduces the calculated amount, and adds inflection points, so that the method can adapt to fingerprint image splicing with smaller size, and the splicing effect is more accurate. Characteristic point update is added in the splicing process, so that the robustness of the whole fingerprint identification system is better.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic of each process, and should not limit the implementation process of the embodiment of the present application in any way.
Fig. 6 is a schematic structural diagram of a terminal device provided in an embodiment of the present application. As shown in fig. 6, the terminal device 6 of this embodiment includes: at least one processor 60 (only one shown in fig. 6), a memory 61 and a computer program 62 stored in the memory 61 and executable on the at least one processor 60, the processor 60 implementing the steps in any of the various image stitching method embodiments described above when executing the computer program 62.
The terminal equipment can be computing equipment such as a desktop computer, a notebook computer, a palm computer, a cloud server and the like. The terminal device may include, but is not limited to, a processor, a memory. It will be appreciated by those skilled in the art that fig. 6 is merely an example of the terminal device 6 and is not meant to be limiting as to the terminal device 6, and may include more or fewer components than shown, or may combine certain components, or different components, such as may also include input-output devices, network access devices, etc.
The processor 60 may be a central processing unit (Central Processing Unit, CPU), the processor 60 may also be other general purpose processors, digital signal processors (Digital SignalProcessor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 61 may in some embodiments be an internal storage unit of the terminal device 6, such as a hard disk or a memory of the terminal device 6. The memory 61 may in other embodiments also be an external storage device of the terminal device 6, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the terminal device 6. Further, the memory 61 may also include both an internal storage unit and an external storage device of the terminal device 6. The memory 61 is used for storing an operating system, an application program, a Boot Loader (Boot Loader), data, other programs, etc., such as program codes of the computer program. The memory 61 may also be used for temporarily storing data that has been output or is to be output.
Embodiments of the present application also provide a computer readable storage medium storing a computer program, which when executed by a processor, may implement the steps in the above-described method embodiments.
The present embodiments provide a computer program product which, when run on a terminal device, causes the terminal device to perform steps that enable the respective method embodiments described above to be implemented.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application implements all or part of the flow of the method of the above embodiments, and may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, where the computer program, when executed by a processor, may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to an apparatus/terminal device, recording medium, computer Memory, read-Only Memory (ROM), random access Memory (RAM, random Access Memory), electrical carrier signals, telecommunications signals, and software distribution media. Such as a U-disk, removable hard disk, magnetic or optical disk, etc. In some jurisdictions, computer readable media may not be electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other manners. For example, the apparatus/terminal device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.
Claims (10)
1. An image stitching method, comprising:
acquiring a first vector of a first feature point in a first fingerprint image, wherein the first vector represents a main component in a feature vector of the first feature point;
Obtaining a second vector of a second feature point in a second fingerprint graph, wherein the second vector represents a principal component in a feature vector of the second feature point;
calculating a first matching value between the first feature point and the second feature point according to the first vector and the second vector;
determining matching feature points between the first fingerprint image and the second fingerprint image according to the first matching value;
and performing image stitching processing on the first fingerprint image and the second fingerprint image according to the matching feature points to obtain a target fingerprint image.
2. The image stitching method of claim 1 wherein the obtaining a first vector of first feature points in a first fingerprint comprises:
acquiring a pixel group corresponding to the first feature point, wherein the pixel group comprises a plurality of first pixel points in the first fingerprint image;
extracting characteristic data of each first pixel point in the pixel group;
generating a feature vector of the first feature point according to the feature data of the first pixel point ;
And performing dimension reduction processing on the feature vector of the first feature point to obtain the first vector.
3. The image stitching method according to claim 2, wherein said extracting feature data of each of the first pixel points in the pixel includes:
For each first pixel point, rotating the first pixel point to a preset direction to obtain the rotated first pixel point, wherein the preset direction represents a direction consistent with the fingerprint ridge line direction;
calculating a first feature and a second feature of the rotated first pixel point, wherein the first feature represents a distance feature along the horizontal direction of the first fingerprint image, and the second feature represents a distance feature along the vertical direction of the first fingerprint image;
and generating feature data of the first pixel point according to the first feature and the second feature of the first pixel point.
4. The image stitching method according to claim 2, wherein the performing a dimension reduction process on the feature vector of the first feature point to obtain the first vector includes:
obtaining a third vector corresponding to the feature vector of the first feature point, wherein the third vector represents the main component of the feature vector of the first feature point;
the first vector is obtained from the third vector.
5. The image stitching method according to claim 1, wherein the determining the matching feature points between the first fingerprint image and the second fingerprint image according to the first matching value includes:
For the first feature points, determining a plurality of matching groups according to the first matching values, wherein each matching group comprises one first feature point and one second feature point;
respectively calculating a second matching value corresponding to each matching group;
and determining the matching characteristic points from a plurality of matching groups according to the second matching values.
6. The image stitching method according to claim 5, wherein said determining a plurality of matching groups based on said first matching value comprises:
calculating a ratio between first data and second data, wherein the first data and the second value are two first matching values with the smallest value;
and if the ratio is in the preset range, determining the first characteristic point and the second characteristic point corresponding to the minimum value in the first numerical value and the second numerical value as one matching group.
7. The image stitching method according to claim 5, wherein said calculating a second match value for each of said matching groups, respectively, comprises:
for each of the matching groups, the second feature points are mapped into the first fingerprint map. Obtaining mapping feature points;
Calculating an attitude difference value between a fourth vector and a fifth vector, wherein the fourth vector is generated according to the second feature point and a target feature point, the fifth vector is generated according to the mapping feature point and the target feature point, and the target feature point is a first feature point in the matching group;
and calculating a second matching value corresponding to the matching group according to the attitude difference value.
8. The image stitching method of claim 1, the method further comprising:
and setting an updating strategy, wherein the updating strategy updates the feature points with higher reserved components.
9. The image stitching method of claim 1, the method further comprising:
acquiring a first feature point and a second feature point, wherein the first feature point comprises at least one of the following: end points, fork points and inflection points, the second feature points comprising at least one of: end points, fork points, and inflection points.
10. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 9 when executing the computer program.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310350683.2A CN116468604A (en) | 2023-03-28 | 2023-03-28 | Image stitching method and terminal equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310350683.2A CN116468604A (en) | 2023-03-28 | 2023-03-28 | Image stitching method and terminal equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116468604A true CN116468604A (en) | 2023-07-21 |
Family
ID=87178214
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310350683.2A Pending CN116468604A (en) | 2023-03-28 | 2023-03-28 | Image stitching method and terminal equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116468604A (en) |
-
2023
- 2023-03-28 CN CN202310350683.2A patent/CN116468604A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Emambakhsh et al. | Nasal patches and curves for expression-robust 3D face recognition | |
Alvarez-Betancourt et al. | A keypoints-based feature extraction method for iris recognition under variable image quality conditions | |
US10192098B2 (en) | Palm print image matching techniques | |
Wang et al. | Weber local descriptors with variable curvature gabor filter for finger vein recognition | |
Cao et al. | Similarity based leaf image retrieval using multiscale R-angle description | |
Jia et al. | Hierarchical projective invariant contexts for shape recognition | |
KR100950776B1 (en) | Method of face recognition | |
Xiao et al. | Extracting palmprint ROI from whole hand image using straight line clusters | |
Pandey et al. | ASRA: Automatic singular value decomposition-based robust fingerprint image alignment | |
CN110427826B (en) | Palm recognition method and device, electronic equipment and storage medium | |
Khodadoust et al. | Partial fingerprint identification for large databases | |
Ali et al. | Speeded up robust features for efficient iris recognition | |
El-Abed et al. | Quality assessment of image-based biometric information | |
Tiwari et al. | A palmprint based recognition system for smartphone | |
Anand et al. | Pore-based indexing for fingerprints acquired using high-resolution sensors | |
Al Tamimi et al. | Offline signature recognition system using oriented FAST and rotated BRIEF | |
Yanbin et al. | Human face feature extraction and recognition base on SIFT | |
WO2012138004A1 (en) | Fingerprint authentication device using pca, and method therefor | |
CN111582142B (en) | Image matching method and device | |
Zhu et al. | Hand dorsal vein recognition based on shape representation of the venous network | |
Oldal et al. | Biometric Authentication System based on Hand Geometry and Palmprint Features. | |
Xie et al. | Optical and SAR image registration using complexity analysis and binary descriptor in suburban areas | |
Khongkraphan | An efficient fingerprint matching by multiple reference points | |
CN116468604A (en) | Image stitching method and terminal equipment | |
Kang et al. | Fast representation based on a double orientation histogram for local image descriptors |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |