CN116363401A - Image matching method and device, electronic equipment and chip - Google Patents

Image matching method and device, electronic equipment and chip Download PDF

Info

Publication number
CN116363401A
CN116363401A CN202310357208.8A CN202310357208A CN116363401A CN 116363401 A CN116363401 A CN 116363401A CN 202310357208 A CN202310357208 A CN 202310357208A CN 116363401 A CN116363401 A CN 116363401A
Authority
CN
China
Prior art keywords
matched
image
templates
matching
template
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310357208.8A
Other languages
Chinese (zh)
Inventor
陈世林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chipone Technology Beijing Co Ltd
Original Assignee
Chipone Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chipone Technology Beijing Co Ltd filed Critical Chipone Technology Beijing Co Ltd
Priority to CN202310357208.8A priority Critical patent/CN116363401A/en
Publication of CN116363401A publication Critical patent/CN116363401A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Collating Specific Patterns (AREA)

Abstract

The disclosure relates to an image matching method, an image matching device, electronic equipment and a chip. The image matching method comprises the steps of obtaining an image to be matched; matching the image to be matched with a plurality of templates; under the condition that the matching of the image to be matched with one template in the templates is successful and the matching of the image to be matched with other templates is failed, checking the matching based on the overlapping rate of the image to be matched and the other templates; and under the condition that the verification is passed, determining that the image to be matched is successfully matched. According to the embodiment of the disclosure, on the basis of successful matching of the single template, the matching verification is performed based on the overlapping rate of the image to be matched and other templates, so that the matching accuracy is improved, the mismatching rate is reduced, and the high-precision matching of the image is realized.

Description

Image matching method and device, electronic equipment and chip
Technical Field
The disclosure relates to the field of image processing, and in particular relates to an image matching method, an image matching device, electronic equipment and a chip.
Background
The most basic matching technology in the field of image matching comprises a template matching method and a feature matching method. The template matching method is to set one or more target templates and then carry out traversing mobile matching with the images to be matched so as to judge whether the templates and the images to be matched can be matched. The feature matching method is to take certain specific structures in the image as features and then match the features by utilizing the information such as the structures or the positions of the features, and relates to two aspects of feature extraction and feature matching.
The small-size image matching mainly refers to the situation that a plurality of images are input at one time, but only one time is acquired during matching (such as fingerprint matching), and the matching accuracy requirement is high. In the current small-size image matching system, a method of inputting a plurality of templates in advance and matching the templates with newly acquired samples one by one is generally adopted. If the single sub-template is successfully matched, judging the matching as successful; otherwise, if no sub-template is successfully matched with the new sample, the matching is failed.
But are limited by the size and shape of the acquisition device, this matching method has the following problems: if a new sample is collected in a collection device of smaller size or unsuitable shape (e.g., a narrow strip collection device), templates collected from different sources may have a greater similarity to the sample in some areas, resulting in a false match result.
How to adjust and optimize the image matching process, improve the matching accuracy and realize the high-precision matching of images is a problem to be solved at present.
Disclosure of Invention
In view of this, the disclosure proposes an image matching method, an apparatus, an electronic device, and a chip.
According to an aspect of the present disclosure, there is provided an image matching method including:
acquiring an image to be matched;
matching the image to be matched with a plurality of templates;
under the condition that the matching of the image to be matched with one template in the templates is successful and the matching of the image to be matched with other templates is failed, checking the matching based on the overlapping rate of the image to be matched and the other templates;
and under the condition that the verification is passed, determining that the image to be matched is successfully matched.
In one possible implementation manner, the verifying the matching based on the overlapping rate of the image to be matched and other templates includes:
determining the position relation between the image to be matched and the templates;
determining the overlapping rate of the image to be matched and other templates according to the position relation;
and checking the matching based on the overlapping rate of the image to be matched and other templates.
In one possible implementation manner, the verifying the matching based on the overlapping ratio of the image to be matched and other templates further includes:
and if the overlapping rates of the image to be matched and other templates are smaller than a given threshold, checking to pass, and determining that the image to be matched is successfully matched.
In one possible implementation manner, the verifying the matching based on the overlapping ratio of the image to be matched and other templates further includes:
if templates with the overlapping rate of the template and the image to be matched not smaller than a given threshold value exist in the other templates, re-matching the template with the image to be matched, wherein the overlapping rate of the template is not smaller than the given threshold value;
and if the overlapping rate is not smaller than the given threshold, templates with a given proportion or more than a given number which are successfully matched with the images to be matched exist in the templates, checking to pass, and determining that the images to be matched are successfully matched.
In one possible implementation manner, the verifying the matching based on the overlapping ratio of the image to be matched and other templates further includes:
if the overlapping rate is not smaller than the given threshold, the templates are not successfully matched with the images to be matched, and the verification is failed, so that the failure of matching the images to be matched is determined.
In one possible implementation, determining the positional relationship between the image to be matched and the plurality of templates includes:
acquiring the position relation between the template successfully matched and the image to be matched in the matching process;
combining the position relation between the successfully matched template and the image to be matched and the position relation among the given templates to determine the position relation between the image to be matched and the templates
In one possible implementation, the method further includes:
if the image to be matched is successfully matched with two or more templates in the templates, determining that the image to be matched is successfully matched;
and if the image to be matched is not successfully matched with the templates, determining that the image to be matched is failed to be matched.
According to another aspect of the present disclosure, there is provided an image matching apparatus including:
the acquisition module is used for acquiring the images to be matched;
the matching module is used for matching the image to be matched with a plurality of templates;
the verification module is used for verifying the matching based on the overlapping rate of the image to be matched and other templates under the condition that the image to be matched is successfully matched with one template in the plurality of templates and the matching of the image to be matched with the other templates is failed;
and the determining module is used for determining that the images to be matched are successfully matched under the condition that the verification is passed.
According to another aspect of the present disclosure, there is provided an electronic device including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to implement the above-described method when executing the instructions stored by the memory.
According to another aspect of the present disclosure, there is provided a chip comprising the above device.
Based on this, the present disclosure provides an image matching method. Firstly, an image to be matched is obtained, and then the image to be matched is matched with a plurality of templates. If the matching of the image to be matched with one template in the templates is successful and the matching of the image to be matched with the other templates is failed, checking the matching based on the overlapping rate of the image to be matched and the other templates; and under the condition that the verification is passed, determining that the image to be matched is successfully matched. According to the method and the device, on the basis of successful single-template matching, multi-template verification is performed once again, the matching accuracy is improved through verification, the mismatching rate is reduced, and therefore high-precision matching of images is achieved.
Other features and aspects of the present disclosure will become apparent from the following detailed description of exemplary embodiments, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate exemplary embodiments, features and aspects of the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 shows a schematic diagram where the sample does not match the template, but there is a greater similarity in the partial regions.
Fig. 2 shows a flowchart of an image matching method according to an embodiment of the present disclosure.
Fig. 3a shows a flow chart of a match check according to an embodiment of the present disclosure.
Fig. 3b shows a schematic diagram of determining a positional relationship between an image to be matched and a plurality of templates according to an embodiment of the present disclosure.
Fig. 4 illustrates a block diagram of an electronic device 1900 according to an embodiment of the disclosure.
Detailed Description
Various exemplary embodiments, features and aspects of the disclosure will be described in detail below with reference to the drawings. In the drawings, like reference numbers indicate identical or functionally similar elements. Although various aspects of the embodiments are illustrated in the accompanying drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
In the description of the present disclosure, it should be understood that the terms "length," "width," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like indicate orientations or positional relationships based on the orientation or positional relationships shown in the drawings, merely to facilitate description of the present disclosure and simplify the description, and do not indicate or imply that the devices or elements referred to must have a particular orientation, be configured and operated in a particular orientation, and thus should not be construed as limiting the present disclosure.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present disclosure, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
In the present disclosure, unless explicitly specified and limited otherwise, the terms "mounted," "connected," "secured," and the like are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally formed; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communicated with the inside of two elements or the interaction relationship of the two elements. The specific meaning of the terms in this disclosure will be understood by those of ordinary skill in the art as the case may be.
The term "and/or" is herein merely an association relationship describing an associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C.
In some application scenarios, matching between small-sized images may be involved. For example, in a fingerprint identification scenario, a smaller-sized collected fingerprint image needs to be matched with a pre-stored fingerprint template, so as to realize fingerprint identification.
The following problems exist in the current small-size image matching process: if a new sample is collected in a smaller or improperly shaped collection device (e.g., a narrow strip collection device), then the template and the sample image may have a greater similarity in some areas. FIG. 1 is a schematic diagram of a sample not matching a template, but with a greater similarity in partial regions. The two images are actually acquired from different image sources, but have larger overlapping areas and have high similarity. Due to the limitation of the performance of the matching algorithm, in the process of matching the image with the template, the sample and the template in the condition shown in fig. 1 are easy to be mismatched.
Based on the above, the embodiment of the disclosure provides an image matching method. The embodiment of the disclosure firstly obtains an image to be matched, and then matches the image to be matched with a plurality of templates. If the matching of the image to be matched and one template in the templates is successful and the matching of the image to be matched and the other templates is failed, checking the matching based on the overlapping rate of the image to be matched and the other templates; and under the condition that the verification is passed, determining that the image to be matched is successfully matched. According to the embodiment of the disclosure, on the basis of successful matching of the single template, the matching verification is performed based on the overlapping rate of the image to be matched and other templates, so that the matching accuracy is improved, the mismatching rate is reduced, and the high-precision matching of the image is realized.
The embodiment of the disclosure can be widely applied to most of image and texture matching terminal equipment, including but not limited to bar codes, fingerprints, characters, invariant moment images and other image matching terminal equipment. According to the image matching method, on the premise that single fingerprint templates are successfully matched, matching verification is performed again according to the overlapping rate of images to be matched (such as fingerprint samples) and other fingerprint templates, and therefore accuracy of fingerprint matching is improved.
Fig. 2 shows a flowchart of an image matching method according to an embodiment of the present disclosure, which may include, as shown in fig. 2:
step S100, obtaining an image to be matched;
the embodiment of the disclosure can acquire the image to be matched by acquiring the image of the target to be matched through the terminal equipment with the image acquisition function. The specific kind of the target to be matched and the mode of acquiring the image to be matched are not limited in the embodiment of the disclosure. In a specific embodiment, this step may be to record, by a terminal device with a fingerprint collection function (for example, a smart phone, a fingerprint attendance machine, etc.), a fingerprint sample to be matched, denoted as S.
Step S200, matching the image to be matched with a plurality of templates;
the templates refer to data which is recorded in advance and is used for carrying out multiple collection on the same target, and the data can be the results of carrying out collection on the same target under different conditions such as different environments, different angles, different times and the like. The pre-entered templates should be sufficient and there should be no duplication between templates. The embodiment of the disclosure can calculate the position relation among a plurality of templates in the process of template registration, and take the result as the position relation among the given plurality of templates. The calculation method of the positional relationship between the templates in the embodiment of the present disclosure is not limited.
The positional relationship may be represented by a projective transformation relationship between spaces in which templates are located, for example, represented as a transformation matrix. In a specific embodiment, the plurality of templates may be a plurality of fingerprint templates collected for the same finger, and are denoted as T0, T1, T2, and T3 … …, where any fingerprint template T0 is taken as an example, and the positional relationship between T0 and other fingerprint templates is denoted as A1, A2, and A3 … …, where A1 is a change matrix used for projecting the space where T0 is located into the space where T1 is located; a2 is a change matrix used by the space where T0 is located in projection to the space where T2 is located; a3 is the change matrix used by the space where T0 is located projected to the space where T3 is located, and so on … …
The embodiment of the disclosure can match the image to be matched with a plurality of templates by using a matching mode in related technologies such as a feature matching algorithm. Further, in the embodiment of the present disclosure, a positional relationship between a successfully matched template and an image to be matched may be obtained according to returned matching information in a matching process, for example, a change matrix As used by a space where the image to be matched is projected to a space where the successfully matched template is located. Feature matching is an important technology in the field of computer vision, and can find corresponding feature points in two or more images, so that tasks such as image matching, splicing, tracking and the like are completed. In a general case, the image feature matching algorithm flow comprises three steps of feature extraction, feature description and feature matching. In the feature extraction step, the feature matching algorithm extracts key point information from the picture, wherein the key point information comprises the positions of the key points, so that the feature matching algorithm can determine the position relationship between images while realizing image matching. The embodiment of the disclosure does not limit the feature extraction, feature description, feature matching method and decision model related to the matching information extraction method and feature matching flow used in the step.
In a specific embodiment, the step can perform feature matching on the fingerprint sample S to be matched and a plurality of fingerprint templates, and in the matching process, obtain the position relationship between the successfully matched fingerprint template and the fingerprint sample according to the returned matching information. The feature matching algorithm flow may involve a SURF algorithm, SIFT algorithm, FAST algorithm, ORB algorithm, etc.
Optionally, in step S300, if the matching between the image to be matched and two or more templates in the plurality of templates is successful, it is determined that the matching between the image to be matched is successful;
if the image to be matched is successfully matched with two or more templates in the templates, the embodiment of the disclosure can directly determine that the image to be matched is successfully matched.
Optionally, in step S400, if the image to be matched is not matched with the templates successfully, it is determined that the image to be matched fails to match.
If the image to be matched is not successfully matched with the templates, the embodiment of the disclosure can directly determine that the image to be matched fails to match.
Step S500, when the matching of the image to be matched with one template in the templates is successful and the matching of the image to be matched with other templates is failed, checking the matching based on the overlapping rate of the image to be matched and the other templates; and under the condition that the verification is passed, determining that the image to be matched is successfully matched.
In the case that only one template of the templates is successfully matched, the embodiment of the disclosure does not directly determine whether the images to be matched are successfully matched, but checks the matching result according to the situation. According to the embodiment of the disclosure, on the basis that only one template of the templates is successfully matched, the matching verification is performed based on the overlapping rate of the image to be matched and other templates, so that the matching accuracy is improved, and the high-precision matching of the images is realized.
Fig. 3a shows a flow chart of a match check according to an embodiment of the present disclosure. As shown in fig. 3a, the matching verification process described in step S500 may include:
step S501, determining the positional relationship between the image to be matched and the templates
In one possible manner, the steps may include: acquiring the position relation between the template successfully matched and the image to be matched in the matching process; and combining the position relation between the template successfully matched and the image to be matched and the position relation among a plurality of given templates to determine the position relation between the image to be matched and the templates.
According to step S200, the positional relationship among a plurality of templates can be obtained in the process of template registration. According to the embodiment of the disclosure, the position relationship between the successfully matched template and the image to be matched can be obtained according to the returned matching information in the matching process. And combining the position relationship between the successfully matched template and the image to be matched and the position relationship between the given templates, and determining the position relationship between the image to be matched and the templates. In actual calculation, the position relationship between the image to be matched and the templates can be obtained by calculating the transfer relationship of the transformation matrix, and the method for expressing the change relationship used in the step is not limited in the embodiment of the disclosure (the expression methods such as the transformation matrix, or the rotation translation parameter and the like can be used).
Fig. 3b shows a schematic diagram of determining a positional relationship between an image to be matched and a plurality of templates according to an embodiment of the present disclosure. In one embodiment, taking the example of determining the positional relationship between the fingerprint sample and the plurality of fingerprint templates, the positional relationship between the plurality of fingerprint templates can be obtained when the fingerprint templates are registered. The fingerprint template which is only successfully matched with the sample is marked As T0, and the position relation between the fingerprint sample S and other fingerprint templates T1, T2 and T3 … … is calculated according to the position relation between the fingerprint sample S and the fingerprint template T0 which is only successfully matched with the fingerprint sample and the position relation between the given fingerprint template T0 and the other templates T1, T2 and T3 … …, and the position relation between the fingerprint sample S and the other fingerprint templates T1, T2 and T3 … … is marked As As1, as2 and As3 … …, wherein As1 is a change matrix used by projecting the space where S is located into the space where T1 is located; as2 is a change matrix used by the space where S is located projected to the space where T2 is located; as3 is the change matrix used by projection of the space where S is located to the space where T3 is located, and so on … …
In practical calculations, as, A1, A2, A3 … … are known, and As1, as2, as3 … … can be calculated by the transfer relation of the transformation matrix, i.e., asx=ax. Examples used herein are the left-hand matrix (x '=ax), and if the right-hand matrix is (x' =xa), i.e. asx=as×ax, where x=1, 2, 3, 4 … …
Step S502, determining the overlapping rate of the image to be matched and other templates according to the position relation.
Step S503, based on the relation between the overlapping rate of the image to be matched and other templates and a given threshold value, the matching is checked.
The embodiment of the disclosure can determine the overlapping rate of the image to be matched and other templates by combining the position relationship between the image to be matched and other templates obtained in the previous step and the sizes of the two images. The overlap ratio may indicate that the image to be matched and the template have overlapping pixel positions in a partial region, and the overlapping may be regarded as a similarity. The specific method can comprise the following steps: according to the pixel coordinates of the images and the position relation (such as a transformation matrix) among the images, all pixels of one image are projected into the coordinate system of the other side, the number of pixels of the image falling into the coverage area of the other side is counted, the size of an overlapping area of the two images is obtained through the number of overlapped pixels, and then the overlapping rate (the overlapping rate=the size of the overlapping area/the size of the images, wherein the size can be expressed by the number of pixels) is obtained according to the sizes of the two images. Further, the embodiment of the disclosure can set an overlapping rate judgment threshold value, and verify matching based on the size relation between the overlapping rate of the image to be matched and other templates and the given threshold value. Further, on the premise of ensuring efficiency and controlling overhead, the number of times of performing the matching verification process in the embodiment of the present disclosure may not be limited. In this way, the embodiment of the disclosure calculates the overlapping rate of the image to be matched and other templates, and performs matching verification based on the overlapping rate, thereby improving the matching accuracy, reducing the mismatching rate caused by the fact that the sample and the template partial region have larger similarity due to factors such as undersize and narrowness of the image acquisition size, and further realizing high-precision matching and matching of the image.
And step S504, if the overlapping rates of the image to be matched and other templates are smaller than a given threshold, checking to pass, and determining that the image to be matched is successfully matched.
Under the condition that the overlapping rate of the image to be matched and other templates is smaller than a given threshold sample, the embodiment of the disclosure confirms that the verification is passed and determines that the image to be matched is successfully matched. And the overlapping rate of the image to be matched and other templates is smaller than a given threshold value, which indicates that no overlapping area exists between the image to be matched and other templates or the overlapping area is too small. The problems that the target to be matched is too close to the edge, the acquisition strength is too large and the like can exist in the process of acquiring the sample image. In this case, no other template than the specific template that can be successfully matched with the image to be matched can be successfully matched with the specific template, and the other template has no overlapping relationship with the image to be matched or has an overlapping rate smaller than the threshold value. That is, the template which is not successfully matched previously is overlapped with the image to be matched in the pixel position very little, and the conclusion of the template and the image to be matched is consistent without contradiction. In this case, the result of the matching, i.e. the matching check is considered to pass, and the matching of the images to be matched is successful.
In step S505, if there is a template with an overlapping rate with the image to be matched not smaller than the given threshold value in the other templates, the template with the overlapping rate not smaller than the given threshold value is re-matched with the image to be matched.
The single feature matching algorithm has unavoidable performance problems, and the possibility that matching is successful but matching fails exists. After determining the overlapping rate of the image to be matched and other templates, if the template with the overlapping rate not smaller than a given threshold value exists in other templates which are failed to be matched with the image to be matched in the first matching, for example, the overlapping rate of one or more templates and the image to be matched is larger than or equal to the given threshold value exists in other templates which are failed to be matched, the template and the sample are overlapped in a larger position in a partial area, and the possibility that the template should be successfully matched with the image to be matched but the matching is failed exists in the other templates. In this case, the embodiment of the disclosure identifies that the matching result obtained previously in which the image to be matched is successfully matched with only one template is doubtful, and re-matches the template with the image to be matched, where the overlapping rate is not less than a given threshold, to perform matching verification. Further, on the premise of ensuring efficiency and controlling overhead, the number of times of performing the re-matching is not limited in the embodiments of the present disclosure.
Step S506, if the overlapping rate is not smaller than the given threshold, and templates with a given proportion or more than a given number which are successfully matched with the images to be matched exist, checking to pass, and determining that the images to be matched are successfully matched.
The embodiment of the disclosure may preset a proportion standard or a quantity standard of the template successfully re-matched, which is used for judging whether the verification is passed or not, and the given proportion or the given quantity is not specifically limited. The above description of the embodiments of the present disclosure includes this number. If there are templates with a given proportion or more that match the image to be matched successfully in the templates with the overlapping rate not less than the given threshold, for example, assuming that the given number is 2, if the overlapping rate of 3 templates with the image to be matched is greater than the given threshold in the templates with the matching failure, 2 or 3 templates are matched with the image to be matched again, and above the given number of 2, it is indicated that there are other templates which can match the image to be matched successfully in addition to the template with the matching success for the first time, and that there is a larger overlapping area between the templates with the matching success and the image to be matched in the templates with the matching success. In this case, the disclosed embodiments recognize that the result of the matching is correct. Further, the embodiment of the disclosure recognizes that the matching check is passed, and determines that the matching of the images to be matched is successful. In this way, the embodiment of the disclosure performs matching verification based on the overlapping rate, improves the matching accuracy, reduces the error matching rate caused by the fact that the sample and the template partial area have larger similarity due to factors such as undersize and narrowness of the image acquisition, and accordingly realizes high-precision matching and matching of images.
And step S507, if the templates with the overlapping rate not smaller than the given threshold cannot be successfully matched with the images to be matched, checking fails, and determining that the images to be matched are failed to be matched.
The overlapping rate can mark that the images to be matched and the templates are overlapped in the positions of the partial areas, so that the images to be matched and the templates have certain similarity. The overlapping rate of the template is not smaller than a given threshold value, which indicates that the template overlaps with the image to be matched in a part of the region in a larger position. If the template with the overlapping rate not smaller than the given threshold value is not successfully matched with the image to be matched, the template is proved to have larger position overlapping with the image to be matched in a partial area, but the template is not matched. This is probably the situation that the images to be matched from different image sources are not matched with the template due to factors such as undersize and too narrow image acquisition in the background technology, but have larger overlapping positions in partial areas. In this case, the embodiment of the disclosure may identify that the matching result of the single template matching success in the first matching is a false matching, and the matching verification fails, that is, it is determined that the image to be matched and the templates are not successfully matched, and it is determined that the image to be matched fails to match. In this way, the embodiment of the disclosure performs matching verification based on the overlapping rate, improves the matching accuracy, reduces the error matching rate caused by the fact that the sample and the template partial area have larger similarity due to factors such as undersize and narrowness of the image acquisition, and accordingly realizes high-precision matching of images.
An application example of the present disclosure is given below taking a fingerprint identification scenario as an example. In an exemplary application scenario, a plurality of fingerprint templates T0, T1, T2, T3 and … … of the same finger are input in advance through terminal devices capable of fingerprint collection and matching, such as a smart phone and a fingerprint attendance machine, and when a fingerprint image S to be matched is collected:
and performing feature matching on the fingerprint image S to be matched and a plurality of fingerprint templates T0, T1, T2 and T3 … ….
Under the condition that the matching of the fingerprint image S to be matched with one of the fingerprint templates is successful and the matching of the fingerprint image S to be matched with the other fingerprint templates is failed, assuming that the fingerprint template which is successfully matched is T0, acquiring the position relation between the fingerprint template T0 which is successfully matched and the fingerprint image S to be matched according to the returned matching information, and checking the matching based on the overlapping rate of the fingerprint image S to be matched and the other fingerprint templates, wherein the checking process can comprise the following steps:
and obtaining the position relationship between the fingerprint image S to be matched and other fingerprint templates according to the position relationship among the fingerprint templates and the position relationship between the successfully matched fingerprint template T0 and the fingerprint image S to be matched. And determining the overlapping rate of the fingerprint image S to be matched with other fingerprint templates according to the position relation between the fingerprint image S to be matched and the other fingerprint templates and the sizes of the two images, further judging the size relation between the fingerprint image S to be matched and the overlapping rate of the other fingerprint templates, and carrying out matching verification according to the result.
If the overlapping rate of the fingerprint image S to be matched and other fingerprint templates is smaller than the given threshold, the overlapping rate of the fingerprint templates which are not successfully matched with the fingerprint image S to be matched in the prior art on the pixel position is small. In this case, the result of the matching, i.e. the matching verification is considered to pass, and the fingerprint image S to be matched is successfully matched.
And if the fingerprint templates with the overlapping rate of the fingerprint templates with the fingerprint image S to be matched not smaller than the given threshold value exist in the other fingerprint templates, re-matching the fingerprint templates with the fingerprint image S to be matched, wherein the overlapping rate of the fingerprint templates with the fingerprint image S to be matched is not smaller than the given threshold value.
If the overlapping rate is not smaller than the given threshold, there are templates with a given proportion or more than a given number that are successfully matched with the fingerprint image S to be matched, it means that there are other templates that can be successfully matched with the fingerprint image S to be matched besides the template that is successfully matched for the first time, and there are large overlapping areas between the fingerprint templates that are successfully matched with the fingerprint image S to be matched, that is, there are two or more than two fingerprint templates that are successfully matched with the fingerprint image S to be matched in all the templates. In this case, the matching check passes, and it is determined that the fingerprint image S to be matched is successfully matched.
If the overlapping rate is not smaller than the given threshold, the fingerprint template is not successfully matched with the fingerprint image S to be matched, and the fingerprint image S to be matched is determined to be failed in matching if the fingerprint template is not successfully checked. This situation is likely to be the situation that the fingerprint image S to be matched from different image sources is not matched with the fingerprint template due to factors such as undersize and too narrow image acquisition in the background technology, but has larger overlapping positions in partial areas. In this case, the matching result of the single fingerprint template matching success in the first matching is considered as false matching, the matching check is not passed, and the matching failure of the image S to be matched is determined.
If the fingerprint image S to be matched is successfully matched with two or more fingerprint templates in the plurality of fingerprint templates, the fingerprint image S to be matched is determined to be successfully matched.
If the fingerprint image S to be matched is not successfully matched with the fingerprint templates, determining that the fingerprint image S to be matched is failed to be matched.
According to an embodiment of the present disclosure, there is provided an image matching apparatus including:
the acquisition module is used for acquiring the images to be matched;
the matching module is used for matching the image to be matched with a plurality of templates;
the verification module is used for verifying the matching based on the overlapping rate of the image to be matched and other templates under the condition that the image to be matched is successfully matched with one template in the plurality of templates and the matching of the image to be matched with the other templates is failed;
and the determining module is used for determining that the images to be matched are successfully matched under the condition that the verification is passed.
In one possible implementation manner, the verifying the matching based on the overlapping rate of the image to be matched and other templates includes:
determining the position relation between the image to be matched and the templates;
determining the overlapping rate of the image to be matched and other templates according to the position relation;
and checking the matching based on the overlapping rate of the image to be matched and other templates.
In one possible implementation manner, the verifying the matching based on the overlapping rate of the image to be matched and other templates includes:
and if the overlapping rates of the image to be matched and other templates are smaller than a given threshold, checking to pass, and determining that the image to be matched is successfully matched.
In one possible implementation manner, the verifying the matching based on the overlapping ratio of the image to be matched and other templates further includes:
if templates with the overlapping rate of the template and the image to be matched not smaller than a given threshold value exist in the other templates, re-matching the template with the image to be matched, wherein the overlapping rate of the template is not smaller than the given threshold value;
and if the overlapping rate is not smaller than the given threshold, templates with a given proportion or more than a given number which are successfully matched with the images to be matched exist in the templates, checking to pass, and determining that the images to be matched are successfully matched.
In one possible implementation manner, the verifying the matching based on the overlapping ratio of the image to be matched and other templates further includes:
if the overlapping rate is not smaller than the given threshold, the templates are not successfully matched with the images to be matched, and the verification is failed, so that the failure of matching the images to be matched is determined.
In one possible implementation manner, the determining the positional relationship between the image to be matched and the templates includes:
acquiring the position relation between the template successfully matched and the image to be matched in the matching process;
combining the position relation between the successfully matched template and the image to be matched and the position relation among the given templates to determine the position relation between the image to be matched and the templates
In one possible implementation, the apparatus further includes:
the matching success module is used for determining that the image to be matched is successfully matched if the image to be matched is successfully matched with two or more templates in the templates;
and the matching failure module is used for determining that the matching of the image to be matched fails if the image to be matched is not successfully matched with the templates.
There is provided, in accordance with an embodiment of the present disclosure, an electronic device including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to implement the above-described method when executing the instructions stored by the memory.
According to an embodiment of the present disclosure, there is provided a chip including the above-described apparatus.
Specific embodiments of the above-described apparatus, electronic device, chip may be found in the methods section.
The electronic device may also be a User Equipment (UE), a mobile device, a User terminal, a handheld device, a computing device, or a vehicle mounted device, and examples of some terminals are: a display, a Smart Phone or portable device, a Mobile Phone (Mobile Phone), a tablet, a notebook, a palm top, a Mobile internet device (Mobile Internetdevice, MID), a wearable device, a Virtual Reality (VR) device, an Augmented Reality (AR) device, a wireless terminal in industrial control (Industrial Control), a wireless terminal in unmanned (self driving), a wireless terminal in teleoperation (Remote medical Surgery), a wireless terminal in Smart Grid (Smart Grid), a wireless terminal in transportation security (Transportation Safety), a wireless terminal in Smart City (Smart City), a wireless terminal in Smart Home (Smart Home), a wireless terminal in the internet of vehicles, and the like.
Fig. 4 illustrates a block diagram of an electronic device 1900 according to an embodiment of the disclosure. For example, electronic device 1900 may be provided as a server or terminal device. Referring to FIG. 4, electronic device 1900 includes a processing component 1922 that further includes one or more processors and memory resources represented by memory 1932 for storing instructions, such as application programs, that can be executed by processing component 1922. The application programs stored in memory 1932 may include one or more modules each corresponding to a set of instructions. Further, processing component 1922 is configured to execute instructions to perform the methods described above.
The electronic device 1900 may also include a power component 1926 configured to perform power management of the electronic device 1900, a wired or wireless network interface 1950 configured to connect the electronic device 1900 to a network, and an input/output (I/O) interface 1958. The electronic device 1900 may operate based on an operating system stored in memory 1932, such as Windows Server, mac OS XTM, unixTM, linuxTM, freeBSDTM, or the like.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as memory 1932, including computer program instructions executable by processing component 1922 of electronic device 1900 to perform the methods described above.
The foregoing is merely exemplary embodiments of the present invention and is not intended to limit the scope of the invention, which is defined by the appended claims.
The word "exemplary" is used herein to mean "serving as an example, embodiment, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The foregoing description of the embodiments of the present disclosure has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various embodiments described. The terminology used herein was chosen in order to best explain the principles of the embodiments, the practical application, or the improvement of technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (10)

1. A method of image matching, the method comprising:
acquiring an image to be matched;
matching the image to be matched with a plurality of templates;
under the condition that the matching of the image to be matched with one template in the templates is successful and the matching of the image to be matched with other templates is failed, checking the matching based on the overlapping rate of the image to be matched and the other templates;
and under the condition that the verification is passed, determining that the image to be matched is successfully matched.
2. The image matching method according to claim 1, wherein the verifying the matching based on the overlapping ratio of the image to be matched and other templates comprises:
determining the position relation between the image to be matched and the templates;
determining the overlapping rate of the image to be matched and other templates according to the position relation;
and checking the matching based on the overlapping rate of the image to be matched and other templates.
3. The image matching method according to claim 1 or 2, wherein the verifying the matching based on the overlapping ratio of the image to be matched and other templates comprises:
and if the overlapping rates of the image to be matched and other templates are smaller than a given threshold, checking to pass, and determining that the image to be matched is successfully matched.
4. The image matching method according to claim 1 or 2, wherein the verifying the matching based on the overlapping ratio of the image to be matched and other templates further comprises:
if templates with the overlapping rate of the template and the image to be matched not smaller than a given threshold value exist in the other templates, re-matching the template with the image to be matched, wherein the overlapping rate of the template is not smaller than the given threshold value;
and if the overlapping rate is not smaller than the given threshold, templates with a given proportion or more than a given number which are successfully matched with the images to be matched exist in the templates, checking to pass, and determining that the images to be matched are successfully matched.
5. The image matching method according to claim 4, wherein the verifying the matching based on the overlapping ratio of the image to be matched and other templates further comprises:
if the overlapping rate is not smaller than the given threshold, the templates are not successfully matched with the images to be matched, and the verification is failed, so that the failure of matching the images to be matched is determined.
6. The image matching method according to claim 2, wherein the determining the positional relationship between the image to be matched and the plurality of templates includes:
acquiring the position relation between the template successfully matched and the image to be matched in the matching process;
and combining the position relation between the successfully matched template and the image to be matched and the position relation among the given templates to determine the position relation among the image to be matched and the templates.
7. The image matching method according to claim 1, characterized in that the method further comprises:
if the image to be matched is successfully matched with two or more templates in the templates, determining that the image to be matched is successfully matched;
and if the image to be matched is not successfully matched with the templates, determining that the image to be matched is failed to be matched.
8. An image matching apparatus, the apparatus comprising:
the acquisition module is used for acquiring the images to be matched;
the matching module is used for matching the image to be matched with a plurality of templates;
the verification module is used for verifying the matching based on the overlapping rate of the image to be matched and other templates under the condition that the image to be matched is successfully matched with one template in the plurality of templates and the matching of the image to be matched with the other templates is failed;
and the determining module is used for determining that the images to be matched are successfully matched under the condition that the verification is passed.
9. An electronic device, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to implement the method of any one of claims 1 to 7 when executing the instructions stored by the memory.
10. A chip comprising the apparatus of claim 8.
CN202310357208.8A 2023-04-04 2023-04-04 Image matching method and device, electronic equipment and chip Pending CN116363401A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310357208.8A CN116363401A (en) 2023-04-04 2023-04-04 Image matching method and device, electronic equipment and chip

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310357208.8A CN116363401A (en) 2023-04-04 2023-04-04 Image matching method and device, electronic equipment and chip

Publications (1)

Publication Number Publication Date
CN116363401A true CN116363401A (en) 2023-06-30

Family

ID=86907197

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310357208.8A Pending CN116363401A (en) 2023-04-04 2023-04-04 Image matching method and device, electronic equipment and chip

Country Status (1)

Country Link
CN (1) CN116363401A (en)

Similar Documents

Publication Publication Date Title
CN108764024B (en) Device and method for generating face recognition model and computer readable storage medium
KR101877570B1 (en) Apparatus for setting parking position based on around view image and method thereof
EP4044146A1 (en) Method and apparatus for detecting parking space and direction and angle thereof, device and medium
CN108230383B (en) Hand three-dimensional data determination method and device and electronic equipment
EP2842075B1 (en) Three-dimensional face recognition for mobile devices
US9418274B2 (en) Biometric authentication technique utilizing image data of both hands
US8908934B2 (en) Fingerprint recognition for low computing power applications
KR20180098367A (en) Face biometry verification method and apparatus
CN110008824B (en) Palmprint recognition method, palmprint recognition device, palmprint recognition computer device and palmprint recognition storage medium
WO2023016271A1 (en) Attitude determining method, electronic device, and readable storage medium
US20170140211A1 (en) Face comparison device, method, and recording medium
CN111914775B (en) Living body detection method, living body detection device, electronic equipment and storage medium
JP5229912B2 (en) Object recognition apparatus and object recognition method
CN111143925A (en) Drawing labeling method and related product
CN111553251A (en) Certificate four-corner incomplete detection method, device, equipment and storage medium
CN112560592A (en) Image processing method and device, and terminal control method and device
US10853631B2 (en) Face verification method and apparatus, server and readable storage medium
CN115937003A (en) Image processing method, image processing device, terminal equipment and readable storage medium
KR101795264B1 (en) Face landmark detection apparatus and verification method of the same
CN109934858B (en) Image registration method and device
EP2128820A1 (en) Information extracting method, registering device, collating device and program
CN116363401A (en) Image matching method and device, electronic equipment and chip
CN112286780A (en) Method, device and equipment for testing recognition algorithm and storage medium
KR102301785B1 (en) Method and appauatus for face continuous authentication
CN110276249A (en) Three-dimensional palm image ajusts method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination