WO2021212874A1 - 掌纹的误匹配点剔除方法、装置、设备及存储介质 - Google Patents

掌纹的误匹配点剔除方法、装置、设备及存储介质 Download PDF

Info

Publication number
WO2021212874A1
WO2021212874A1 PCT/CN2020/135853 CN2020135853W WO2021212874A1 WO 2021212874 A1 WO2021212874 A1 WO 2021212874A1 CN 2020135853 W CN2020135853 W CN 2020135853W WO 2021212874 A1 WO2021212874 A1 WO 2021212874A1
Authority
WO
WIPO (PCT)
Prior art keywords
point
verified
standard
matching
image
Prior art date
Application number
PCT/CN2020/135853
Other languages
English (en)
French (fr)
Inventor
侯丽
严明洋
Original Assignee
平安科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 平安科技(深圳)有限公司 filed Critical 平安科技(深圳)有限公司
Publication of WO2021212874A1 publication Critical patent/WO2021212874A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1365Matching; Classification

Definitions

  • This application relates to the field of artificial intelligence technology, and in particular to a method, device, equipment, and computer-readable storage medium for removing palmprint mismatch points.
  • Palmprint is a relatively stable biological feature, which can be used to effectively identify the identity of a person. For example, palmprint recognition can be applied to various application scenarios that require person identification, such as unmanned supermarkets and workplace attendance.
  • the inventor realizes that the existing person recognition through palmprints has many mismatch points, resulting in low recognition accuracy. Therefore, how to solve the low accuracy of the existing palmprint recognition has become a technical problem that needs to be solved urgently.
  • This application provides a method for removing mismatched points of palmprints.
  • the method for removing mismatched points of palmprints includes the following steps:
  • the standard image feature points corresponding to the preset standard palm image, and the feature points of the image to be verified, the palm image to be verified and the initial matching point corresponding to the standard palm image are determined, and based on the initial Matching points generate an initial set of matching points;
  • An initial matching point is acquired from the initial matching point set as the current matching point, and based on the feature matching algorithm GMS, the current matching point is determined in the relevant field point set to be verified in the image to be verified and the point set in the relevant field to be verified. Describe the standard-related field point set in the standard image;
  • the related field point set to be verified and the standard related field point set it is determined whether the current matching point is a mismatched point, and the current matching point carrying the mismatched point identifier is excluded from the initial matching point set.
  • the present application also provides a device for removing mismatched points of palmprints, and the device for removing mismatched points of palmprints includes:
  • the feature point extraction module is used to obtain the palm image to be verified, and extract the feature points of the image to be verified corresponding to the palm image to be verified based on the rapid feature point extraction and description algorithm ORB algorithm;
  • the matching point determination module is used to determine the initial match corresponding to the palm image to be verified and the standard palm image based on the brute force matching algorithm, the standard image feature points corresponding to the preset standard palm image, and the image feature points to be verified Points, and generate an initial matching point set according to the initial matching points;
  • the matching point judgment module is used to obtain an initial matching point in the initial matching point set as the current matching point, and determine the current matching point in the image to be verified based on the feature matching algorithm GMS The related field point set and the standard related field point set in the standard image;
  • the mismatch point elimination module is used to determine whether the current matching point is a mismatch point according to the relevant field point set to be verified and the standard relevant field point set, and to remove mismatch points from the initial matching point set The current matching point identified.
  • the present application also provides a device that includes a processor, a memory, and a program stored on the memory and executable by the processor, wherein when the program is executed by the processor, the following steps are implemented :
  • the standard image feature points corresponding to the preset standard palm image, and the feature points of the image to be verified, the palm image to be verified and the initial matching point corresponding to the standard palm image are determined, and based on the initial Matching points generate an initial set of matching points;
  • An initial matching point is acquired from the initial matching point set as the current matching point, and based on the feature matching algorithm GMS, the current matching point is determined in the relevant field point set to be verified in the image to be verified and the point set in the relevant field to be verified. Describe the standard-related field point set in the standard image;
  • the related field point set to be verified and the standard related field point set it is determined whether the current matching point is a mismatched point, and the current matching point carrying the mismatched point identifier is excluded from the initial matching point set.
  • the present application also provides a computer-readable storage medium with a program stored on the computer-readable storage medium, and when the program is executed by a processor, the following steps are implemented:
  • the standard image feature points corresponding to the preset standard palm image, and the feature points of the image to be verified, the palm image to be verified and the initial matching point corresponding to the standard palm image are determined, and based on the initial Matching points generate an initial set of matching points;
  • An initial matching point is acquired from the initial matching point set as the current matching point, and based on the feature matching algorithm GMS, the current matching point is determined in the relevant field point set to be verified in the image to be verified and the point set in the relevant field to be verified. Describe the standard-related field point set in the standard image;
  • the related field point set to be verified and the standard related field point set it is determined whether the current matching point is a mismatched point, and the current matching point carrying the mismatched point identifier is excluded from the initial matching point set.
  • FIG. 1 is a schematic diagram of the hardware structure of the device involved in the solution of the embodiment of the application;
  • FIG. 2 is a schematic flowchart of the first embodiment of the application method
  • FIG. 3 is a schematic flowchart of a second embodiment of the application method
  • FIG. 5 is a schematic diagram of the functional modules of the first embodiment of the device of this application.
  • the method for removing mismatched palmprint points involved in the embodiments of the present application is mainly applied to a palmprint mismatched point removal device.
  • the palmprint mismatched point removal device may be a PC, a portable computer, a mobile terminal, etc., with display and processing functions. device of.
  • FIG. 1 is a schematic diagram of the hardware structure of a palmprint mismatch point elimination device involved in the solution of the embodiment of the application.
  • the device for removing mismatched palmprint points may include a processor 1001 (for example, a CPU), a communication bus 1002, a user interface 1003, a network interface 1004, and a memory 1005.
  • the communication bus 1002 is used to realize the connection and communication between these components;
  • the user interface 1003 may include a display screen (Display), an input unit such as a keyboard (Keyboard);
  • the network interface 1004 may optionally include a standard wired interface, a wireless interface (Such as WI-FI interface);
  • the memory 1005 can be a high-speed RAM memory, or a stable memory (non-volatile memory), such as a disk memory.
  • the memory 1005 can optionally also be a storage device independent of the aforementioned processor 1001 .
  • FIG. 1 does not constitute a limitation on the palmprint mismatch point removal device, and may include more or less components than shown in the figure, or combine certain components, or Different component arrangements.
  • the memory 1005 as a computer-readable storage medium in FIG. 1 may include an operating system, a network communication module, and a palmprint mismatch point elimination program.
  • the network communication module is mainly used to connect to the server and perform data communication with the server; and the processor 1001 can call the palmprint mismatch point elimination program stored in the memory 1005, and execute the palmprint provided in the embodiment of the application. The method of removing mismatched points.
  • the embodiment of the present application provides a method for removing mismatched points of palmprints.
  • FIG. 2 is a schematic flowchart of a first embodiment of a method for removing mismatched points of palmprints according to the present application.
  • the method for removing mismatched palmprint points includes the following steps:
  • Step S10 acquiring a palm image to be verified, and extracting feature points of the image to be verified corresponding to the palm image to be verified based on the rapid feature point extraction and description algorithm ORB algorithm;
  • Palmprint is a relatively stable biological feature, which can be used to effectively identify the identity of a person.
  • palmprint recognition can be applied to various application scenarios that require person identification, such as unmanned supermarkets and workplace attendance.
  • the existing person recognition through palmprints has many mismatch points, resulting in low recognition accuracy.
  • this application extracts the palmprint image features based on the ORB algorithm, then obtains the matching points corresponding to the standard palm image and the palm image to be verified based on the brute force matching algorithm, and then uses the cross-validation method to eliminate the non-optimal match Therefore, palmprint recognition can be performed based on the final matching result, which improves the accuracy of palmprint recognition.
  • the palmprint image has relatively rich features.
  • Step S20 Determine the palm image to be verified and the initial matching point corresponding to the standard palm image based on the brute force matching algorithm, the standard image feature points corresponding to the preset standard palm image, and the image feature points to be verified, and The initial matching point generates an initial matching point set;
  • ORB is used to extract the feature points of the standard image corresponding to the preset standard palm image and the feature points of the image to be verified corresponding to the palm image to be verified, and then the brute force matching algorithm is used to obtain the matching points corresponding to the two images.
  • the matching principle is that the point i in image A will inevitably find a point j that best matches it in image B, thereby forming a pair of matching points.
  • point i in image A is not necessarily its optimal matching point, so that (i, j) is artificially a pair of mismatched points. Therefore, the initial matching point needs to be further matched and verified.
  • the initial matching point is added to the initial matching point set, where the initial matching point set has one or more matching points.
  • the standard palm image can be stored in the blockchain.
  • the blockchain referred to in this application is a new application mode of computer technology such as distributed data storage, point-to-point transmission, consensus mechanism, and encryption algorithm.
  • Blockchain essentially a decentralized database, is a series of data blocks associated with cryptographic methods. Each data block contains a batch of network transaction information for verification. The validity of the information (anti-counterfeiting) and the generation of the next block.
  • the blockchain can include the underlying platform of the blockchain, the platform product service layer, and the application service layer.
  • Step S30 Obtain an initial matching point from the initial matching point set as the current matching point, and determine the relevant field point set of the current matching point in the image to be verified based on the feature matching algorithm GMS And the standard-related field point set in the standard image;
  • the main problem of the conventional feature matching method is that the matching effect is good and the matching speed is slow, and the matching result with fast matching speed is often unstable.
  • GMS Grid-based Motion
  • the Statistic method proposes a simple statistical-based solution. Using the strong constraint of neighborhood consistency, it can quickly distinguish between correct matches and incorrect matches, which improves the stability of matching.
  • the core idea of this method is: the smoothness of the motion leads to more matching points in the neighborhood of the matched feature point, and the feature points in the neighborhood near the correct matching point are also in one-to-one correspondence.
  • the initial matching points in the relevant field points to be verified in the image to be verified and the standard relevant field points in the standard image should also be in one-to-one correspondence, or mostly one-to-one correspondence. That is, an initial matching point in the initial matching point set is acquired in sequence as the current matching point (therefore, all the initial matching points in the initial matching point set are operated as current matching points in turn), based on the feature matching algorithm GMS , Determining the current matching point in the to-be-verified related field point set in the to-be-verified image and the standard related field point set in the standard image. In a specific embodiment, it is possible to determine whether a matching point is a mismatched point by counting the number of matching points in the neighborhood.
  • Step S40 Determine whether the current matching point is a mismatched point according to the to-be-verified related field point set and the standard related field point set, and remove the current match carrying the mismatched point identifier from the initial matching point set point.
  • each point in the related field point set to be verified is matched with each point in the standard related field point set, and according to the matching result, it is determined whether the current matching point is a mismatched point. If it is determined that the related field point set to be verified does not match the standard related field point set, that is, the points adjacent to the initial matching point do not match. That is, it means that the initial matching point is only a single point corresponding to the matching, the initial matching point is marked as an incorrect matching point, and the initial matching point carrying the identification of the incorrect matching point is eliminated.
  • a method for removing mismatched points of palmprints is provided.
  • the standard image feature points corresponding to the preset standard palm image and the feature points of the image to be verified, the palm image to be verified and the initial matching point corresponding to the standard palm image are determined, and according to the
  • the initial matching point generates an initial matching point set; an initial matching point is obtained from the initial matching point set as the current matching point, and based on the feature matching algorithm GMS, the current matching point is determined in the image to be verified.
  • a set of related field points to be verified and a set of standard related field points in the standard image according to the set of related field points to be verified and the standard related field point set, it is determined whether the current matching point is a mismatched point, And in the initial matching point set, the current matching point that carries the wrong matching point identifier is eliminated.
  • this application extracts the palmprint image features based on the ORB algorithm, and then obtains the standard palm image and the matching points corresponding to the two images of the palm image to be verified based on the brute force matching algorithm, and then uses GMS (Grid-based Motion Statistic) eliminates the matching points that do not conform to the domain consistency, so that palmprint recognition can be performed based on the final matching result, which improves the accuracy of palmprint recognition and solves the existing technical problem of low palmprint recognition accuracy.
  • GMS Grid-based Motion Statistic
  • FIG. 3 is a schematic flowchart of a second embodiment of a method for removing mismatched points of palmprints according to the present application.
  • the step S20 specifically includes:
  • Step S21 based on the brute force matching algorithm, acquire a standard image feature point in the standard palm image as a first standard image feature point, and extract a to-be-verified image feature point in the to-be-verified palm image, As the first feature point of the image to be verified;
  • Step S22 Determine whether the feature point of the first standard image is a first standard matching point, wherein the first standard matching point is a matching point in the standard palm image corresponding to the first to-be-verified matching point ;
  • step S22 specifically includes:
  • the first standard image feature point is the first standard matching point.
  • Step S23 when the first standard image feature point is the first standard matching point, determine whether the first to-be-verified image feature point is the first to-be-verified matching point, wherein the first to-be-verified matching point is The point is the matching point corresponding to the feature point of the first standard image in the palm image to be verified;
  • Step S24 when the first image feature point to be verified is the first matching point to be verified, the first standard image feature point and the first image feature point to be verified are marked as a pair of initial matching points , And generate an initial matching point set according to the initial matching point.
  • step S20 further includes:
  • the first image feature point to be verified is not the first to-be-verified matching point
  • the first standard image feature point and the first to-be-verified image feature point are marked as mismatched points.
  • the first standard image feature points corresponding to the standard palm image are acquired, and the first image feature points to be verified corresponding to the palm image to be verified are extracted; based on the The first standard image feature point and the first image feature point to be verified, a first standard matching point corresponding to the first standard image feature point to be verified is determined in the first standard image feature point; for example, a palm image A and palm image B, use ORB algorithm to extract image features, and then use brute force matching algorithm to obtain the matching points corresponding to the two images.
  • the principle of brute force matching is that the point i in image A will inevitably find its best in image B A matching point j constitutes a pair of matching points.
  • point i in image A is not necessarily its optimal matching point, so that (i, j) is artificially a pair of mismatched points. Therefore, this paper adopts the method of cross-validation to eliminate this type of mismatch points.
  • the idea of cross-validation is: the best matching point of point i in image A in image B is j, and the best matching point of point j in image B in image A is i, then the artificial point pair (i, j) is a pair of best matching points.
  • ORB extracts image features, and each feature point has 256-dimensional feature values. Calculating the similarity between point i in image A and point j in image B is to calculate the corresponding 256 feature values of the two feature points i and j The similarity (or distance) of each.
  • determining point i in image A as a matching point in image B includes the following steps:
  • the steps for determining the best matching point and the wrong matching point are:
  • the application extracts the palmprint image features based on the ORB algorithm, then obtains the matching points corresponding to the standard palm image and the palm image to be verified based on the brute force matching algorithm, and then uses the cross-validation method to eliminate the non-optimal match Mismatch points, and finally use GMS (Grid-based Motion Statistic) eliminates the matching points that do not conform to the domain consistency, so that palmprint recognition can be performed based on the final matching result, which improves the accuracy of palmprint recognition and solves the problem of low recognition accuracy based on palmprint recognition methods. technical problem.
  • FIG. 4 is a schematic flowchart of a third embodiment of a method for removing mismatched points of palmprints according to the present application.
  • the step S40 specifically includes:
  • Step S41 When the point set of the related field to be verified does not match the point set of the standard related field, mark the current matching point as a mismatched point, and remove the mismatched point identifier from the initial matching point set. The current matching point.
  • step S41 further includes:
  • step S40 specifically further includes:
  • the main problem of the conventional feature matching method is that the matching effect is good and the matching speed is slow, and the matching result with fast matching speed is often unstable.
  • GMS Grid-based Motion
  • the Statistic method proposes a simple statistical-based solution. Using the strong constraint of neighborhood consistency, it can quickly distinguish between correct matches and incorrect matches, which improves the stability of matching.
  • the core idea of this method is: the smoothness of the motion leads to more matching points in the neighborhood of the matched feature point, and the feature points in the neighborhood near the correct matching point are also in one-to-one correspondence. It is possible to determine whether the relevant field point set to be verified matches the standard relevant field point set by counting the number of matching points in the neighborhood.
  • the embodiment of the present application also provides a palmprint mismatch point elimination device.
  • Fig. 5 is a schematic diagram of the functional modules of the first embodiment of the device for removing mismatched points of palmprints according to the present application.
  • the device for removing mismatched palmprint points includes:
  • the feature point extraction module 10 is configured to obtain a palm image to be verified, and extract the feature points of the image to be verified corresponding to the palm image to be verified based on the rapid feature point extraction and description algorithm ORB algorithm;
  • the matching point determination module 20 is configured to determine the palm image to be verified and the initial corresponding to the standard palm image based on the brute force matching algorithm, the standard image feature points corresponding to the preset standard palm image, and the image feature points to be verified. Matching points, and generating an initial matching point set according to the initial matching points;
  • the matching point judgment module 30 is configured to obtain an initial matching point in the initial matching point set as the current matching point, and based on the feature matching algorithm GMS, to determine the current matching point in the image to be verified. Verifying the related field point set and the standard related field point set in the standard image;
  • the mismatch point elimination module 40 is used to determine whether the current matching point is a mismatch point according to the relevant field point set to be verified and the standard relevant field point set, and to remove mismatches from the initial matching point set The current matching point identified by the point.
  • the matching point determination module 20 specifically includes:
  • the image feature point extraction unit is configured to obtain a standard image feature point in the standard palm image as a first standard image feature point based on the brute force matching algorithm, and extract a to-be-verified palm image
  • the feature point of the verification image is used as the first feature point of the image to be verified;
  • the first matching point judging unit is configured to judge whether the feature point of the first standard image is a first standard matching point, wherein the first standard matching point is the same as the first to be verified in the standard palm image The matching point corresponding to the matching point;
  • the second matching point determining unit is configured to determine whether the first image feature point to be verified is the first matching point to be verified when the feature point of the first standard image is the first standard matching point, wherein The first matching point to be verified is a matching point in the palm image to be verified that corresponds to a feature point of the first standard image;
  • the matching point marking unit is configured to mark the first standard image feature point and the first image feature point to be verified as one when the first image feature point to be verified is the first matching point to be verified
  • an initial matching point set is generated according to the initial matching points.
  • matching point determination module 20 specifically further includes:
  • the first matching point elimination unit is configured to mark the first standard image feature point and the first image feature point to be verified when the first image feature point to be verified is not the first matching point to be verified It is a mismatch point.
  • the first matching point judgment unit is further configured to:
  • the first standard image feature point is the first standard matching point.
  • mismatch point elimination module 40 specifically includes:
  • the second matching point elimination unit is configured to mark the current matching point as a mismatched point when the related field point set to be verified does not match the standard related field point set, and set it in the initial matching point set Eliminate the current matching point that carries the mismatched point identifier.
  • mismatch point elimination module 40 further includes:
  • a target point obtaining unit configured to obtain a relevant field point to be verified that matches the standard relevant field point set in the relevant field point set to be verified, as a target point, to generate a target point set;
  • a target point judging unit configured to judge whether the number of target points in the target point set is greater than a first threshold
  • the result mismatch unit is configured to determine that the related field point set to be verified does not match the standard related field point set if the number of target points is not greater than the first threshold.
  • the result matching unit is configured to determine that the palm to be verified is the same as the standard palm when the related field point set to be verified matches the standard related field point set.
  • each module in the device for removing mismatched points of palmprint corresponds to each step in the embodiment of the method for removing mismatched points of palmprint, and its functions and implementation processes are not repeated here.
  • the embodiments of the present application also provide a computer-readable storage medium.
  • the computer-readable storage medium may be volatile or non-volatile.
  • the computer-readable storage medium of the present application stores a palmprint mismatch point removal program, wherein when the palmprint mismatch point removal program is executed by a processor, the steps of the palmprint mismatch point removal method as described above are implemented .
  • the method implemented when the palmprint mismatch point removal program is executed can refer to the various embodiments of the palmprint mismatch point removal method of the present application, which will not be repeated here.
  • the technical solution of this application essentially or the part that contributes to the existing technology can be embodied in the form of a software product, and the computer software product is stored in a storage medium (such as ROM/RAM) as described above. , Magnetic disks, optical disks), including several instructions to make a terminal device (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the method described in each embodiment of the present application.
  • a terminal device which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Collating Specific Patterns (AREA)
  • Image Analysis (AREA)

Abstract

一种掌纹的误匹配点剔除的方法、装置、设备及存储介质,涉及人工智能领域,即基于ORB算法提取待验证手掌图像对应的待验证图像特征点;基于蛮力匹配算法、预设标准手掌图像对应的标准图像特征点以及所述待验证图像特征点,确定两个手掌图像对应的初始匹配点;基于特征匹配算法GMS,确定所述初始匹配点在两个手掌图像中的相关领域点是否匹配;若所述不匹配,则将所述初始匹配点标记为误匹配点并进行剔除。其中,标准手掌图像可存储于区块链中。该方法提高了掌纹的识别精确度。

Description

掌纹的误匹配点剔除方法、装置、设备及存储介质
本申请要求于2020年4月24日提交中国专利局、申请号为CN202010331794.5、名称为“掌纹的误匹配点剔除方法、装置、设备及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及人工智能技术领域,尤其涉及一种掌纹的误匹配点剔除方法、装置、设备及计算机可读存储介质。
背景技术
掌纹是一个相对稳定的生物特征,可以利用掌纹有效识别出人物的身份,例如将掌纹识别应用到无人超市、职场考勤等各种需要进行人物身份识别的应用场景。
技术问题
发明人意识到现有通过掌纹进行人物识别由于存在较多的误匹配点,导致识别准确率低下。因此,如何解决现有掌纹识别准确率低下,成为了目前亟待解决的技术问题。
技术解决方案
本申请提供一种掌纹的误匹配点剔除方法,所述掌纹的误匹配点剔除方法包括以下步骤:
获取待验证手掌图像,并基于快速特征点提取和描述算法ORB算法提取所述待验证手掌图像对应的待验证图像特征点;
基于蛮力匹配算法、预设标准手掌图像对应的标准图像特征点以及所述待验证图像特征点,确定所述待验证手掌图像以及所述标准手掌图像对应的初始匹配点,并根据所述初始匹配点生成初始匹配点集;
在所述初始匹配点集中获取一初始匹配点,作为所述当前匹配点,并基于特征匹配算法GMS,确定所述当前匹配点在所述待验证图像中的待验证相关领域点集以及在所述标准图像中的标准相关领域点集;
根据所述待验证相关领域点集与所述标准相关领域点集,确定所述当前匹配点是否为误匹配点,并在所述初始匹配点集中剔除携带误匹配点标识的当前匹配点。
本申请还提供一种掌纹的误匹配点剔除装置,所述掌纹的误匹配点剔除装置包括:
特征点提取模块,用于获取待验证手掌图像,并基于快速特征点提取和描述算法ORB算法提取所述待验证手掌图像对应的待验证图像特征点;
匹配点确定模块,用于基于蛮力匹配算法、预设标准手掌图像对应的标准图像特征点以及所述待验证图像特征点,确定所述待验证手掌图像以及所述标准手掌图像对应的初始匹配点,并根据所述初始匹配点生成初始匹配点集;
匹配点判断模块,用于在所述初始匹配点集中获取一初始匹配点,作为所述当前匹配点,并基于特征匹配算法GMS,确定所述当前匹配点在所述待验证图像中的待验证相关领域点集以及在所述标准图像中的标准相关领域点集;
误配点剔除模块,用于根据所述待验证相关领域点集与所述标准相关领域点集,确定所述当前匹配点是否为误匹配点,并在所述初始匹配点集中剔除携带误匹配点标识的当前匹配点。
本申请还提供一种设备,所述设备包括处理器、存储器、以及存储在所述存储器上并可被所述处理器执行的程序,其中所述程序被所述处理器执行时,实现如下步骤:
获取待验证手掌图像,并基于快速特征点提取和描述算法ORB算法提取所述待验证手掌图像对应的待验证图像特征点;
基于蛮力匹配算法、预设标准手掌图像对应的标准图像特征点以及所述待验证图像特征点,确定所述待验证手掌图像以及所述标准手掌图像对应的初始匹配点,并根据所述初始匹配点生成初始匹配点集;
在所述初始匹配点集中获取一初始匹配点,作为所述当前匹配点,并基于特征匹配算法GMS,确定所述当前匹配点在所述待验证图像中的待验证相关领域点集以及在所述标准图像中的标准相关领域点集;
根据所述待验证相关领域点集与所述标准相关领域点集,确定所述当前匹配点是否为误匹配点,并在所述初始匹配点集中剔除携带误匹配点标识的当前匹配点。
本申请还提供一种计算机可读存储介质,所述计算机可读存储介质上存储有程序,其中所述程序被处理器执行时,实现如下步骤:
获取待验证手掌图像,并基于快速特征点提取和描述算法ORB算法提取所述待验证手掌图像对应的待验证图像特征点;
基于蛮力匹配算法、预设标准手掌图像对应的标准图像特征点以及所述待验证图像特征点,确定所述待验证手掌图像以及所述标准手掌图像对应的初始匹配点,并根据所述初始匹配点生成初始匹配点集;
在所述初始匹配点集中获取一初始匹配点,作为所述当前匹配点,并基于特征匹配算法GMS,确定所述当前匹配点在所述待验证图像中的待验证相关领域点集以及在所述标准图像中的标准相关领域点集;
根据所述待验证相关领域点集与所述标准相关领域点集,确定所述当前匹配点是否为误匹配点,并在所述初始匹配点集中剔除携带误匹配点标识的当前匹配点。
附图说明
图1为本申请实施例方案中涉及的设备的硬件结构示意图;
图2为本申请方法第一实施例的流程示意图;
图3为本申请方法第二实施例的流程示意图;
图4为本申请方法第三实施例的流程示意图;
图5为本申请装置第一实施例的功能模块示意图。
本申请目的的实现、功能特点及优点将结合实施例,参照附图做进一步说明。
本发明的实施方式
应当理解,此处所描述的具体实施例仅仅用以解释本申请,并不用于限定本申请。
本申请实施例涉及的掌纹的误匹配点剔除方法主要应用于掌纹的误匹配点剔除设备,该掌纹的误匹配点剔除设备可以是PC、便携计算机、移动终端等具有显示和处理功能的设备。
参照图1,图1为本申请实施例方案中涉及的掌纹的误匹配点剔除设备的硬件结构示意图。本申请实施例中,掌纹的误匹配点剔除设备可以包括处理器1001(例如CPU),通信总线1002,用户接口1003,网络接口1004,存储器1005。其中,通信总线1002用于实现这些组件之间的连接通信;用户接口1003可以包括显示屏(Display)、输入单元比如键盘(Keyboard);网络接口1004可选的可以包括标准的有线接口、无线接口(如WI-FI接口);存储器1005可以是高速RAM存储器,也可以是稳定的存储器(non-volatile memory),例如磁盘存储器,存储器1005可选的还可以是独立于前述处理器1001的存储装置。
本领域技术人员可以理解,图1中示出的硬件结构并不构成对掌纹的误匹配点剔除设备的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
继续参照图1,图1中作为一种计算机可读存储介质的存储器1005可以包括操作系统、网络通信模块以及掌纹的误匹配点剔除程序。
在图1中,网络通信模块主要用于连接服务器,与服务器进行数据通信;而处理器1001可以调用存储器1005中存储的掌纹的误匹配点剔除程序,并执行本申请实施例提供的掌纹的误匹配点剔除方法。
本申请实施例提供了一种掌纹的误匹配点剔除方法。
参照图2,图2为本申请掌纹的误匹配点剔除方法第一实施例的流程示意图。
本实施例中,所述掌纹的误匹配点剔除方法包括以下步骤:
步骤S10, 获取待验证手掌图像,并基于快速特征点提取和描述算法ORB算法提取所述待验证手掌图像对应的待验证图像特征点;
掌纹是一个相对稳定的生物特征,可以利用掌纹有效识别出人物的身份,例如将掌纹识别应用到无人超市、职场考勤等各种需要进行人物身份识别的应用场景。但是现有通过掌纹进行人物识别由于存在较多的误匹配点,导致识别准确率低下。为了解决上述问题,本申请基于ORB算法提取掌纹图像特征,然后基于蛮力匹配算法获取标准手掌图像以及待验证手掌图像两幅图像对应的匹配点,然后用交叉验证的方法剔除不是最佳匹配的误匹配点,由此可基于最后的匹配结果,进行掌纹识别,提高了掌纹的识别精确度。具体地,掌纹图像具有比较丰富的特征,在掌纹识别过程中可以通过提取掌纹图像的特征来判断两幅图像是否为同一个手掌。常用的特征提取方法有sift、surf和ORB等。其中,ORB(Oriented FAST and Rotated BRIEF,快速特征点提取和描述的算法)。ORB特征是将FAST特征点的检测方法与BRIEF特征描述算法结合起来,并在它们原来的基础上做了改进与优化,并通过ORB算法提取所述待验证手掌图像对应的待验证图像特征点。
步骤S20,基于蛮力匹配算法、预设标准手掌图像对应的标准图像特征点以及所述待验证图像特征点,确定所述待验证手掌图像以及所述标准手掌图像对应的初始匹配点,并根据所述初始匹配点生成初始匹配点集;
本实施例中,使用ORB分别提取预设标准手掌图像对应的标准图像特征点以及待验证手掌图像对应的待验证图像特征点,然后用蛮力匹配算法获取两幅图像对应的匹配点,蛮力匹配原理是图像A中的点i在图像B中必然会找到与其最优匹配的一个点j,从而构成一对匹配点。对于图像B中的点j,图像A中的点i并不一定是其最优匹配点,这样就人为(i,j)是一对误匹配点。因此,需要对所述初始匹配点进行进一步匹配验证。将所述初始匹配点添加至初始匹配点集中,其中,所述初始匹配点集中具有一个或者多个匹配点。
作为一种实施方式,为保证上述标准手掌图像的私密和安全性,标准手掌图像可存储于区块链中。本申请所指区块链是分布式数据存储、点对点传输、共识机制、加密算法等计算机技术的新型应用模式。区块链(Blockchain),本质上是一个去中心化的数据库,是一串使用密码学方法相关联产生的数据块,每一个数据块中包含了一批次网络交易的信息,用于验证其信息的有效性(防伪)和生成下一个区块。区块链可以包括区块链底层平台、平台产品服务层以及应用服务层等。
步骤S30,在所述初始匹配点集中获取一初始匹配点,作为所述当前匹配点,并基于特征匹配算法GMS,确定所述当前匹配点在所述待验证图像中的待验证相关领域点集以及在所述标准图像中的标准相关领域点集;
本实施例中,常规的特征匹配方法主要存在的问题是匹配效果好的匹配速度慢,匹配速度快的匹配结果经常不稳定等问题,针对这种问题GMS(Grid-based Motion Statistic)方法提出了一个简单的基于统计的解决方法,利用邻域一致性这一强有力的约束条件,可以快速区分出正确的匹配和错误的匹配,提高了匹配的稳定性。该方法的核心思想是:运动的平滑性导致了匹配的特征点邻域有较多匹配的点,正确的匹配点附近的邻域里的特征点也是一一对应的。即初始匹配点在所述待验证图像中的待验证相关领域点以及在所述标准图像中的标准相关领域点也应该一一对应,或者是大部分一一对应。即依次获取所述初始匹配点集中的一初始匹配点,作为所述当前匹配点(由此将所述初始匹配点集中的全部初始匹配点轮流作为当前匹配点进行操作),基于特征匹配算法GMS,确定所述当前匹配点在所述待验证图像中的待验证相关领域点集以及在所述标准图像中的标准相关领域点集。具体实施例中,可以通过计数邻域的匹配点个数来判断一个匹配点是否为误匹配点。
步骤S40,根据所述待验证相关领域点集与所述标准相关领域点集,确定所述当前匹配点是否为误匹配点,并在所述初始匹配点集中剔除携带误匹配点标识的当前匹配点。
本实施例中,将所述待验证相关领域点集中的各个点与所述标准相关领域点集中的各个点进行匹配,根据匹配结果,确定所述当前匹配点是否为误匹配点。若判定所述待验证相关领域点集与所述标准相关领域点集不匹配,即初始匹配点相邻的点并不匹配。即表示该初始匹配点只是单个点对应匹配上,将所述初始匹配点标记为误匹配点,并将携带误匹配点标识的初始匹配点进行剔除。
本实施例中提供提供一种掌纹的误匹配点剔除的方法,通过获取待验证手掌图像,并基于快速特征点提取和描述算法ORB算法提取所述待验证手掌图像对应的待验证图像特征点;基于蛮力匹配算法、预设标准手掌图像对应的标准图像特征点以及所述待验证图像特征点,确定所述待验证手掌图像以及所述标准手掌图像对应的初始匹配点,并根据所述初始匹配点生成初始匹配点集;在所述初始匹配点集中获取一初始匹配点,作为所述当前匹配点,并基于特征匹配算法GMS,确定所述当前匹配点在所述待验证图像中的待验证相关领域点集以及在所述标准图像中的标准相关领域点集;根据所述待验证相关领域点集与所述标准相关领域点集,确定所述当前匹配点是否为误匹配点,并在所述初始匹配点集中剔除携带误匹配点标识的当前匹配点。通过上述方式,本申请基于ORB算法提取掌纹图像特征,然后基于蛮力匹配算法获取标准手掌图像以及待验证手掌图像两幅图像对应的匹配点,然后用GMS(Grid-based Motion Statistic)剔除不符合领域一致性的匹配点,由此可基于最后的匹配结果,进行掌纹识别,提高了掌纹的识别精确度,解决了现有掌纹识别准确率低下的技术问题。
参照图3,图3为本申请掌纹的误匹配点剔除方法第二实施例的流程示意图。
基于上述图2所示实施例,本实施例中,所述步骤S20具体包括:
步骤S21,基于所述蛮力匹配算法,获取所述标准手掌图像中的一标准图像特征点,作为第一标准图像特征点,并提取所述待验证手掌图像中的一待验证图像特征点,作为第一待验证图像特征点;
步骤S22,判断所述第一标准图像特征点是否为第一标准匹配点,其中,所述第一标准匹配点为所述标准手掌图像中与所述待第一待验证匹配点对应的匹配点;
其中,所述步骤S22具体包括:
计算所述第一标准图像特征点对应的第一标准特征值,并计算所述第一待验证图像特征点对应的第一待验证特征值;
根据所述第一标准特征值与所述第一待验证特征值,计算所述第一标准特征点与所述第一待验证匹配点的相似度;
根据所述第一标准特征点与所述第一待验证匹配点的相似度,判断所述第一标准图像特征点是否为所述第一标准匹配点。
步骤S23,在所述第一标准图像特征点为所述第一标准匹配点时,判断所述第一待验证图像特征点是否为第一待验证匹配点,其中,所述第一待验证匹配点为所述待验证手掌图像中与所述第一标准图像特征点对应的匹配点;
步骤S24,在所述第一待验证图像特征点为所述第一待验证匹配点时,将所述第一标准图像特征点与所述第一待验证图像特征点标记为一对初始匹配点,并根据所述初始匹配点生成初始匹配点集。
进一步地,所述步骤S20还包括:
在所述第一待验证图像特征点不是所述第一待验证匹配点时,将所述第一标准图像特征点与所述第一待验证图像特征点标记为误匹配点。
本实施例中,首先基于所述蛮力匹配算法,获取所述标准手掌图像对应的第一标准图像特征点,并提取所述待验证手掌图像对应的第一待验证图像特征点;基于所述第一标准图像特征点以及所述第一待验证图像特征点,在所述第一标准图像特征点中确定与所述待第一待验证匹配点对应匹配的第一标准匹配点;如手掌图像A和手掌图像B,使用ORB算法分别提取图像特征,然后用蛮力匹配算法获取两幅图像对应的匹配点,蛮力匹配原理是图像A中的点i在图像B中必然会找到与其最优匹配的一个点j,从而构成一对匹配点。对于图像B中的点j,图像A中的点i并不一定是其最优匹配点,这样就人为(i,j)是一对误匹配点。因此本文采用交叉验证的方法剔除此种类型的误匹配点。交叉验证的思想是:图像A中的点i在图像B中的最佳匹配点为j,同时图像B中的点j在图像A中的最佳匹配点为i,则人为点对(i, j)是一对最佳匹配点。
其中,ORB提取图片特征,每一个特征点有256个维度的特征值,计算图像A中点i与图像B中点j的相似性,就是计算i和j两个特征点的相应256个特征值的相似度(或距离)。
具体在图像B中确定图像A中的点i作为匹配点包括以下步骤:
1)获取图像A和图像B,两幅图像的特征点及特征点对应的特征值
2)遍历图像A中每一个特征点,取图像A中的特征点i,及其特征值;
3)对于图像A中的点i,遍历图像B中每一个特征点j,计算特征点i和j的对应特征值的相似性(或距离),得到一系列计算结果;
4)对上述步骤得到的值进行排序,取值最大的一个特征点j作为图像A中点i的匹配点;
在图像A中确定图像B中的点j作为匹配点的步骤与以上步骤相同。
其中,最佳匹配点和误匹配点的确定步骤为:
获取图像A中点i及其在图像B中的匹配点j;
根据1)得到的图像B中的点j获取其在图像A中的匹配点m;
判断i和m是否为同一个特征点(判断方法:ORB提取特征得到的是一个固定序列,只需判断i和m在序列中的坐标是否一致即可)。若i和m是同一个点,则图像A中点和图像B中点是最佳匹配点,否则认为是误匹配点。
本实施例中,本申请基于ORB算法提取掌纹图像特征,然后基于蛮力匹配算法获取标准手掌图像以及待验证手掌图像两幅图像对应的匹配点,再用交叉验证的方法剔除不是最佳匹配的误匹配点,最后用GMS(Grid-based Motion Statistic)剔除不符合领域一致性的匹配点,由此可基于最后的匹配结果,进行掌纹识别,提高了掌纹的识别精确度,解决了现有基于掌纹识别方法的识别准确率低下的技术问题。
参照图4,图4为本申请掌纹的误匹配点剔除方法第三实施例的流程示意图。
基于上述图3所示实施例,本实施例中,所述步骤S40具体包括:
步骤S41,在所述待验证相关领域点集与所述标准相关领域点集不匹配时,将所述当前匹配点标记为误匹配点,并在所述初始匹配点集中剔除携带误匹配点标识的当前匹配点。
其中,所述步骤S41之前,还包括:
获取所述待验证相关领域点集中与所述标准相关领域点集相匹配的一待验证相关领域点,作为目标点,生成目标点集;
判断所述目标点集中的目标点个数是否大于第一阈值;
若所述目标点个数不大于所述第一阈值,则判定所述待验证相关领域点集与所述标准相关领域点集不匹配。
其中,所述步骤S40具体还包括:
在所述待验证相关领域点集与所述标准相关领域点集相匹配时,判定所述待验证手掌与所述标准手掌相同。
本实施例中,常规的特征匹配方法主要存在的问题是匹配效果好的匹配速度慢,匹配速度快的匹配结果经常不稳定等问题,针对这种问题GMS(Grid-based Motion Statistic)方法提出了一个简单的基于统计的解决方法,利用邻域一致性这一强有力的约束条件,可以快速区分出正确的匹配和错误的匹配,提高了匹配的稳定性。该方法的核心思想是:运动的平滑性导致了匹配的特征点邻域有较多匹配的点,正确的匹配点附近的邻域里的特征点也是一一对应的。可以通过计数邻域的匹配点个数来判断所述待验证相关领域点集与所述标准相关领域点集是否匹配。
其中,判断各个匹配点的特征领域是否有较多匹配的点的具体步骤如下:
1)分别将图像A和图像B划分为20*20的网格,并对划分网格的每一个小格子进行排序0~400;
2)对于图像A和图像B中的匹配点对(i,j),在图像A和图像B中分别对应网格位置m和n,并统计m和n周围8邻域范围内是否存在匹配点,并计算匹配点个数;
3)图像A和图像B的对应网格m和n周围8邻域内匹配点个数大于给定阈值,则认为图像A和图像B中的匹配点对(I,j)是最佳匹配点,否则剔除匹配点对(i,j)。例如,若A和B匹配点周围有其它相邻匹配点,并且相邻匹配点数量大于给定阈值4,则认为A和B是最佳匹配点;C和D匹配对周围没有其它匹配点,则认为是误匹配点,则删除C和D匹配点。
此外,本申请实施例还提供一种掌纹的误匹配点剔除装置。
参照图5,图5为本申请掌纹的误匹配点剔除装置第一实施例的功能模块示意图。
本实施例中,所述掌纹的误匹配点剔除装置包括:
特征点提取模块10,用于获取待验证手掌图像,并基于快速特征点提取和描述算法ORB算法提取所述待验证手掌图像对应的待验证图像特征点;
匹配点确定模块20,用于基于蛮力匹配算法、预设标准手掌图像对应的标准图像特征点以及所述待验证图像特征点,确定所述待验证手掌图像以及所述标准手掌图像对应的初始匹配点,并根据所述初始匹配点生成初始匹配点集;
匹配点判断模块30,用于在所述初始匹配点集中获取一初始匹配点,作为所述当前匹配点,并基于特征匹配算法GMS,确定所述当前匹配点在所述待验证图像中的待验证相关领域点集以及在所述标准图像中的标准相关领域点集;
误配点剔除模块40,用于根据所述待验证相关领域点集与所述标准相关领域点集,确定所述当前匹配点是否为误匹配点,并在所述初始匹配点集中剔除携带误匹配点标识的当前匹配点。
进一步地,所述匹配点确定模块20具体包括:
图像特征点提取单元,用于基于所述蛮力匹配算法,获取所述标准手掌图像中的一标准图像特征点,作为第一标准图像特征点,并提取所述待验证手掌图像中的一待验证图像特征点,作为第一待验证图像特征点;
第一匹配点判断单元,用于判断所述第一标准图像特征点是否为第一标准匹配点,其中,所述第一标准匹配点为所述标准手掌图像中与所述待第一待验证匹配点对应的匹配点;
第二匹配点判断单元,用于在所述第一标准图像特征点为所述第一标准匹配点时,判断所述第一待验证图像特征点是否为第一待验证匹配点,其中,所述第一待验证匹配点为所述待验证手掌图像中与所述第一标准图像特征点对应的匹配点;
匹配点标记单元,用于在所述第一待验证图像特征点为所述第一待验证匹配点时,将所述第一标准图像特征点与所述第一待验证图像特征点标记为一对初始匹配点,并根据所述初始匹配点生成初始匹配点集。
进一步地,所述匹配点确定模块20具体还包括:
第一匹配点剔除单元,用于在所述第一待验证图像特征点不是所述第一待验证匹配点时,将所述第一标准图像特征点与所述第一待验证图像特征点标记为误匹配点。
进一步地,所述第一匹配点判断单元还用于:
计算所述第一标准图像特征点对应的第一标准特征值,并计算所述第一待验证图像特征点对应的第一待验证特征值;
根据所述第一标准特征值与所述第一待验证特征值,计算所述第一标准特征点与所述第一待验证匹配点的相似度;
根据所述第一标准特征点与所述第一待验证匹配点的相似度,判断所述第一标准图像特征点是否为所述第一标准匹配点。
进一步地,所述误配点剔除模块40具体包括:
第二匹配点剔除单元,用于在所述待验证相关领域点集与所述标准相关领域点集不匹配时,将所述当前匹配点标记为误匹配点,并在所述初始匹配点集中剔除携带误匹配点标识的当前匹配点。
进一步地,所述误配点剔除模块40还包括:
目标点获取单元,用于获取所述待验证相关领域点集中与所述标准相关领域点集相匹配的一待验证相关领域点,作为目标点,生成目标点集;
目标点判断单元,用于判断所述目标点集中的目标点个数是否大于第一阈值;
结果不匹配单元,用于若所述目标点个数不大于所述第一阈值,则判定所述待验证相关领域点集与所述标准相关领域点集不匹配。
结果匹配单元,用于在所述待验证相关领域点集与所述标准相关领域点集相匹配时,判定所述待验证手掌与所述标准手掌相同。
其中,上述掌纹的误匹配点剔除装置中各个模块与上述掌纹的误匹配点剔除方法实施例中各步骤相对应,其功能和实现过程在此处不再一一赘述。
此外,本申请实施例还提供一种计算机可读存储介质,所述计算机可读存储介质可以是易失性,也可以是非易失性。
本申请计算机可读存储介质上存储有掌纹的误匹配点剔除程序,其中所述掌纹的误匹配点剔除程序被处理器执行时,实现如上述的掌纹的误匹配点剔除方法的步骤。
其中,掌纹的误匹配点剔除程序被执行时所实现的方法可参照本申请掌纹的误匹配点剔除方法的各个实施例,此处不再赘述。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者系统不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者系统所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者系统中还存在另外的相同要素。
上述本申请实施例序号仅仅为了描述,不代表实施例的优劣。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在如上所述的一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本申请各个实施例所述的方法。
以上仅为本申请的优选实施例,并非因此限制本申请的专利范围,凡是利用本申请说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本申请的专利保护范围内。

Claims (20)

  1. 一种掌纹的误匹配点剔除方法,所述掌纹的误匹配点剔除方法包括以下步骤:
    获取待验证手掌图像,并基于快速特征点提取和描述算法ORB算法提取所述待验证手掌图像对应的待验证图像特征点;
    基于蛮力匹配算法、预设标准手掌图像对应的标准图像特征点以及所述待验证图像特征点,确定所述待验证手掌图像以及所述标准手掌图像对应的初始匹配点,并根据所述初始匹配点生成初始匹配点集;
    在所述初始匹配点集中获取一初始匹配点,作为所述当前匹配点,并基于特征匹配算法GMS,确定所述当前匹配点在所述待验证图像中的待验证相关领域点集以及在所述标准图像中的标准相关领域点集;
    根据所述待验证相关领域点集与所述标准相关领域点集,确定所述当前匹配点是否为误匹配点,并在所述初始匹配点集中剔除携带误匹配点标识的当前匹配点。
  2. 如权利要求1所述的掌纹的误匹配点剔除方法,其中,所述基于蛮力匹配算法、预设标准手掌图像对应的标准图像特征点以及所述待验证图像特征点,确定所述待验证手掌图像以及所述标准手掌图像对应的初始匹配点,并根据所述初始匹配点生成初始匹配点集的步骤具体包括:
    基于所述蛮力匹配算法,获取所述标准手掌图像中的一标准图像特征点,作为第一标准图像特征点,并提取所述待验证手掌图像中的一待验证图像特征点,作为第一待验证图像特征点;
    判断所述第一标准图像特征点是否为第一标准匹配点,其中,所述第一标准匹配点为所述标准手掌图像中与所述待第一待验证匹配点对应的匹配点;
    在所述第一标准图像特征点为所述第一标准匹配点时,判断所述第一待验证图像特征点是否为第一待验证匹配点,其中,所述第一待验证匹配点为所述待验证手掌图像中与所述第一标准图像特征点对应的匹配点;
    在所述第一待验证图像特征点为所述第一待验证匹配点时,将所述第一标准图像特征点与所述第一待验证图像特征点标记为一对初始匹配点,并根据所述初始匹配点生成初始匹配点集。
  3. 如权利要求2所述的掌纹的误匹配点剔除方法,其中,所述在所述第一待验证图像特征点为所述第一待验证匹配点时,将所述第一标准图像特征点与所述第一待验证图像特征点标记为一对初始匹配点,并根据所述初始匹配点生成初始匹配点集的步骤之后,还包括:
    在所述第一待验证图像特征点不是所述第一待验证匹配点时,将所述第一标准图像特征点与所述第一待验证图像特征点标记为误匹配点。
  4. 如权利要求2所述的掌纹的误匹配点剔除方法,其中,所述判断所述第一标准图像特征点是否为第一标准匹配点,其中,所述第一标准匹配点为所述标准手掌图像中与所述待第一待验证匹配点对应的匹配点的步骤具体包括:
    计算所述第一标准图像特征点对应的第一标准特征值,并计算所述第一待验证图像特征点对应的第一待验证特征值;
    根据所述第一标准特征值与所述第一待验证特征值,计算所述第一标准特征点与所述第一待验证匹配点的相似度;
    根据所述第一标准特征点与所述第一待验证匹配点的相似度,判断所述第一标准图像特征点是否为所述第一标准匹配点。
  5. 如权利要求1-4中任意一项所述的掌纹的误匹配点剔除方法,其中,所述根据所述待验证相关领域点集与所述标准相关领域点集,确定所述当前匹配点是否为误匹配点,并在所述初始匹配点集中剔除携带误匹配点标识的当前匹配点的步骤具体包括:
    在所述待验证相关领域点集与所述标准相关领域点集不匹配时,将所述当前匹配点标记为误匹配点,并在所述初始匹配点集中剔除携带误匹配点标识的当前匹配点。
  6. 如权利要求5所述的掌纹的误匹配点剔除方法,其中,所述在所述待验证相关领域点集与所述标准相关领域点集不匹配时,将所述当前匹配点标记为误匹配点,并在所述初始匹配点集中剔除携带误匹配点标识的当前匹配点的步骤之前,还包括:
    获取所述待验证相关领域点集中与所述标准相关领域点集相匹配的一待验证相关领域点,作为目标点,生成目标点集;
    判断所述目标点集中的目标点个数是否大于第一阈值;
    若所述目标点个数不大于所述第一阈值,则判定所述待验证相关领域点集与所述标准相关领域点集不匹配。
  7. 如权利要求5所述的掌纹的误匹配点剔除方法,其中,所述根据所述待验证相关领域点集与所述标准相关领域点集,确定所述当前匹配点是否为误匹配点,并在所述初始匹配点集中剔除携带误匹配点标识的当前匹配点的步骤还包括:
    在所述待验证相关领域点集与所述标准相关领域点集相匹配时,判定所述待验证手掌与所述标准手掌相同。
  8. 一种掌纹的误匹配点剔除装置,所述掌纹的误匹配点剔除装置包括:
    特征点提取模块,用于获取待验证手掌图像,并基于快速特征点提取和描述算法ORB算法提取所述待验证手掌图像对应的待验证图像特征点;
    匹配点确定模块,用于基于蛮力匹配算法、预设标准手掌图像对应的标准图像特征点以及所述待验证图像特征点,确定所述待验证手掌图像以及所述标准手掌图像对应的初始匹配点,并根据所述初始匹配点生成初始匹配点集;
    匹配点判断模块,用于在所述初始匹配点集中获取一初始匹配点,作为所述当前匹配点,并基于特征匹配算法GMS,确定所述当前匹配点在所述待验证图像中的待验证相关领域点集以及在所述标准图像中的标准相关领域点集;
    误配点剔除模块,用于根据所述待验证相关领域点集与所述标准相关领域点集,确定所述当前匹配点是否为误匹配点,并在所述初始匹配点集中剔除携带误匹配点标识的当前匹配点。
  9. 一种掌纹的误匹配点剔除设备,所述掌纹的误匹配点剔除设备包括处理器、存储器、以及存储在所述存储器上并可被所述处理器执行的掌纹的误匹配点剔除程序,其中所述掌纹的误匹配点剔除程序被所述处理器执行时,实现如下步骤:
    获取待验证手掌图像,并基于快速特征点提取和描述算法ORB算法提取所述待验证手掌图像对应的待验证图像特征点;
    基于蛮力匹配算法、预设标准手掌图像对应的标准图像特征点以及所述待验证图像特征点,确定所述待验证手掌图像以及所述标准手掌图像对应的初始匹配点,并根据所述初始匹配点生成初始匹配点集;
    在所述初始匹配点集中获取一初始匹配点,作为所述当前匹配点,并基于特征匹配算法GMS,确定所述当前匹配点在所述待验证图像中的待验证相关领域点集以及在所述标准图像中的标准相关领域点集;
    根据所述待验证相关领域点集与所述标准相关领域点集,确定所述当前匹配点是否为误匹配点,并在所述初始匹配点集中剔除携带误匹配点标识的当前匹配点。
  10. 如权利要求9所述的掌纹的误匹配点剔除设备,其中,所述基于蛮力匹配算法、预设标准手掌图像对应的标准图像特征点以及所述待验证图像特征点,确定所述待验证手掌图像以及所述标准手掌图像对应的初始匹配点,并根据所述初始匹配点生成初始匹配点集的步骤具体包括:
    基于所述蛮力匹配算法,获取所述标准手掌图像中的一标准图像特征点,作为第一标准图像特征点,并提取所述待验证手掌图像中的一待验证图像特征点,作为第一待验证图像特征点;
    判断所述第一标准图像特征点是否为第一标准匹配点,其中,所述第一标准匹配点为所述标准手掌图像中与所述待第一待验证匹配点对应的匹配点;
    在所述第一标准图像特征点为所述第一标准匹配点时,判断所述第一待验证图像特征点是否为第一待验证匹配点,其中,所述第一待验证匹配点为所述待验证手掌图像中与所述第一标准图像特征点对应的匹配点;
    在所述第一待验证图像特征点为所述第一待验证匹配点时,将所述第一标准图像特征点与所述第一待验证图像特征点标记为一对初始匹配点,并根据所述初始匹配点生成初始匹配点集。
  11. 如权利要求10所述的掌纹的误匹配点剔除设备,其中,所述在所述第一待验证图像特征点为所述第一待验证匹配点时,将所述第一标准图像特征点与所述第一待验证图像特征点标记为一对初始匹配点,并根据所述初始匹配点生成初始匹配点集的步骤之后,所述掌纹的误匹配点剔除程序被所述处理器执行时还实现如下步骤:
    在所述第一待验证图像特征点不是所述第一待验证匹配点时,将所述第一标准图像特征点与所述第一待验证图像特征点标记为误匹配点。
  12. 如权利要求10所述的掌纹的误匹配点剔除设备,其中,所述判断所述第一标准图像特征点是否为第一标准匹配点,其中,所述第一标准匹配点为所述标准手掌图像中与所述待第一待验证匹配点对应的匹配点的步骤具体包括:
    计算所述第一标准图像特征点对应的第一标准特征值,并计算所述第一待验证图像特征点对应的第一待验证特征值;
    根据所述第一标准特征值与所述第一待验证特征值,计算所述第一标准特征点与所述第一待验证匹配点的相似度;
    根据所述第一标准特征点与所述第一待验证匹配点的相似度,判断所述第一标准图像特征点是否为所述第一标准匹配点。
  13. 如权利要求9-12中任意一项所述的掌纹的误匹配点剔除设备,其中,所述根据所述待验证相关领域点集与所述标准相关领域点集,确定所述当前匹配点是否为误匹配点,并在所述初始匹配点集中剔除携带误匹配点标识的当前匹配点的步骤具体包括:
    在所述待验证相关领域点集与所述标准相关领域点集不匹配时,将所述当前匹配点标记为误匹配点,并在所述初始匹配点集中剔除携带误匹配点标识的当前匹配点。
  14. 如权利要求13所述的掌纹的误匹配点剔除设备,其中,所述在所述待验证相关领域点集与所述标准相关领域点集不匹配时,将所述当前匹配点标记为误匹配点,并在所述初始匹配点集中剔除携带误匹配点标识的当前匹配点的步骤之前,所述掌纹的误匹配点剔除程序被所述处理器执行时还实现如下步骤:
    获取所述待验证相关领域点集中与所述标准相关领域点集相匹配的一待验证相关领域点,作为目标点,生成目标点集;
    判断所述目标点集中的目标点个数是否大于第一阈值;
    若所述目标点个数不大于所述第一阈值,则判定所述待验证相关领域点集与所述标准相关领域点集不匹配。
  15. 如权利要求13所述的掌纹的误匹配点剔除设备,其中,所述根据所述待验证相关领域点集与所述标准相关领域点集,确定所述当前匹配点是否为误匹配点,并在所述初始匹配点集中剔除携带误匹配点标识的当前匹配点的步骤还包括:
    在所述待验证相关领域点集与所述标准相关领域点集相匹配时,判定所述待验证手掌与所述标准手掌相同。
  16. 一种计算机可读存储介质,所述计算机可读存储介质上存储有掌纹的误匹配点剔除程序,其中所述掌纹的误匹配点剔除程序被处理器执行时,实现如下步骤:
    获取待验证手掌图像,并基于快速特征点提取和描述算法ORB算法提取所述待验证手掌图像对应的待验证图像特征点;
    基于蛮力匹配算法、预设标准手掌图像对应的标准图像特征点以及所述待验证图像特征点,确定所述待验证手掌图像以及所述标准手掌图像对应的初始匹配点,并根据所述初始匹配点生成初始匹配点集;
    在所述初始匹配点集中获取一初始匹配点,作为所述当前匹配点,并基于特征匹配算法GMS,确定所述当前匹配点在所述待验证图像中的待验证相关领域点集以及在所述标准图像中的标准相关领域点集;
    根据所述待验证相关领域点集与所述标准相关领域点集,确定所述当前匹配点是否为误匹配点,并在所述初始匹配点集中剔除携带误匹配点标识的当前匹配点。
  17. 如权利要求16所述的计算机可读存储介质,其中,所述基于蛮力匹配算法、预设标准手掌图像对应的标准图像特征点以及所述待验证图像特征点,确定所述待验证手掌图像以及所述标准手掌图像对应的初始匹配点,并根据所述初始匹配点生成初始匹配点集的步骤具体包括:
    基于所述蛮力匹配算法,获取所述标准手掌图像中的一标准图像特征点,作为第一标准图像特征点,并提取所述待验证手掌图像中的一待验证图像特征点,作为第一待验证图像特征点;
    判断所述第一标准图像特征点是否为第一标准匹配点,其中,所述第一标准匹配点为所述标准手掌图像中与所述待第一待验证匹配点对应的匹配点;
    在所述第一标准图像特征点为所述第一标准匹配点时,判断所述第一待验证图像特征点是否为第一待验证匹配点,其中,所述第一待验证匹配点为所述待验证手掌图像中与所述第一标准图像特征点对应的匹配点;
    在所述第一待验证图像特征点为所述第一待验证匹配点时,将所述第一标准图像特征点与所述第一待验证图像特征点标记为一对初始匹配点,并根据所述初始匹配点生成初始匹配点集。
  18. 如权利要求17所述的掌纹的计算机可读存储介质,其中,所述在所述第一待验证图像特征点为所述第一待验证匹配点时,将所述第一标准图像特征点与所述第一待验证图像特征点标记为一对初始匹配点,并根据所述初始匹配点生成初始匹配点集的步骤之后,所述掌纹的误匹配点剔除程序被处理器执行时还实现如下步骤:
    在所述第一待验证图像特征点不是所述第一待验证匹配点时,将所述第一标准图像特征点与所述第一待验证图像特征点标记为误匹配点。
  19. 如权利要求17所述的掌纹的计算机可读存储介质,其中,所述判断所述第一标准图像特征点是否为第一标准匹配点,其中,所述第一标准匹配点为所述标准手掌图像中与所述待第一待验证匹配点对应的匹配点的步骤具体包括:
    计算所述第一标准图像特征点对应的第一标准特征值,并计算所述第一待验证图像特征点对应的第一待验证特征值;
    根据所述第一标准特征值与所述第一待验证特征值,计算所述第一标准特征点与所述第一待验证匹配点的相似度;
    根据所述第一标准特征点与所述第一待验证匹配点的相似度,判断所述第一标准图像特征点是否为所述第一标准匹配点。
  20. 如权利要求16-19中任意一项所述的掌纹的计算机可读存储介质,其中,所述根据所述待验证相关领域点集与所述标准相关领域点集,确定所述当前匹配点是否为误匹配点,并在所述初始匹配点集中剔除携带误匹配点标识的当前匹配点的步骤具体包括:
    在所述待验证相关领域点集与所述标准相关领域点集不匹配时,将所述当前匹配点标记为误匹配点,并在所述初始匹配点集中剔除携带误匹配点标识的当前匹配点。
PCT/CN2020/135853 2020-04-24 2020-12-11 掌纹的误匹配点剔除方法、装置、设备及存储介质 WO2021212874A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010331794.5 2020-04-24
CN202010331794.5A CN111553241B (zh) 2020-04-24 2020-04-24 掌纹的误匹配点剔除方法、装置、设备及存储介质

Publications (1)

Publication Number Publication Date
WO2021212874A1 true WO2021212874A1 (zh) 2021-10-28

Family

ID=72003947

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/135853 WO2021212874A1 (zh) 2020-04-24 2020-12-11 掌纹的误匹配点剔除方法、装置、设备及存储介质

Country Status (2)

Country Link
CN (1) CN111553241B (zh)
WO (1) WO2021212874A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114972629A (zh) * 2022-04-14 2022-08-30 广州极飞科技股份有限公司 一种特征点匹配方法、装置、设备以及存储介质
CN115049847A (zh) * 2022-06-21 2022-09-13 上海大学 一种基于orb描述子的特征点局部邻域特征匹配方法

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111553241B (zh) * 2020-04-24 2024-05-07 平安科技(深圳)有限公司 掌纹的误匹配点剔除方法、装置、设备及存储介质
CN112819095B (zh) * 2021-02-26 2023-04-18 吉林大学 特征点匹配方法、装置、智能终端及计算机可读存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150356736A1 (en) * 2014-06-05 2015-12-10 Drvision Technologies Llc Edit guided processing method for time-lapse image analysis
CN107980140A (zh) * 2017-10-16 2018-05-01 厦门中控智慧信息技术有限公司 一种掌静脉的识别方法及装置
CN109886089A (zh) * 2019-01-07 2019-06-14 平安科技(深圳)有限公司 掌纹识别方法、装置和计算机设备
CN110826355A (zh) * 2018-08-07 2020-02-21 腾讯数码(天津)有限公司 一种图像识别方法、装置和存储介质
CN111553241A (zh) * 2020-04-24 2020-08-18 平安科技(深圳)有限公司 掌纹的误匹配点剔除方法、装置、设备及存储介质

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11001804B2 (en) * 2017-08-30 2021-05-11 Wayne State University Methods for the production of therapeutic, diagnostic, or research antibodies
CN110147769B (zh) * 2019-05-22 2023-11-07 成都艾希维智能科技有限公司 一种手指静脉图像匹配方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150356736A1 (en) * 2014-06-05 2015-12-10 Drvision Technologies Llc Edit guided processing method for time-lapse image analysis
CN107980140A (zh) * 2017-10-16 2018-05-01 厦门中控智慧信息技术有限公司 一种掌静脉的识别方法及装置
CN110826355A (zh) * 2018-08-07 2020-02-21 腾讯数码(天津)有限公司 一种图像识别方法、装置和存储介质
CN109886089A (zh) * 2019-01-07 2019-06-14 平安科技(深圳)有限公司 掌纹识别方法、装置和计算机设备
CN111553241A (zh) * 2020-04-24 2020-08-18 平安科技(深圳)有限公司 掌纹的误匹配点剔除方法、装置、设备及存储介质

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114972629A (zh) * 2022-04-14 2022-08-30 广州极飞科技股份有限公司 一种特征点匹配方法、装置、设备以及存储介质
CN115049847A (zh) * 2022-06-21 2022-09-13 上海大学 一种基于orb描述子的特征点局部邻域特征匹配方法
CN115049847B (zh) * 2022-06-21 2024-04-16 上海大学 一种基于orb描述子的特征点局部邻域特征匹配方法

Also Published As

Publication number Publication date
CN111553241A (zh) 2020-08-18
CN111553241B (zh) 2024-05-07

Similar Documents

Publication Publication Date Title
WO2021212874A1 (zh) 掌纹的误匹配点剔除方法、装置、设备及存储介质
WO2021031825A1 (zh) 网络欺诈识别方法、装置、计算机装置及存储介质
CN106650350B (zh) 一种身份认证方法及系统
TW202026984A (zh) 伺服器、客戶端、用戶核身方法及系統
WO2019196303A1 (zh) 用户身份验证方法、服务器及存储介质
CN110383758B (zh) 计算机系统、秘密信息的验证方法以及计算机
WO2017210934A1 (zh) 嵌入式sim卡注册、嵌入式sim卡鉴权方法及对应系统
CN107506629B (zh) 解锁控制方法及相关产品
CN110119608A (zh) 一种生物特征信息处理方法、生物特征信息保存方法及装置
US20090169116A1 (en) Comparison method, comparison system, computer, and program
CN102891751B (zh) 从指纹图像生成业务密码的方法和设备
US20180247152A1 (en) Method and apparatus for distance measurement
CN111753271A (zh) 基于ai识别的开户身份验证方法、装置、设备及介质
CN112597978B (zh) 指纹匹配方法、装置、电子设备及存储介质
US11115214B2 (en) Biometric signature system and biometric signature method
CN112149088B (zh) 一种基于人脸识别权限管理装置
CN112802138A (zh) 一种图像处理方法、装置、存储介质及电子设备
CN114063651A (zh) 用户与多架无人机进行相互认证的方法、可存储介质
CN114996727A (zh) 基于掌纹掌静脉识别的生物特征隐私加密方法及系统
KR20080052098A (ko) 지문 특징점 및 지문 이진영상을 이용한 지문 정합 방법 및그 장치
CN109614804B (zh) 一种双模态生物特征加密方法、设备及存储设备
US11037008B2 (en) System and method for automatically detecting and repairing biometric crosslinks
CN116108412A (zh) 人脸身份认证方法、装置、设备、存储介质及程序产品
CN107995167B (zh) 一种设备识别方法及服务器
CN113177543B (zh) 证件识别方法、装置、设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20931970

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20931970

Country of ref document: EP

Kind code of ref document: A1