CN112733645B - Handwritten signature verification method, handwritten signature verification device, computer equipment and storage medium - Google Patents
Handwritten signature verification method, handwritten signature verification device, computer equipment and storage medium Download PDFInfo
- Publication number
- CN112733645B CN112733645B CN202011609053.5A CN202011609053A CN112733645B CN 112733645 B CN112733645 B CN 112733645B CN 202011609053 A CN202011609053 A CN 202011609053A CN 112733645 B CN112733645 B CN 112733645B
- Authority
- CN
- China
- Prior art keywords
- image
- signature
- face
- signature image
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/30—Writer recognition; Reading and verifying signatures
- G06V40/33—Writer recognition; Reading and verifying signatures based only on signature image, e.g. static signature recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2415—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
- G06V40/1347—Preprocessing; Feature extraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Computation (AREA)
- Health & Medical Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Human Computer Interaction (AREA)
- Bioinformatics & Computational Biology (AREA)
- Multimedia (AREA)
- Evolutionary Biology (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Probability & Statistics with Applications (AREA)
- Collating Specific Patterns (AREA)
Abstract
The embodiment of the application belongs to the field of artificial intelligence, and relates to a handwritten signature verification method, a handwritten signature verification device, computer equipment and a storage medium, wherein the method comprises the following steps: acquiring a handwritten signature image to be verified and a face image to be verified; inputting the hand-written signature image into a deep learning neural network model for feature extraction to obtain hand-written signature image features; searching a preset user signature image database to obtain the nearest user signature image characteristics; calculating signature feature similarity between the two; comparing the user signature image database with a first threshold value, and searching the user signature image database when the user signature image database is larger than the first threshold value to obtain the closest user face characteristics; calculating the similarity of the facial features between the two; and comparing the hand-written signature image with a second threshold value, and determining that the hand-written signature image to be verified passes verification when the hand-written signature image is larger than the second threshold value. The signature validity is determined by simultaneously comparing the similarity between the handwritten signature image and the face image with preset data, so that the signature can be prevented from being falsified.
Description
Technical Field
The present application relates to the field of artificial intelligence technologies, and in particular, to a handwritten signature verification method, apparatus, computer device, and storage medium.
Background
The handwritten signature is an important means for verifying the real signing will of the user, the rapid development of Internet applications such as Internet finance, B2B electronic commerce, travel, education and the like also drives the application requirement of online electronic signature, and in order to enable the Internet business behavior to be legal and circulation, more and more Internet platforms actively seek legal and effective online handwritten electronic signature correction schemes.
At present, the mainstream handwritten signature verification scheme mostly adopts handwriting recognition, and based on the space relation of handwriting tracks or pixels, the recognition result is compared with the name, and the risk of signature impossibility cannot be avoided in the mode.
Disclosure of Invention
The embodiment of the application aims to provide a handwritten signature verification method, a handwritten signature verification device, computer equipment and a storage medium, so as to solve the problem that signatures are falsely used.
In order to solve the above technical problems, the embodiments of the present application provide a handwritten signature verification method, which adopts the following technical scheme:
acquiring a handwritten signature image to be verified and a face image to be verified, wherein the handwritten signature image to be verified and the face image to be verified are derived from the same carrier;
Inputting the hand-written signature image to be verified into a pre-trained deep learning neural network model for feature extraction, and obtaining hand-written signature image features;
searching a preset user signature image database by adopting an annoy search algorithm according to the hand-written signature image characteristics to acquire user signature image characteristics closest to the hand-written signature image characteristics;
calculating signature feature similarity between the handwritten signature image features and the closest user signature image features;
comparing the signature feature similarity with a preset first threshold value, and searching a preset user signature image database according to the nearest user signature image feature when the signature feature similarity is larger than the preset first threshold value to obtain the nearest user face feature, wherein the user signature image features in the preset user signature image database are in one-to-one correspondence with the user face features;
inputting the face image to be checked into a pre-trained face feature extraction model to extract the face features, and obtaining the face features to be checked;
calculating the similarity of the face features between the face features to be checked and the nearest face features of the user;
Comparing the facial feature similarity with a preset second threshold value, and determining that the handwritten signature image to be verified passes verification when the facial feature similarity is larger than the preset second threshold value.
Further, before the step of inputting the handwritten signature image to be verified into a pre-trained deep learning neural network model to perform feature extraction, the method further includes:
acquiring signature image training samples, wherein the signature image training samples are N handwritten signature images marked with user IDs;
inputting the signature image training sample into a deep learning neural network model to obtain N signature prediction results output by the deep learning neural network model in response to the signature image training sample;
comparing whether the N signature predictions and the labels are consistent by a softmax loss function, wherein the softmax loss function is:
;
wherein N is the number of training samples, the yi corresponding to the i-th sample is the labeled result, h= (h 1, h2,., hc) is the predicted result for sample i, where C is the number of all classifications;
and adjusting parameters of each node of the deep learning neural network model until the loss function reaches the minimum, and obtaining the trained deep learning neural network model.
Further, the preset user signature image database includes user fingerprint features, the fingerprint features are in one-to-one correspondence with the user signature image features, and before the step of comparing the face feature similarity with a preset second threshold value, and when the face feature similarity is greater than the preset second threshold value, determining that the handwritten signature image to be verified passes the verification, the method further includes:
acquiring a fingerprint image to be verified, wherein the fingerprint image and the handwritten signature image to be verified originate from the same carrier;
inputting the fingerprint image into a pre-trained fingerprint feature extraction model for feature extraction to obtain fingerprint features to be checked;
searching a preset user signature image database according to the nearest user signature image characteristics to obtain user fingerprint characteristics corresponding to the nearest user signature image characteristics;
calculating the similarity of the fingerprint features between the fingerprint features to be checked and the user fingerprint features;
comparing the fingerprint feature similarity with a preset third threshold value, and determining that the handwritten signature image to be verified passes verification when the fingerprint feature similarity is larger than the preset third threshold value and the face feature similarity is larger than the preset second threshold value.
Further, the fingerprint feature extraction model is based on a first convolutional neural network model, and before the step of inputting the fingerprint image into a pre-trained fingerprint feature extraction model to perform feature extraction, the method further comprises the following steps:
acquiring fingerprint image training samples, wherein the fingerprint image training samples are N fingerprint images marked with user IDs;
inputting the fingerprint image training sample into a first convolutional neural network model to obtain N fingerprint prediction results output by the first convolutional neural network model in response to the fingerprint image training sample;
comparing whether the N fingerprint prediction results are consistent with the labels or not through a softmax loss function;
and adjusting parameters of each node of the first convolutional neural network model until the loss function reaches the minimum, and obtaining a trained fingerprint feature extraction model.
Further, the face feature extraction model is based on a second convolutional neural network model, and before the step of inputting the face image to be checked into a pre-trained face feature extraction model to perform feature extraction, the face feature extraction model further comprises:
Acquiring face image training samples, wherein the face image training samples are N face images marked with user IDs;
inputting the face image training sample into a second convolutional neural network model to obtain N face prediction results output by the second convolutional neural network model in response to the face image training sample;
comparing whether the N face prediction results are consistent with the labels or not through a softmax loss function;
and adjusting parameters of each node of the second convolutional neural network model until the loss function reaches the minimum, and obtaining a trained face feature extraction model.
Further, after the step of obtaining the handwritten signature image to be verified and the face image to be verified, the step of obtaining the handwritten signature image to be verified and the face image to be verified from the same carrier further includes:
and storing the handwritten signature image to be verified and the face image to be verified into a blockchain.
In order to solve the technical problems, the embodiment of the application also provides a handwritten signature verification device, which adopts the following technical scheme:
the acquisition module is used for acquiring the hand-written signature image to be checked and the face image to be checked, wherein the hand-written signature image to be checked and the face image to be checked are derived from the same carrier;
The first extraction module is used for inputting the handwritten signature image to be verified into a pre-trained deep learning neural network model to perform feature extraction, so as to obtain the features of the handwritten signature image;
the first retrieval module is used for retrieving a preset user signature image database by adopting an annoy search algorithm according to the hand-written signature image characteristics to acquire user signature image characteristics closest to the hand-written signature image characteristics;
a first calculation module for calculating signature feature similarities between the handwritten signature image features and the closest user signature image features;
the second retrieval module is used for comparing the signature feature similarity with a preset first threshold value, and retrieving a preset user signature image database according to the nearest user signature image feature to obtain the nearest user face feature when the signature feature similarity is larger than the preset first threshold value, wherein the user signature image feature in the preset user signature image database corresponds to the user face feature one by one;
the second extraction module is used for inputting the face image to be verified into a pre-trained face feature extraction model to extract the face features, so as to obtain the face features to be verified;
The second calculation module is used for calculating the similarity of the face features between the face features to be checked and the nearest face features of the user;
and the determining module is used for comparing the facial feature similarity with a preset second threshold value, and determining that the handwritten signature image to be verified passes verification when the facial feature similarity is larger than the preset second threshold value.
Further, the handwritten signature verification apparatus further includes:
the first acquisition sub-module is used for acquiring signature image training samples, wherein the signature image training samples are N handwritten signature images marked with user IDs;
the first prediction submodule is used for inputting the signature image training sample into a deep learning neural network model to obtain N signature prediction results output by the deep learning neural network model in response to the signature image training sample;
a first comparison sub-module, configured to compare whether the N signature predictions and the labels are consistent through a softmax loss function, where the softmax loss function is:
;
wherein N is the number of training samples, the yi corresponding to the i-th sample is the labeled result, h= (h 1, h2,., hc) is the predicted result for sample i, where C is the number of all classifications;
And the first adjusting sub-module is used for adjusting parameters of each node of the deep learning neural network model until the loss function reaches the minimum, and obtaining the trained deep learning neural network model.
Further, the preset user signature image database includes user fingerprint features, the fingerprint features and the user signature image features are in one-to-one correspondence, and the handwritten signature verification device further includes:
the second acquisition submodule is used for acquiring a fingerprint image to be verified, and the fingerprint image and the handwritten signature image to be verified are derived from the same carrier;
the first extraction submodule is used for inputting the fingerprint image into a pre-trained fingerprint feature extraction model to perform feature extraction so as to obtain fingerprint features to be checked;
the first retrieval sub-module is used for retrieving a preset user signature image database according to the nearest user signature image characteristics to obtain user fingerprint characteristics corresponding to the nearest user signature image characteristics;
the first computing sub-module is used for computing the similarity of the fingerprint characteristics between the fingerprint characteristics to be verified and the user fingerprint characteristics;
and the first determining submodule is used for comparing the fingerprint feature similarity with a preset third threshold value, and determining that the handwritten signature image to be verified passes verification when the fingerprint feature similarity is larger than the preset third threshold value and the face feature similarity is larger than the preset second threshold value.
Further, the fingerprint feature extraction model is based on a first convolutional neural network model, and the handwritten signature verification apparatus further includes:
the third acquisition sub-module is used for acquiring fingerprint image training samples, wherein the fingerprint image training samples are N fingerprint images marked with user IDs;
the second prediction submodule is used for inputting the fingerprint image training sample into a first convolutional neural network model to obtain N fingerprint prediction results output by the first convolutional neural network model in response to the fingerprint image training sample;
the second comparison sub-module is used for comparing whether the N fingerprint prediction results are consistent with the labels or not through a softmax loss function;
and the second adjusting sub-module is used for adjusting the parameters of each node of the first convolutional neural network model until the loss function reaches the minimum, and obtaining a trained fingerprint feature extraction model.
Further, the face feature extraction model is based on a second convolutional neural network model, and the handwritten signature verification apparatus further includes:
a fifth obtaining sub-module, configured to obtain a face image training sample, where the face image training sample is N face images labeled with user IDs;
The third prediction submodule is used for inputting the face image training sample into a second convolutional neural network model to obtain N face prediction results output by the second convolutional neural network model in response to the face image training sample;
the third comparison sub-module is used for comparing whether the N face prediction results are consistent with the labels or not through a softmax loss function;
and the third adjustment sub-module is used for adjusting the parameters of each node of the second convolutional neural network model until the loss function reaches the minimum, and obtaining a trained face feature extraction model.
Further, the handwritten signature verification apparatus further includes:
and the storage module is used for storing the handwritten signature image to be verified and the face image to be verified into a blockchain.
In order to solve the above technical problems, the embodiments of the present application further provide a computer device, which adopts the following technical schemes:
a computer device comprising a memory having stored therein computer readable instructions which when executed by the processor implement the steps of a handwritten signature verification method as described above.
In order to solve the above technical problems, embodiments of the present application further provide a computer readable storage medium, which adopts the following technical solutions:
A computer readable storage medium having stored thereon computer readable instructions which when executed by a processor perform the steps of a handwritten signature verification method as described above.
Compared with the prior art, the embodiment of the application has the following main beneficial effects: the method comprises the steps of obtaining a handwritten signature image to be verified and a face image to be verified, wherein the handwritten signature image to be verified and the face image to be verified are derived from the same carrier; inputting the hand-written signature image to be verified into a pre-trained deep learning neural network model for feature extraction, and obtaining hand-written signature image features; searching a preset user signature image database by adopting an annoy search algorithm according to the hand-written signature image characteristics to acquire user signature image characteristics closest to the hand-written signature image characteristics; calculating signature feature similarity between the handwritten signature image features and the closest user signature image features; comparing the signature feature similarity with a preset first threshold value, and searching a preset user signature image database according to the nearest user signature image feature when the signature feature similarity is larger than the preset first threshold value to obtain the nearest user face feature, wherein the user signature image features in the preset user signature image database are in one-to-one correspondence with the user face features; inputting the face image to be checked into a pre-trained face feature extraction model to extract the face features, and obtaining the face features to be checked; calculating the similarity of the face features between the face features to be checked and the nearest face features of the user; comparing the facial feature similarity with a preset second threshold value, and determining that the handwritten signature image to be verified passes verification when the facial feature similarity is larger than the preset second threshold value. The effectiveness of the handwritten signature is determined by comparing the characteristics of the handwritten signature image with a preset user signature image database, the risk of the signature being falsified can be avoided to a large extent, and the speed and the accuracy of the handwritten signature verification can be considered through the combination of an annoy algorithm and similarity calculation.
Drawings
For a clearer description of the solution in the present application, a brief description will be given below of the drawings that are needed in the description of the embodiments of the present application, it being obvious that the drawings in the following description are some embodiments of the present application, and that other drawings may be obtained from these drawings without inventive effort for a person of ordinary skill in the art.
FIG. 1 is an exemplary system architecture diagram in which the present application may be applied;
FIG. 2 is a flow chart of one embodiment of a handwritten signature verification method in accordance with the present application;
FIG. 3 is a flow chart of one embodiment of handwritten signature verification;
FIG. 4 is a schematic diagram of the structure of one embodiment of a handwritten signature verification apparatus in accordance with the present application;
FIG. 5 is a schematic structural diagram of one embodiment of a computer device according to the present application.
Description of the embodiments
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs; the terminology used in the description of the applications herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application; the terms "comprising" and "having" and any variations thereof in the description and claims of the present application and in the description of the figures above are intended to cover non-exclusive inclusions. The terms first, second and the like in the description and in the claims or in the above-described figures, are used for distinguishing between different objects and not necessarily for describing a sequential or chronological order.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the present application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
In order to better understand the technical solutions of the present application, the following description will clearly and completely describe the technical solutions in the embodiments of the present application with reference to the accompanying drawings.
As shown in fig. 1, a system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 is used as a medium to provide communication links between the terminal devices 101, 102, 103 and the server 105. The network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
The user may interact with the server 105 via the network 104 using the terminal devices 101, 102, 103 to receive or send messages or the like. Various communication client applications, such as a web browser application, a shopping class application, a search class application, an instant messaging tool, a mailbox client, social platform software, etc., may be installed on the terminal devices 101, 102, 103.
The terminal devices 101, 102, 103 may be various electronic devices having a display screen and supporting web browsing, including but not limited to smartphones, tablet computers, electronic book readers, MP3 players (Moving Picture Experts Group Audio Layer III, dynamic video expert compression standard audio plane 3), MP4 (Moving Picture Experts Group Audio Layer IV, dynamic video expert compression standard audio plane 4) players, laptop and desktop computers, and the like.
The server 105 may be a server providing various services, such as a background server providing support for pages displayed on the terminal devices 101, 102, 103.
It should be noted that, the method for verifying a handwritten signature provided in the embodiments of the present application generally includesServer/terminalEnd-on Preparation methodExecuting, correspondingly, the handwritten signature verification apparatus is generally arranged inServer/terminal deviceIs a kind of medium.
It should be understood that the number of terminal devices, networks and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to fig. 2, a flow chart of one embodiment of a method of handwritten signature verification in accordance with the present application is shown. The handwritten signature verification method comprises the following steps:
Step S201, a handwritten signature image to be verified and a face image to be verified are obtained, wherein the handwritten signature image to be verified and the face image to be verified are derived from the same carrier.
In this embodiment, the electronic device (e.g., as shown in FIG. 1) on which the handwritten signature verification method operatesService Device/terminal equipment) The handwritten signature image to be checked and the face image to be checked can be obtained through a wired connection mode or a wireless connection mode. It should be noted that the wireless connection may include, but is not limited to, 3G/4G connections, wiFi connections, bluetooth connections, wiMAX connections, zigbee connections, UWB (ultra wideband) connections, and other now known or later developed wireless connection means.
The method comprises the steps that a user is subjected to handwriting signature through electronic equipment with a camera, and face images are shot at the same time, or when signature is carried out on a screen of the electronic equipment through a preset handwriting signature module on the electronic equipment, the face images are shot through the camera of the electronic equipment at the same time. And obtaining a handwritten signature image to be verified and a face image to be verified. The signature video which is shot can be imported, the video is analyzed, and the hand-written signature image to be checked and the face image to be checked are obtained.
Step S202, inputting the hand-written signature image to be verified into a pre-trained deep learning neural network model for feature extraction, and obtaining hand-written signature image features.
In this embodiment, the handwritten signature image to be verified is input to a pre-trained deep learning neural network model for feature extraction, and the pre-trained deep learning neural network learns the handwritten signature images of different users, so that the deep learning neural network can extract high-dimensional features of the handwritten signature images of different users.
And step S203, searching a preset user signature image database by adopting an annoy search algorithm according to the hand-written signature image characteristics to acquire the user signature image characteristics closest to the hand-written signature image characteristics.
In this embodiment, in some application scenarios, the user signature image database is preset, and as long as the handwritten signature image to be verified matches one of the data in the preset user signature image database, the handwritten signature to be verified is considered valid. In the scene, a preset user signature image database is searched according to the characteristics of the handwriting signature image, and in order to perform rapid characteristic searching in massive IDs, a annoy (Approximate Nearest Neighbors Oh Yeah) algorithm is adopted, so that the method has rapid and stable searching capability; the annoy principle is that two points are randomly selected, the two points are taken as initial center points, a kmeans process with the clustering number of 2 is executed, and two converged clustering center points are finally generated, so that an equidistant hyperplane of the two clustering center points divides a data space into two subspaces. The partitioning is continued in successive recursion iterations within the partitioned subspaces until at most K data nodes remain for each subspace. Through multiple recursive iterative divisions, the final raw data will form a binary tree-like structure. The bottom layer of the binary tree is a leaf node recording original data nodes, and other intermediate nodes recording information of the segmentation hyperplane. The temporal complexity of such a query for a point closest to a point is sub-linear. Specifically, the method can be realized through a Python API.
Step S204, calculating a signature feature similarity between the hand-written signature image feature and the closest user signature image feature.
Because the annoy algorithm cannot simultaneously consider the retrieval speed and the precision, when the number of layers of the constructed binary tree is larger, the precision is higher, but the retrieval speed is slower, and the retrieval is only the nearest point. The method has high precision requirements, and after the closest user signature image features are searched by adopting an annoy algorithm, the similarity between the handwriting signature image features and the closest user signature image features is calculated, so that the searching speed and the searching precision can be both considered. The similarity is calculated by calculating the euclidean distance between the two feature vectors.
Step S205, comparing the signature feature similarity with a preset first threshold, and searching a preset user signature image database according to the nearest user signature image feature when the signature feature similarity is larger than the preset first threshold to obtain the nearest user face feature, wherein the user signature image features in the preset user signature image database are in one-to-one correspondence with the user face features;
and 206, inputting the face image to be checked into a pre-trained face feature extraction model to extract the face features, and obtaining the face features to be checked.
In this embodiment, the face image to be checked is input to a pre-trained face feature extraction model for face feature extraction, and the pre-trained face feature extraction model is based on a second convolutional neural network model, which learns face images of different users, so that the second convolutional neural network model can extract high-dimensional features of face images of different users.
Step 207, calculating a face feature similarity between the face feature to be checked and the closest face feature of the user;
the similarity is calculated by calculating the euclidean distance between the two feature vectors.
And step 208, comparing the facial feature similarity with a preset second threshold, and determining that the handwritten signature image to be verified passes verification when the facial feature similarity is greater than the preset second threshold.
When the human face similarity between the human face feature to be verified and the nearest human face feature of the user is larger than a preset second threshold value, the hand-written signature image to be verified and the human face image to be verified are considered to be consistent with data in a preset user signature image database, and verification is passed.
The method comprises the steps of obtaining a handwritten signature image to be verified and a face image to be verified, wherein the handwritten signature image to be verified and the face image to be verified are derived from the same carrier; inputting the hand-written signature image to be verified into a pre-trained deep learning neural network model for feature extraction, and obtaining hand-written signature image features; searching a preset user signature image database by adopting an annoy search algorithm according to the hand-written signature image characteristics to acquire user signature image characteristics closest to the hand-written signature image characteristics; calculating signature feature similarity between the handwritten signature image features and the closest user signature image features; comparing the signature feature similarity with a preset first threshold value, and searching a preset user signature image database according to the nearest user signature image feature when the signature feature similarity is larger than the preset first threshold value to obtain the nearest user face feature, wherein the user signature image features in the preset user signature image database are in one-to-one correspondence with the user face features; inputting the face image to be checked into a pre-trained face feature extraction model to extract the face features, and obtaining the face features to be checked; calculating the similarity of the face features between the face features to be checked and the nearest face features of the user; comparing the facial feature similarity with a preset second threshold value, and determining that the handwritten signature image to be verified passes verification when the facial feature similarity is larger than the preset second threshold value. The effectiveness of the handwritten signature is determined by comparing the characteristics of the handwritten signature image with a preset user signature image database, the risk of the signature being falsified can be avoided to a large extent, and the speed and the accuracy of the handwritten signature verification can be considered through the combination of an annoy algorithm and similarity calculation.
In some optional implementations of this embodiment, before step 202, the electronic device may further perform the following steps:
acquiring signature image training samples, wherein the signature image training samples are N handwritten signature images marked with user IDs;
inputting the signature image training sample into a deep learning neural network model to obtain N signature prediction results output by the deep learning neural network model in response to the signature image training sample;
comparing whether the N signature predictions and the labels are consistent by a softmax loss function, wherein the softmax loss function is:
;
wherein N is the number of training samples, the yi corresponding to the i-th sample is the labeled result, h= (h 1, h2,., hc) is the predicted result for sample i, where C is the number of all classifications;
and adjusting parameters of each node of the deep learning neural network model until the loss function reaches the minimum, and obtaining the trained deep learning neural network model.
The deep learning neural network model can be regarded as an image feature extraction model connected with an output layer, wherein the output layer is a softmax output layer, the softmax output layer is used for identifying an input handwritten signature image according to features extracted by the image feature extraction model, when training is performed, whether a predicted result is consistent with a marked result is compared through softmaxloss, when the softmaxloss reaches a minimum value, training of the deep learning neural network model is finished, and the trained deep learning neural network model has the capability of extracting high-dimensional features of the signature image.
In some optional implementations, the preset user signature image database includes user fingerprint features, and the fingerprint features correspond to the user signature image features one by one, and before step S208, the electronic device may perform the following steps:
acquiring a fingerprint image to be verified, wherein the fingerprint image and the handwritten signature image to be verified originate from the same carrier;
inputting the fingerprint image into a pre-trained fingerprint feature extraction model for feature extraction to obtain fingerprint features to be checked;
searching a preset user signature image database according to the nearest user signature image characteristics to obtain user fingerprint characteristics corresponding to the nearest user signature image characteristics;
calculating the similarity of the fingerprint features between the fingerprint features to be checked and the user fingerprint features;
comparing the fingerprint feature similarity with a preset third threshold value, and determining that the handwritten signature image to be verified passes verification when the fingerprint feature similarity is larger than the preset third threshold value and the face feature similarity is larger than the preset second threshold value.
In some scenarios, the signing of an important document requires not only a handwritten signature, but also the leaving of the signer's fingerprint. To prevent imitation of handwriting, fingerprints are identified at the same time. The extraction of the features of the fingerprint features is performed by a pre-trained fingerprint feature extraction model, which is based on a first convolutional neural network model.
In a preset user signature image database, user fingerprint features are in one-to-one correspondence with user signature image features, corresponding user fingerprint features are obtained according to the nearest user signature image features, the fingerprint feature similarity between the fingerprint features to be checked and the user fingerprint features is calculated, and the Euclidean distance between two feature vectors can be calculated.
Comparing the fingerprint feature similarity with a preset third threshold value, and determining that the handwritten signature image to be verified passes verification when the fingerprint feature similarity is larger than the preset third threshold value and the face feature similarity is larger than the preset second threshold value.
And when the signature is verified, the fingerprint is verified at the same time, so that the condition that handwriting is imitated but is considered as an effective signature can be avoided, and the accuracy of signature verification is improved.
In some alternative implementations, before the step of inputting the fingerprint image into a pre-trained fingerprint feature extraction model to perform feature extraction, the electronic device may perform the following steps:
acquiring fingerprint image training samples, wherein the fingerprint image training samples are N fingerprint images marked with user IDs;
Inputting the fingerprint image training sample into a first convolutional neural network model to obtain N fingerprint prediction results output by the first convolutional neural network model in response to the fingerprint image training sample;
comparing whether the N fingerprint prediction results are consistent with the labels or not through a softmax loss function;
and adjusting parameters of each node of the first convolutional neural network model until the loss function reaches the minimum, and obtaining a trained fingerprint feature extraction model.
The training of the first convolutional neural network adopts supervised training, namely fingerprint images marked with user identities are input into the first convolutional neural network, parameters of all nodes of the first convolutional neural network are regulated, fingerprint prediction results output by the first convolutional neural network are consistent with marking results, a softmax output layer is used for an output layer of the first convolutional neural network, whether the first convolutional neural network converges or not is measured through softmaxloss, and when the softmaxloss value reaches the minimum, the training of the first convolutional neural network is finished, and a fingerprint feature extraction model is formed by a structure before the output layer of the trained first convolutional neural network.
In some optional implementations, before the step of inputting the face image to be verified into a pre-trained face feature extraction model to perform feature extraction to obtain the face feature to be verified, the electronic device may perform the following steps:
Acquiring face image training samples, wherein the face image training samples are N face images marked with user IDs;
inputting the face image training sample into a second convolutional neural network model to obtain N face prediction results output by the second convolutional neural network model in response to the face image training sample;
comparing whether the N face prediction results are consistent with the labels or not through a softmax loss function;
and adjusting parameters of each node of the second convolutional neural network model until the loss function reaches the minimum, and obtaining a trained face feature extraction model.
The training of the second convolutional neural network adopts supervised training, namely, a face image marked with the user identity is input into the second convolutional neural network, parameters of each node of the second convolutional neural network are regulated, the face prediction result output by the second convolutional neural network is consistent with the marking result, a softmax output layer is used for an output layer of the second convolutional neural network, whether the second convolutional neural network converges or not is measured through softmaxloss, and when the softmaxloss value reaches the minimum, the training of the second convolutional neural network is finished, and a face feature extraction model is formed by a structure before the trained second convolutional neural network output layer.
It should be emphasized that, to further ensure the privacy and security of the handwritten signature information, the handwritten signature image to be verified may also be stored in a node of a blockchain.
The blockchain referred to in the application is a novel application mode of computer technologies such as distributed data storage, point-to-point transmission, consensus mechanism, encryption algorithm and the like. The Blockchain (Blockchain), which is essentially a decentralised database, is a string of data blocks that are generated by cryptographic means in association, each data block containing a batch of information of network transactions for verifying the validity of the information (anti-counterfeiting) and generating the next block. The blockchain may include a blockchain underlying platform, a platform product services layer, an application services layer, and the like.
The subject application is operational with numerous general purpose or special purpose computer system environments or configurations. For example: personal computers, server computers, hand-held or portable devices, tablet devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like. The application may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The application may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by computer readable instructions stored in a computer readable storage medium that, when executed, may comprise the steps of the embodiments of the methods described above. The storage medium may be a nonvolatile storage medium such as a magnetic disk, an optical disk, a Read-Only Memory (ROM), or a random access Memory (Random Access Memory, RAM).
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited in order and may be performed in other orders, unless explicitly stated herein. Moreover, at least some of the steps in the flowcharts of the figures may include a plurality of sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, the order of their execution not necessarily being sequential, but may be performed in turn or alternately with other steps or at least a portion of the other steps or stages.
With further reference to fig. 4, as an implementation of the method shown in fig. 2, the present application provides an embodiment of a handwritten signature verification apparatus, where an embodiment of the apparatus corresponds to the embodiment of the method shown in fig. 2, and the apparatus may be specifically applied to various electronic devices.
As shown in fig. 4, the handwritten signature verification apparatus 400 according to the embodiment includes: an acquisition module 401, a first extraction module 402, a first retrieval module 403, a first calculation module 404, a second retrieval module 405, a second extraction module 406, a second calculation module 407, and a determination module 408. Wherein:
an obtaining module 401, configured to obtain a handwritten signature image to be verified and a face image to be verified, where the handwritten signature image to be verified and the face image to be verified originate from the same carrier;
a first extraction module 402, configured to input the handwritten signature image to be verified into a pre-trained deep learning neural network model for feature extraction, so as to obtain handwritten signature image features;
a first retrieving module 403, configured to retrieve a preset user signature image database by adopting an annoy searching algorithm according to the handwritten signature image feature, so as to obtain a user signature image feature closest to the handwritten signature image feature;
A first calculation module 404, configured to calculate a signature feature similarity between the handwritten signature image feature and the closest user signature image feature;
a second retrieving module 405, configured to compare the signature feature similarity with a preset first threshold, and retrieve a preset user signature image database according to the closest user signature image feature to obtain a closest user face feature when the signature feature similarity is greater than the preset first threshold, where the user signature image feature in the preset user signature image database corresponds to the user face feature one by one;
a second extraction module 406, configured to input the face image to be verified to a pre-trained face feature extraction model to perform face feature extraction, so as to obtain a face feature to be verified;
a second calculating module 407, configured to calculate a face feature similarity between the face feature to be checked and the closest face feature of the user;
the determining module 408 is configured to compare the facial feature similarity with a preset second threshold, and determine that the handwritten signature image to be verified passes verification when the facial feature similarity is greater than the preset second threshold.
In the embodiment, a handwritten signature image to be verified and a face image to be verified are obtained, wherein the handwritten signature image to be verified and the face image to be verified are derived from the same carrier; inputting the hand-written signature image to be verified into a pre-trained deep learning neural network model for feature extraction, and obtaining hand-written signature image features; searching a preset user signature image database by adopting an annoy search algorithm according to the hand-written signature image characteristics to acquire user signature image characteristics closest to the hand-written signature image characteristics; calculating signature feature similarity between the handwritten signature image features and the closest user signature image features; comparing the signature feature similarity with a preset first threshold value, and searching a preset user signature image database according to the nearest user signature image feature when the signature feature similarity is larger than the preset first threshold value to obtain the nearest user face feature, wherein the user signature image features in the preset user signature image database are in one-to-one correspondence with the user face features; inputting the face image to be checked into a pre-trained face feature extraction model to extract the face features, and obtaining the face features to be checked; calculating the similarity of the face features between the face features to be checked and the nearest face features of the user; comparing the facial feature similarity with a preset second threshold value, and determining that the handwritten signature image to be verified passes verification when the facial feature similarity is larger than the preset second threshold value. The effectiveness of the handwritten signature is determined by comparing the characteristics of the handwritten signature image with a preset user signature image database, the risk of the signature being falsified can be avoided to a large extent, and the speed and the accuracy of the handwritten signature verification can be considered through the combination of an annoy algorithm and similarity calculation.
In some optional implementations of this embodiment, the handwritten signature verification apparatus 400 further includes:
the first acquisition sub-module is used for acquiring signature image training samples, wherein the signature image training samples are N handwritten signature images marked with user IDs;
the first prediction submodule is used for inputting the signature image training sample into a deep learning neural network model to obtain N signature prediction results output by the deep learning neural network model in response to the signature image training sample;
a first comparison sub-module, configured to compare whether the N signature predictions and the labels are consistent through a softmax loss function, where the softmax loss function is:
;
wherein N is the number of training samples, the yi corresponding to the i-th sample is the labeled result, h= (h 1, h2,., hc) is the predicted result for sample i, where C is the number of all classifications;
and the first adjusting sub-module is used for adjusting parameters of each node of the deep learning neural network model until the loss function reaches the minimum, and obtaining the trained deep learning neural network model.
In some optional implementations of this embodiment, the handwritten signature verification apparatus 400 further includes:
The second acquisition submodule is used for acquiring a fingerprint image to be verified, and the fingerprint image and the handwritten signature image to be verified are derived from the same carrier;
the first extraction submodule is used for inputting the fingerprint image into a pre-trained fingerprint feature extraction model to perform feature extraction so as to obtain fingerprint features to be checked;
the first retrieval sub-module is used for retrieving a preset user signature image database according to the nearest user signature image characteristics to obtain user fingerprint characteristics corresponding to the nearest user signature image characteristics;
the first computing sub-module is used for computing the similarity of the fingerprint characteristics between the fingerprint characteristics to be verified and the user fingerprint characteristics;
and the first determining submodule is used for comparing the fingerprint feature similarity with a preset third threshold value, and determining that the handwritten signature image to be verified passes verification when the fingerprint feature similarity is larger than the preset third threshold value and the face feature similarity is larger than the preset second threshold value.
In some optional implementations of this embodiment, the handwritten signature verification apparatus 400 further includes:
the third acquisition sub-module is used for acquiring fingerprint image training samples, wherein the fingerprint image training samples are N fingerprint images marked with user IDs;
The second prediction submodule is used for inputting the fingerprint image training sample into a first convolutional neural network model to obtain N fingerprint prediction results output by the first convolutional neural network model in response to the fingerprint image training sample;
the second comparison sub-module is used for comparing whether the N fingerprint prediction results are consistent with the labels or not through a softmax loss function;
and the second adjusting sub-module is used for adjusting the parameters of each node of the first convolutional neural network model until the loss function reaches the minimum, and obtaining a trained fingerprint feature extraction model.
In some optional implementations of this embodiment, the handwritten signature verification apparatus 400 further includes:
a fifth obtaining sub-module, configured to obtain a face image training sample, where the face image training sample is N face images labeled with user IDs;
the third prediction submodule is used for inputting the face image training sample into a second convolutional neural network model to obtain N face prediction results output by the second convolutional neural network model in response to the face image training sample;
the third comparison sub-module is used for comparing whether the N face prediction results are consistent with the labels or not through a softmax loss function;
And the third adjustment sub-module is used for adjusting the parameters of each node of the second convolutional neural network model until the loss function reaches the minimum, and obtaining a trained face feature extraction model.
In some optional implementations of this embodiment, the handwritten signature verification apparatus 400 further includes:
and the storage module is used for storing the handwritten signature image to be verified and the face image to be verified into a blockchain.
In order to solve the technical problems, the embodiment of the application also provides computer equipment. Referring specifically to fig. 5, fig. 5 is a basic structural block diagram of a computer device according to the present embodiment.
The computer device 5 comprises a memory 51, a processor 52, a network interface 53 which are communicatively connected to each other via a system bus. It should be noted that only the computer device 5 with components 51-53 is shown in the figures, but it should be understood that not all of the illustrated components are required to be implemented and that more or fewer components may be implemented instead. It will be appreciated by those skilled in the art that the computer device herein is a device capable of automatically performing numerical calculations and/or information processing in accordance with predetermined or stored instructions, the hardware of which includes, but is not limited to, microprocessors, application specific integrated circuits (Application Specific Integrated Circuit, ASICs), programmable gate arrays (fields-Programmable Gate Array, FPGAs), digital processors (Digital Signal Processor, DSPs), embedded devices, etc.
The computer equipment can be a desktop computer, a notebook computer, a palm computer, a cloud server and other computing equipment. The computer equipment can perform man-machine interaction with a user through a keyboard, a mouse, a remote controller, a touch pad or voice control equipment and the like.
The memory 51 includes at least one type of readable storage medium including flash memory, hard disk, multimedia card, card memory (e.g., SD or DX memory, etc.), random Access Memory (RAM), static Random Access Memory (SRAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), programmable Read Only Memory (PROM), magnetic memory, magnetic disk, optical disk, etc. In some embodiments, the storage 51 may be an internal storage unit of the computer device 5, such as a hard disk or a memory of the computer device 5. In other embodiments, the memory 51 may also be an external storage device of the computer device 5, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash Card (Flash Card) or the like, which are provided on the computer device 5. Of course, the memory 51 may also comprise both an internal memory unit of the computer device 5 and an external memory device. In this embodiment, the memory 51 is typically used to store an operating system and various application software installed on the computer device 5, such as computer readable instructions of a handwritten signature verification method. Further, the memory 51 may be used to temporarily store various types of data that have been output or are to be output.
The processor 52 may be a central processing unit (Central Processing Unit, CPU), controller, microcontroller, microprocessor, or other data processing chip in some embodiments. The processor 52 is typically used to control the overall operation of the computer device 5. In this embodiment, the processor 52 is configured to execute computer readable instructions stored in the memory 51 or process data, such as computer readable instructions for executing the handwritten signature verification method.
The network interface 53 may comprise a wireless network interface or a wired network interface, which network interface 53 is typically used to establish communication connections between the computer device 5 and other electronic devices.
The method comprises the steps of obtaining a handwritten signature image to be verified and a face image to be verified, wherein the handwritten signature image to be verified and the face image to be verified are derived from the same carrier; inputting the hand-written signature image to be verified into a pre-trained deep learning neural network model for feature extraction, and obtaining hand-written signature image features; searching a preset user signature image database by adopting an annoy search algorithm according to the hand-written signature image characteristics to acquire user signature image characteristics closest to the hand-written signature image characteristics; calculating signature feature similarity between the handwritten signature image features and the closest user signature image features; comparing the signature feature similarity with a preset first threshold value, and searching a preset user signature image database according to the nearest user signature image feature when the signature feature similarity is larger than the preset first threshold value to obtain the nearest user face feature, wherein the user signature image features in the preset user signature image database are in one-to-one correspondence with the user face features; inputting the face image to be checked into a pre-trained face feature extraction model to extract the face features, and obtaining the face features to be checked; calculating the similarity of the face features between the face features to be checked and the nearest face features of the user; comparing the facial feature similarity with a preset second threshold value, and determining that the handwritten signature image to be verified passes verification when the facial feature similarity is larger than the preset second threshold value. The effectiveness of the handwritten signature is determined by comparing the characteristics of the handwritten signature image with a preset user signature image database, the risk of the signature being falsified can be avoided to a large extent, and the speed and the accuracy of the handwritten signature verification can be considered through the combination of an annoy algorithm and similarity calculation.
The present application also provides another embodiment, namely, a computer-readable storage medium storing computer-readable instructions executable by at least one processor to cause the at least one processor to perform the steps of the handwritten signature verification method as described above.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk), comprising several instructions for causing a terminal device (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method described in the embodiments of the present application.
It is apparent that the embodiments described above are only some embodiments of the present application, but not all embodiments, the preferred embodiments of the present application are given in the drawings, but not limiting the patent scope of the present application. This application may be embodied in many different forms, but rather, embodiments are provided in order to provide a more thorough understanding of the present disclosure. Although the present application has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that modifications may be made to the embodiments described in the foregoing, or equivalents may be substituted for elements thereof. All equivalent structures made by the specification and the drawings of the application are directly or indirectly applied to other related technical fields, and are also within the protection scope of the application.
Claims (10)
1. A method for verifying a handwritten signature, comprising the steps of:
acquiring a handwritten signature image to be verified and a face image to be verified, wherein the handwritten signature image to be verified and the face image to be verified are derived from the same carrier;
inputting the hand-written signature image to be verified into a pre-trained deep learning neural network model for feature extraction, and obtaining hand-written signature image features;
searching a preset user signature image database by adopting an annoy search algorithm according to the hand-written signature image characteristics to acquire user signature image characteristics closest to the hand-written signature image characteristics;
calculating signature feature similarity between the handwritten signature image features and the closest user signature image features;
comparing the signature feature similarity with a preset first threshold value, and searching a preset user signature image database according to the nearest user signature image feature when the signature feature similarity is larger than the preset first threshold value to obtain the nearest user face feature, wherein the user signature image features in the preset user signature image database are in one-to-one correspondence with the user face features;
Inputting the face image to be checked into a pre-trained face feature extraction model to extract the face features, and obtaining the face features to be checked;
calculating the similarity of the face features between the face features to be checked and the nearest face features of the user;
comparing the facial feature similarity with a preset second threshold value, and determining that the handwritten signature image to be verified passes verification when the facial feature similarity is larger than the preset second threshold value.
2. The handwritten signature verification method according to claim 1, wherein before the step of inputting the handwritten signature image to be verified into a pre-trained deep learning neural network model for feature extraction, the step of obtaining handwritten signature image features further comprises:
acquiring signature image training samples, wherein the signature image training samples are N handwritten signature images marked with user IDs;
inputting the signature image training sample into a deep learning neural network model to obtain N signature prediction results output by the deep learning neural network model in response to the signature image training sample;
comparing whether the N signature predictions and the labels are consistent by a softmax loss function, wherein the softmax loss function is:
Wherein N is the number of training samples, the yi corresponding to the i-th sample is the labeled result, h= (h 1, h2,., hc) is the predicted result for sample i, where C is the number of all classifications;
and adjusting parameters of each node of the deep learning neural network model until the loss function reaches the minimum, and obtaining the trained deep learning neural network model.
3. The method for verifying a handwritten signature according to claim 1, wherein the preset user signature image database contains user fingerprint features, the fingerprint features are in one-to-one correspondence with user signature image features, and before the step of comparing the face feature similarity with a preset second threshold value and determining that the handwritten signature image to be verified passes the verification when the face feature similarity is greater than the preset second threshold value, the method further comprises:
acquiring a fingerprint image to be verified, wherein the fingerprint image and the handwritten signature image to be verified originate from the same carrier;
inputting the fingerprint image into a pre-trained fingerprint feature extraction model for feature extraction to obtain fingerprint features to be checked;
searching a preset user signature image database according to the nearest user signature image characteristics to obtain user fingerprint characteristics corresponding to the nearest user signature image characteristics;
Calculating the similarity of the fingerprint features between the fingerprint features to be checked and the user fingerprint features;
comparing the fingerprint feature similarity with a preset third threshold value, and determining that the handwritten signature image to be verified passes verification when the fingerprint feature similarity is larger than the preset third threshold value and the face feature similarity is larger than the preset second threshold value.
4. A handwritten signature verification method according to claim 3, wherein said fingerprint feature extraction model is based on a first convolutional neural network model, and before said step of inputting said fingerprint image into a pre-trained fingerprint feature extraction model for feature extraction, further comprising:
acquiring fingerprint image training samples, wherein the fingerprint image training samples are N fingerprint images marked with user IDs;
inputting the fingerprint image training sample into a first convolutional neural network model to obtain N fingerprint prediction results output by the first convolutional neural network model in response to the fingerprint image training sample;
comparing whether the N fingerprint prediction results are consistent with the labels or not through a softmax loss function;
And adjusting parameters of each node of the first convolutional neural network model until the loss function reaches the minimum, and obtaining a trained fingerprint feature extraction model.
5. The method for verifying a handwritten signature according to claim 1, wherein the face feature extraction model is based on a second convolutional neural network model, and before the step of inputting the face image to be verified into a pre-trained face feature extraction model to perform feature extraction, the method further comprises:
acquiring face image training samples, wherein the face image training samples are N face images marked with user IDs;
inputting the face image training sample into a second convolutional neural network model to obtain N face prediction results output by the second convolutional neural network model in response to the face image training sample;
comparing whether the N face prediction results are consistent with the labels or not through a softmax loss function;
and adjusting parameters of each node of the second convolutional neural network model until the loss function reaches the minimum, and obtaining a trained face feature extraction model.
6. The method for verifying a handwritten signature according to claim 1, wherein after the step of acquiring a handwritten signature image to be verified and a face image to be verified, wherein the handwritten signature image to be verified and the face image to be verified originate from the same carrier, the method further comprises:
And storing the handwritten signature image to be verified and the face image to be verified into a blockchain.
7. A handwritten signature verification apparatus, comprising:
the acquisition module is used for acquiring the hand-written signature image to be checked and the face image to be checked, wherein the hand-written signature image to be checked and the face image to be checked are derived from the same carrier;
the first extraction module is used for inputting the handwritten signature image to be verified into a pre-trained deep learning neural network model to perform feature extraction, so as to obtain the features of the handwritten signature image;
the first retrieval module is used for retrieving a preset user signature image database by adopting an annoy search algorithm according to the hand-written signature image characteristics to acquire user signature image characteristics closest to the hand-written signature image characteristics;
a first calculation module for calculating signature feature similarities between the handwritten signature image features and the closest user signature image features;
the second retrieval module is used for comparing the signature feature similarity with a preset first threshold value, and retrieving a preset user signature image database according to the nearest user signature image feature to obtain the nearest user face feature when the signature feature similarity is larger than the preset first threshold value, wherein the user signature image feature in the preset user signature image database corresponds to the user face feature one by one;
The second extraction module is used for inputting the face image to be verified into a pre-trained face feature extraction model to extract the face features, so as to obtain the face features to be verified;
the second calculation module is used for calculating the similarity of the face features between the face features to be checked and the nearest face features of the user;
and the determining module is used for comparing the facial feature similarity with a preset second threshold value, and determining that the handwritten signature image to be verified passes verification when the facial feature similarity is larger than the preset second threshold value.
8. The handwritten signature verification apparatus of claim 7, further comprising:
the first acquisition sub-module is used for acquiring signature image training samples, wherein the signature image training samples are N handwritten signature images marked with user IDs;
the first prediction submodule is used for inputting the signature image training sample into a deep learning neural network model to obtain N signature prediction results output by the deep learning neural network model in response to the signature image training sample;
a first comparison sub-module, configured to compare whether the N signature predictions and the labels are consistent through a softmax loss function, where the softmax loss function is:
Wherein N is the number of training samples, the yi corresponding to the i-th sample is the labeled result, h= (h 1, h2,., hc) is the predicted result for sample i, where C is the number of all classifications;
and the first adjusting sub-module is used for adjusting parameters of each node of the deep learning neural network model until the loss function reaches the minimum, and obtaining the trained deep learning neural network model.
9. A computer device comprising a memory having stored therein computer readable instructions which when executed implement the steps of the handwritten signature verification method as recited in any one of claims 1 to 6.
10. A computer readable storage medium having stored thereon computer readable instructions which when executed by a processor implement the steps of the handwritten signature verification method according to any of claims 1 to 6.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011609053.5A CN112733645B (en) | 2020-12-30 | 2020-12-30 | Handwritten signature verification method, handwritten signature verification device, computer equipment and storage medium |
PCT/CN2021/091266 WO2022142032A1 (en) | 2020-12-30 | 2021-04-30 | Handwritten signature verification method and apparatus, computer device, and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011609053.5A CN112733645B (en) | 2020-12-30 | 2020-12-30 | Handwritten signature verification method, handwritten signature verification device, computer equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112733645A CN112733645A (en) | 2021-04-30 |
CN112733645B true CN112733645B (en) | 2023-08-01 |
Family
ID=75610874
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011609053.5A Active CN112733645B (en) | 2020-12-30 | 2020-12-30 | Handwritten signature verification method, handwritten signature verification device, computer equipment and storage medium |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN112733645B (en) |
WO (1) | WO2022142032A1 (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113255582A (en) * | 2021-06-21 | 2021-08-13 | 中国银行股份有限公司 | Handwriting identification method and device based on deep neural network and block chain |
CN113705560A (en) * | 2021-09-01 | 2021-11-26 | 平安医疗健康管理股份有限公司 | Data extraction method, device and equipment based on image recognition and storage medium |
US12051256B2 (en) | 2021-09-13 | 2024-07-30 | Microsoft Technology Licensing, Llc | Entry detection and recognition for custom forms |
CN117437650A (en) * | 2023-12-20 | 2024-01-23 | 山东山大鸥玛软件股份有限公司 | Handwriting signature comparison method, system, device and medium based on deep learning |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107302433A (en) * | 2016-04-15 | 2017-10-27 | 平安科技(深圳)有限公司 | Method of calibration, verification server and the user terminal of electronic signature |
CN109409254A (en) * | 2018-10-10 | 2019-03-01 | 成都优易数据有限公司 | A kind of electronic contract handwritten signature identification method based on twin neural network |
CN111428557A (en) * | 2020-02-18 | 2020-07-17 | 深圳壹账通智能科技有限公司 | Method and device for automatically checking handwritten signature based on neural network model |
WO2020211387A1 (en) * | 2019-04-18 | 2020-10-22 | 深圳壹账通智能科技有限公司 | Electronic contract displaying method and apparatus, electronic device, and computer readable storage medium |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103995995A (en) * | 2013-04-22 | 2014-08-20 | 厦门维觉电子科技有限公司 | Multimedia signature identification method and system |
CN105631272B (en) * | 2016-02-02 | 2018-05-11 | 云南大学 | A kind of identity identifying method of multiple security |
CN106779665A (en) * | 2016-11-23 | 2017-05-31 | 广东微模式软件股份有限公司 | A kind of POS enchashment methods based on human body biological characteristics identification with anti-repudiation technology |
CN108388813A (en) * | 2018-02-28 | 2018-08-10 | 中国平安财产保险股份有限公司 | Electronic endorsement method, user equipment, storage medium and device |
CN108595927B (en) * | 2018-04-04 | 2023-09-19 | 北京市商汤科技开发有限公司 | Identity authentication, unlocking and payment method and device, storage medium, product and equipment |
CN109523392B (en) * | 2018-10-19 | 2024-06-28 | 中国平安财产保险股份有限公司 | Signature file generation method, device, computer equipment and storage medium |
-
2020
- 2020-12-30 CN CN202011609053.5A patent/CN112733645B/en active Active
-
2021
- 2021-04-30 WO PCT/CN2021/091266 patent/WO2022142032A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107302433A (en) * | 2016-04-15 | 2017-10-27 | 平安科技(深圳)有限公司 | Method of calibration, verification server and the user terminal of electronic signature |
CN109409254A (en) * | 2018-10-10 | 2019-03-01 | 成都优易数据有限公司 | A kind of electronic contract handwritten signature identification method based on twin neural network |
WO2020211387A1 (en) * | 2019-04-18 | 2020-10-22 | 深圳壹账通智能科技有限公司 | Electronic contract displaying method and apparatus, electronic device, and computer readable storage medium |
CN111428557A (en) * | 2020-02-18 | 2020-07-17 | 深圳壹账通智能科技有限公司 | Method and device for automatically checking handwritten signature based on neural network model |
Also Published As
Publication number | Publication date |
---|---|
WO2022142032A1 (en) | 2022-07-07 |
CN112733645A (en) | 2021-04-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112733645B (en) | Handwritten signature verification method, handwritten signature verification device, computer equipment and storage medium | |
CN108629043B (en) | Webpage target information extraction method, device and storage medium | |
CN108564954B (en) | Deep neural network model, electronic device, identity verification method, and storage medium | |
CN113435583B (en) | Federal learning-based countermeasure generation network model training method and related equipment thereof | |
CN109543516A (en) | Signing intention judgment method, device, computer equipment and storage medium | |
CN112395979B (en) | Image-based health state identification method, device, equipment and storage medium | |
US20230032728A1 (en) | Method and apparatus for recognizing multimedia content | |
CN111898550B (en) | Expression recognition model building method and device, computer equipment and storage medium | |
CN112330331A (en) | Identity verification method, device and equipment based on face recognition and storage medium | |
CN112417887B (en) | Sensitive word and sentence recognition model processing method and related equipment thereof | |
CN112995414B (en) | Behavior quality inspection method, device, equipment and storage medium based on voice call | |
CN112668482B (en) | Face recognition training method, device, computer equipment and storage medium | |
CN113343898B (en) | Mask shielding face recognition method, device and equipment based on knowledge distillation network | |
CN113626704A (en) | Method, device and equipment for recommending information based on word2vec model | |
Thao et al. | Self-enhancing gps-based authentication using corresponding address | |
CN116935449A (en) | Fingerprint image matching model training method, fingerprint matching method and related medium | |
US11810388B1 (en) | Person re-identification method and apparatus based on deep learning network, device, and medium | |
CN111552865A (en) | User interest portrait method and related equipment | |
Sokolova et al. | Computation-efficient face recognition algorithm using a sequential analysis of high dimensional neural-net features | |
CN114282019A (en) | Target multimedia data searching method and device, computer equipment and storage medium | |
CN112417886B (en) | Method, device, computer equipment and storage medium for extracting intention entity information | |
CN117237757A (en) | Face recognition model training method and device, electronic equipment and medium | |
Punyani et al. | A comparison study of face, gait and speech features for age estimation | |
CN113259369B (en) | Data set authentication method and system based on machine learning member inference attack | |
CN114513578A (en) | Outbound method, device, computer equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |