CN112733645A - Handwritten signature verification method and device, computer equipment and storage medium - Google Patents

Handwritten signature verification method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN112733645A
CN112733645A CN202011609053.5A CN202011609053A CN112733645A CN 112733645 A CN112733645 A CN 112733645A CN 202011609053 A CN202011609053 A CN 202011609053A CN 112733645 A CN112733645 A CN 112733645A
Authority
CN
China
Prior art keywords
image
verified
signature
face
signature image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011609053.5A
Other languages
Chinese (zh)
Other versions
CN112733645B (en
Inventor
何小臻
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Technology Shenzhen Co Ltd
Original Assignee
Ping An Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Technology Shenzhen Co Ltd filed Critical Ping An Technology Shenzhen Co Ltd
Priority to CN202011609053.5A priority Critical patent/CN112733645B/en
Priority to PCT/CN2021/091266 priority patent/WO2022142032A1/en
Publication of CN112733645A publication Critical patent/CN112733645A/en
Application granted granted Critical
Publication of CN112733645B publication Critical patent/CN112733645B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/30Writer recognition; Reading and verifying signatures
    • G06V40/33Writer recognition; Reading and verifying signatures based only on signature image, e.g. static signature recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Human Computer Interaction (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Multimedia (AREA)
  • Evolutionary Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Probability & Statistics with Applications (AREA)
  • Collating Specific Patterns (AREA)

Abstract

The embodiment of the application belongs to the field of artificial intelligence and relates to a handwritten signature verification method, a device, computer equipment and a storage medium, wherein the method comprises the following steps: acquiring a handwritten signature image to be verified and a face image to be verified; inputting the handwritten signature image into a deep learning neural network model for feature extraction to obtain the features of the handwritten signature image; retrieving a preset user signature image database, and acquiring the closest user signature image characteristic; calculating the signature feature similarity between the two; comparing the user signature with a first threshold, and searching the user signature image database when the user signature image database is larger than the first threshold to obtain the closest user face feature; calculating the similarity of the human face features between the two; and comparing the signature with a second threshold value, and determining that the handwritten signature image to be verified passes verification when the signature is larger than the second threshold value. By simultaneously comparing the similarity between the handwritten signature image and the face image and the preset data, the validity of the signature is determined, and the signature can be prevented from being falsely used.

Description

Handwritten signature verification method and device, computer equipment and storage medium
Technical Field
The present application relates to the field of artificial intelligence technologies, and in particular, to a method and an apparatus for verifying a handwritten signature, a computer device, and a storage medium.
Background
The handwritten signature is an important means for verifying the real signing intention of a user, the rapid development of internet applications such as internet finance, B2B electronic commerce, tourism, education and the like drives the application requirements of online electronic signatures, and more internet platforms actively seek legal and effective online handwritten electronic signature proofreading schemes in order to enable internet business behaviors to be legally and attentively followed.
At present, most of the mainstream handwritten signature proofreading schemes adopt handwriting recognition, and the recognition result is compared with the name based on the space relation of a handwriting track or pixels, so that the risk of falsely using the signature cannot be avoided.
Disclosure of Invention
The embodiment of the application aims to provide a handwritten signature verification method, a handwritten signature verification device, computer equipment and a storage medium, so as to solve the problem that a signature is falsely used.
In order to solve the above technical problem, an embodiment of the present application provides a handwritten signature verification method, which adopts the following technical solutions:
acquiring a handwritten signature image to be verified and a face image to be verified, wherein the handwritten signature image to be verified and the face image to be verified are from the same carrier;
inputting the handwritten signature image to be verified into a pre-trained deep learning neural network model for feature extraction to obtain the features of the handwritten signature image;
searching a preset user signature image database by adopting an annoy search algorithm according to the characteristics of the hand-written signature image, and acquiring the characteristics of the user signature image which are closest to the characteristics of the hand-written signature image;
calculating signature feature similarity between the handwritten signature image features and the closest user signature image features;
comparing the signature feature similarity with a preset first threshold, and when the signature feature similarity is greater than the preset first threshold, retrieving a preset user signature image database according to the closest user signature image feature to obtain the closest user face feature, wherein the user signature image features in the preset user signature image database correspond to the user face features one to one;
inputting the face image to be verified into a pre-trained face feature extraction model for face feature extraction to obtain the face feature to be verified;
calculating the face feature similarity between the face feature to be verified and the closest user face feature;
and comparing the face feature similarity with a preset second threshold, and determining that the handwritten signature image to be verified passes verification when the face feature similarity is greater than the preset second threshold.
Further, before the step of inputting the handwritten signature image to be verified into a pre-trained deep learning neural network model for feature extraction, obtaining the features of the handwritten signature image, the method further includes:
acquiring signature image training samples, wherein the signature image training samples are N handwritten signature images marked with user IDs;
inputting the signature image training sample into a deep learning neural network model to obtain N signature prediction results output by the deep learning neural network model in response to the signature image training sample;
comparing whether the N signature prediction results are consistent with the labels through a softmax loss function, wherein the softmax loss function is as follows:
Figure BDA0002874189720000021
n is the number of training samples, yi corresponding to the ith sample is the labeled result, h ═ h (h1, h 2.., hc) is the predicted result of the sample i, where C is the number of all classes;
and adjusting parameters of each node of the deep learning neural network model until the loss function reaches the minimum, and obtaining the trained deep learning neural network model.
Further, the preset user signature image database includes user fingerprint features, the fingerprint features correspond to the user signature image features one to one, and the method further includes, before the step of comparing the face feature similarity with a preset second threshold value, and when the face feature similarity is greater than the preset second threshold value, determining that the handwritten signature image to be verified passes verification:
acquiring a fingerprint image to be verified, wherein the fingerprint image and the handwritten signature image to be verified originate from the same carrier;
inputting the fingerprint image into a pre-trained fingerprint feature extraction model for feature extraction to obtain the fingerprint features to be verified;
retrieving a preset user signature image database according to the closest user signature image characteristic to obtain a user fingerprint characteristic corresponding to the closest user signature image characteristic;
calculating fingerprint feature similarity between the fingerprint features to be verified and the user fingerprint features;
and comparing the fingerprint feature similarity with a preset third threshold, and when the fingerprint feature similarity is greater than the preset third threshold and the face feature similarity is greater than the preset second threshold, determining that the handwritten signature image to be verified passes verification.
Further, the fingerprint feature extraction model, based on the first convolutional neural network model, further includes, before the step of inputting the fingerprint image into a pre-trained fingerprint feature extraction model for feature extraction to obtain the fingerprint feature to be verified:
acquiring fingerprint image training samples, wherein the fingerprint image training samples are N fingerprint images marked with user IDs;
inputting the fingerprint image training sample into a first convolution neural network model to obtain N fingerprint prediction results output by the first convolution neural network model in response to the fingerprint image training sample;
comparing whether the N fingerprint prediction results are consistent with the labels or not through a softmax loss function;
and adjusting parameters of each node of the first convolution neural network model until the loss function reaches the minimum, and obtaining a trained fingerprint feature extraction model.
Further, the face feature extraction model is based on a second convolutional neural network model, and before the step of inputting the face image to be verified into a pre-trained face feature extraction model for feature extraction to obtain the face feature to be verified, the method further includes:
acquiring face image training samples, wherein the face image training samples are N face images marked with user IDs;
inputting the face image training sample into a second convolutional neural network model to obtain N face prediction results output by the second convolutional neural network model in response to the face image training sample;
comparing whether the N person face prediction results are consistent with the labels or not through a softmax loss function;
and adjusting parameters of each node of the second convolutional neural network model until the loss function reaches the minimum, and obtaining a trained face feature extraction model.
Further, after the step of obtaining the handwritten signature image to be verified and the face image to be verified, where the handwritten signature image to be verified and the face image to be verified originate from the same carrier, the method further includes:
and storing the handwritten signature image to be verified and the face image to be verified into a block chain.
In order to solve the above technical problem, an embodiment of the present application further provides a handwritten signature verification apparatus, which adopts the following technical solutions:
the system comprises an acquisition module, a verification module and a verification module, wherein the acquisition module is used for acquiring a handwritten signature image to be verified and a face image to be verified, and the handwritten signature image to be verified and the face image to be verified are from the same carrier;
the first extraction module is used for inputting the handwritten signature image to be verified into a pre-trained deep learning neural network model for feature extraction to obtain the features of the handwritten signature image;
the first retrieval module is used for retrieving a preset user signature image database by adopting an annoy search algorithm according to the characteristics of the hand-written signature image to acquire the characteristics of the user signature image closest to the characteristics of the hand-written signature image;
the first calculation module is used for calculating the signature feature similarity between the handwritten signature image features and the closest user signature image features;
the second retrieval module is used for comparing the signature feature similarity with a preset first threshold, and retrieving a preset user signature image database according to the closest user signature image feature to obtain the closest user face feature when the signature feature similarity is larger than the preset first threshold, wherein the user signature image features in the preset user signature image database correspond to the user face features one to one;
the second extraction module is used for inputting the face image to be verified into a pre-trained face feature extraction model for face feature extraction to obtain the face feature to be verified;
the second calculation module is used for calculating the face feature similarity between the face feature to be verified and the closest user face feature;
and the determining module is used for comparing the face feature similarity with a preset second threshold value, and when the face feature similarity is greater than the preset second threshold value, determining that the handwritten signature image to be verified passes verification.
Further, the handwritten signature verification apparatus further includes:
the first acquisition submodule is used for acquiring signature image training samples, wherein the signature image training samples are N handwritten signature images marked with user IDs;
the first prediction submodule is used for inputting the signature image training samples into a deep learning neural network model to obtain N signature prediction results output by the deep learning neural network model in response to the signature image training samples;
a first comparison pair module, configured to compare whether the N signature prediction results and the labels are consistent through a softmax loss function, where the softmax loss function is:
Figure BDA0002874189720000051
n is the number of training samples, yi corresponding to the ith sample is the labeled result, h ═ h (h1, h 2.., hc) is the predicted result of the sample i, where C is the number of all classes;
and the first adjusting submodule is used for adjusting the parameters of each node of the deep learning neural network model until the loss function is the minimum, so that the trained deep learning neural network model is obtained.
Further, the preset user signature image database contains user fingerprint features, the fingerprint features are in one-to-one correspondence with the user signature image features, and the handwritten signature verification device further includes:
the second acquisition submodule is used for acquiring a fingerprint image to be verified, and the fingerprint image and the handwritten signature image to be verified originate from the same carrier;
the first extraction submodule is used for inputting the fingerprint image into a pre-trained fingerprint feature extraction model for feature extraction to obtain fingerprint features to be verified;
the first retrieval submodule is used for retrieving a preset user signature image database according to the closest user signature image characteristic to obtain a user fingerprint characteristic corresponding to the closest user signature image characteristic;
the first calculation submodule is used for calculating the fingerprint feature similarity between the fingerprint feature to be verified and the user fingerprint feature;
and the first determining submodule is used for comparing the fingerprint feature similarity with a preset third threshold, and when the fingerprint feature similarity is greater than the preset third threshold and the face feature similarity is greater than a preset second threshold, determining that the handwritten signature image to be verified passes verification.
Further, the fingerprint feature extraction model is based on a first convolutional neural network model, and the handwritten signature verification device further includes:
the third acquisition submodule is used for acquiring fingerprint image training samples, wherein the fingerprint image training samples are N fingerprint images marked with user IDs;
the second prediction sub-module is used for inputting the fingerprint image training sample into a first convolution neural network model to obtain N fingerprint prediction results output by the first convolution neural network model in response to the fingerprint image training sample;
the second comparison submodule is used for comparing whether the N fingerprint prediction results are consistent with the labels or not through a softmax loss function;
and the second adjusting submodule is used for adjusting the parameters of each node of the first convolution neural network model until the loss function reaches the minimum, and obtaining the trained fingerprint feature extraction model.
Further, the face feature extraction model is based on a second convolutional neural network model, and the handwritten signature verification device further includes:
a fifth obtaining submodule, configured to obtain face image training samples, where the face image training samples are N face images labeled with user IDs;
the third prediction sub-module is used for inputting the face image training sample into a second convolutional neural network model to obtain N face prediction results output by the second convolutional neural network model in response to the face image training sample;
the third comparison submodule is used for comparing whether the N personal face prediction results are consistent with the labels or not through a softmax loss function;
and the third adjusting submodule is used for adjusting the parameters of each node of the second convolutional neural network model until the loss function reaches the minimum, and obtaining the trained face feature extraction model.
Further, the handwritten signature verification apparatus further includes:
and the storage module is used for storing the handwritten signature image to be verified and the face image to be verified into a block chain.
In order to solve the above technical problem, an embodiment of the present application further provides a computer device, which adopts the following technical solutions:
a computer device comprising a memory and a processor, the memory having stored therein computer readable instructions, the processor implementing the steps of the handwritten signature verification method described above when executing the computer readable instructions.
In order to solve the above technical problem, an embodiment of the present application further provides a computer-readable storage medium, which adopts the following technical solutions:
a computer readable storage medium having computer readable instructions stored thereon which, when executed by a processor, implement the steps of the handwritten signature verification method described above.
Compared with the prior art, the embodiment of the application mainly has the following beneficial effects: the method comprises the steps of obtaining a handwritten signature image to be verified and a face image to be verified, wherein the handwritten signature image to be verified and the face image to be verified are from the same carrier; inputting the handwritten signature image to be verified into a pre-trained deep learning neural network model for feature extraction to obtain the features of the handwritten signature image; searching a preset user signature image database by adopting an annoy search algorithm according to the characteristics of the hand-written signature image, and acquiring the characteristics of the user signature image which are closest to the characteristics of the hand-written signature image; calculating signature feature similarity between the handwritten signature image features and the closest user signature image features; comparing the signature feature similarity with a preset first threshold, and when the signature feature similarity is greater than the preset first threshold, retrieving a preset user signature image database according to the closest user signature image feature to obtain the closest user face feature, wherein the user signature image features in the preset user signature image database correspond to the user face features one to one; inputting the face image to be verified into a pre-trained face feature extraction model for face feature extraction to obtain the face feature to be verified; calculating the face feature similarity between the face feature to be verified and the closest user face feature; and comparing the face feature similarity with a preset second threshold, and determining that the handwritten signature image to be verified passes verification when the face feature similarity is greater than the preset second threshold. The handwritten signature image characteristics are compared with a preset user signature image database to determine the effectiveness of the handwritten signature, the risk that the signature is falsely used can be avoided to a greater extent, and the speed and the precision of the handwritten signature verification can be considered through the combination of the annoy algorithm and the similarity calculation.
Drawings
In order to more clearly illustrate the solution of the present application, the drawings needed for describing the embodiments of the present application will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present application, and that other drawings can be obtained by those skilled in the art without inventive effort.
FIG. 1 is an exemplary system architecture diagram in which the present application may be applied;
FIG. 2 is a flow diagram of one embodiment of a handwritten signature verification method according to the present application;
FIG. 3 is a flow diagram of one embodiment of a handwritten signature verification;
FIG. 4 is a schematic diagram illustrating one embodiment of a handwritten signature verification apparatus according to the present application;
FIG. 5 is a schematic block diagram of one embodiment of a computer device according to the present application.
Detailed Description
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs; the terminology used in the description of the application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application; the terms "including" and "having," and any variations thereof, in the description and claims of this application and the description of the above figures are intended to cover non-exclusive inclusions. The terms "first," "second," and the like in the description and claims of this application or in the above-described drawings are used for distinguishing between different objects and not for describing a particular order.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings.
As shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The user may use the terminal devices 101, 102, 103 to interact with the server 105 via the network 104 to receive or send messages or the like. The terminal devices 101, 102, 103 may have various communication client applications installed thereon, such as a web browser application, a shopping application, a search application, an instant messaging tool, a mailbox client, social platform software, and the like.
The terminal devices 101, 102, 103 may be various electronic devices having a display screen and supporting web browsing, including but not limited to smart phones, tablet computers, e-book readers, MP3 players (Moving Picture experts Group Audio Layer III, mpeg compression standard Audio Layer 3), MP4 players (Moving Picture experts Group Audio Layer IV, mpeg compression standard Audio Layer 4), laptop portable computers, desktop computers, and the like.
The server 105 may be a server providing various services, such as a background server providing support for pages displayed on the terminal devices 101, 102, 103.
It should be noted that, the handwritten signature verification method provided in the embodiments of the present application generally consists ofServer/terminal device Prepare forThe execution, correspondingly, of the handwritten signature verification means is generally arrangedServer/terminal deviceIn (1).
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continuing reference to FIG. 2, a flow diagram of one embodiment of a method of handwritten signature verification in accordance with the present application is shown. The handwritten signature verification method comprises the following steps:
step S201, acquiring a handwritten signature image to be verified and a face image to be verified, wherein the handwritten signature image to be verified and the face image to be verified are from the same carrier.
In this embodiment, an electronic device (such as that shown in FIG. 1) on which a handwritten signature verification method operatesService Device/terminal equipment) The handwritten signature image to be verified and the face image to be verified can be obtained in a wired connection mode or a wireless connection mode. It should be noted that the wireless connection means may include, but is not limited to, a 3G/4G connection, a WiFi connection, a bluetooth connection, a WiMAX connection, a Zigbee connection, a uwb (ultra wideband) connection, and other wireless connection means now known or developed in the future.
The method comprises the steps that when the electronic equipment with the camera is used for carrying out handwritten signature on a user, the face image is shot, or when the user signs on a screen of the electronic equipment through a preset handwritten signature module on the electronic equipment, the face image is shot through the camera of the electronic equipment. And acquiring a handwritten signature image to be verified and a face image to be verified. Or importing the shot signature video, analyzing the video, and obtaining the handwritten signature image to be verified and the face image to be verified.
And S202, inputting the handwritten signature image to be verified into a pre-trained deep learning neural network model for feature extraction, and obtaining the features of the handwritten signature image.
In this embodiment, a handwritten signature image to be verified is input to a pre-trained deep learning neural network model for feature extraction, and the pre-trained deep learning neural network learns handwritten signature images of different users, so that the deep learning neural network can extract high-dimensional features of the handwritten signature images of different users.
And step S203, retrieving a preset user signature image database by adopting an annoy search algorithm according to the characteristics of the handwritten signature image, and acquiring the characteristics of the user signature image which are closest to the characteristics of the handwritten signature image.
In this embodiment, in some application scenarios, the user signature image database is preset, and as long as the handwritten signature image to be verified matches one of the data in the preset user signature image database, the handwritten signature to be verified is considered to be valid. In the scene, a preset user signature image database is retrieved according to the characteristics of a handwritten signature image, and in order to perform rapid characteristic search in massive IDs, an annoy (approximate neuron errors Oh Yeah) algorithm is adopted, so that the method has rapid and stable searching capability; the annoy principle is that two points are randomly selected, a kmean process with the clustering number of 2 is executed by taking the two points as initial central points, and finally two converged clustering central points are generated, so that the equidistant hyperplane of the two clustering central points divides a data space into two subspaces. And continuously dividing the partitioned subspaces by continuous recursive iteration until each subspace only has K data nodes at most. Through multiple recursive iterative partitioning, the final original data can form a binary tree-like structure. The bottom layer of the binary tree is that leaf nodes record original data nodes, and other intermediate nodes record information for segmenting the hyperplane. The temporal complexity of such a query for the point closest to a point is sub-linear. In particular, the implementation can be through a Python API.
And step S204, calculating the signature characteristic similarity between the handwritten signature image characteristic and the closest user signature image characteristic.
Since the annoy algorithm cannot simultaneously take into account the retrieval speed and the retrieval precision, when the number of constructed binary tree layers is more, the precision is higher, but the retrieval speed is slower, and only the closest point is obtained by retrieval. The method has high requirement on precision, and after the nearest user signature image features are retrieved by adopting the annoy algorithm, the similarity between the handwritten signature image features and the nearest user signature image features is calculated, so that the retrieval speed and precision can be considered. The similarity is calculated by calculating the euclidean distance between the two feature vectors.
Step S205, comparing the signature feature similarity with a preset first threshold, and when the signature feature similarity is greater than the preset first threshold, retrieving a preset user signature image database according to the closest user signature image feature to obtain the closest user face feature, wherein the user signature image features in the preset user signature image database correspond to the user face features one to one;
and step 206, inputting the face image to be verified into a pre-trained face feature extraction model for face feature extraction, so as to obtain the face feature to be verified.
In this embodiment, a face image to be verified is input to a pre-trained face feature extraction model for face feature extraction, the pre-trained face feature extraction model is based on a second convolutional neural network model, and the second convolutional neural network model learns face images of different users, so that the second convolutional neural network model can extract high-dimensional features of the face images of the different users.
Step 207, calculating the face feature similarity between the face feature to be verified and the nearest user face feature;
the similarity is calculated by calculating the euclidean distance between the two feature vectors.
And 208, comparing the face feature similarity with a preset second threshold, and determining that the handwritten signature image to be verified passes verification when the face feature similarity is greater than the preset second threshold.
And when the human face similarity between the human face features to be verified and the closest user human face features is larger than a preset second threshold value, the handwritten signature image to be verified and the human face image to be verified are considered to be consistent with data in a preset user signature image database, and verification is passed.
The method comprises the steps of obtaining a handwritten signature image to be verified and a face image to be verified, wherein the handwritten signature image to be verified and the face image to be verified are from the same carrier; inputting the handwritten signature image to be verified into a pre-trained deep learning neural network model for feature extraction to obtain the features of the handwritten signature image; searching a preset user signature image database by adopting an annoy search algorithm according to the characteristics of the hand-written signature image, and acquiring the characteristics of the user signature image which are closest to the characteristics of the hand-written signature image; calculating signature feature similarity between the handwritten signature image features and the closest user signature image features; comparing the signature feature similarity with a preset first threshold, and when the signature feature similarity is greater than the preset first threshold, retrieving a preset user signature image database according to the closest user signature image feature to obtain the closest user face feature, wherein the user signature image features in the preset user signature image database correspond to the user face features one to one; inputting the face image to be verified into a pre-trained face feature extraction model for face feature extraction to obtain the face feature to be verified; calculating the face feature similarity between the face feature to be verified and the closest user face feature; and comparing the face feature similarity with a preset second threshold, and determining that the handwritten signature image to be verified passes verification when the face feature similarity is greater than the preset second threshold. The handwritten signature image characteristics are compared with a preset user signature image database to determine the effectiveness of the handwritten signature, the risk that the signature is falsely used can be avoided to a greater extent, and the speed and the precision of the handwritten signature verification can be considered through the combination of the annoy algorithm and the similarity calculation.
In some optional implementations of this embodiment, before step 202, the electronic device may further perform the following steps:
acquiring signature image training samples, wherein the signature image training samples are N handwritten signature images marked with user IDs;
inputting the signature image training sample into a deep learning neural network model to obtain N signature prediction results output by the deep learning neural network model in response to the signature image training sample;
comparing whether the N signature prediction results are consistent with the labels through a softmax loss function, wherein the softmax loss function is as follows:
Figure BDA0002874189720000131
n is the number of training samples, yi corresponding to the ith sample is the labeled result, h ═ h (h1, h 2.., hc) is the predicted result of the sample i, where C is the number of all classes;
and adjusting parameters of each node of the deep learning neural network model until the loss function reaches the minimum, and obtaining the trained deep learning neural network model.
The deep learning neural network model can be regarded as an image feature extraction model and connected with an output layer, wherein the output layer is a softmax output layer, the softmax output layer is used for identifying an input handwritten signature image according to features extracted by the image feature extraction model, during training, whether a prediction result is consistent with a labeled result is compared through softmax, when softmax reaches a minimum value, the deep learning neural network model is trained, and the trained deep learning neural network model has the capability of extracting high-dimensional features of the signature image.
In some optional implementations, the preset user signature image database includes user fingerprint features, and the fingerprint features correspond to the user signature image features one to one, and before step S208, the electronic device may perform the following steps:
acquiring a fingerprint image to be verified, wherein the fingerprint image and the handwritten signature image to be verified originate from the same carrier;
inputting the fingerprint image into a pre-trained fingerprint feature extraction model for feature extraction to obtain the fingerprint features to be verified;
retrieving a preset user signature image database according to the closest user signature image characteristic to obtain a user fingerprint characteristic corresponding to the closest user signature image characteristic;
calculating fingerprint feature similarity between the fingerprint features to be verified and the user fingerprint features;
and comparing the fingerprint feature similarity with a preset third threshold, and when the fingerprint feature similarity is greater than the preset third threshold and the face feature similarity is greater than the preset second threshold, determining that the handwritten signature image to be verified passes verification.
In some scenarios, the signing of important documents requires not only a handwritten signature, but also the leaving of a signer's fingerprint. In order to prevent a situation of imitating handwriting, fingerprints are recognized at the same time. The extraction of the features of the fingerprint features is performed through a pre-trained fingerprint feature extraction model, and the fingerprint feature extraction model is based on a first convolution neural network model.
In a preset user signature image database, user fingerprint features correspond to user signature image features one to one, the corresponding user fingerprint features are obtained according to the closest user signature image features, the fingerprint feature similarity between the fingerprint features to be verified and the user fingerprint features is calculated, and the Euclidean distance between the two feature vectors can be calculated.
And comparing the fingerprint feature similarity with a preset third threshold, and when the fingerprint feature similarity is greater than the preset third threshold and the face feature similarity is greater than the preset second threshold, determining that the handwritten signature image to be verified passes verification.
When the signature is verified, the fingerprint is verified at the same time, the situation that the handwriting is simulated but is considered as a valid signature can be avoided, and the accuracy of signature verification is improved.
In some optional implementation manners, before the step of inputting the fingerprint image into a pre-trained fingerprint feature extraction model for feature extraction to obtain a fingerprint feature to be verified, the electronic device may perform the following steps:
acquiring fingerprint image training samples, wherein the fingerprint image training samples are N fingerprint images marked with user IDs;
inputting the fingerprint image training sample into a first convolution neural network model to obtain N fingerprint prediction results output by the first convolution neural network model in response to the fingerprint image training sample;
comparing whether the N fingerprint prediction results are consistent with the labels or not through a softmax loss function;
and adjusting parameters of each node of the first convolution neural network model until the loss function reaches the minimum, and obtaining a trained fingerprint feature extraction model.
The method comprises the steps that supervised training is adopted for training of a first convolutional neural network, namely, a fingerprint image with a user identity marked is input into the first convolutional neural network, parameters of each node of the first convolutional neural network are adjusted, a fingerprint prediction result output by the first convolutional neural network is consistent with a marking result, an output layer of the first convolutional neural network uses a softmax output layer, whether the first convolutional neural network is converged or not is measured through softmax, when the softmax value reaches the minimum, training of the first convolutional neural network is finished, and a structure before the output layer of the trained first convolutional neural network forms a fingerprint feature extraction model.
In some optional implementation manners, before the step of inputting the face image to be verified into a pre-trained face feature extraction model for feature extraction to obtain the face feature to be verified, the electronic device may perform the following steps:
acquiring face image training samples, wherein the face image training samples are N face images marked with user IDs;
inputting the face image training sample into a second convolutional neural network model to obtain N face prediction results output by the second convolutional neural network model in response to the face image training sample;
comparing whether the N person face prediction results are consistent with the labels or not through a softmax loss function;
and adjusting parameters of each node of the second convolutional neural network model until the loss function reaches the minimum, and obtaining a trained face feature extraction model.
And the second convolutional neural network is trained by adopting supervised training, namely, a face image marked with the user identity is input into the second convolutional neural network, parameters of each node of the second convolutional neural network are adjusted, a face prediction result output by the second convolutional neural network is consistent with a marking result, an output layer of the second convolutional neural network uses a softmax output layer, whether the second convolutional neural network is converged or not is balanced by softmax, when the softmax value reaches the minimum, the second convolutional neural network is trained, and a structure before the trained second convolutional neural network output layer forms a face feature extraction model.
It should be emphasized that, in order to further ensure the privacy and security of the handwritten signature information, the handwritten signature image to be verified may also be stored in a node of a block chain.
The block chain referred by the application is a novel application mode of computer technologies such as distributed data storage, point-to-point transmission, a consensus mechanism, an encryption algorithm and the like. A block chain (Blockchain), which is essentially a decentralized database, is a series of data blocks associated by using a cryptographic method, and each data block contains information of a batch of network transactions, so as to verify the validity (anti-counterfeiting) of the information and generate a next block. The blockchain may include a blockchain underlying platform, a platform product service layer, an application service layer, and the like.
The application is operational with numerous general purpose or special purpose computing system environments or configurations. For example: personal computers, server computers, hand-held or portable devices, tablet-type devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like. The application may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The application may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware associated with computer readable instructions, which can be stored in a computer readable storage medium, and when executed, the processes of the embodiments of the methods described above can be included. The storage medium may be a non-volatile storage medium such as a magnetic disk, an optical disk, a Read-Only Memory (ROM), or a Random Access Memory (RAM).
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least a portion of the steps in the flow chart of the figure may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed alternately or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
With further reference to fig. 4, as an implementation of the method shown in fig. 2, the present application provides an embodiment of a handwritten signature verification apparatus, where the embodiment of the apparatus corresponds to the embodiment of the method shown in fig. 2, and the apparatus may be applied to various electronic devices.
As shown in fig. 4, the handwritten signature verification apparatus 400 according to this embodiment includes: an obtaining module 401, a first extracting module 402, a first retrieving module 403, a first calculating module 404, a second retrieving module 405, a second extracting module 406, a second calculating module 407, and a determining module 408. Wherein:
an obtaining module 401, configured to obtain a handwritten signature image to be verified and a face image to be verified, where the handwritten signature image to be verified and the face image to be verified originate from a same carrier;
a first extraction module 402, configured to input the handwritten signature image to be verified into a pre-trained deep learning neural network model for feature extraction, so as to obtain a feature of the handwritten signature image;
a first retrieval module 403, configured to retrieve a preset user signature image database by using an annoy search algorithm according to the feature of the handwritten signature image, and obtain a user signature image feature closest to the feature of the handwritten signature image;
a first calculating module 404, configured to calculate a signature feature similarity between the handwritten signature image feature and the closest user signature image feature;
a second retrieving module 405, configured to compare the signature feature similarity with a preset first threshold, and when the signature feature similarity is greater than the preset first threshold, retrieve a preset user signature image database according to the closest user signature image feature to obtain a closest user facial feature, where the user signature image features in the preset user signature image database correspond to the user facial features one to one;
a second extraction module 406, configured to input the facial image to be verified to a pre-trained facial feature extraction model for facial feature extraction, so as to obtain a facial feature to be verified;
a second calculating module 407, configured to calculate a face feature similarity between the face feature to be verified and the closest user face feature;
the determining module 408 is configured to compare the face feature similarity with a preset second threshold, and determine that the handwritten signature image to be verified passes verification when the face feature similarity is greater than the preset second threshold.
In this embodiment, a handwritten signature image to be verified and a face image to be verified are obtained, where the handwritten signature image to be verified and the face image to be verified originate from the same carrier; inputting the handwritten signature image to be verified into a pre-trained deep learning neural network model for feature extraction to obtain the features of the handwritten signature image; searching a preset user signature image database by adopting an annoy search algorithm according to the characteristics of the hand-written signature image, and acquiring the characteristics of the user signature image which are closest to the characteristics of the hand-written signature image; calculating signature feature similarity between the handwritten signature image features and the closest user signature image features; comparing the signature feature similarity with a preset first threshold, and when the signature feature similarity is greater than the preset first threshold, retrieving a preset user signature image database according to the closest user signature image feature to obtain the closest user face feature, wherein the user signature image features in the preset user signature image database correspond to the user face features one to one; inputting the face image to be verified into a pre-trained face feature extraction model for face feature extraction to obtain the face feature to be verified; calculating the face feature similarity between the face feature to be verified and the closest user face feature; and comparing the face feature similarity with a preset second threshold, and determining that the handwritten signature image to be verified passes verification when the face feature similarity is greater than the preset second threshold. The handwritten signature image characteristics are compared with a preset user signature image database to determine the effectiveness of the handwritten signature, the risk that the signature is falsely used can be avoided to a greater extent, and the speed and the precision of the handwritten signature verification can be considered through the combination of the annoy algorithm and the similarity calculation.
In some optional implementations of this embodiment, the handwritten signature verification apparatus 400 further includes:
the first acquisition submodule is used for acquiring signature image training samples, wherein the signature image training samples are N handwritten signature images marked with user IDs;
the first prediction submodule is used for inputting the signature image training samples into a deep learning neural network model to obtain N signature prediction results output by the deep learning neural network model in response to the signature image training samples;
a first comparison pair module, configured to compare whether the N signature prediction results and the labels are consistent through a softmax loss function, where the softmax loss function is:
Figure BDA0002874189720000191
n is the number of training samples, yi corresponding to the ith sample is the labeled result, h ═ h (h1, h 2.., hc) is the predicted result of the sample i, where C is the number of all classes;
and the first adjusting submodule is used for adjusting the parameters of each node of the deep learning neural network model until the loss function is the minimum, so that the trained deep learning neural network model is obtained.
In some optional implementations of this embodiment, the handwritten signature verification apparatus 400 further includes:
the second acquisition submodule is used for acquiring a fingerprint image to be verified, and the fingerprint image and the handwritten signature image to be verified originate from the same carrier;
the first extraction submodule is used for inputting the fingerprint image into a pre-trained fingerprint feature extraction model for feature extraction to obtain fingerprint features to be verified;
the first retrieval submodule is used for retrieving a preset user signature image database according to the closest user signature image characteristic to obtain a user fingerprint characteristic corresponding to the closest user signature image characteristic;
the first calculation submodule is used for calculating the fingerprint feature similarity between the fingerprint feature to be verified and the user fingerprint feature;
and the first determining submodule is used for comparing the fingerprint feature similarity with a preset third threshold, and when the fingerprint feature similarity is greater than the preset third threshold and the face feature similarity is greater than a preset second threshold, determining that the handwritten signature image to be verified passes verification.
In some optional implementations of this embodiment, the handwritten signature verification apparatus 400 further includes:
the third acquisition submodule is used for acquiring fingerprint image training samples, wherein the fingerprint image training samples are N fingerprint images marked with user IDs;
the second prediction sub-module is used for inputting the fingerprint image training sample into a first convolution neural network model to obtain N fingerprint prediction results output by the first convolution neural network model in response to the fingerprint image training sample;
the second comparison submodule is used for comparing whether the N fingerprint prediction results are consistent with the labels or not through a softmax loss function;
and the second adjusting submodule is used for adjusting the parameters of each node of the first convolution neural network model until the loss function reaches the minimum, and obtaining the trained fingerprint feature extraction model.
In some optional implementations of this embodiment, the handwritten signature verification apparatus 400 further includes:
a fifth obtaining submodule, configured to obtain face image training samples, where the face image training samples are N face images labeled with user IDs;
the third prediction sub-module is used for inputting the face image training sample into a second convolutional neural network model to obtain N face prediction results output by the second convolutional neural network model in response to the face image training sample;
the third comparison submodule is used for comparing whether the N personal face prediction results are consistent with the labels or not through a softmax loss function;
and the third adjusting submodule is used for adjusting the parameters of each node of the second convolutional neural network model until the loss function reaches the minimum, and obtaining the trained face feature extraction model.
In some optional implementations of this embodiment, the handwritten signature verification apparatus 400 further includes:
and the storage module is used for storing the handwritten signature image to be verified and the face image to be verified into a block chain.
In order to solve the technical problem, an embodiment of the present application further provides a computer device. Referring to fig. 5, fig. 5 is a block diagram of a basic structure of a computer device according to the present embodiment.
The computer device 5 comprises a memory 51, a processor 52, a network interface 53 communicatively connected to each other via a system bus. It is noted that only a computer device 5 having components 51-53 is shown, but it is understood that not all of the shown components are required to be implemented, and that more or fewer components may be implemented instead. As will be understood by those skilled in the art, the computer device is a device capable of automatically performing numerical calculation and/or information processing according to a preset or stored instruction, and the hardware includes, but is not limited to, a microprocessor, an Application Specific Integrated Circuit (ASIC), a Programmable Gate Array (FPGA), a Digital Signal Processor (DSP), an embedded device, and the like.
The computer device can be a desktop computer, a notebook, a palm computer, a cloud server and other computing devices. The computer equipment can carry out man-machine interaction with a user through a keyboard, a mouse, a remote controller, a touch panel or voice control equipment and the like.
The memory 51 includes at least one type of readable storage medium including a flash memory, a hard disk, a multimedia card, a card type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Programmable Read Only Memory (PROM), a magnetic memory, a magnetic disk, an optical disk, etc. In some embodiments, the memory 51 may be an internal storage unit of the computer device 5, such as a hard disk or a memory of the computer device 5. In other embodiments, the memory 51 may also be an external storage device of the computer device 5, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the computer device 5. Of course, the memory 51 may also comprise both an internal storage unit of the computer device 5 and an external storage device thereof. In this embodiment, the memory 51 is generally used for storing an operating system installed in the computer device 5 and various application software, such as computer readable instructions of a handwritten signature verification method. Further, the memory 51 may also be used to temporarily store various types of data that have been output or are to be output.
The processor 52 may be a Central Processing Unit (CPU), controller, microcontroller, microprocessor, or other data Processing chip in some embodiments. The processor 52 is typically used to control the overall operation of the computer device 5. In this embodiment, the processor 52 is configured to execute computer readable instructions stored in the memory 51 or process data, such as executing computer readable instructions of the handwritten signature verification method.
The network interface 53 may comprise a wireless network interface or a wired network interface, and the network interface 53 is generally used for establishing communication connections between the computer device 5 and other electronic devices.
The method comprises the steps of obtaining a handwritten signature image to be verified and a face image to be verified, wherein the handwritten signature image to be verified and the face image to be verified are from the same carrier; inputting the handwritten signature image to be verified into a pre-trained deep learning neural network model for feature extraction to obtain the features of the handwritten signature image; searching a preset user signature image database by adopting an annoy search algorithm according to the characteristics of the hand-written signature image, and acquiring the characteristics of the user signature image which are closest to the characteristics of the hand-written signature image; calculating signature feature similarity between the handwritten signature image features and the closest user signature image features; comparing the signature feature similarity with a preset first threshold, and when the signature feature similarity is greater than the preset first threshold, retrieving a preset user signature image database according to the closest user signature image feature to obtain the closest user face feature, wherein the user signature image features in the preset user signature image database correspond to the user face features one to one; inputting the face image to be verified into a pre-trained face feature extraction model for face feature extraction to obtain the face feature to be verified; calculating the face feature similarity between the face feature to be verified and the closest user face feature; and comparing the face feature similarity with a preset second threshold, and determining that the handwritten signature image to be verified passes verification when the face feature similarity is greater than the preset second threshold. The handwritten signature image characteristics are compared with a preset user signature image database to determine the effectiveness of the handwritten signature, the risk that the signature is falsely used can be avoided to a greater extent, and the speed and the precision of the handwritten signature verification can be considered through the combination of the annoy algorithm and the similarity calculation.
The present application further provides another embodiment, which is to provide a computer-readable storage medium storing computer-readable instructions executable by at least one processor to cause the at least one processor to perform the steps of the handwritten signature verification method as described above.
The method comprises the steps of obtaining a handwritten signature image to be verified and a face image to be verified, wherein the handwritten signature image to be verified and the face image to be verified are from the same carrier; inputting the handwritten signature image to be verified into a pre-trained deep learning neural network model for feature extraction to obtain the features of the handwritten signature image; searching a preset user signature image database by adopting an annoy search algorithm according to the characteristics of the hand-written signature image, and acquiring the characteristics of the user signature image which are closest to the characteristics of the hand-written signature image; calculating signature feature similarity between the handwritten signature image features and the closest user signature image features; comparing the signature feature similarity with a preset first threshold, and when the signature feature similarity is greater than the preset first threshold, retrieving a preset user signature image database according to the closest user signature image feature to obtain the closest user face feature, wherein the user signature image features in the preset user signature image database correspond to the user face features one to one; inputting the face image to be verified into a pre-trained face feature extraction model for face feature extraction to obtain the face feature to be verified; calculating the face feature similarity between the face feature to be verified and the closest user face feature; and comparing the face feature similarity with a preset second threshold, and determining that the handwritten signature image to be verified passes verification when the face feature similarity is greater than the preset second threshold. The handwritten signature image characteristics are compared with a preset user signature image database to determine the effectiveness of the handwritten signature, the risk that the signature is falsely used can be avoided to a greater extent, and the speed and the precision of the handwritten signature verification can be considered through the combination of the annoy algorithm and the similarity calculation.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
It is to be understood that the above-described embodiments are merely illustrative of some, but not restrictive, of the broad invention, and that the appended drawings illustrate preferred embodiments of the invention and do not limit the scope of the invention. This application is capable of embodiments in many different forms and is provided for the purpose of enabling a thorough understanding of the disclosure of the application. Although the present application has been described in detail with reference to the foregoing embodiments, it will be apparent to one skilled in the art that the present application may be practiced without modification or with equivalents of some of the features described in the foregoing embodiments. All equivalent structures made by using the contents of the specification and the drawings of the present application are directly or indirectly applied to other related technical fields and are within the protection scope of the present application.

Claims (10)

1. A handwritten signature verification method is characterized by comprising the following steps:
acquiring a handwritten signature image to be verified and a face image to be verified, wherein the handwritten signature image to be verified and the face image to be verified are from the same carrier;
inputting the handwritten signature image to be verified into a pre-trained deep learning neural network model for feature extraction to obtain the features of the handwritten signature image;
searching a preset user signature image database by adopting an annoy search algorithm according to the characteristics of the hand-written signature image, and acquiring the characteristics of the user signature image which are closest to the characteristics of the hand-written signature image;
calculating signature feature similarity between the handwritten signature image features and the closest user signature image features;
comparing the signature feature similarity with a preset first threshold, and when the signature feature similarity is greater than the preset first threshold, retrieving a preset user signature image database according to the closest user signature image feature to obtain the closest user face feature, wherein the user signature image features in the preset user signature image database correspond to the user face features one to one;
inputting the face image to be verified into a pre-trained face feature extraction model for face feature extraction to obtain the face feature to be verified;
calculating the face feature similarity between the face feature to be verified and the closest user face feature;
and comparing the face feature similarity with a preset second threshold, and determining that the handwritten signature image to be verified passes verification when the face feature similarity is greater than the preset second threshold.
2. The method for verifying handwritten signature as claimed in claim 1, wherein before the step of inputting the handwritten signature image to be verified into a pre-trained deep learning neural network model for feature extraction, obtaining the features of the handwritten signature image further comprises:
acquiring signature image training samples, wherein the signature image training samples are N handwritten signature images marked with user IDs;
inputting the signature image training sample into a deep learning neural network model to obtain N signature prediction results output by the deep learning neural network model in response to the signature image training sample;
comparing whether the N signature prediction results are consistent with the labels through a softmax loss function, wherein the softmax loss function is as follows:
Figure FDA0002874189710000021
n is the number of training samples, yi corresponding to the ith sample is the labeled result, h ═ h (h1, h 2.., hc) is the predicted result of the sample i, where C is the number of all classes;
and adjusting parameters of each node of the deep learning neural network model until the loss function reaches the minimum, and obtaining the trained deep learning neural network model.
3. The method for verifying a handwritten signature as claimed in claim 1, wherein the preset user signature image database contains user fingerprint features, the fingerprint features are in one-to-one correspondence with the user signature image features, and before the step of comparing the face feature similarity with a preset second threshold and determining that the handwritten signature image to be verified passes verification when the face feature similarity is greater than the preset second threshold, the method further comprises:
acquiring a fingerprint image to be verified, wherein the fingerprint image and the handwritten signature image to be verified originate from the same carrier;
inputting the fingerprint image into a pre-trained fingerprint feature extraction model for feature extraction to obtain the fingerprint features to be verified;
retrieving a preset user signature image database according to the closest user signature image characteristic to obtain a user fingerprint characteristic corresponding to the closest user signature image characteristic;
calculating fingerprint feature similarity between the fingerprint features to be verified and the user fingerprint features;
and comparing the fingerprint feature similarity with a preset third threshold, and when the fingerprint feature similarity is greater than the preset third threshold and the face feature similarity is greater than the preset second threshold, determining that the handwritten signature image to be verified passes verification.
4. The handwritten signature verification method according to claim 3, wherein said fingerprint feature extraction model is based on a first convolutional neural network model, and before said step of inputting said fingerprint image into a pre-trained fingerprint feature extraction model for feature extraction to obtain the fingerprint features to be verified, further comprises:
acquiring fingerprint image training samples, wherein the fingerprint image training samples are N fingerprint images marked with user IDs;
inputting the fingerprint image training sample into a first convolution neural network model to obtain N fingerprint prediction results output by the first convolution neural network model in response to the fingerprint image training sample;
comparing whether the N fingerprint prediction results are consistent with the labels or not through a softmax loss function;
and adjusting parameters of each node of the first convolution neural network model until the loss function reaches the minimum, and obtaining a trained fingerprint feature extraction model.
5. The handwritten signature verification method according to claim 1, wherein the face feature extraction model is based on a second convolutional neural network model, and before the step of inputting the face image to be verified into a pre-trained face feature extraction model for feature extraction to obtain the face feature to be verified, the method further comprises:
acquiring face image training samples, wherein the face image training samples are N face images marked with user IDs;
inputting the face image training sample into a second convolutional neural network model to obtain N face prediction results output by the second convolutional neural network model in response to the face image training sample;
comparing whether the N person face prediction results are consistent with the labels or not through a softmax loss function;
and adjusting parameters of each node of the second convolutional neural network model until the loss function reaches the minimum, and obtaining a trained face feature extraction model.
6. The handwritten signature verification method according to claim 1, further comprising, after the step of obtaining the handwritten signature image to be verified and the face image to be verified, the handwritten signature image to be verified and the face image to be verified originating from the same carrier:
and storing the handwritten signature image to be verified and the face image to be verified into a block chain.
7. A handwritten signature verification device, comprising:
the system comprises an acquisition module, a verification module and a verification module, wherein the acquisition module is used for acquiring a handwritten signature image to be verified and a face image to be verified, and the handwritten signature image to be verified and the face image to be verified are from the same carrier;
the first extraction module is used for inputting the handwritten signature image to be verified into a pre-trained deep learning neural network model for feature extraction to obtain the features of the handwritten signature image;
the first retrieval module is used for retrieving a preset user signature image database by adopting an annoy search algorithm according to the characteristics of the hand-written signature image to acquire the characteristics of the user signature image closest to the characteristics of the hand-written signature image;
the first calculation module is used for calculating the signature feature similarity between the handwritten signature image features and the closest user signature image features;
the second retrieval module is used for comparing the signature feature similarity with a preset first threshold, and retrieving a preset user signature image database according to the closest user signature image feature to obtain the closest user face feature when the signature feature similarity is larger than the preset first threshold, wherein the user signature image features in the preset user signature image database correspond to the user face features one to one;
the second extraction module is used for inputting the face image to be verified into a pre-trained face feature extraction model for face feature extraction to obtain the face feature to be verified;
the second calculation module is used for calculating the face feature similarity between the face feature to be verified and the closest user face feature;
and the determining module is used for comparing the face feature similarity with a preset second threshold value, and when the face feature similarity is greater than the preset second threshold value, determining that the handwritten signature image to be verified passes verification.
8. The handwritten signature verification device according to claim 7, further comprising:
the first acquisition submodule is used for acquiring signature image training samples, wherein the signature image training samples are N handwritten signature images marked with user IDs;
the first prediction submodule is used for inputting the signature image training samples into a deep learning neural network model to obtain N signature prediction results output by the deep learning neural network model in response to the signature image training samples;
a first comparison pair module, configured to compare whether the N signature prediction results and the labels are consistent through a softmax loss function, where the softmax loss function is:
Figure FDA0002874189710000051
n is the number of training samples, yi corresponding to the ith sample is the labeled result, h ═ h (h1, h 2.., hc) is the predicted result of the sample i, where C is the number of all classes;
and the first adjusting submodule is used for adjusting the parameters of each node of the deep learning neural network model until the loss function is the minimum, so that the trained deep learning neural network model is obtained.
9. A computer device comprising a memory having computer readable instructions stored therein and a processor which when executed implements the steps of the handwritten signature verification method of any one of claims 1 to 6.
10. A computer-readable storage medium having computer-readable instructions stored thereon which, when executed by a processor, implement the steps of the handwritten signature verification method of any of claims 1 to 6.
CN202011609053.5A 2020-12-30 2020-12-30 Handwritten signature verification method, handwritten signature verification device, computer equipment and storage medium Active CN112733645B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011609053.5A CN112733645B (en) 2020-12-30 2020-12-30 Handwritten signature verification method, handwritten signature verification device, computer equipment and storage medium
PCT/CN2021/091266 WO2022142032A1 (en) 2020-12-30 2021-04-30 Handwritten signature verification method and apparatus, computer device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011609053.5A CN112733645B (en) 2020-12-30 2020-12-30 Handwritten signature verification method, handwritten signature verification device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112733645A true CN112733645A (en) 2021-04-30
CN112733645B CN112733645B (en) 2023-08-01

Family

ID=75610874

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011609053.5A Active CN112733645B (en) 2020-12-30 2020-12-30 Handwritten signature verification method, handwritten signature verification device, computer equipment and storage medium

Country Status (2)

Country Link
CN (1) CN112733645B (en)
WO (1) WO2022142032A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117437650A (en) * 2023-12-20 2024-01-23 山东山大鸥玛软件股份有限公司 Handwriting signature comparison method, system, device and medium based on deep learning

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107302433A (en) * 2016-04-15 2017-10-27 平安科技(深圳)有限公司 Method of calibration, verification server and the user terminal of electronic signature
CN109409254A (en) * 2018-10-10 2019-03-01 成都优易数据有限公司 A kind of electronic contract handwritten signature identification method based on twin neural network
CN111428557A (en) * 2020-02-18 2020-07-17 深圳壹账通智能科技有限公司 Method and device for automatically checking handwritten signature based on neural network model
WO2020211387A1 (en) * 2019-04-18 2020-10-22 深圳壹账通智能科技有限公司 Electronic contract displaying method and apparatus, electronic device, and computer readable storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103995995A (en) * 2013-04-22 2014-08-20 厦门维觉电子科技有限公司 Multimedia signature identification method and system
CN105631272B (en) * 2016-02-02 2018-05-11 云南大学 A kind of identity identifying method of multiple security
CN106779665A (en) * 2016-11-23 2017-05-31 广东微模式软件股份有限公司 A kind of POS enchashment methods based on human body biological characteristics identification with anti-repudiation technology
CN108388813A (en) * 2018-02-28 2018-08-10 中国平安财产保险股份有限公司 Electronic endorsement method, user equipment, storage medium and device
CN108595927B (en) * 2018-04-04 2023-09-19 北京市商汤科技开发有限公司 Identity authentication, unlocking and payment method and device, storage medium, product and equipment
CN109523392A (en) * 2018-10-19 2019-03-26 中国平安财产保险股份有限公司 Signature file generation method, device, computer equipment and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107302433A (en) * 2016-04-15 2017-10-27 平安科技(深圳)有限公司 Method of calibration, verification server and the user terminal of electronic signature
CN109409254A (en) * 2018-10-10 2019-03-01 成都优易数据有限公司 A kind of electronic contract handwritten signature identification method based on twin neural network
WO2020211387A1 (en) * 2019-04-18 2020-10-22 深圳壹账通智能科技有限公司 Electronic contract displaying method and apparatus, electronic device, and computer readable storage medium
CN111428557A (en) * 2020-02-18 2020-07-17 深圳壹账通智能科技有限公司 Method and device for automatically checking handwritten signature based on neural network model

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117437650A (en) * 2023-12-20 2024-01-23 山东山大鸥玛软件股份有限公司 Handwriting signature comparison method, system, device and medium based on deep learning

Also Published As

Publication number Publication date
CN112733645B (en) 2023-08-01
WO2022142032A1 (en) 2022-07-07

Similar Documents

Publication Publication Date Title
CN113127633B (en) Intelligent conference management method and device, computer equipment and storage medium
CN112395979B (en) Image-based health state identification method, device, equipment and storage medium
CN112231569A (en) News recommendation method and device, computer equipment and storage medium
CN112863683A (en) Medical record quality control method and device based on artificial intelligence, computer equipment and storage medium
US20230032728A1 (en) Method and apparatus for recognizing multimedia content
CN112632278A (en) Labeling method, device, equipment and storage medium based on multi-label classification
CN112330331A (en) Identity verification method, device and equipment based on face recognition and storage medium
CN112287069A (en) Information retrieval method and device based on voice semantics and computer equipment
CN112308237A (en) Question and answer data enhancement method and device, computer equipment and storage medium
CN113420690A (en) Vein identification method, device and equipment based on region of interest and storage medium
CN112528029A (en) Text classification model processing method and device, computer equipment and storage medium
CN114241459B (en) Driver identity verification method and device, computer equipment and storage medium
CN112995414B (en) Behavior quality inspection method, device, equipment and storage medium based on voice call
CN114282258A (en) Screen capture data desensitization method and device, computer equipment and storage medium
CN114359582A (en) Small sample feature extraction method based on neural network and related equipment
CN112733645B (en) Handwritten signature verification method, handwritten signature verification device, computer equipment and storage medium
CN113569118A (en) Self-media pushing method and device, computer equipment and storage medium
CN116935449A (en) Fingerprint image matching model training method, fingerprint matching method and related medium
CN113343898B (en) Mask shielding face recognition method, device and equipment based on knowledge distillation network
CN115730237A (en) Junk mail detection method and device, computer equipment and storage medium
CN112395450B (en) Picture character detection method and device, computer equipment and storage medium
CN112417886A (en) Intention entity information extraction method and device, computer equipment and storage medium
CN113688268B (en) Picture information extraction method, device, computer equipment and storage medium
CN113792549B (en) User intention recognition method, device, computer equipment and storage medium
CN112949317B (en) Text semantic recognition method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant