WO2022142032A1 - Procédé et appareil de vérification de signature manuscrite, dispositif informatique et support de stockage - Google Patents

Procédé et appareil de vérification de signature manuscrite, dispositif informatique et support de stockage Download PDF

Info

Publication number
WO2022142032A1
WO2022142032A1 PCT/CN2021/091266 CN2021091266W WO2022142032A1 WO 2022142032 A1 WO2022142032 A1 WO 2022142032A1 CN 2021091266 W CN2021091266 W CN 2021091266W WO 2022142032 A1 WO2022142032 A1 WO 2022142032A1
Authority
WO
WIPO (PCT)
Prior art keywords
verified
image
feature
signature image
fingerprint
Prior art date
Application number
PCT/CN2021/091266
Other languages
English (en)
Chinese (zh)
Inventor
何小臻
Original Assignee
平安科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 平安科技(深圳)有限公司 filed Critical 平安科技(深圳)有限公司
Publication of WO2022142032A1 publication Critical patent/WO2022142032A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/30Writer recognition; Reading and verifying signatures
    • G06V40/33Writer recognition; Reading and verifying signatures based only on signature image, e.g. static signature recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Definitions

  • the present application relates to the technical field of artificial intelligence, and in particular, to a handwritten signature verification method, device, computer equipment and storage medium.
  • Handwritten signature is an important means to verify the user's true willingness to sign.
  • the rapid development of Internet applications such as Internet finance, B2B e-commerce, tourism, and education has also driven the application demand for online electronic signatures. It can be seen that more and more Internet platforms are actively seeking legal and effective online handwritten electronic signature proofreading solutions.
  • the inventor found that the mainstream handwritten signature proofreading schemes mostly use handwriting recognition, based on the handwriting trajectory or the spatial relationship of the pixels, and then compare the recognition result with the name, which cannot avoid the risk of signature being used fraudulently.
  • the purpose of the embodiments of the present application is to propose a handwritten signature verification method, device, computer equipment and storage medium, so as to solve the problem of fraudulent use of signatures.
  • the embodiments of the present application provide a method for verifying handwritten signatures, which adopts the following technical solutions:
  • annoy search algorithm is used to retrieve the preset user signature image database, and the user signature image feature closest to the handwritten signature image feature is obtained;
  • the facial feature similarity is compared with a preset second threshold, and when the facial feature similarity is greater than the preset second threshold, it is determined that the handwritten signature image to be verified has passed the verification.
  • the embodiment of the present application also provides a handwritten signature verification device, which adopts the following technical solutions:
  • an acquisition module for acquiring a handwritten signature image to be verified and a face image to be verified, wherein the handwritten signature image to be verified and the face image to be verified originate from the same carrier;
  • the first extraction module is used for inputting the handwritten signature image to be verified into a pre-trained deep learning neural network model for feature extraction to obtain the handwritten signature image features;
  • a first retrieval module configured to use annoy search algorithm to retrieve a preset user signature image database according to the handwritten signature image features, and obtain the user signature image features that are closest to the handwritten signature image features;
  • a first calculation module used to calculate the signature feature similarity between the handwritten signature image feature and the closest user signature image feature
  • the second retrieval module is configured to compare the signature feature similarity with a preset first threshold, and when the signature feature similarity is greater than the preset first threshold, search according to the closest user signature image feature A preset user signature image database to obtain the closest user face features, wherein the user signature image features in the preset user signature image database are in one-to-one correspondence with user face features;
  • the second extraction module is used for inputting the face image to be verified into a pre-trained face feature extraction model for face feature extraction to obtain the face feature to be verified;
  • the second computing module is used to calculate the facial feature similarity between the to-be-verified facial feature and the closest user facial feature
  • the determining module is used to compare the similarity of the facial features with a preset second threshold, and when the similarity of the facial features is greater than the preset second threshold, determine that the handwritten signature image to be verified has passed verify.
  • the embodiment of the present application also provides a computer device, which adopts the following technical solutions:
  • a computer device includes a memory and a processor, wherein computer-readable instructions are stored in the memory, and the processor also implements the following steps when executing the computer-readable instructions:
  • annoy search algorithm is used to retrieve the preset user signature image database, and the user signature image feature closest to the handwritten signature image feature is obtained;
  • the facial feature similarity is compared with a preset second threshold, and when the facial feature similarity is greater than the preset second threshold, it is determined that the handwritten signature image to be verified has passed the verification.
  • the embodiments of the present application also provide a computer-readable storage medium, which adopts the following technical solutions:
  • a computer-readable storage medium where computer-readable instructions are stored on the computer-readable storage medium, and when the computer-readable instructions are executed by a processor, the processor is caused to perform the following steps:
  • annoy search algorithm is used to retrieve the preset user signature image database, and the user signature image feature closest to the handwritten signature image feature is obtained;
  • the facial feature similarity is compared with a preset second threshold, and when the facial feature similarity is greater than the preset second threshold, it is determined that the handwritten signature image to be verified has passed the verification.
  • the embodiment of the present application mainly has the following beneficial effects: by comparing the handwritten signature image features with a preset user signature image database, the validity of the handwritten signature can be determined, and the signature can be avoided to a large extent by fraudulent use. Risk, through the combination of annoy algorithm and similarity calculation, the speed and accuracy of handwritten signature verification can be taken into account.
  • FIG. 1 is an exemplary system architecture diagram to which the present application can be applied;
  • Fig. 2 is a flow chart according to an embodiment of the handwritten signature verification method of the present application
  • 3 is a flow chart of a specific embodiment of handwritten signature verification
  • FIG. 4 is a schematic structural diagram of an embodiment of a handwritten signature verification device according to the present application.
  • FIG. 5 is a schematic structural diagram of an embodiment of a computer device according to the present application.
  • the system architecture 100 may include terminal devices 101 , 102 , and 103 , a network 104 and a server 105 .
  • the network 104 is a medium used to provide a communication link between the terminal devices 101 , 102 , 103 and the server 105 .
  • the network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
  • the user can use the terminal devices 101, 102, 103 to interact with the server 105 through the network 104 to receive or send messages and the like.
  • Various communication client applications may be installed on the terminal devices 101 , 102 and 103 , such as web browser applications, shopping applications, search applications, instant messaging tools, email clients, social platform software, and the like.
  • the terminal devices 101, 102, and 103 can be various electronic devices that have a display screen and support web browsing, including but not limited to smart phones, tablet computers, e-book readers, MP3 players (Moving Picture Experts Group Audio Layer III, dynamic Picture Experts Compression Standard Audio Layer 3), MP4 (Moving Picture Experts Group Audio Layer IV, Moving Picture Experts Compression Standard Audio Layer 4) Players, Laptops and Desktops, etc.
  • MP3 players Moving Picture Experts Group Audio Layer III, dynamic Picture Experts Compression Standard Audio Layer 3
  • MP4 Moving Picture Experts Group Audio Layer IV, Moving Picture Experts Compression Standard Audio Layer 4
  • the server 105 may be a server that provides various services, such as a background server that provides support for the pages displayed on the terminal devices 101 , 102 , and 103 .
  • the handwritten signature verification method provided by the embodiments of the present application is generally performed by a server/terminal device , and accordingly, the handwritten signature verification device is generally set in the server/terminal device .
  • terminal devices, networks and servers in FIG. 1 are merely illustrative. There can be any number of terminal devices, networks and servers according to implementation needs.
  • the described handwritten signature verification method comprises the following steps:
  • Step S201 acquiring a handwritten signature image to be verified and a face image to be verified, wherein the handwritten signature image to be verified and the face image to be verified originate from the same carrier.
  • the electronic device for example, the server/terminal device shown in FIG. 1
  • the handwritten signature verification method can obtain the handwritten signature image to be verified and the to-be-verified handwritten signature image through a wired connection or a wireless connection. Tested face images.
  • the above wireless connection methods may include but are not limited to 3G/4G connection, WiFi connection, Bluetooth connection, WiMAX connection, Zigbee connection, UWB (ultra wideband) connection, and other wireless connection methods currently known or developed in the future .
  • the camera of the electronic device simultaneously scans the face of the user. image was taken. Obtain the handwritten signature image to be verified and the face image to be verified. It is also possible to import the captured signature video, analyze the video, and obtain the handwritten signature image to be verified and the face image to be verified.
  • Step S202 the handwritten signature image to be verified is input into a pre-trained deep learning neural network model for feature extraction to obtain the handwritten signature image features.
  • the handwritten signature image to be verified is input into a pre-trained deep learning neural network model for feature extraction, and the pre-trained deep learning neural network learns the handwritten signature images of different users, so that the deep learning neural network can extract High-dimensional features of handwritten signature images of different users.
  • Step S203 according to the handwritten signature image features, use annoy search algorithm to retrieve the preset user signature image database, and obtain the user signature image features that are closest to the handwritten signature image features.
  • the user signature image database is preset, and as long as the handwritten signature image to be verified matches one of the data in the preset user signature image database, the handwritten signature to be verified is considered valid .
  • the preset user signature image database is retrieved according to the characteristics of the handwritten signature image.
  • the bottom layer of the binary tree is that the leaf nodes record the original data nodes, and the other intermediate nodes record the information of the split hyperplane.
  • the time complexity of querying the closest point of a point is sub-linear. Specifically, it can be implemented through the Python API.
  • Step S204 Calculate the signature feature similarity between the handwritten signature image feature and the closest user signature image feature.
  • the similarity between the handwritten signature image features and the closest user signature image features can be calculated, which can take into account the retrieval speed and precision.
  • the similarity is calculated by calculating the Euclidean distance between two feature vectors.
  • Step S205 compare the signature feature similarity with a preset first threshold, and when the signature feature similarity is greater than the preset first threshold, retrieve a preset user signature image feature according to the closest user signature image feature.
  • a signature image database to obtain the closest user face features, wherein the user signature image features in the preset user signature image database are in one-to-one correspondence with the user face features;
  • Step 206 Input the face image to be verified into a pre-trained face feature extraction model for face feature extraction to obtain the face feature to be verified.
  • the face image to be verified is input into a pre-trained face feature extraction model for face feature extraction, and the pre-trained face feature extraction model is based on the second convolutional neural network model, the second The convolutional neural network model learns the face images of different users, so that the second convolutional neural network model can extract the high-dimensional features of the face images of different users.
  • Step 207 calculating the facial feature similarity between the facial feature to be verified and the closest user facial feature
  • the similarity is calculated by calculating the Euclidean distance between two feature vectors.
  • Step 208 Compare the facial feature similarity with a preset second threshold, and when the facial feature similarity is greater than the preset second threshold, determine that the handwritten signature image to be verified has passed the verification.
  • the face similarity between the face feature to be verified and the closest user face feature is greater than the preset second threshold, it is considered that the handwritten signature image to be verified and the face image to be verified are the same as the preset
  • the data in the set user signature image database is consistent and verified.
  • This application determines the validity of the handwritten signature by comparing the features of the handwritten signature image with the preset user signature image database, which can largely avoid the risk of the signature being fraudulently used. Speed and precision of signature verification.
  • the above-mentioned electronic device may further perform the following steps:
  • signature image training samples where the signature image training samples are N handwritten signature images marked with user IDs;
  • N is the number of training samples
  • the corresponding yi for the i-th sample is the marked result
  • h (h1,h2,...,hc) is the prediction result of sample i, where C is the number of all categories
  • the parameters of each node of the deep learning neural network model are adjusted until the loss function reaches a minimum, and a trained deep learning neural network model is obtained.
  • the deep learning neural network model can be regarded as an image feature extraction model connected to the output layer.
  • the output layer is the softmax output layer.
  • the softmax output layer is used to identify the input handwritten signature image according to the features extracted by the aforementioned image feature extraction model.
  • the softmaxloss is used to compare whether the predicted results are consistent with the marked results. When the softmaxloss reaches the minimum value, the training of the deep learning neural network model ends, and the trained deep learning neural network model has the ability to extract high-dimensional features of the signature image.
  • the preset user signature image database includes user fingerprint features, and the fingerprint features correspond to the user signature image features one-to-one.
  • the above electronic device may perform the following steps:
  • the fingerprint feature similarity is compared with a preset third threshold, and when the fingerprint feature similarity is greater than the preset third threshold, and the face feature similarity is greater than the preset second threshold, determine the The handwritten signature image to be verified is verified.
  • the signing of important documents requires not only a handwritten signature, but also the signer's fingerprints.
  • the fingerprint is identified at the same time.
  • the features of the fingerprint features are extracted through a pre-trained fingerprint feature extraction model, and the fingerprint feature extraction model is based on the first convolutional neural network model.
  • the user fingerprint features are in one-to-one correspondence with the user signature image features, and the corresponding user fingerprint features are obtained according to the closest user signature image features, and the fingerprint features to be verified and the user fingerprint features are calculated.
  • the similarity between fingerprint features can be calculated by Euclidean distance between two feature vectors.
  • the fingerprint feature similarity is compared with a preset third threshold, and when the fingerprint feature similarity is greater than the preset third threshold, and the face feature similarity is greater than the preset second threshold, determine the The handwritten signature image to be verified is verified.
  • the fingerprint While verifying the signature, the fingerprint is verified at the same time, which can avoid the situation that the handwriting is imitated, but it is regarded as a valid signature, which improves the accuracy of signature verification.
  • the electronic device may perform the following steps:
  • fingerprint image training samples where the fingerprint image training samples are N fingerprint images marked with user IDs;
  • the parameters of each node of the first convolutional neural network model are adjusted until the loss function reaches a minimum, and a trained fingerprint feature extraction model is obtained.
  • the training of the first convolutional neural network adopts supervised training, that is, the fingerprint image marked with the user's identity is input into the first convolutional neural network, and the parameters of each node of the first convolutional neural network are adjusted to make the first convolutional neural network.
  • the output fingerprint prediction results are consistent with the labeling results.
  • the output layer of the first convolutional neural network uses the softmax output layer.
  • the softmaxloss is used to measure whether the first convolutional neural network converges. When the softmaxloss value reaches the minimum, the first convolutional neural network is trained. At the end, the structure before the output layer of the trained first convolutional neural network constitutes a fingerprint feature extraction model.
  • the electronic device before the above-mentioned step of inputting the face image to be verified into a pre-trained face feature extraction model for feature extraction and obtaining the face feature to be verified, the electronic device The following steps can be performed:
  • the parameters of each node of the second convolutional neural network model are adjusted until the loss function reaches a minimum, and a trained facial feature extraction model is obtained.
  • the training of the second convolutional neural network adopts supervised training, that is, the face image marked with the user's identity is input into the second convolutional neural network, and the parameters of each node of the second convolutional neural network are adjusted to make the second convolutional neural network
  • the face prediction results output by the network are consistent with the labeling results.
  • the output layer of the second convolutional neural network uses the softmax output layer, and the softmaxloss is used to measure whether the second convolutional neural network converges. When the softmaxloss value reaches the minimum, the second convolutional neural network At the end of network training, the structure before the output layer of the trained second convolutional neural network constitutes a face feature extraction model.
  • the above-mentioned handwritten signature image to be verified can also be stored in a node of a blockchain.
  • the blockchain referred to in this application is a new application mode of computer technology such as distributed data storage, point-to-point transmission, consensus mechanism, and encryption algorithm.
  • Blockchain essentially a decentralized database, is a series of data blocks associated with cryptographic methods. Each data block contains a batch of network transaction information to verify its Validity of information (anti-counterfeiting) and generation of the next block.
  • the blockchain can include the underlying platform of the blockchain, the platform product service layer, and the application service layer.
  • the present application may be used in numerous general purpose or special purpose computer system environments or configurations. For example: personal computers, server computers, handheld or portable devices, tablet devices, multiprocessor systems, microprocessor-based systems, set-top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, including A distributed computing environment for any of the above systems or devices, and the like.
  • This application may be described in the general context of computer-executable instructions, such as computer-readable instruction modules, being executed by a computer.
  • modules of computer-readable instructions include routines, computer-readable instructions, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types.
  • the application may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network.
  • modules of computer readable instructions may be located in both local and remote computer storage media including storage devices.
  • the aforementioned storage medium may be a non-volatile storage medium such as a magnetic disk, an optical disk, a read-only memory (Read-Only Memory, ROM), or a random access memory (Random Access Memory, RAM) or the like.
  • the present application provides an embodiment of a handwritten signature verification device.
  • the device embodiment corresponds to the method embodiment shown in FIG. 2 .
  • the device Specifically, it can be applied to various electronic devices.
  • the handwritten signature verification device 400 in this embodiment includes: an acquisition module 401 , a first extraction module 402 , a first retrieval module 403 , a first calculation module 404 , a second retrieval module 405 , and a second retrieval module 401 .
  • the obtaining module 401 is used for obtaining the handwritten signature image to be verified and the face image to be verified, wherein the handwritten signature image to be verified and the face image to be verified originate from the same carrier;
  • the first extraction module 402 is used to input the handwritten signature image to be verified into a pre-trained deep learning neural network model for feature extraction to obtain the handwritten signature image features;
  • the first retrieval module 403 is configured to use annoy search algorithm to retrieve the preset user signature image database according to the handwritten signature image features, and obtain the user signature image features that are closest to the handwritten signature image features;
  • the first calculation module 404 is used to calculate the signature feature similarity between the handwritten signature image feature and the closest user signature image feature;
  • the second retrieval module 405 is configured to compare the signature feature similarity with a preset first threshold, and when the signature feature similarity is greater than the preset first threshold, according to the closest user signature image feature Retrieve a preset user signature image database to obtain the closest user face feature, wherein the user signature image feature in the preset user signature image database corresponds to the user face feature one-to-one;
  • the second extraction module 406 is used for inputting the face image to be verified into a pre-trained face feature extraction model for face feature extraction to obtain the face feature to be verified;
  • the second calculation module 407 is used to calculate the facial feature similarity between the facial feature to be verified and the closest user facial feature
  • a determination module 408 configured to compare the similarity of the facial features with a preset second threshold, and determine the handwritten signature image to be verified when the similarity of the facial features is greater than the preset second threshold approved.
  • the validity of the handwritten signature is determined by comparing the features of the handwritten signature image with the preset user signature image database, which can largely avoid the risk of the signature being fraudulently used.
  • the annoy algorithm and the similarity calculation which can take into account the speed and accuracy of handwritten signature verification.
  • the handwritten signature verification device 400 further includes:
  • the first acquisition submodule is used to acquire signature image training samples, where the signature image training samples are N handwritten signature images marked with user IDs;
  • a first prediction submodule configured to input the signature image training sample into a deep learning neural network model, and obtain N signature prediction results output by the deep learning neural network model in response to the signature image training sample;
  • the first comparison sub-module is used to compare whether the N signature prediction results are consistent with the annotations through a softmax loss function, wherein the softmax loss function is:
  • N is the number of training samples
  • the corresponding yi for the i-th sample is the marked result
  • h (h1,h2,...,hc) is the prediction result of sample i, where C is the number of all categories
  • the first adjustment sub-module is used to adjust the parameters of each node of the deep learning neural network model, and ends when the loss function reaches a minimum, and a trained deep learning neural network model is obtained.
  • the handwritten signature verification device 400 further includes:
  • the second acquisition sub-module is used to acquire the fingerprint image to be verified, and the fingerprint image and the handwritten signature image to be verified originate from the same carrier;
  • the first extraction submodule is used for inputting the fingerprint image into a pre-trained fingerprint feature extraction model for feature extraction to obtain fingerprint features to be verified;
  • a first retrieval submodule used for retrieving a preset user signature image database according to the closest user signature image feature, to obtain a user fingerprint feature corresponding to the closest user signature image feature;
  • a first calculation submodule for calculating the fingerprint feature similarity between the fingerprint feature to be verified and the user fingerprint feature
  • the first determination sub-module is used to compare the similarity of the fingerprint features with a preset third threshold, when the similarity of the fingerprint features is greater than the preset third threshold, and the similarity of the face features is greater than the preset third threshold When the second threshold is , it is determined that the handwritten signature image to be verified has passed the verification.
  • the handwritten signature verification device 400 further includes:
  • the third acquisition sub-module is used to acquire fingerprint image training samples, and the fingerprint image training samples are N fingerprint images marked with user IDs;
  • a second prediction submodule configured to input the fingerprint image training sample into a first convolutional neural network model, and obtain N fingerprint prediction results output by the first convolutional neural network model in response to the fingerprint image training sample;
  • the second comparison sub-module is used to compare the N fingerprint prediction results and the labels through the softmax loss function
  • the second adjustment sub-module is used to adjust the parameters of each node of the first convolutional neural network model, and ends when the loss function reaches a minimum, and a trained fingerprint feature extraction model is obtained.
  • the handwritten signature verification device 400 further includes:
  • the fifth acquisition sub-module is used to acquire face image training samples, and the face image training samples are N face images marked with user IDs;
  • the third prediction submodule is configured to input the face image training sample into the second convolutional neural network model, and obtain N face predictions output by the second convolutional neural network model in response to the face image training sample result;
  • the third comparison sub-module is used to compare whether the N face prediction results are consistent with the labeling through the softmax loss function
  • the third adjustment sub-module is used to adjust the parameters of each node of the second convolutional neural network model, and ends when the loss function reaches a minimum, to obtain a trained facial feature extraction model.
  • the handwritten signature verification device 400 further includes:
  • the storage module is used to store the to-be-verified handwritten signature image and the to-be-verified face image in the blockchain.
  • FIG. 5 is a block diagram of a basic structure of a computer device according to this embodiment.
  • the computer device 5 includes a memory 51 , a processor 52 , and a network interface 53 that communicate with each other through a system bus. It should be pointed out that only the computer device 5 with components 51-53 is shown in the figure, but it should be understood that it is not required to implement all of the shown components, and more or less components may be implemented instead. Among them, those skilled in the art can understand that the computer device here is a device that can automatically perform numerical calculation and/or information processing according to pre-set or stored instructions, and its hardware includes but is not limited to microprocessors, special-purpose Integrated circuit (Application Specific Integrated Circuit, ASIC), programmable gate array (Field-Programmable Gate Array, FPGA), digital processor (Digital Signal Processor, DSP), embedded equipment, etc.
  • ASIC Application Specific Integrated Circuit
  • FPGA Field-Programmable Gate Array
  • DSP Digital Signal Processor
  • the computer equipment may be a desktop computer, a notebook computer, a palmtop computer, a cloud server and other computing equipment.
  • the computer device can perform human-computer interaction with the user through a keyboard, a mouse, a remote control, a touch pad or a voice control device.
  • the memory 51 includes at least one type of readable storage medium, and the readable storage medium includes flash memory, hard disk, multimedia card, card-type memory (for example, SD or DX memory, etc.), random access memory (RAM), static Random Access Memory (SRAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), Programmable Read Only Memory (PROM), Magnetic Memory, Magnetic Disk, Optical Disk, etc.
  • the memory 51 may be an internal storage unit of the computer device 5 , such as a hard disk or a memory of the computer device 5 .
  • the memory 51 may also be an external storage device of the computer device 5, such as a plug-in hard disk, a smart memory card (Smart Media Card, SMC), a secure digital (Secure Digital, SD) card, flash memory card (Flash Card), etc.
  • the memory 51 may also include both the internal storage unit of the computer device 5 and its external storage device.
  • the memory 51 is generally used to store the operating system and various application software installed on the computer device 5 , such as computer-readable instructions for a handwritten signature verification method.
  • the memory 51 can also be used to temporarily store various types of data that have been output or will be output.
  • the processor 52 may be a central processing unit (Central Processing Unit, CPU), a controller, a microcontroller, a microprocessor, or other data processing chips. This processor 52 is typically used to control the overall operation of the computer device 5 . In this embodiment, the processor 52 is configured to execute computer-readable instructions or process data stored in the memory 51, for example, computer-readable instructions for executing the handwritten signature verification method.
  • CPU Central Processing Unit
  • controller a controller
  • microcontroller a microcontroller
  • microprocessor microprocessor
  • This processor 52 is typically used to control the overall operation of the computer device 5 .
  • the processor 52 is configured to execute computer-readable instructions or process data stored in the memory 51, for example, computer-readable instructions for executing the handwritten signature verification method.
  • the network interface 53 may include a wireless network interface or a wired network interface, and the network interface 53 is generally used to establish a communication connection between the computer device 5 and other electronic devices.
  • the validity of the handwritten signature can be determined, and the risk of signature fraud can be avoided to a large extent. speed and accuracy of the test.
  • the present application also provides another embodiment, that is, to provide a computer-readable storage medium, where the computer-readable storage medium stores computer-readable instructions, and the computer-readable instructions can be executed by at least one processor to The at least one processor is caused to perform the steps of the handwritten signature verification method as described above.
  • the computer-readable storage medium may be non-volatile or volatile.
  • the validity of the handwritten signature can be determined, and the risk of signature fraud can be avoided to a large extent. speed and accuracy of the test.
  • the method of the above embodiment can be implemented by means of software plus a necessary general hardware platform, and of course can also be implemented by hardware, but in many cases the former is better implementation.
  • the technical solution of the present application can be embodied in the form of a software product in essence or in a part that contributes to the prior art, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, CD-ROM), including several instructions to make a terminal device (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the methods described in the various embodiments of this application.
  • a storage medium such as ROM/RAM, magnetic disk, CD-ROM

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Human Computer Interaction (AREA)
  • Molecular Biology (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Computational Linguistics (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Probability & Statistics with Applications (AREA)
  • Collating Specific Patterns (AREA)

Abstract

L'invention concerne un procédé et un appareil de vérification de signature manuscrite, un dispositif informatique et un support de stockage, le procédé comprenant les étapes suivantes : acquérir une image de signature manuscrite à vérifier et une image faciale à vérifier ; fournir l'image de signature manuscrite en entrée d'un modèle de réseau neuronal à apprentissage profond pour une extraction de caractéristiques pour acquérir des caractéristiques d'image de signature manuscrite ; effectuer une recherche dans une base de données d'images de signatures d'utilisateur prédéfinies pour acquérir les caractéristiques d'image de signature d'utilisateur les plus proches ; calculer la similarité de caractéristiques de signatures entre les deux, comparer celle-ci à un premier seuil et, lorsque celle-ci est supérieure au premier seuil, effectuer une nouvelle recherche dans la base de données d'images de signatures d'utilisateur pour acquérir les caractéristiques faciales d'utilisateur les plus proches ; calculer la similarité de caractéristiques faciales entre les deux, comparer celle-ci à un deuxième seuil et, lorsque celle-ci est supérieure au deuxième seuil, déterminer que l'image de signature manuscrite à vérifier a réussi la vérification. La validité de la signature est déterminée au moyen de la comparaison simultanée de la similarité de l'image de signature manuscrite et de l'image faciale avec des données préétablies, ce qui empêche l'utilisation frauduleuse de la signature.
PCT/CN2021/091266 2020-12-30 2021-04-30 Procédé et appareil de vérification de signature manuscrite, dispositif informatique et support de stockage WO2022142032A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011609053.5A CN112733645B (zh) 2020-12-30 2020-12-30 手写签名校验方法、装置、计算机设备及存储介质
CN202011609053.5 2020-12-30

Publications (1)

Publication Number Publication Date
WO2022142032A1 true WO2022142032A1 (fr) 2022-07-07

Family

ID=75610874

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/091266 WO2022142032A1 (fr) 2020-12-30 2021-04-30 Procédé et appareil de vérification de signature manuscrite, dispositif informatique et support de stockage

Country Status (2)

Country Link
CN (1) CN112733645B (fr)
WO (1) WO2022142032A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117437650A (zh) * 2023-12-20 2024-01-23 山东山大鸥玛软件股份有限公司 基于深度学习的手写体签名比对方法、系统、装置及介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103995995A (zh) * 2013-04-22 2014-08-20 厦门维觉电子科技有限公司 多媒体签字识别方法及系统
CN105631272A (zh) * 2016-02-02 2016-06-01 云南大学 一种多重保险的身份认证方法
CN106779665A (zh) * 2016-11-23 2017-05-31 广东微模式软件股份有限公司 一种基于人体生物特征识别与防抵赖技术的pos取现方法
CN108388813A (zh) * 2018-02-28 2018-08-10 中国平安财产保险股份有限公司 电子化签名方法、用户设备、存储介质及装置
CN109523392A (zh) * 2018-10-19 2019-03-26 中国平安财产保险股份有限公司 签名文件生成方法、装置、计算机设备和存储介质
US20200218794A1 (en) * 2018-04-04 2020-07-09 Beijing Sensetime Technology Development Co., Ltd. Identity authentication, unlocking, and payment methods and apparatuses, storage media, products, and devices

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107302433A (zh) * 2016-04-15 2017-10-27 平安科技(深圳)有限公司 电子签名的校验方法、校验服务器及用户终端
CN109409254A (zh) * 2018-10-10 2019-03-01 成都优易数据有限公司 一种基于孪生神经网络的电子合同手写签名鉴定方法
CN110210194A (zh) * 2019-04-18 2019-09-06 深圳壹账通智能科技有限公司 电子合同显示方法、装置、电子设备及存储介质
CN111428557A (zh) * 2020-02-18 2020-07-17 深圳壹账通智能科技有限公司 基于神经网络模型的手写签名的自动校验的方法和装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103995995A (zh) * 2013-04-22 2014-08-20 厦门维觉电子科技有限公司 多媒体签字识别方法及系统
CN105631272A (zh) * 2016-02-02 2016-06-01 云南大学 一种多重保险的身份认证方法
CN106779665A (zh) * 2016-11-23 2017-05-31 广东微模式软件股份有限公司 一种基于人体生物特征识别与防抵赖技术的pos取现方法
CN108388813A (zh) * 2018-02-28 2018-08-10 中国平安财产保险股份有限公司 电子化签名方法、用户设备、存储介质及装置
US20200218794A1 (en) * 2018-04-04 2020-07-09 Beijing Sensetime Technology Development Co., Ltd. Identity authentication, unlocking, and payment methods and apparatuses, storage media, products, and devices
CN109523392A (zh) * 2018-10-19 2019-03-26 中国平安财产保险股份有限公司 签名文件生成方法、装置、计算机设备和存储介质

Also Published As

Publication number Publication date
CN112733645B (zh) 2023-08-01
CN112733645A (zh) 2021-04-30

Similar Documents

Publication Publication Date Title
WO2022126970A1 (fr) Procédé et dispositif d'identification de risques de fraude financière, dispositif informatique et support de stockage
US11727053B2 (en) Entity recognition from an image
WO2021143267A1 (fr) Procédé de traitement de modèle de classification à grain fin basé sur la détection d'image, et dispositifs associés
US11409789B2 (en) Determining identity in an image that has multiple people
CN112528025A (zh) 基于密度的文本聚类方法、装置、设备及存储介质
WO2022174491A1 (fr) Procédé et appareil fondés sur l'intelligence artificielle pour le contrôle qualité des dossiers médicaux, dispositif informatique et support de stockage
CN113435583B (zh) 基于联邦学习的对抗生成网络模型训练方法及其相关设备
WO2022105118A1 (fr) Procédé et appareil d'identification d'état de santé basés sur une image, dispositif et support de stockage
WO2022134584A1 (fr) Procédé et appareil de vérification d'image de bien immobilier, dispositif informatique et support de stockage
WO2021237570A1 (fr) Procédé et appareil d'audit d'image, dispositif, et support de stockage
CN112330331A (zh) 基于人脸识别的身份验证方法、装置、设备及存储介质
US20230032728A1 (en) Method and apparatus for recognizing multimedia content
CN112995414B (zh) 基于语音通话的行为质检方法、装置、设备及存储介质
CN112668482B (zh) 人脸识别训练方法、装置、计算机设备及存储介质
CN110855648A (zh) 一种网络攻击的预警控制方法及装置
US20230222762A1 (en) Adversarially robust visual fingerprinting and image provenance models
US11810388B1 (en) Person re-identification method and apparatus based on deep learning network, device, and medium
WO2022142032A1 (fr) Procédé et appareil de vérification de signature manuscrite, dispositif informatique et support de stockage
CN114282258A (zh) 截屏数据脱敏方法、装置、计算机设备及存储介质
WO2022105120A1 (fr) Procédé et appareil de détection de texte à partir d'une image, dispositif informatique et support de mémoire
CN112966150A (zh) 一种视频内容抽取的方法、装置、计算机设备及存储介质
CN112036501A (zh) 基于卷积神经网络的图片的相似度检测方法及其相关设备
CN115250200B (zh) 服务授权认证方法及其相关设备
CN113688268B (zh) 图片信息抽取方法、装置、计算机设备及存储介质
CN113792549B (zh) 一种用户意图识别的方法、装置、计算机设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21912794

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21912794

Country of ref document: EP

Kind code of ref document: A1