CN106529395B - Signature image identification method based on depth confidence network and k mean cluster - Google Patents
Signature image identification method based on depth confidence network and k mean cluster Download PDFInfo
- Publication number
- CN106529395B CN106529395B CN201610840672.2A CN201610840672A CN106529395B CN 106529395 B CN106529395 B CN 106529395B CN 201610840672 A CN201610840672 A CN 201610840672A CN 106529395 B CN106529395 B CN 106529395B
- Authority
- CN
- China
- Prior art keywords
- neuron
- output
- inverting
- layer
- signature image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/30—Writer recognition; Reading and verifying signatures
- G06V40/33—Writer recognition; Reading and verifying signatures based only on signature image, e.g. static signature recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/23—Clustering techniques
- G06F18/232—Non-hierarchical techniques
- G06F18/2321—Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
- G06F18/23213—Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/30—Writer recognition; Reading and verifying signatures
- G06V40/37—Writer recognition; Reading and verifying signatures based only on signature signals such as velocity or pressure, e.g. dynamic signature recognition
- G06V40/382—Preprocessing; Feature extraction
Abstract
The present invention relates to a kind of signature image identification method based on depth confidence network and k mean cluster, method includes the following steps: S1 trains depth confidence network: input handwritten signature image is as training set, the first eigenvector that every width inputs handwritten signature image is extracted by depth confidence network, the first eigenvector is gathered for 1 class using k means clustering algorithm, the distance df of cluster centre and the cluster centre farthest vector into class is obtained, realizes training depth confidence network;S2 identifies signature image: input needs the handwritten signature image identified, second feature vector is extracted by trained depth confidence network, distance d of the calculating second feature vector to training set feature vector clusters center, if d >=df, then inputting the handwritten signature image that needs are identified is false signature image, is otherwise true signature image.The present invention can automatically identify without human intervention signature image, and accelerate the speed of identification, improve the accuracy of identification.
Description
Technical field
The present invention relates to image authentications, reflect more particularly, to the signature image of a kind of depth confidence network and k mean cluster
Determine method, belongs to deep learning and area of pattern recognition.
Background technique
Currently, many occasion needs identify that signature, signature identification has extensively as a kind of identification authentication mode
Application, such as finance signature identification, legal signature identification etc..Signature identification mainly identified by 2 kinds of modes: 1) manually into
Row identification, but time-consuming, and qualification result is big by subjective impact;2) computer is identified, designs signature image by people
Feature mode identified simultaneously for extracting corresponding characteristics of image by the feature that comparison is extracted, however, due to signature with
Meaning property, it is higher to lead to design suitable characteristics of image mode difficulty, can not design effective signature image generic features mode, because
This, the identification method identification accuracy of contrast characteristic is difficult to effectively improve.
Summary of the invention
Present invention aims to overcome that various problems present in above-mentioned existing signature identification technology, for traditional artificial label
Disadvantage present in name image authentication and the identification of the signature of computer technology, such as: conventional identification relies on the experience of people, computer mirror
It is more complex that signature image feature mode is manually set in fixed, needs to set different characteristics of image for different signatures, to label
Name image has the disadvantages of particular/special requirement, identification accuracy is not high, proposes a kind of label based on depth confidence network and k mean cluster
Name image authentication method.The present invention can automatically identify without human intervention signature image, and accelerate identification
Speed, improve the accuracy of identification.
Realize the object of the invention the technical solution adopted is that a kind of signature map based on depth confidence network and k mean cluster
As identification method, method includes the following steps:
S1 trains depth confidence network: input handwritten signature image is extracted every as training set by depth confidence network
Width inputs the first eigenvector of handwritten signature image, and the first eigenvector is gathered for 1 class using k means clustering algorithm,
The distance df of cluster centre and the cluster centre farthest vector into class is obtained, realizes training depth confidence network;
S2 identifies signature image: input needs the handwritten signature image identified, is mentioned by trained depth confidence network
Second feature vector is taken, the distance d of calculating second feature vector to training set feature vector clusters center is defeated if d >=df
Enter to need the handwritten signature image identified to be false signature image, is otherwise true signature image.
The present invention has the advantage that the invention proposes inverting neural networks, and with common group of convolutional encoding automatic machine
At a new depth confidence network, the characteristics of depth confidence network is can directly to extract from the image of arbitrary size
Feature vector rather than feature subgraph out, extend the ability in feature extraction of depth confidence network;Inverting neural network is used and is drilled
Change strategy to be trained, since evolutionary strategy searching globally optimal solution ability is relatively strong and does not need gradient information, so training
Inverting neural network applicability it is wider, and from feature subgraph extract validity feature vector ability it is stronger.Use k mean value
Clustering algorithm clusters the signature image feature vector extracted, by comparing the spy of the hand-written image to be identified newly inputted
The distance of sign vector sum cluster centre carries out identification input, and authenticity problem is switched to clustering problem by this method, makes identification
Accuracy is improved, and effectively reduces over-fitting.Compared with congenic method, the present invention can be improved hand-written identification
Speed improves identification accuracy.
Detailed description of the invention
Fig. 1 is that the present invention is based on the flow charts of depth confidence network and the signature image identification method of k mean cluster.
Fig. 2 is a kind of flow chart of preferred embodiment of training depth confidence network in Fig. 1.
Fig. 3 is the flow chart that a kind of preferred embodiment of signature image is identified in Fig. 1.
Specific embodiment
Below with reference to the embodiments and with reference to the accompanying drawing the technical solutions of the present invention will be further described.
Shown in fig. 1 is the totality of the signature image identification method the present invention is based on depth confidence network and k mean cluster
Flow chart, entire identity process are divided into: S1 training depth confidence network: inputting the signature image as training set to identification network
It is trained and identifies signature image with S2: signature image to be identified is inputted in trained identification network to the mirror for obtaining network
Determine result.
Present embodiment assumes that the actual signature image of input RBG color space is 100 width, the identification network instruction based on Fig. 2
Practice flow chart, step S1 input of the present invention is implemented the specific training that identification network is trained as the signature image of training set
Steps are as follows:
S1.1, the signature image of 100 RBG color spaces is switched to 8 grayscale images, the size of all input pictures is unified
Pixel for 256*256 pixel, all images all normalizes.
S1.2, user initialize convolution automatic coding machine parameter according to input image information, and convolution automatic coding machine shares 4
Layer, wherein convolution layer number be 2 layers, sampling layer number be 2 layers, built-up sequence be level 1 volume lamination, the 2nd layer of sample level, the 3rd
Layer convolutional layer, the 4th layer of sample level.Level 1 volume product template size is 33*33, and convolution mask quantity is 6;2nd layer of sample template
Size is 4*4, and sample template quantity is 6;3rd layer of convolution mask size is 33*33, and convolution mask number is 16;4th layer
Sample template size is 4*4, and sample template quantity is 16.
S1.3, user initialize the parameter of gradient descent algorithm, and the learning rate of gradient decline is 0.03, maximum number of iterations
It is 15.
S1.4,100 signature images as training set are input in convolution automatic coding machine one by one, use gradient
Descent algorithm is trained convolution automatic coding machine to obtain the feature subgraph of every width training set image, every width training set image
Feature subgraph quantity be 16, each feature subgraph size be 6*6, feature sub-collective drawing is input in inverting neural network.
S1.5, user initialize inverting neural network parameter according to input feature vector picture information, and inverting neural network is divided into 2
Layer, the 1st layer is 300 neurons, and the 2nd layer is 120 neurons, and the activation primitive of each neuron is sigomd function.
S1.6, user initialize the parameter of evolutionary strategy, use 1+4 strategy, mutation probability 0.06, maximum evolution algebra
For 50000 generations, variation mode is Gaussian mutation.Individual of the weight and biasing of inverting neural network as evolutionary strategy, individual
Length is 300*16*6*6+120*300+300+120=209220.
S1.7, the 1st layer of neuron of inverting neural network data returned and the every dimension being originally inputted between feature vector are flat
Equal valuation functions of the error as evolutionary strategy.Valuation functions are made of 3 parts:
1) feature vector forward direction is transmitted.Firstly, the feature subgraph for every width input picture that convolution automatic coding machine is extracted
Integrate the feature vector tieed up according to each start pixel as 16*6*6=576, and this feature vector is input to the nerve in the 1st layer
Member is handled, and goes to calculate entire inverting mind in the neuron that then processing result of neuron is input in the 2nd layer in the 1st layer
Positive output through network, detailed process are identical with the data forward direction transmittance process of general neural network.Wherein, the 1st
The output calculating formula of i-th of neuron in layer are as follows: For in the 1st layer i-th neuron it is defeated
Out,For j-th of weight of i-th of neuron in the 1st layer,It is tieed up for the jth of input feature value,It is in the 1st layer i-th
The biasing of a neuron, in the positive output of this layer, i=1,2 ..., 300.
The output calculating formula of the 2nd layer of i-th of neuron are as follows: For i-th in the 2nd layer
The output of neuron,For j-th of weight of i-th of neuron in the 2nd layer,For in the 1st layer j-th neuron it is defeated
Out,For the biasing of i-th of neuron in the 2nd layer, in the positive output of this layer, i=1,2 ..., 120.
2) inverting neural network forward direction output data back transfer.In the present embodiment, the 2nd layer of neuron is to the 1st layer of nerve
The reversed input data of member, the 1st layer of neuron reversely export the vector of inverting after handling the data reversely inputted.Wherein,
The reversed output calculating formula of i-th of neuron in 1st layer are as follows: For i-th in the 1st layer
The reversed output of neuron,For i-th of weight of j-th of neuron in the 2nd layer,For j-th neuron in the 2nd layer
Output,It is the biasing of i-th of neuron in the 1st layer, in this layer of inverting, i=1,2 ..., 300.
I-th of neuron exports inverting vector, calculating formula to inverting neural network outside is reversed in 1st layer are as follows: For the i-th dimension of reversed output vector,For i-th of weight of j-th of neuron in the 1st layer,
For the output of j-th of neuron in the 1st layer, in this layer of inverting, i=1,2 ..., 576.
3) the every dimension mean error for being originally inputted the reversed output vector of vector sum is calculated, inverting mind is assessed according to mean error
Extraction characteristic effect through network, valuation functions are as follows: It is extracted for convolutional encoding automatic machine
Input feature value,For the reversed output vector of inverting neural network.
S1.8, evolutionary strategy training inverting neural network, specific step is as follows;
4 S1.8.1, random initializtion individuals, assess all individuals;
S1.8.2, best individual is selected, 8.5 is gone to step if reaching termination condition, otherwise goes to step 8.3;
S1.8.3, every 1 in preferably individual is tieed up, Gaussian mutation is carried out according to mutation probability.To current preferably individual
It carries out variation and generates 4 new individuals, assess all newly generated individuals;
S1.8.4, the previous generation 4 new individuals that preferably individual and variation generate constitute 1+4 individual, go to step 8.2;
S1.8.5, the best individual of output;Inverting neural metwork training is completed, and the first of the every width signature image of final output is special
Levy vector.
S1.9, user initialize k means clustering algorithm parameter, and the maximum number of iterations of k means clustering algorithm was 2000 generations,
Clustering number is 1, i.e., gathers all input datas for 1 class.100 feature vectors that inverting neural network is exported are equal using k
Value clustering algorithm gathers for 1 class, is denoted as K, generates cluster centre C, the feature vector farthest apart from cluster centre C is found in class K,
Cluster centre C is recorded to the distance of this feature vector, is denoted as df.So far, signature image identification network training is fully completed, and is identified
Network, which has been built up, to be finished.
Signature image to be identified is inputted trained mirror by the identification network training flow chart based on Fig. 3, S2 of the present invention
It is as follows to determine the step of qualification result of network is obtained in network:
S2.1, signature image to be identified is pre-processed, including by image gray processing, unified image size, image
Pixel all normalize.
S2.2, the aforementioned trained depth confidence network of pretreated image input is extracted corresponding characteristics of image to
Amount;
S2.3, the feature vector for calculating signature image to be identified are denoted as d to the distance of cluster centre C.Compare d and df,
If d >=df, then judge that signature image to be identified is vacation, whereas if d < df, then judge that signature image to be identified is
Very.
The performance data for using the present embodiment method to identify signature for the handwritten signature image of " Li Wenbin " is as follows
Table:
1. the present embodiment performance data of table
Identify speed | 1s |
Identify accuracy | 100% |
Upper Biao Biaomingben identification method identification speed is fast, and accuracy is high.
Specific embodiment described herein is only an example for the spirit of the invention.The neck of technology belonging to the present invention
The technical staff in domain can make various modifications or additions to the described embodiments or replace by a similar method
In generation, however, it does not deviate from the spirit of the invention or beyond the scope of the appended claims.
Claims (8)
1. a kind of signature image identification method based on depth confidence network and k mean cluster, which is characterized in that including following step
It is rapid:
S1 trains depth confidence network: input handwritten signature image as training set, by depth confidence network extract every it is defeated
The first eigenvector for entering handwritten signature image is gathered the first eigenvector for 1 class, acquisition using k means clustering algorithm
The distance df of cluster centre and the cluster centre farthest vector into 1 class realizes training depth confidence network, specifically includes:
All actual signature picture sizes are unified for similarly by S1.1, the actual signature image for inputting multiple as training set
Size converts gray level image for the triple RGB color of each image, gray-scale pixel values is normalized;
S1.2, convolution automatic coding machine and inverting neural network constitute depth confidence network for extracting image feature vector;Root
The parameter of convolution automatic coding machine is set according to input picture size user, the parameter includes that convolution automatic coding machine uses convolution
Layer number, sampling layer number are adopted in convolution mask usage quantity and convolution mask size and every layer of sample level in every layer of convolutional layer
Original mold plate usage quantity and sample template size;
S1.3, convolution automatic coding machine training algorithm use gradient descent algorithm, user initialize gradient descent algorithm ginseng
Number, the parameter includes learning rate and maximum number of iterations;
S1.4, then each image in training set is input to one by one in convolution automatic coding machine, uses gradient descent algorithm
Convolutional encoding machine is trained to obtain the feature sub-collective drawing of each image, and the feature sub-collective drawing that will acquire is input to inverting
In neural network;
S1.5, according to the size and number for inputting every width feature subgraph, user sets the parameter of inverting neural network, these parameters
The activation primitive selection of the number of plies, every layer of quantity, each neuron using neuron including inverting neural network;
S1.6, inverting neural network training algorithm use evolutionary strategy, user initialize evolutionary strategy parameter, the parameter
Including population scale, mutation probability, maximum evolution algebra, variation mode, the weight of inverting neural network and biasing are used as and develop
The individual of strategy;
S1.7, the input layer data returned of inverting neural network and the every dimension being originally inputted between feature vector are average
Valuation functions of the error as evolutionary strategy;
S1.8, evolutionary strategy training inverting neural network
S1.9, user initialize k means clustering algorithm parameter, and the parameter includes maximum number of iterations and cluster number;Wherein,
In signature identification, cluster number is fixed as 1 class;All feature vectors that inverting neural network is exported use k mean cluster
Algorithm gathers for 1 class, is denoted as K, generates cluster centre C, the first eigenvector farthest apart from cluster centre C is found in class K, remembers
Cluster centre C is recorded to the distance of the first eigenvector, is denoted as df, so far, signature image identification network training is fully completed, and is reflected
Determine network and have been built up to finish;
S2 identifies signature image: input needs the handwritten signature image identified, extracts the by trained depth confidence network
Two feature vectors calculate second feature vector to the distance d at training set feature vector clusters center, if d >=df, inputting is needed
The handwritten signature image to be identified is false signature image, is otherwise true signature image.
2. the signature image identification method based on depth confidence network and k mean cluster, feature exist according to claim 1
In: at least 100 actual signature images are inputted in the step S1.1 as training set.
3. the signature image identification method based on depth confidence network and k mean cluster, feature exist according to claim 1
In: depth confidence network described in the step S1.2 is made of convolution automatic coding machine and inverting neural network, and convolution is certainly
Dynamic code machine is used to extract the feature sub-collective drawing of every width signature image, and inverting neural network is used for the feature from every width signature image
The feature vector of every width signature image is extracted in sub-collective drawing, the two person together constitutes a complete depth confidence network.
4. the signature image identification method based on depth confidence network and k mean cluster, feature exist according to claim 1
The valuation functions calculating of inverting neural network is made of 3 parts in the step S1.7, is respectively:
1) input feature value data forward direction is transmitted, and calculates the positive output of inverting neural network;Firstly, by convolution autocoding
For the feature sub-collective drawing for every width input picture that machine extracts according to the feature vector that each start pixel is that t*m*n is tieed up, t is every width figure
The feature sub-collective drawing scale of picture, m*n are the size of each feature subgraph;Then by feature vector input reverse neural network, and
It is successively handled to generate the positive output of inverting neural network;
2) inverting data network forward direction output data back transfer calculates the reversed output of inverting neural network;Its calculating process
It is as follows: firstly, neuron in the output layer N of inverting neural network is by inverting neural network forward direction output data back to previous
In neuron in hidden layer P, if the neuronal quantity in output layer N is Nt, neuronal quantity is P in hidden layer Pt, f (x)
For neuron activation functions, the output calculating formula of i-th of neuron in hidden layer P are as follows: For the reversed output of i-th of neuron in hidden layer P,For i-th of weight of j-th of neuron in output layer N,
For the output of j-th of neuron in output layer N,For the biasing of i-th of neuron in hidden layer P, i=1,2 ..., Pt;
Then, hidden layer successively reversely exports, and the output of the neuron in hidden layer s is reversely input to the nerve in hidden layer L
Member, if the neuronal quantity in hidden layer s is St, the neuronal quantity in hidden layer L is Lt, i-th of neuron in hidden layer L
Output calculating formula are as follows: For the reversed output of i-th of neuron in hidden layer L,For i-th of weight of j-th of neuron in hidden layer s,For the reversed output of j-th of neuron in hidden layer s,For
The biasing of i-th of neuron in L layers, i=1,2 ..., Lt;
Finally, outside input layer reversed output data to inverting neural network, the dimension of reversed output vector and original
The dimension of input vector is equal;If the neuronal quantity in input layer H is Ht, the dimension of reversed output vector is D, reversed to export
Every one-dimensional calculating formula of data are as follows: It is the i-th dimension of reversed output vector,For in input layer H
I-th of weight of j-th of neuron,For the reversed output of j-th of neuron in input layer H, i=1,2 ..., D;
3) the every dimension mean error for being originally inputted the reversed output vector of vector sum is calculated, inverting nerve net is assessed according to mean error
The extraction characteristic effect of network, valuation functions are as follows: The input extracted for convolutional encoding automatic machine
The i-th dimension of feature vector,For the i-th dimension of the reversed output vector of inverting neural network, D is the dimension for being originally inputted feature vector
Number.
5. the signature image identification method based on depth confidence network and k mean cluster, feature exist according to claim 4
In the training algorithm of inverting neural network is evolutionary strategy, the Inversion Calculation in evolutionary strategy, in the assessment of inverting neural network
Process are as follows:
Firstly, neuron in the output layer N of inverting neural network is by inverting neural network forward direction output data back to previous
In neuron in hidden layer P, if the neuronal quantity in output layer N is Nt, neuronal quantity is P in hidden layer Pt, f (x)
For neuron activation functions, the output calculating formula of i-th of neuron in hidden layer P are as follows: For the reversed output of i-th of neuron in hidden layer P,For i-th of weight of j-th of neuron in output layer N,
For the output of j-th of neuron in output layer N,For the biasing of i-th of neuron in hidden layer P, i=1,2 ..., Pt;
Then, hidden layer successively reversely exports, and the output of the neuron in hidden layer s is reversely input to the nerve in hidden layer L
Member, if the neuronal quantity in hidden layer s is St, the neuronal quantity in hidden layer L is Lt, i-th of neuron in hidden layer L
Output calculating formula are as follows: For the reversed output of i-th of neuron in hidden layer L,For i-th of weight of j-th of neuron in hidden layer s,For the reversed output of j-th of neuron in hidden layer s,For
The biasing of i-th of neuron in L layers, i=1,2 ..., Lt;
Finally, outside input layer reversed output data to inverting neural network, the dimension of reversed output vector and original
The dimension of input vector is equal, if the neuronal quantity in input layer H is Ht, the dimension of reversed output vector is D, reversed to export
Every one-dimensional calculating formula of data are as follows: It is the i-th dimension of reversed output vector,For in input layer H
I-th of weight of j-th of neuron,For the reversed output of j-th of neuron in input layer H, i=1,2 ..., D.
6. the signature image identification method based on depth confidence network and k mean cluster, feature exist according to claim 1
Include: in the step S1.8
λ S1.8.1, random initializtion individual, assess all individuals;
S1.8.2, best individual is selected, 8.5 is gone to step if reaching termination condition, otherwise goes to step 8.3;
S1.8.3, every 1 dimension in preferably individual carries out current preferably individual according to mutation probability progress Gaussian mutation
Variation generates λ new individual, assesses all newly generated individuals;
S1.8.4, the previous generation λ new individual that preferably individual and variation generate constitute 1+ λ individual, go to step 8.2;
S1.8.5, the best individual of output;
Inverting neural metwork training is completed, the first eigenvector of the every width signature image of final output.
7. the signature image identification side described in any one based on depth confidence network and k mean cluster according to claim 1~6
Method, which is characterized in that the S2 identifies that signature image includes:
S2.1, signature image to be identified is pre-processed, including by image gray processing, unified image size, image picture
Plain all normalization;
S2.2, the aforementioned trained depth confidence network of pretreated image input is extracted corresponding second characteristics of image to
Amount;
S2.3, the second feature vector of signature image to be identified is calculated to the distance of cluster centre C, be denoted as d, compare d and df,
If d >=df, then judge that signature image to be identified is vacation, whereas if d < df, then judge that signature image to be identified is
Very.
8. the signature image identification method based on depth confidence network and k mean cluster, feature exist according to claim 1
In: the feature vector of image is extracted by inverting neural network, and the feature vector by extracting restores original image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610840672.2A CN106529395B (en) | 2016-09-22 | 2016-09-22 | Signature image identification method based on depth confidence network and k mean cluster |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610840672.2A CN106529395B (en) | 2016-09-22 | 2016-09-22 | Signature image identification method based on depth confidence network and k mean cluster |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106529395A CN106529395A (en) | 2017-03-22 |
CN106529395B true CN106529395B (en) | 2019-07-12 |
Family
ID=58343917
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610840672.2A Active CN106529395B (en) | 2016-09-22 | 2016-09-22 | Signature image identification method based on depth confidence network and k mean cluster |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106529395B (en) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019165939A1 (en) * | 2018-02-27 | 2019-09-06 | 上海寒武纪信息科技有限公司 | Computing device, and related product |
CN110196735A (en) * | 2018-02-27 | 2019-09-03 | 上海寒武纪信息科技有限公司 | A kind of computing device and Related product |
CN109583332B (en) * | 2018-11-15 | 2021-07-27 | 北京三快在线科技有限公司 | Face recognition method, face recognition system, medium, and electronic device |
CN109299306B (en) * | 2018-12-14 | 2021-09-07 | 央视国际网络无锡有限公司 | Image retrieval method and device |
CN113269136B (en) * | 2021-06-17 | 2023-11-21 | 南京信息工程大学 | Off-line signature verification method based on triplet loss |
CN113326809A (en) * | 2021-06-30 | 2021-08-31 | 重庆大学 | Off-line signature identification method and system based on three-channel neural network |
CN116484247B (en) * | 2023-06-21 | 2023-09-05 | 北京点聚信息技术有限公司 | Intelligent signed data processing system |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104200239A (en) * | 2014-09-09 | 2014-12-10 | 河海大学常州校区 | Image feature fusion identification based signature authentic identification system and method |
CN105320961A (en) * | 2015-10-16 | 2016-02-10 | 重庆邮电大学 | Handwriting numeral recognition method based on convolutional neural network and support vector machine |
CN105893952A (en) * | 2015-12-03 | 2016-08-24 | 无锡度维智慧城市科技股份有限公司 | Hand-written signature identifying method based on PCA method |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8209080B2 (en) * | 2009-04-27 | 2012-06-26 | Toyota Motor Engineering & Manufacturing North America, Inc. | System for determining most probable cause of a problem in a plant |
CA2788145C (en) * | 2010-02-17 | 2015-05-19 | Photoccino Ltd. | System and method for creating a collection of images |
-
2016
- 2016-09-22 CN CN201610840672.2A patent/CN106529395B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104200239A (en) * | 2014-09-09 | 2014-12-10 | 河海大学常州校区 | Image feature fusion identification based signature authentic identification system and method |
CN105320961A (en) * | 2015-10-16 | 2016-02-10 | 重庆邮电大学 | Handwriting numeral recognition method based on convolutional neural network and support vector machine |
CN105893952A (en) * | 2015-12-03 | 2016-08-24 | 无锡度维智慧城市科技股份有限公司 | Hand-written signature identifying method based on PCA method |
Non-Patent Citations (1)
Title |
---|
脱机中文签名鉴别系统关键技术研究;朱浩悦;《中国优秀博硕士学位论文全文数据库 (硕士) 信息科技辑》;20060915;第2006卷(第09期);第28-29、45-51、61页 |
Also Published As
Publication number | Publication date |
---|---|
CN106529395A (en) | 2017-03-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106529395B (en) | Signature image identification method based on depth confidence network and k mean cluster | |
CN107977609B (en) | Finger vein identity authentication method based on CNN | |
CN106326886B (en) | Finger vein image quality appraisal procedure based on convolutional neural networks | |
CN103605972B (en) | Non-restricted environment face verification method based on block depth neural network | |
CN106372581B (en) | Method for constructing and training face recognition feature extraction network | |
CN103824054B (en) | A kind of face character recognition methods based on cascade deep neural network | |
CN107220635A (en) | Human face in-vivo detection method based on many fraud modes | |
CN103839041B (en) | The recognition methods of client features and device | |
CN109101938B (en) | Multi-label age estimation method based on convolutional neural network | |
CN110399821B (en) | Customer satisfaction acquisition method based on facial expression recognition | |
CN108427921A (en) | A kind of face identification method based on convolutional neural networks | |
CN110427832A (en) | A kind of small data set finger vein identification method neural network based | |
CN107194376A (en) | Mask fraud convolutional neural networks training method and human face in-vivo detection method | |
CN107292267A (en) | Photo fraud convolutional neural networks training method and human face in-vivo detection method | |
CN106485214A (en) | A kind of eyes based on convolutional neural networks and mouth state identification method | |
CN109190566A (en) | A kind of fusion local code and CNN model finger vein identification method | |
CN106503661B (en) | Face gender identification method based on fireworks deepness belief network | |
CN107301396A (en) | Video fraud convolutional neural networks training method and human face in-vivo detection method | |
CN102799872B (en) | Image processing method based on face image characteristics | |
CN105956570B (en) | Smiling face's recognition methods based on lip feature and deep learning | |
CN106778512A (en) | Face identification method under the conditions of a kind of unrestricted based on LBP and depth school | |
CN107066951A (en) | A kind of recognition methods of spontaneous expression of face and system | |
CN112560710B (en) | Method for constructing finger vein recognition system and finger vein recognition system | |
CN106203373B (en) | A kind of human face in-vivo detection method based on deep vision bag of words | |
CN114926892A (en) | Fundus image matching method and system based on deep learning and readable medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |