CN106529395A - Signature image recognition method based on deep brief network and k-means clustering - Google Patents

Signature image recognition method based on deep brief network and k-means clustering Download PDF

Info

Publication number
CN106529395A
CN106529395A CN201610840672.2A CN201610840672A CN106529395A CN 106529395 A CN106529395 A CN 106529395A CN 201610840672 A CN201610840672 A CN 201610840672A CN 106529395 A CN106529395 A CN 106529395A
Authority
CN
China
Prior art keywords
neuron
layer
inverting
signature image
output
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610840672.2A
Other languages
Chinese (zh)
Other versions
CN106529395B (en
Inventor
刘罡
饶鉴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Creative Wisdom Technology (wuhan) Co Ltd
Original Assignee
Creative Wisdom Technology (wuhan) Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Creative Wisdom Technology (wuhan) Co Ltd filed Critical Creative Wisdom Technology (wuhan) Co Ltd
Priority to CN201610840672.2A priority Critical patent/CN106529395B/en
Publication of CN106529395A publication Critical patent/CN106529395A/en
Application granted granted Critical
Publication of CN106529395B publication Critical patent/CN106529395B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/30Writer recognition; Reading and verifying signatures
    • G06V40/33Writer recognition; Reading and verifying signatures based only on signature image, e.g. static signature recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/30Writer recognition; Reading and verifying signatures
    • G06V40/37Writer recognition; Reading and verifying signatures based only on signature signals such as velocity or pressure, e.g. dynamic signature recognition
    • G06V40/382Preprocessing; Feature extraction

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a signature image recognition method based on deep brief network and k-means clustering. The method comprises the following steps: S1) training a deep brief network: inputting handwritten signature images as a training set; through the deep brief network, extracting the first characteristic vector of each inputted handwritten signature image; using a k-means clustering algorithm to cluster the first characteristic vectors as one class; obtaining a clustering center and the distance df from the clustering center to the furthest vector in the class and accomplishing the training of the deep brief network; and S2) recognizing the signature image: inputting the handwritten signature image in need to be recognized; through the trained deep brief network, extracting the second characteristic vectors; calculating the distance d from the second characteristic vectors to the center of the training set characteristic vector clustering; if d is equal to or greater than df, recognizing the inputted handwritten signature image in need to be recognized as a fake signature image; otherwise a true signature image. According to the invention, it is possible to automatically perform signature image recognition without manual intervention; and the recognition speed is accelerated; and the correct rate of recognition is also increased.

Description

Signature image authentication method based on depth confidence network and k mean clusters
Technical field
The present invention relates to image authentication, reflects more particularly, to the signature image of a kind of depth confidence network and k mean clusters Determine method, belong to deep learning and area of pattern recognition.
Background technology
Currently, many occasions need to identify signature, and signature identification has extensively as a kind of identification authentication mode Application, such as finance signature identification, legal signature identification etc..Signature identification is mainly identified by 2 kinds of modes:1) manually enter Row identification, but time-consuming, and qualification result is subject to subjective impact big;2) computer is identified, designs signature image by people Feature mode identified by contrasting the feature extracted simultaneously for extracting corresponding characteristics of image, however, due to signature with Meaning property, causes to design suitable characteristics of image pattern difficulty higher, it is impossible to design effective signature image generic features pattern, because This, the authentication method identification accuracy of contrast characteristic is difficult to effectively improve.
The content of the invention
Present invention aim to overcome that the various problems existing for above-mentioned existing signature identification technology, for Traditional Man label Shortcoming present in the signature identification of name image authentication and computer technology, such as:Conventional identification relies on the experience of people, computer mirror It is manually set in fixed that signature image feature mode is more complicated, for different signatures needs to set different characteristics of image, to label Name image has a particular/special requirement, the shortcomings of identification accuracy is not high, proposes a kind of based on depth confidence network and the label of k mean clusters Name image authentication method.The present invention can be identified to signature image automatically and without the need for human intervention, and accelerates identification Speed, improve the accuracy of identification.
Realize that technical scheme that the object of the invention is adopted is a kind of based on depth confidence network and the signature map of k mean clusters As authentication method, the method is comprised the following steps:
S1 trains depth confidence network:Input handwritten signature image is as training set, every by depth confidence network extraction Width is input into the first eigenvector of handwritten signature image, the first eigenvector is gathered for 1 class using k means clustering algorithms, Obtain cluster centre and cluster centre in class farthest vector apart from df, realize training depth confidence network;
S2 identifies signature image:Input needs the handwritten signature image of identification, is carried by the depth confidence network for training Second feature vector is taken, and second feature vector is calculated to training set feature vector clusters center apart from d, it is if d >=df, defeated Entering needs the handwritten signature image identified to be false signature image, is otherwise true signature image.
The invention has the advantages that:The present invention proposes inverting neutral net, and with common group of convolutional encoding automatic machine The characteristics of into a new depth confidence network, the depth confidence network can be from extracting directly in the image of arbitrary size Go out characteristic vector rather than feature subgraph, extend the ability in feature extraction of depth confidence network;Inverting neutral net is adopted and is drilled Change strategy to be trained, as evolutionary strategy searching globally optimal solution ability is relatively strong and does not need gradient information, so training Inverting neutral net applicability it is wider, and from feature subgraph extract validity feature vector ability it is stronger.Using k averages Clustering algorithm is clustered to the signature image characteristic vector for extracting, by the spy of the hand-written image to be identified of relatively newer input The distance for levying vector sum cluster centre carries out identification input, and authenticity problem is switched to clustering problem, makes identification by the method Accuracy is improved, and effectively reduces over-fitting.Compared with congenic method, the present invention can improve hand-written identification Speed, improves identification accuracy.
Description of the drawings
Fig. 1 is flow chart of the present invention based on the signature image authentication method of depth confidence network and k mean clusters.
Fig. 2 is a kind of flow chart of preferred embodiment of training depth confidence network in Fig. 1.
Fig. 3 is a kind of flow chart of preferred embodiment of identification signature image in Fig. 1.
Specific embodiment
Below by embodiment, and accompanying drawing is combined, technical scheme is described in further detail.
Fig. 1 is illustrated that totality of the present invention based on the signature image authentication method of depth confidence network and k mean clusters Flow chart, whole identity process are divided into:S1 trains depth confidence network:Be input into as training set signature image to identify network Be trained signature image is identified with S2:The mirror of network is obtained in the identification network that signature image input to be identified is trained Determine result.
Present embodiment assumes that the actual signature image of input RBG color spaces is 100 width, the identification network based on Fig. 2 is instructed Practice flow chart, step S1 input of the present invention is implemented to identifying the concrete training that network is trained as the signature image of training set Step is as follows:
S1.1, the signature image of 100 RBG color spaces is switched to 8 gray-scale maps, the size unification of all input pictures For 256*256 pixel, pixel all normalization of all images.
S1.2, user initialize convolution automatic coding machine parameter according to input image information, and convolution automatic coding machine has 4 Layer, wherein convolution layer number be 2 layers, sampling layer number be 2 layers, built-up sequence be level 1 volume lamination, the 2nd layer of sample level, the 3rd Layer convolutional layer, the 4th layer of sample level.Level 1 volume product template size is 33*33, and convolution mask quantity is 6;2nd layer of sample template Size is 4*4, and sample template quantity is 6;3rd layer of convolution mask size is 33*33, and convolution mask number is 16;4th layer Sample template size is 4*4, and sample template quantity is 16.
S1.3, user initialize the parameter of gradient descent algorithm, and the learning rate that gradient declines is 0.03, maximum iteration time It is 15.
S1.4,100 signature images as training set are input to one by one in convolution automatic coding machine, using gradient Descent algorithm is trained to obtain the feature subgraph of every width training set image, every width training set image to convolution automatic coding machine Feature subgraph quantity be 16, each feature subgraph size be 6*6, feature sub-collective drawing is input in inverting neutral net.
S1.5, user initialize inverting neural network parameter according to input feature vector picture information, and inverting neutral net is divided into 2 Layer, the 1st layer is 300 neurons, and the 2nd layer is 120 neurons, and the activation primitive of each neuron is sigomd functions.
S1.6, user initialize the parameter of evolutionary strategy, and tactful using 1+4, mutation probability is 0.06, maximum evolution algebraically For 50000 generations, variation mode is Gaussian mutation.The individuality of the weights of inverting neutral net and biasing as evolutionary strategy, it is individual Length is 300*16*6*6+120*300+300+120=209220.
S1.7, the 1st layer of neuron of inverting neutral net data for returning and the often dimension being originally inputted between characteristic vector are put down Valuation functions of the equal error as evolutionary strategy.Valuation functions are made up of 3 parts:
1) the positive transmission of characteristic vector.First, the feature subgraph of the every width input picture for convolution automatic coding machine being extracted Integrate the characteristic vector tieed up as 16*6*6=576 according to each start pixel, and this characteristic vector is input to into the nerve in the 1st layer Unit is processed, and then the result of neuron is input in the neuron in the 2nd layer and goes to calculate whole inverting god in the 1st layer The positive output of Jing networks, its detailed process are identical with the data of general neural network forward direction transmittance process.Wherein, the 1st In layer, the output calculating formula of i-th neuron is: For in the 1st layer i-th neuron it is defeated Go out,For j-th weights of i-th neuron in the 1st layer,Jth for input feature value is tieed up,For in the 1st layer i-th The biasing of individual neuron, in this layer of positive output, i=1,2 ..., 300.
The output calculating formula of the 2nd layer of i-th neuron is: For i-th in the 2nd layer The output of neuron,For j-th weights of i-th neuron in the 2nd layer,For the output of j-th neuron in the 1st layer,For the biasing of i-th neuron in the 2nd layer, in this layer of positive output, i=1,2 ..., 120.
2) inverting neutral net forward direction output data back transfer.In the present embodiment, the 2nd layer of neuron is neural to the 1st layer First reversely input data, the vector of the 1st layer of neuron reverse output inverting after processing to the data of reverse input.Wherein, In 1st layer, the reverse output calculating formula of i-th neuron is: It is refreshing for i-th in the 1st layer The reverse output of Jing units,For i-th weights of j-th neuron in the 2nd layer,For in the 2nd layer j-th neuron it is defeated Go out,It is the biasing of i-th neuron in the 1st layer, in this layer of inverting, i=1,2 ..., 300.
In 1st layer, reversely output inverting is vectorial to the outside of inverting neutral net for i-th neuron, and its calculating formula is: For the i-th dimension of reverse output vector,For i-th weights of j-th neuron in the 1st layer, For the output of j-th neuron in the 1st layer, in this layer of inverting, i=1,2 ..., 576.
3) the often dimension mean error for being originally inputted the reverse output vector of vector sum is calculated, inverting god is assessed according to mean error The extraction characteristic effect of Jing networks, its valuation functions is: Extract for convolutional encoding automatic machine Input feature value,For the reverse output vector of inverting neutral net.
S1.8, evolutionary strategy training inverting neutral net, comprises the following steps that;
S1.8.1, random initializtion 4 are individual, assess all individualities;
S1.8.2, best individuality is selected, go to step 8.5 if end condition is reached, otherwise go to step 8.3;
S1.8.3, for it is best individuality in per 1 dimension, Gaussian mutation is carried out according to mutation probability.To current preferably individual Enter row variation and produce 4 new individuals, assess the individuality of all new generations;
4 new individuals that S1.8.4, the previous generation be preferably individual and variation is produced constitute 1+4 it is individual, go to step 8.2;
S1.8.5, the best individuality of output;Inverting neural metwork training is completed, and the first of the every width signature image of final output is special Levy vector.
S1.9, user's initialization k means clustering algorithm parameters, the maximum iteration time of k means clustering algorithms was 2000 generations, Cluster number be 1, will all input datas gather for 1 class.100 characteristic vectors that inverting neutral net is exported are equal using k Value clustering algorithm gathers for 1 class, is designated as K, produces cluster centre C, is found apart from cluster centre C farthest characteristic vector in class K, Record cluster centre C is designated as d to the distance of this feature vectorf.So far, signature image identification network training is fully completed, and identifies Network has been built up finishing.
Based on the mirror that signature image input to be identified is trained by the identification network training flow chart of Fig. 3, S2 of the present invention The step of qualification result of network is obtained in determining network is as follows:
S2.1, signature image to be identified is pre-processed, including by image gray processing, unified image size, image Pixel all normalize.
S2.2, pretreated image is input into the aforementioned corresponding characteristics of image of depth confidence network extraction for training to Amount;
The characteristic vector of S2.3, calculating signature image to be identified is designated as d to the distance of cluster centre C.Contrast d and df, If d >=df, then judge that signature image to be identified is false, whereas if d < df, then judge that signature image to be identified is Very.
Adopt the performance data identified for the handwritten signature image of " Li Wenbin " to signing by the present embodiment method as follows Table:
1. the present embodiment performance data of table
Identification speed 1s
Identification accuracy 100%
Upper table shows that this authentication method identification speed is fast, and accuracy is high.
Specific embodiment described herein is only explanation for example spiritual to the present invention.Technology neck belonging to of the invention The technical staff in domain can be made various modifications or supplement or replaced using similar mode to described specific embodiment Generation, but without departing from the spiritual of the present invention or surmount scope defined in appended claims.

Claims (9)

1. a kind of signature image authentication method based on depth confidence network and k mean clusters, it is characterised in that including following step Suddenly:
S1 trains depth confidence network:Input handwritten signature image is as training set, defeated by depth confidence network extraction per Enter the first eigenvector of handwritten signature image, the first eigenvector is gathered for 1 class, acquisition using k means clustering algorithms Cluster centre and cluster centre in 1 class farthest vector apart from df, realize training depth confidence network;
S2 identifies signature image:Input needs the handwritten signature image of identification, by the depth confidence network extraction that trains the Two characteristic vectors, calculate second feature vector to training set feature vector clusters center apart from d, if d >=df, input is needed Handwritten signature image to be identified is false signature image, is otherwise true signature image.
2. signature image authentication method according to claim 1 based on depth confidence network and k mean clusters, its feature exist In the S1 training depth confidence network includes:
S1.1, the actual signature image of multiple is input into as training set, be same by the unification of all actual signature picture sizes The triple RGB color of each image is converted into gray level image, gray-scale pixel values is normalized by size;
S1.2, convolution automatic coding machine and inverting neutral net constitute depth confidence network to be used for extracting image feature vector;Root The parameter of convolution automatic coding machine is set according to input picture size user, the parameter includes that convolution automatic coding machine uses convolution Adopt in convolution mask usage quantity and convolution mask size and every layer of sample level in layer number, sampling layer number, every layer of convolutional layer Original mold plate usage quantity and sample template size;
S1.3, the training algorithm of convolution automatic coding machine adopt gradient descent algorithm, user to initialize the ginseng of gradient descent algorithm Number, the parameter include learning rate and maximum iteration time;
S1.4 and then each image in training set is input to one by one in convolution automatic coding machine, using gradient descent algorithm Convolutional encoding machine is trained to obtain the feature sub-collective drawing of each image, and the feature sub-collective drawing of acquisition is input to into inverting In neutral net;
S1.5, according to the size and number for being input into every width feature subgraph, user sets the parameter of inverting neutral net, these parameters The number of plies including inverting neutral net, per layer of activation primitive for using the quantity of neuron, each neuron are selected;
S1.6, the training algorithm of inverting neutral net adopt evolutionary strategy, user to initialize the parameter of evolutionary strategy, the parameter Including population scale, mutation probability, maximum evolution algebraically, variation mode, weights and the biasing conduct of inverting neutral net are developed The individuality of strategy;
S1.7, the input layer data for returning of inverting neutral net and the often dimension being originally inputted between characteristic vector are average Valuation functions of the error as evolutionary strategy;
S1.8, evolutionary strategy training inverting neutral net
S1.9, user's initialization k means clustering algorithm parameters, the parameter include maximum iteration time and cluster number;Wherein, In signature identification, cluster number is fixed as 1 class;All characteristic vectors that inverting neutral net is exported use k mean clusters Algorithm gathers for 1 class, is designated as K, produces cluster centre C, is found apart from cluster centre C farthest first eigenvector, note in class K Cluster centre C is to the distance of the first eigenvector for record, is designated as df, so far, signature image identification network training is fully completed, and reflects Determine network to have been built up finishing.
3. signature image authentication method according to claim 2 based on depth confidence network and k mean clusters, its feature exist In:The actual signature image of at least 100 is input in step S1.1 as training set.
4. signature image authentication method according to claim 2 based on depth confidence network and k mean clusters, its feature exist In:Depth confidence network described in step S1.2 is made up of convolution automatic coding machine and inverting neutral net, and convolution is certainly Dynamic code machine is used for extracting the feature sub-collective drawing of every width signature image, and inverting neutral net is for the feature from every width signature image The characteristic vector of every width signature image is extracted in sub-collective drawing, and both persons together constitute a complete depth confidence network.
5. signature image authentication method according to claim 2 based on depth confidence network and k mean clusters, its feature exist In step S1.7, the valuation functions of inverting neutral net are calculated and are made up of 3 parts, are respectively:
1) the positive transmission of input feature value data, calculates the positive output of inverting neutral net;First, by convolution autocoding According to the characteristic vector that each start pixel is t*m*n dimensions, t is per width figure to the feature sub-collective drawing of every width input picture that machine is extracted The feature sub-collective drawing scale of picture, m*n is the size of each feature subgraph;Then by characteristic vector input reverse neutral net, and Successively processed to produce the positive output of inverting neutral net;
2) inverting data network forward direction output data back transfer, calculates the reverse output of inverting neutral net;Its calculating process It is as follows:First, inverting neutral net forward direction output data is returned to previous by the neuron in the output layer N of inverting neutral net In neuron in hidden layer P, if the neuronal quantity in output layer N is Nt, in hidden layer P, neuronal quantity is Pt, f (x) For neuron activation functions, in hidden layer P, the output calculating formula of i-th neuron is: For the reverse output of i-th neuron in hidden layer P,For i-th weights of j-th neuron in output layer N,For The output of j-th neuron in output layer N,For the biasing of i-th neuron in hidden layer P, i=1,2 ..., Pt
Then, hidden layer is successively reversely exported, and the output of the neuron in hidden layer s is reversely input to the nerve in hidden layer L Unit, if the neuronal quantity in hidden layer s is St, the neuronal quantity in hidden layer L is Lt, i-th neuron in hidden layer L Output calculating formula be: For the reverse output of i-th neuron in hidden layer L, For i-th weights of j-th neuron in hidden layer s,For the reverse output of j-th neuron in hidden layer s,For L The biasing of i-th neuron, i=1,2 ..., L in layert
Finally, the reverse output data of input layer is to outside inverting neutral net, the dimension of reverse output vector and original The dimension of input vector is equal;If the neuronal quantity in input layer H is Ht, the dimension of reverse output vector is D, is reversely exported The every one-dimensional calculating formula of data is: It is the i-th dimension of reverse output vector,For in input layer H I-th weights of j neuron,For the reverse output of j-th neuron in input layer H, i=1,2 ..., D;
3) the often dimension mean error for being originally inputted the reverse output vector of vector sum is calculated, inverting nerve net is assessed according to mean error The extraction characteristic effect of network, its valuation functions is: For the input that convolutional encoding automatic machine is extracted The i-th dimension of characteristic vector,For the i-th dimension of the reverse output vector of inverting neutral net, D is the dimension for being originally inputted characteristic vector Number.
6. signature image authentication method according to claim 5 based on depth confidence network and k mean clusters, its feature exist In the training algorithm of inverting neutral net is evolutionary strategy, the Inversion Calculation in evolutionary strategy, in the assessment of inverting neutral net Process is:
First, inverting neutral net forward direction output data is returned to previous by the neuron in the output layer N of inverting neutral net In neuron in hidden layer P, if the neuronal quantity in output layer N is Nt, in hidden layer P, neuronal quantity is Pt, f (x) For neuron activation functions, in hidden layer P, the output calculating formula of i-th neuron is: For the reverse output of i-th neuron in hidden layer P,For i-th weights of j-th neuron in output layer N,For The output of j-th neuron in output layer N,For the biasing of i-th neuron in hidden layer P, i=1,2 ..., Pt
Then, hidden layer is successively reversely exported, and the output of the neuron in hidden layer s is reversely input to the nerve in hidden layer L Unit, if the neuronal quantity in hidden layer s is St, the neuronal quantity in hidden layer L is Lt, i-th neuron in hidden layer L Output calculating formula be: For the reverse output of i-th neuron in hidden layer L, For i-th weights of j-th neuron in hidden layer s,For the reverse output of j-th neuron in hidden layer s,For L The biasing of i-th neuron, i=1,2 ..., L in layert
Finally, the reverse output data of input layer is to outside inverting neutral net, the dimension of reverse output vector and original The dimension of input vector is equal, if the neuronal quantity in input layer H is Ht, the dimension of reverse output vector is D, is reversely exported The every one-dimensional calculating formula of data is: It is the i-th dimension of reverse output vector,For in input layer H I-th weights of j neuron,For the reverse output of j-th neuron in input layer H, i=1,2 ..., D.
7. signature image authentication method according to claim 2 based on depth confidence network and k mean clusters, its feature exist Include in step S1.8:
S1.8.1, random initializtion λ are individual, assess all individualities;
S1.8.2, best individuality is selected, go to step 8.5 if end condition is reached, otherwise go to step 8.3;
S1.8.3, for it is best individuality in per 1 dimension, Gaussian mutation is carried out according to mutation probability.Current preferably individuality is carried out Variation produces λ new individual, assesses the individuality of all new generations;
It is individual that the λ new individual that S1.8.4, the previous generation be preferably individual and variation is produced constitutes 1+ λ, goes to step 8.2;
S1.8.5, the best individuality of output;
Inverting neural metwork training is completed, the first eigenvector of the every width signature image of final output.
8. the signature image identification side according to any one of claim 1~7 based on depth confidence network and k mean clusters Method, it is characterised in that the S2 identifications signature image includes:
S2.1, signature image to be identified is pre-processed, including by image gray processing, unified image size, image picture It is plain all to normalize;
S2.2, pretreated image is input into aforementioned corresponding second characteristics of image of depth confidence network extraction for training to Amount;
S2.3, the distance for calculating the second feature vector of signature image to be identified to cluster centre C, are designated as d, contrast d and df, If d >=df, then judge that signature image to be identified is false, whereas if d < df, then judge that signature image to be identified is Very.
9. signature image authentication method according to claim 1 based on depth confidence network and k mean clusters, its feature exist In:The characteristic vector of image is extracted by inverting neutral net, and original figure is restored by the characteristic vector extracted Picture.
CN201610840672.2A 2016-09-22 2016-09-22 Signature image identification method based on depth confidence network and k mean cluster Active CN106529395B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610840672.2A CN106529395B (en) 2016-09-22 2016-09-22 Signature image identification method based on depth confidence network and k mean cluster

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610840672.2A CN106529395B (en) 2016-09-22 2016-09-22 Signature image identification method based on depth confidence network and k mean cluster

Publications (2)

Publication Number Publication Date
CN106529395A true CN106529395A (en) 2017-03-22
CN106529395B CN106529395B (en) 2019-07-12

Family

ID=58343917

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610840672.2A Active CN106529395B (en) 2016-09-22 2016-09-22 Signature image identification method based on depth confidence network and k mean cluster

Country Status (1)

Country Link
CN (1) CN106529395B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109299306A (en) * 2018-12-14 2019-02-01 央视国际网络无锡有限公司 Image search method and device
CN109583332A (en) * 2018-11-15 2019-04-05 北京三快在线科技有限公司 Face identification method, face identification system, medium and electronic equipment
CN110196735A (en) * 2018-02-27 2019-09-03 上海寒武纪信息科技有限公司 A kind of computing device and Related product
WO2019165939A1 (en) * 2018-02-27 2019-09-06 上海寒武纪信息科技有限公司 Computing device, and related product
CN113269136A (en) * 2021-06-17 2021-08-17 南京信息工程大学 Offline signature verification method based on triplet loss
CN113326809A (en) * 2021-06-30 2021-08-31 重庆大学 Off-line signature identification method and system based on three-channel neural network
CN116484247A (en) * 2023-06-21 2023-07-25 北京点聚信息技术有限公司 Intelligent signed data processing system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100274433A1 (en) * 2009-04-27 2010-10-28 Toyota Motor Engineering & Manufacturing North America, Inc. System for determining most probable cause of a problem in a plant
CN104200239A (en) * 2014-09-09 2014-12-10 河海大学常州校区 Image feature fusion identification based signature authentic identification system and method
US20150339542A1 (en) * 2010-02-17 2015-11-26 Shutterfly, Inc. System and method for creating a collection of images
CN105320961A (en) * 2015-10-16 2016-02-10 重庆邮电大学 Handwriting numeral recognition method based on convolutional neural network and support vector machine
CN105893952A (en) * 2015-12-03 2016-08-24 无锡度维智慧城市科技股份有限公司 Hand-written signature identifying method based on PCA method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100274433A1 (en) * 2009-04-27 2010-10-28 Toyota Motor Engineering & Manufacturing North America, Inc. System for determining most probable cause of a problem in a plant
US20150339542A1 (en) * 2010-02-17 2015-11-26 Shutterfly, Inc. System and method for creating a collection of images
CN104200239A (en) * 2014-09-09 2014-12-10 河海大学常州校区 Image feature fusion identification based signature authentic identification system and method
CN105320961A (en) * 2015-10-16 2016-02-10 重庆邮电大学 Handwriting numeral recognition method based on convolutional neural network and support vector machine
CN105893952A (en) * 2015-12-03 2016-08-24 无锡度维智慧城市科技股份有限公司 Hand-written signature identifying method based on PCA method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
朱浩悦: "脱机中文签名鉴别系统关键技术研究", 《中国优秀博硕士学位论文全文数据库 (硕士) 信息科技辑》 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110196735A (en) * 2018-02-27 2019-09-03 上海寒武纪信息科技有限公司 A kind of computing device and Related product
WO2019165939A1 (en) * 2018-02-27 2019-09-06 上海寒武纪信息科技有限公司 Computing device, and related product
CN109583332A (en) * 2018-11-15 2019-04-05 北京三快在线科技有限公司 Face identification method, face identification system, medium and electronic equipment
CN109583332B (en) * 2018-11-15 2021-07-27 北京三快在线科技有限公司 Face recognition method, face recognition system, medium, and electronic device
CN109299306A (en) * 2018-12-14 2019-02-01 央视国际网络无锡有限公司 Image search method and device
CN109299306B (en) * 2018-12-14 2021-09-07 央视国际网络无锡有限公司 Image retrieval method and device
CN113269136A (en) * 2021-06-17 2021-08-17 南京信息工程大学 Offline signature verification method based on triplet loss
CN113269136B (en) * 2021-06-17 2023-11-21 南京信息工程大学 Off-line signature verification method based on triplet loss
CN113326809A (en) * 2021-06-30 2021-08-31 重庆大学 Off-line signature identification method and system based on three-channel neural network
CN116484247A (en) * 2023-06-21 2023-07-25 北京点聚信息技术有限公司 Intelligent signed data processing system
CN116484247B (en) * 2023-06-21 2023-09-05 北京点聚信息技术有限公司 Intelligent signed data processing system

Also Published As

Publication number Publication date
CN106529395B (en) 2019-07-12

Similar Documents

Publication Publication Date Title
CN106529395B (en) Signature image identification method based on depth confidence network and k mean cluster
CN107977609B (en) Finger vein identity authentication method based on CNN
CN106326886B (en) Finger vein image quality appraisal procedure based on convolutional neural networks
WO2021134871A1 (en) Forensics method for synthesized face image based on local binary pattern and deep learning
CN107423700B (en) Method and device for verifying testimony of a witness
CN106372581B (en) Method for constructing and training face recognition feature extraction network
Liu et al. Offline signature verification using a region based deep metric learning network
CN110399821B (en) Customer satisfaction acquisition method based on facial expression recognition
CN106650786A (en) Image recognition method based on multi-column convolutional neural network fuzzy evaluation
CN113221655B (en) Face spoofing detection method based on feature space constraint
CN104268593A (en) Multiple-sparse-representation face recognition method for solving small sample size problem
CN109190566A (en) A kind of fusion local code and CNN model finger vein identification method
Blumenstein et al. The 4NSigComp2010 off-line signature verification competition: Scenario 2
CN107785061A (en) Autism-spectrum disorder with children mood ability interfering system
CN108664843A (en) Live subject recognition methods, equipment and computer readable storage medium
CN108875907A (en) A kind of fingerprint identification method and device based on deep learning
CN107220598A (en) Iris Texture Classification based on deep learning feature and Fisher Vector encoding models
CN114926892A (en) Fundus image matching method and system based on deep learning and readable medium
CN114596608A (en) Double-stream video face counterfeiting detection method and system based on multiple clues
CN116385832A (en) Bimodal biological feature recognition network model training method
CN106203373A (en) A kind of human face in-vivo detection method based on deep vision word bag model
Guo et al. Multifeature extracting CNN with concatenation for image denoising
CN115984930A (en) Micro expression recognition method and device and micro expression recognition model training method
Khan et al. Automatic signature verifier using Gaussian gated recurrent unit neural network
CN104112145B (en) Facial expression recognizing method based on PWLD and D S evidence theories

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant