CN110298233A - Palm grain identification method, device, computer equipment and storage medium - Google Patents

Palm grain identification method, device, computer equipment and storage medium Download PDF

Info

Publication number
CN110298233A
CN110298233A CN201910401000.5A CN201910401000A CN110298233A CN 110298233 A CN110298233 A CN 110298233A CN 201910401000 A CN201910401000 A CN 201910401000A CN 110298233 A CN110298233 A CN 110298233A
Authority
CN
China
Prior art keywords
palm
key point
characteristic pattern
picture
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910401000.5A
Other languages
Chinese (zh)
Other versions
CN110298233B (en
Inventor
侯丽
霍晓燕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Technology Shenzhen Co Ltd
Original Assignee
Ping An Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Technology Shenzhen Co Ltd filed Critical Ping An Technology Shenzhen Co Ltd
Priority to CN201910401000.5A priority Critical patent/CN110298233B/en
Publication of CN110298233A publication Critical patent/CN110298233A/en
Application granted granted Critical
Publication of CN110298233B publication Critical patent/CN110298233B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
    • G06V10/449Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters
    • G06V10/451Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters with interaction between the filter responses, e.g. cortical complex cells
    • G06V10/454Integrating the filters into a hierarchical structure, e.g. convolutional neural networks [CNN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computational Linguistics (AREA)
  • Evolutionary Biology (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Biophysics (AREA)
  • Human Computer Interaction (AREA)
  • Image Analysis (AREA)
  • Collating Specific Patterns (AREA)

Abstract

The invention belongs to artificial intelligence fields, and the embodiment of the invention discloses a kind of method, apparatus of personal recognition, computer equipment and storage mediums, and wherein method includes the following steps: to obtain palm picture to be identified;The palm picture is input to depth match neural network model trained in advance, obtains palm key point and characteristic pattern that the depth match neural network model responds the palm picture and exports;The characteristic pattern is reduced according to the palm key point, the characteristic pattern after being reduced;The matching degree between each archive feature figure in characteristic pattern and preset plate library after calculating the reduction obtains M characteristic pattern matching value;According to the M characteristic pattern matching value, the corresponding identity number of the palm picture is determined.It is unlimited to palm posture, position when personal recognition, flexibility is improved, is compared with the characteristic pattern after cutting with archive feature figure, is greatly avoided interfering, improve the precision of identification.

Description

Palm grain identification method, device, computer equipment and storage medium
Technical field
The invention belongs to artificial intelligence field more particularly to a kind of palm grain identification method, device, computer equipment and storages Medium.
Background technique
In common biometrics identification technology, fingerprint recognition application range is the most extensive, and the accuracy of identification of iris is very Height, however the pickup area of both biological characteristics is smaller, needs high-resolution image that could obtain satisfied accuracy of identification, It is very difficult to collect the high-resolution fingerprints or iris image that can satisfy identification requirement for the imaging system of generic mobile device.And hand The region of shape and palmmprint is larger, and relevant identification technology does not need very high image resolution ratio.Palmmprint refers to that finger tips arrive The palm image of wrist.Wherein many features can be used to carry out identification: such as main line, wrinkle, tiny texture, divide Crunode etc..Personal recognition is also a kind of non-offensive recognition methods, and user is easier to receive, not to acquisition equipment requirement It is high.
Palm key point extracting method contactless at present, generallys use auxiliary magnet, i.e., using more than two small columns It is fixed at Fingers root gap, this method is only applicable to fixed device, may not apply to the actual scenes such as camera or video In contactless palm key point extract.Palm cannot can only have more slant setting straight up, be limited more, knowledge Other precision is not high.
Summary of the invention
The present invention provides a kind of palm grain identification method, device, computer equipment and storage medium, with solve personal recognition by Limit the not high problem of more, accuracy of identification.
In order to solve the above technical problems, the present invention proposes a kind of palm grain identification method, include the following steps:
Obtain palm picture to be identified;
The palm picture is input to depth match neural network model trained in advance, obtains the depth match mind The palm key point and characteristic pattern for responding the palm picture through network model and exporting, wherein the palm key point is N A, N is the positive integer greater than 1;
The characteristic pattern is reduced according to the palm key point, the characteristic pattern after being reduced;
The matching degree between each archive feature figure in characteristic pattern and preset plate library after calculating the reduction obtains M characteristic pattern matching value, wherein be labelled with the archive feature figure of identity number, M in the preset plate library comprising M For the positive integer greater than 1;
According to the M characteristic pattern matching value, the corresponding identity number of the palm picture is determined.
Optionally, the depth match neural network model is trained as steps described below:
Training sample is obtained, the training sample is the palmmprint pictures for being labelled with palm key point, wherein each sample The palm key point of mark is N number of;
The training sample is input to depth match neural network model and obtains the depth match neural network model The palm key point for responding the training sample and predicting;
Calculate the quadratic sum of the distance between the palm key point of the mark and the palm key point of the prediction;
The parameter of each node of depth match neural network model is adjusted, until the palm key point of the mark and described When the quadratic sum minimum of the distance between the palm key point of prediction, training terminates.
Optionally, the depth match neural network model includes convolutional neural networks layer, transformation estimation layer, the life of temperature figure The training sample is input to the depth match neural network model acquisition depth pair described by stratification and feature extraction layer Neat neural network model respond the training sample and predict palm key point the step of in, including K stage, K is greater than 1 Positive integer, wherein k-th of stage includes the following steps:
As k=1, the training sample is input to the convolutional neural networks layer, obtains the key point amendment in stage 1 It is worth Δ S1, with preset key point initial value S0It is added, obtains the palm key point predicted value S of stage k1
As k=2~K, palm key forecast value S that stage k-1 is exportedk-1It is input to the transformation estimation layer, is obtained The transformation matrix T of stage kkAnd inversion matrix Tk -1
By the transformation matrix T of the stage kkIt is multiplied to obtain the palm transformation picture T of stage k with the training samplek(I);
The palm of the stage k is converted into picture Tk(I) it is input to the feature extraction layer, obtains the characteristic pattern of stage k Fk
By the transformation matrix T of the stage kkWith the palm key point predicted value S of the stage k-1k-1Multiplication obtains Tk (Sk-1);
By Tk(Sk-1) it is input to the temperature figure generation layer, obtain the temperature figure H of stage kk, wherein the transformation of temperature figure is public Formula are as follows:
Wherein SiIt is Tk(Sk-1) i-th label, (x, y) be training sample picture midpoint coordinate;
The palm of the stage k is converted into picture Tk(I), the characteristic pattern F of the stage kkWith the temperature figure of the stage k HkIt is input to the convolutional neural networks layer, obtains the key point correction value Δ S of stage kk
The key point correction value Δ Sk of the stage k is substituted into following formula, the palm key point predicted value of calculation stages k Sk:
Wherein, Sk-1For the key point predicted value of the stage k-1, TkAnd Tk -1For the transformation matrix T of the stage kkAnd it is inverse Bending moment battle array Tk -1
Optionally, the step of the corresponding identity number of the palm picture is determined according to M characteristic pattern matching value described In rapid, include the following steps:
The M characteristic pattern matching value is compared with preset threshold value;
When thering is L numerical value to be greater than in the M characteristic pattern matching value and/or being equal to preset threshold value, wherein L is big In 0 positive integer, determine that the identity number of the corresponding archive feature icon note of maximum value in the L numerical value is the palm The identity number of picture;
When the M characteristic pattern matching value is both less than preset threshold value, recognition failures information is returned.
Optionally, the step of the corresponding identity number of the palm picture is determined according to M characteristic pattern matching value described In rapid, include the following steps:
The size for comparing the M characteristic pattern matching value determines in the M characteristic pattern matching value that maximum value is corresponding and deposits The identity number of shelves characteristic pattern mark is the identity number of the palm picture.
Optionally, each archive feature figure is associated with corresponding key point auxiliary information in the preset plate library, described According to M characteristic pattern matching value, in the step of determining the palm picture corresponding identity, include the following steps:
The size for comparing the M characteristic pattern matching value, by R numerical value pair maximum in the M characteristic pattern matching value The identity number for the archive feature icon note answered is compared, and wherein R is the positive integer greater than 0, less than M;
When the identity number of the corresponding archive feature icon note of the R numerical value is identical, the identity is determined Number be the palm picture identity number;
When the identity difference of the corresponding archive feature icon note of the R numerical value, the palm picture is calculated Key point archive feature figure corresponding with the R numerical value key point auxiliary information between matching degree, obtain R key Point matching value;
According to the R numerical value and corresponding R key point matching value, the palm picture and the R numerical value are calculated Comprehensive matching degree between corresponding archive feature figure, wherein comprehensive matching degree calculates according to the following equation:
P (v)=aPf(v)+bPs(v)
Wherein, P (v) is comprehensive matching degree, Pf(v) figure matching value, P are characterizedsIt (v) is key point matching value, a and b are pre- If weight, v be 1~R positive integer;
The identity number for determining the palm picture is the corresponding archive feature of greatest measure in the comprehensive matching degree The identity number of icon note.
Optionally, the key point auxiliary information of the archive feature figure is the key point including at least 3 archive feature figures Coordinate, it is auxiliary in the key point of the key point archive feature figure corresponding with the R numerical value for calculating the palm picture In the step of matching degree between supplementary information, R key point matching value of acquisition, include the following steps:
According to the coordinate of the key point of the archive feature figure calculate the key point distance of the archive feature figure than;
According to the key point of the palm picture calculate the key point distance of the palm picture than;
According to the key point distance of the archive feature figure than the key point distance with the palm picture than calculating distance Than similarity, the key point matching value of the palm picture Yu the archive feature figure is obtained.
To solve the above problems, the present invention also provides a kind of personal recognition devices, comprising:
Module is obtained, for obtaining palm picture to be identified;
Processing module is obtained for the palm picture to be input to depth match neural network model trained in advance The depth match neural network model responds the palm picture and the palm key point and characteristic pattern that export, wherein described Palm key point be it is N number of, N is positive integer greater than 1;
Reduces module, for reducing the characteristic pattern according to the palm key point, the characteristic pattern after being reduced;
Computing module, for calculating between each archive feature figure in characteristic pattern and preset plate library after the reduction Matching degree, obtain M characteristic pattern matching value, wherein in the preset plate library comprising M open be labelled with identity number Archive feature figure, M are the positive integer greater than 1.
Execution module, for determining the corresponding identity of the palm picture according to the M characteristic pattern matching value Number.
Optionally, in the processing module further include:
First acquisition submodule, for obtaining training sample, the training sample is the palmmprint for being labelled with palm key point Pictures, wherein the palm key point of each sample mark is N number of;
First processing submodule obtains the depth for the training sample to be input to depth match neural network model Degree is aligned the palm key point that neural network model responds the training sample and predicts;
First computational submodule, for calculating between the palm key point of the mark and the palm key point of the prediction Distance quadratic sum;
The first adjustment submodule, for adjusting the parameter of each node of depth match neural network model, until the mark When the quadratic sum minimum of the distance between the palm key point of note and the palm key point of the prediction, training terminates.
Optionally, the depth match neural network model includes convolutional neural networks layer, transformation estimation layer, the life of temperature figure Stratification and feature extraction layer, in the first processing submodule, including K sub-processor, K are the positive integer greater than 1:
As k=1, the sub-processor 1 is used to the training sample being input to the convolutional neural networks layer, obtains The key point correction value Δ S in stage 11, with preset key point initial value S0It is added, obtains the palm key point predicted value of stage k S1
As k=2~K, the sub-processor k is used for the palm key forecast value S for exporting stage k-1k-1It is input to institute Transformation estimation layer is stated, the transformation matrix T of stage k is obtainedkAnd inversion matrix Tk -1
By the transformation matrix T of the stage kkIt is multiplied to obtain the palm transformation picture T of stage k with the training samplek(I);
The palm of the stage k is converted into picture Tk(I) it is input to the feature extraction layer, obtains the characteristic pattern of stage k Fk
By the transformation matrix T of the stage kkWith the palm key point predicted value S of the stage k-1k-1Multiplication obtains Tk (Sk-1);
By Tk(Sk-1) it is input to the temperature figure generation layer, obtain the temperature figure H of stage kk, wherein the transformation of temperature figure is public Formula are as follows:
Wherein SiIt is Tk(Sk-1) i-th label, (x, y) be training sample picture midpoint coordinate;
The palm of the stage k is converted into picture Tk(I), the characteristic pattern F of the stage kkWith the temperature figure of the stage k HkIt is input to the convolutional neural networks layer, obtains the key point correction value Δ S of stage kk
The key point correction value Δ Sk of the stage k is substituted into following formula, the palm key point predicted value of calculation stages k Sk:
Wherein, Sk-1For the key point predicted value of the stage k-1, TkAnd Tk -1For the transformation matrix T of the stage kkAnd it is inverse Bending moment battle array Tk -1
Optionally, in the execution module further include:
First compares submodule, for comparing the M characteristic pattern matching value with preset threshold value;
First determines submodule, has L numerical value to be greater than and/or equal to default for working as in the M characteristic pattern matching value Threshold value when, wherein L is positive integer greater than 0, determines that the corresponding archive feature icon of maximum value in the L numerical value is infused Identity number is the identity number of the palm picture;
First returns to submodule, loses for when the M characteristic pattern matching value is both less than preset threshold value, returning to identification Lose information.
Optionally, in the execution module further include:
Second determines that submodule determines the M characteristic pattern for the size of the M characteristic pattern matching value Identity number with the corresponding archive feature icon note of maximum value in value is the identity number of the palm picture.
Optionally, each archive feature figure is associated with corresponding key point auxiliary information in the preset plate library, described In execution module further include:
First Comparative sub-module matches the M characteristic pattern for the size of the M characteristic pattern matching value The identity number of the corresponding archive feature icon note of maximum R numerical value is compared in value, and wherein R is to be less than M greater than 0 Positive integer;
Second determines submodule, and the identity number for infusing when the corresponding archive feature icon of the R numerical value is identical When, determine that the identity number is the identity number of the palm picture;
Second computational submodule, for the identity difference when the corresponding archive feature icon note of the R numerical value When, it calculates between the key point auxiliary information of key point archive feature figure corresponding with the R numerical value of the palm picture Matching degree, obtain R key point matching value;
Third computational submodule, for calculating the hand according to the R numerical value and corresponding R key point matching value The comprehensive matching degree between picture archive feature figure corresponding with the R numerical value is slapped, wherein comprehensive matching degree is according to following public affairs Formula calculates:
P (v)=aPf(v)+bPs(v)
Wherein, P (v) is comprehensive matching degree, Pf(v) figure matching value, P are characterizedsIt (v) is key point matching value, a and b are pre- If weight, v be 1~R positive integer;
Third determines submodule, for determining that the identity number of the palm picture is maximum in the comprehensive matching degree The identity number of the corresponding archive feature icon note of numerical value.
Optionally, the key point auxiliary information of the archive feature figure is the key point including at least 3 archive feature figures Coordinate, in second computational submodule further include:
4th computational submodule, the coordinate for the key point according to the archive feature figure calculate the archive feature figure Key point distance than;
5th computational submodule, for calculated according to the key point of the palm picture key point of the palm picture away from From than;
6th computational submodule, for the key point distance according to the archive feature figure than the pass with the palm picture Key point distance obtains the key point matching value of the palm picture Yu the archive feature figure than calculating distance than similarity.
In order to solve the above technical problems, the embodiment of the present invention also provides a kind of computer equipment, including memory and processing Device is stored with computer-readable instruction in the memory, when the computer-readable instruction is executed by the processor, so that The processor executes the step of palm grain identification method described above.
In order to solve the above technical problems, the embodiment of the present invention also provides a kind of computer readable storage medium, the calculating Computer-readable instruction is stored on machine readable storage medium storing program for executing, when the computer-readable instruction is executed by processor, so that institute State the step of processor executes palm grain identification method described above.
The embodiment of the present invention has the beneficial effect that by obtaining palm picture to be identified;The palm picture is inputted To depth match neural network model trained in advance, obtains the depth match neural network model and respond the palm picture And the palm key point and characteristic pattern exported;The characteristic pattern is reduced according to the palm key point, the feature after being reduced Figure;The matching degree between each archive feature figure in characteristic pattern and preset plate library after calculating the reduction obtains M spy Levy figure matching value;According to the M characteristic pattern matching value, the corresponding identity number of the palm picture is determined.Personal recognition When it is unlimited to palm posture, position, improve flexibility, compared with the characteristic pattern after cutting with archive feature figure, greatly kept away Exempt to interfere, improves the precision of identification.
Detailed description of the invention
To describe the technical solutions in the embodiments of the present invention more clearly, make required in being described below to embodiment Attached drawing is briefly described, it should be apparent that, drawings in the following description are only some embodiments of the invention, for For those skilled in the art, without creative efforts, it can also be obtained according to these attached drawings other attached Figure
Fig. 1 is a kind of palm grain identification method basic procedure schematic diagram of the embodiment of the present invention;
Fig. 2 is depth match of embodiment of the present invention neural network model training flow diagram;
Fig. 3 is a kind of personal recognition device basic structure block diagram of the embodiment of the present invention;
Fig. 4 is that the present invention implements computer equipment basic structure block diagram.
Specific embodiment
In order to enable those skilled in the art to better understand the solution of the present invention, below in conjunction in the embodiment of the present invention Attached drawing, technical scheme in the embodiment of the invention is clearly and completely described.
In some processes of the description in description and claims of this specification and above-mentioned attached drawing, contain according to Multiple operations that particular order occurs, but it should be clearly understood that these operations can not be what appears in this article suitable according to its Sequence is executed or is executed parallel, and serial number of operation such as 101,102 etc. is only used for distinguishing each different operation, serial number It itself does not represent and any executes sequence.In addition, these processes may include more or fewer operations, and these operations can To execute or execute parallel in order.It should be noted that the description such as " first " herein, " second ", is for distinguishing not Same message, equipment, module etc., does not represent sequencing, does not also limit " first " and " second " and be different type.
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete Site preparation description, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.It is based on Embodiment in the present invention, those skilled in the art's every other implementation obtained without creative efforts Example, shall fall within the protection scope of the present invention.
Embodiment
Those skilled in the art of the present technique are appreciated that " terminal " used herein above, " terminal device " both include wireless communication The equipment of number receiver, only has the equipment of the wireless signal receiver of non-emissive ability, and including receiving and emitting hardware Equipment, have on bidirectional communication link, can execute two-way communication reception and emit hardware equipment.This equipment It may include: honeycomb or other communication equipments, shown with single line display or multi-line display or without multi-line The honeycomb of device or other communication equipments;PCS (Personal Communications Service, PCS Personal Communications System), can With combine voice, data processing, fax and/or communication ability;PDA (Personal Digital Assistant, it is personal Digital assistants), it may include radio frequency receiver, pager, the Internet/intranet access, web browser, notepad, day It goes through and/or GPS (Global Positioning System, global positioning system) receiver;Conventional laptop and/or palm Type computer or other equipment, have and/or the conventional laptop including radio frequency receiver and/or palmtop computer or its His equipment." terminal " used herein above, " terminal device " can be it is portable, can transport, be mounted on the vehicles (aviation, Sea-freight and/or land) in, or be suitable for and/or be configured in local runtime, and/or with distribution form, operate in the earth And/or any other position operation in space." terminal " used herein above, " terminal device " can also be communication terminal, on Network termination, music/video playback terminal, such as can be PDA, MID (Mobile Internet Device, mobile Internet Equipment) and/or mobile phone with music/video playing function, it is also possible to the equipment such as smart television, set-top box.
Terminal in present embodiment is above-mentioned terminal.
Specifically, referring to Fig. 1, Fig. 1 is a kind of basic procedure schematic diagram of palm grain identification method of the present embodiment.
As shown in Figure 1, a kind of palm grain identification method, includes the following steps:
S101, palm picture to be identified is obtained;
Palm picture to be identified is obtained by equipment with image collecting function, is set including hand-held with camera Standby, mobile device etc..
S102, the palm picture is input to depth match neural network model trained in advance, obtains the depth It is aligned palm key point and characteristic pattern that neural network model responds the palm picture and exports, wherein the palm is crucial It is N number of for putting, and N is the positive integer greater than 1;
Palm picture is input to depth match neural network model trained in advance, obtains depth match neural network mould Type responds the palm picture and the palm key point and characteristic pattern that export.Palm key point be it is multiple, depend on depth to it Study of the neural network model to training sample.
Depth match neural network model includes convolutional neural networks layer, transformation estimation layer, temperature figure generation layer and feature Extract layer, specific training is referring to Fig. 2, as shown in Fig. 2, the training of depth match neural network model includes the following steps:
S111, training sample is obtained, the training sample is the palmmprint pictures for being labelled with palm key point, wherein every The palm key point of a sample mark is N number of;
Training sample is palmmprint pictures, and each sample is labelled with the coordinate of palm key point, the palm marked here Key point includes but is not limited to that the little finger of toe of palm refers to root points outside, and little finger of toe finger tip refers to bifurcation point between little finger of toe and the third finger, nameless Finger tip refers to that bifurcation point, middle fingertip refer to bifurcation point, index finger tip, index finger outside between middle finger and index finger between the third finger and middle finger Point, tiger's jaw, thumb finger tip, thumb refer to root point, the starting point of three main lines of palmmprint, midpoint, terminal.
S112, the training sample is input to the depth match neural network model acquisition depth match neural network Model responds the training sample and the palm key point predicted;
The process that depth match neural network model exports the palm key point of prediction in response to training sample includes multiple Stage.
Wherein, the 1st stage includes the following steps:
Training sample is input to convolutional neural networks layer, obtains the key point correction value Δ S in stage 11, at the beginning of key point Initial value S0It is added, obtains the palm key point predicted value S in stage 11;Wherein key point initial value S0For the pass of training sample mark The average value of the coordinate of key point.
K-th of stage includes the following steps:
The palm key forecast value S that stage k-1 is exportedk-1It is input to transformation estimation layer, obtains the transformation matrix of stage k TkAnd inversion matrix Tk -1;Transformation estimation layer is realized the posture alignment to training picture, specifically, is used for generating transformation matrix Transformation matrix TkIt is multiplied to obtain the palm transformation picture T of stage k with training samplek(I), that is, palm posture alignment after figure Piece.
Then palm is converted into picture Tk(I) it is input to the feature extraction layer, obtains the characteristic pattern F of stage kk
There are many methods for feature extraction, and for example, by using histograms of oriented gradients method, it is by calculating and statistical picture part The gradient orientation histogram in region carrys out constitutive characteristic.The main thought of this method is the table of localized target in a sub-picture As that can be described well by the direction Density Distribution at gradient or edge with shape (appearance and shape).It is specific real Existing method is by an image image:
1) gray processing (regarding image as an x, the 3-D image of y, z (gray scale));
2) standardization (normalization) of color space is carried out to input picture using Gamma correction method;Purpose is adjusting figure The contrast of picture, reduce image local shade and illumination variation caused by influence, while the interference of noise can be inhibited;
3) gradient (including size and Orientation) of each pixel of image is calculated;Primarily to capture profile information, simultaneously The interference that further weakened light shines.
4) small cells (such as 6*6 pixel/cell) is divided an image into;
5) histogram of gradients (numbers of different gradients) for counting each cell can form description of each cell;
6) block (such as 3*3 cell/block), all cell in a block will be formed per several cell Feature Descriptor be together in series and just obtain the histogram of gradients Feature Descriptor of the block.
7) the histogram of gradients Feature Descriptor of all block in image image is together in series can be obtained by this The histogram of gradients Feature Descriptor of image.
Then by the transformation matrix T of stage kkWith the palm key point predicted value S of the stage k-1k-1Multiplication obtains Tk (Sk-1);Again by Tk(Sk-1) it is input to temperature figure generation layer, obtain the temperature figure H of stage kk, the wherein transformation for mula of temperature figure Are as follows:
Wherein SiIt is Tk(Sk-1) i-th label, (x, y) be training sample picture midpoint coordinate;
The palm of stage k is converted into picture Tk(I), the characteristic pattern F of stage kkWith the temperature figure H of stage kkIt is input to convolution Neural net layer obtains the key point correction value Δ S of stage kk
The key point correction value Δ Sk of stage k is substituted into following formula, the palm key point predicted value S of calculation stages kk:
Wherein, Sk-1For the key point predicted value of the stage k-1, TkAnd Tk -1For the transformation matrix T of stage kkAnd inversion square Battle array Tk -1
For EQUILIBRIUM CALCULATION FOR PROCESS amount and precision, it is 3 that usually circulation, which executes above-mentioned stage 3 times and k, executes 3 circulations, obtains S3The as palm key point predicted value of depth match neural network response training sample output.
Square of the distance between the palm key point of S113, the palm key point for calculating the mark and the prediction With;
Here between the corresponding key point of the distance between the palm key point marked and the palm key point of prediction finger Distance, such as the palm key point of mark includes middle fingertip, index finger tip, the middle fingertip of range index note here and pre- The distance of the middle fingertip of survey, the index finger tip of mark are at a distance from the index finger tip of prediction.
The parameter of S114, adjustment each node of depth match neural network model, until the palm key point of the mark When the quadratic sum minimum of the distance between the palm key point of the prediction, training terminates.
Percentage regulation is aligned the parameter of each node of neural network model, and the palm of the palm key point and prediction that make mark closes The quadratic sum of the distance between key point is minimum.Gradient descent method is used in the embodiment of the present invention, gradient descent method is one optimal Change algorithm, for approaching minimum deflection model in machine learning and artificial intelligence with being used to recursiveness.
S103, the characteristic pattern is reduced according to the palm key point, the characteristic pattern after being reduced;
According to palm key point cut characteristic pattern, specifically, with the rectangle comprising all key points to pass through step S102 Obtained characteristic pattern is handled to be cut.
The matching between each archive feature figure in characteristic pattern and preset plate library after S104, the calculating reduction Degree obtains M characteristic pattern matching value, wherein the archive for being labelled with identity number comprising M in the preset plate library is special Sign figure, M are the positive integer greater than 1;
Preset plate library has multiple archive feature figures, and archive feature figure is labelled with identity number, for wait know The characteristic pattern of other palm picture compares, so that it is determined that the identity number of palm picture.
The matching degree between characteristic pattern is calculated, that is, calculates the similarity between two matrixes, the calculating of matrix similarity can With using the method for the element comparison in matrix, if the element of same position is identical in two matrixes, similar points+1 are scanned Complete all elements obtain similar points, and similar points are divided by with matrix element number obtains the similarity of two matrixes.
The cosine similarity between two feature vectors can also be calculated to determine two by the feature vector of calculating matrix The similarity of a matrix.
It is matched by above method with archive feature figure all in plate library, obtains M characteristic pattern matching value.
S105, according to the M characteristic pattern matching value, determine the corresponding identity number of the palm picture.
According to M characteristic pattern matching value, determining the corresponding identity number of palm picture, there are many modes, specifically, by M A characteristic pattern matching value is compared with preset threshold value;It is default when thering is L numerical value to be greater than in M characteristic pattern matching value and/or being equal to Threshold value when, wherein L is positive integer greater than 0, determines the identity that the corresponding archive feature icon of maximum value is infused in L numerical value Identification number is the identity number of palm picture to be identified;
When M characteristic pattern matching value is both less than preset threshold value, recognition failures information is returned.
In some embodiments, it by comparing the size of M characteristic pattern matching value, determines in M characteristic pattern matching value The identity number of the corresponding archive feature icon note of maximum value is the identity number of palm picture.
In the embodiment of the present invention, for the precision for improving identification, each archive feature figure association is corresponded in preset plate library Key point auxiliary information, the identity number of palm image to be identified is determined according to following step:
The size of M characteristic pattern matching value described in S121, comparison, by R number maximum in the M characteristic pattern matching value The identity number for being worth corresponding archive feature icon note is compared, and wherein R is the positive integer greater than 0, less than M;
S122, when the identity number of the R numerical value corresponding archive feature icon note is identical, determine the identity Identification number is the identity number of the palm picture;
S123, when the identity difference of the R numerical value corresponding archive feature icon note, calculate the palm Matching degree between the key point auxiliary information of the key point of picture archive feature figure corresponding with the R numerical value obtains R Key point matching value;
In the embodiment of the present invention, the key point auxiliary information of archive feature figure is the pass including at least 3 archive feature figures The coordinate of key point, in the key of the key point archive feature figure corresponding with the R numerical value for calculating the palm picture In the step of matching degree between point auxiliary information, R key point matching value of acquisition, include the following steps:
S131, the key point distance that the archive feature figure is calculated according to the coordinate of the key point of the archive feature figure Than;
By the coordinate of key point calculate key point distance than, such as archive feature icon note 3 key points be A, B, C first calculates distance between each point, is denoted as AB, BC, AC, then calculates distance than i.e. AB:BC, BC:AC, AB:AC, for expression side Just d1, d2, d3 are denoted as.
S132, calculated according to the key point of the palm picture key point distance of the palm picture than;
Similarly, such as the key point of palm picture is A*、B*、C*, A*It is the key point of palm same position, palm figure with A The key point distance of piece is than being represented by A*B*: B*C*、B*C*: A*C*、A*B*: B*C*, it is denoted as d1*、d2*、d3*
S133, according to the key point distance of the archive feature figure than the key point distance with the palm picture than calculating Distance obtains the key point matching value of the palm picture Yu the archive feature figure than similarity.
Key point matching value is calculated according to the following formula:
Wherein di* is the key point distance of palm picture than di is the key point distance of archive feature figure than N is key Point number.
Matching degree by key point distance than calculating key point, distance are more special than the structure that can be very good to embody palm Accuracy of identification can be improved in sign, and calculates easy.
S124, according to the R numerical value and corresponding R key point matching value, calculate the palm picture and the R Comprehensive matching degree between the corresponding archive feature figure of numerical value, wherein comprehensive matching degree calculates according to the following equation:
P (v)=aPf(v)+bPs(v)
Wherein, P (v) is comprehensive matching degree, Pf(v) figure matching value, P are characterizedsIt (v) is key point matching value, a and b are pre- If weight, v be 1~R positive integer;
Characteristic pattern matching value and key point matching value are weighted summation, the palm is calculated picture is corresponding with R numerical value and deposit Comprehensive matching degree between shelves characteristic pattern.
S125, the identity number for determining the palm picture are the corresponding archive of greatest measure in the comprehensive matching degree The identity number of characteristic pattern mark.
The identity number for determining the corresponding archive feature icon note of comprehensive matching degree maximum value is the identity of palm picture Identification number.Comprehensive matching degree joined this factor of key point matching value, can be further improved the precision of personal recognition.
The embodiment of the present invention also provides a kind of personal recognition device to solve above-mentioned technical problem.Referring specifically to Fig. 3, figure 3 be the basic structure block diagram of the present embodiment personal recognition device.
As shown in figure 3, a kind of personal recognition device, comprising: obtain module 210, processing module 220, cut module 230, Computing module 240 and execution module 250, wherein module 210 is obtained, for obtaining palm picture to be identified;Processing module 220, for the palm picture to be input to depth match neural network model trained in advance, obtain the depth match mind The palm key point and characteristic pattern for responding the palm picture through network model and exporting, wherein the palm key point is N A, N is the positive integer greater than 1;Reduces module 230 is reduced for reducing the characteristic pattern according to the palm key point Characteristic pattern afterwards;Computing module 240, for calculating the characteristic pattern after the reduction and each archive feature in preset plate library Matching degree between figure, wherein be labelled with the archive feature figure of identity number in the preset plate library comprising M, M is Positive integer greater than 1 obtains M characteristic pattern matching value;Execution module 250 is used for according to the M characteristic pattern matching value, really Determine the corresponding identity number of the palm picture.
The embodiment of the present invention is by obtaining palm picture to be identified;The palm picture is input to depth trained in advance Degree alignment neural network model obtains the palm that the depth match neural network model responds the palm picture and exports and closes Key point and characteristic pattern;The characteristic pattern is reduced according to the palm key point, the characteristic pattern after being reduced;Calculate the reduction The matching degree between each archive feature figure in characteristic pattern and preset plate library afterwards obtains M characteristic pattern matching value;According to The M characteristic pattern matching value determines the corresponding identity number of the palm picture.To palm posture, position when personal recognition It sets unlimited, improves flexibility, compared with the characteristic pattern after cutting with archive feature figure, greatly avoid interfering, improve knowledge Other precision.
In some embodiments, in the processing module 220 further include: the first acquisition submodule, for obtaining training Sample, the training sample are the palmmprint pictures for being labelled with palm key point, wherein the palm key point of each sample mark It is N number of;First processing submodule obtains the depth for the training sample to be input to depth match neural network model It is aligned the palm key point that neural network model responds the training sample and predicts;First computational submodule, for calculating State the quadratic sum of the distance between the palm key point of mark and the palm key point of the prediction;The first adjustment submodule is used In the parameter for adjusting each node of depth match neural network model, until the palm key point of the mark and the prediction When the quadratic sum minimum of the distance between palm key point, training terminates.
In some embodiments, the depth match neural network model includes convolutional neural networks layer, transformation estimation Layer, temperature figure generation layer and feature extraction layer, in the first processing submodule, including K sub-processor, K are just whole greater than 1 Number:
As k=1, the sub-processor 1 is used to the training sample being input to the convolutional neural networks layer, obtains The key point correction value Δ S in stage 11, with preset key point initial value S0It is added, obtains the palm key point predicted value of stage k S1
As k=2~K, the sub-processor k is used for the palm key forecast value S for exporting stage k-1k-1It is input to institute Transformation estimation layer is stated, the transformation matrix T of stage k is obtainedkAnd inversion matrix Tk -1;By the transformation matrix T of the stage kkWith it is described Training sample is multiplied to obtain the palm transformation picture T of stage kk(I);The palm of the stage k is converted into picture Tk(I) it is input to The feature extraction layer obtains the characteristic pattern F of stage kk;By the transformation matrix T of the stage kkWith the palm of the stage k-1 Key point predicted value Sk-1Multiplication obtains Tk(Sk-1);By Tk(Sk-1) it is input to the temperature figure generation layer, obtain the temperature of stage k Scheme Hk, the wherein transformation for mula of temperature figure are as follows:
Wherein SiIt is Tk(Sk-1) i-th label, (x, y) be training sample picture midpoint coordinate;
The palm of the stage k is converted into picture Tk(I), the characteristic pattern F of the stage kkWith the temperature figure of the stage k HkIt is input to the convolutional neural networks layer, obtains the key point correction value Δ S of stage kk;The key point of the stage k is corrected It is worth Δ Sk and substitutes into following formula, the palm key point predicted value S of calculation stages kk:
Wherein, Sk-1For the key point predicted value of the stage k-1, TkAnd Tk -1For the transformation matrix T of the stage kkAnd it is inverse Bending moment battle array Tk -1
In some embodiments, in the execution module 250 further include: first compares submodule, is used for the M A characteristic pattern matching value is compared with preset threshold value;First determines submodule, has L for working as in the M characteristic pattern matching value When a numerical value is greater than and/or is equal to preset threshold value, wherein L is the positive integer greater than 0, determines maximum value in the L numerical value The identity number of corresponding archive feature icon note is the identity number of the palm picture;First returns to submodule, uses In when the M characteristic pattern matching value is both less than preset threshold value, recognition failures information is returned.
In some embodiments, in the execution module 250 further include: second determines submodule, for comparing The size for stating M characteristic pattern matching value determines the corresponding archive feature icon note of maximum value in the M characteristic pattern matching value Identity number is the identity number of the palm picture.
In some embodiments, each archive feature figure is associated with corresponding key point auxiliary letter in the preset plate library Breath, in the execution module further include: the first Comparative sub-module will for the size of the M characteristic pattern matching value The identity number of the corresponding archive feature icon note of maximum R numerical value is compared in the M characteristic pattern matching value, Middle R is the positive integer greater than 0, less than M;Second determines submodule, for when the corresponding archive feature icon note of the R numerical value Identity number it is identical when, determine the identity number be the palm picture identity number;Second calculates submodule Block, for calculating the palm picture when the identity difference of the corresponding archive feature icon note of the R numerical value Matching degree between the key point auxiliary information of key point archive feature figure corresponding with the R numerical value obtains R key point Matching value;Third computational submodule, for calculating the palm according to the R numerical value and corresponding R key point matching value Comprehensive matching degree between picture archive feature figure corresponding with the R numerical value, wherein comprehensive matching degree is according to the following equation It calculates:
P (v)=aPf(v)+bPs(v)
Wherein, P (v) is comprehensive matching degree, Pf(v) figure matching value, P are characterizedsIt (v) is key point matching value, a and b are pre- If weight, v be 1~R positive integer;
Third determines submodule, for determining that the identity number of the palm picture is maximum in the comprehensive matching degree The identity number of the corresponding archive feature icon note of numerical value.
In some embodiments, the key point auxiliary information of the archive feature figure is including at least 3 archive features The coordinate of the key point of figure, in second computational submodule further include: the 4th computational submodule, for according to the archive The coordinate of the key point of characteristic pattern calculate the key point distance of the archive feature figure than;5th computational submodule is used for basis The key point of the palm picture calculate the key point distance of the palm picture than;6th computational submodule, for according to institute The key point distance of archive feature figure is stated than the key point distance with the palm picture than calculating distance than similarity, obtains institute State the key point matching value of palm picture Yu the archive feature figure.
In order to solve the above technical problems, the embodiment of the present invention also provides computer equipment.It is this referring specifically to Fig. 4, Fig. 4 Embodiment computer equipment basic structure block diagram.
As shown in figure 4, the schematic diagram of internal structure of computer equipment.As shown in figure 4, the computer equipment includes passing through to be Processor, non-volatile memory medium, memory and the network interface of bus of uniting connection.Wherein, the computer equipment is non-easy The property lost storage medium is stored with operating system, database and computer-readable instruction, can be stored with control information sequence in database Column, when which is executed by processor, a kind of may make processor to realize personal recognition method.The computer The processor of equipment supports the operation of entire computer equipment for providing calculating and control ability.The computer equipment is deposited It can be stored with computer-readable instruction in reservoir, when which is executed by processor, processor may make to execute A kind of method of personal recognition.The network interface of the computer equipment is used for and terminal connection communication.Those skilled in the art can To understand, structure shown in Fig. 4, only the block diagram of part-structure relevant to application scheme, is not constituted to this Shen Please the restriction of computer equipment that is applied thereon of scheme, specific computer equipment may include than as shown in the figure more or Less component perhaps combines certain components or with different component layouts.
In present embodiment processor by execute in Fig. 3 obtain module 210, processing module 220, cut module 230, based on Calculate the particular content of module 240 and execution module 250, memory is stored with execute above-mentioned module needed for program code and all kinds of Data.Network interface is used for the data transmission between user terminal or server.Memory in present embodiment is stored with Program code needed for executing all submodules in palm grain identification method and data, server are capable of the program generation of invoking server Code and data execute the function of all submodules.
Computer equipment is by obtaining palm picture to be identified;The palm picture is input to depth trained in advance It is aligned neural network model, it is crucial to obtain the palm that the depth match neural network model responds the palm picture and exports Point and characteristic pattern;The characteristic pattern is reduced according to the palm key point, the characteristic pattern after being reduced;After calculating the reduction Characteristic pattern and preset plate library in each archive feature figure between matching degree, obtain M characteristic pattern matching value;According to institute M characteristic pattern matching value is stated, determines the corresponding identity number of the palm picture.To palm posture, position when personal recognition It is unlimited, flexibility is improved, is compared with the characteristic pattern after cutting with archive feature figure, is greatly avoided interfering, improve identification Precision.
The present invention also provides a kind of storage mediums for being stored with computer-readable instruction, and the computer-readable instruction is by one When a or multiple processors execute, so that one or more processors execute palm grain identification method described in any of the above-described embodiment Step.
Those of ordinary skill in the art will appreciate that realizing all or part of the process in above-described embodiment method, being can be with Relevant hardware is instructed to complete by computer program, which can be stored in a computer-readable storage and be situated between In matter, the program is when being executed, it may include such as the process of the embodiment of above-mentioned each method.Wherein, storage medium above-mentioned can be The non-volatile memory mediums such as magnetic disk, CD, read-only memory (Read-Only Memory, ROM) or random storage note Recall body (Random Access Memory, RAM) etc..
It should be understood that although each step in the flow chart of attached drawing is successively shown according to the instruction of arrow, These steps are not that the inevitable sequence according to arrow instruction successively executes.Unless expressly stating otherwise herein, these steps Execution there is no stringent sequences to limit, can execute in the other order.Moreover, at least one in the flow chart of attached drawing Part steps may include that perhaps these sub-steps of multiple stages or stage are not necessarily in synchronization to multiple sub-steps Completion is executed, but can be executed at different times, execution sequence, which is also not necessarily, successively to be carried out, but can be with other At least part of the sub-step or stage of step or other steps executes in turn or alternately.
The above is only some embodiments of the invention, it is noted that for the ordinary skill people of the art For member, various improvements and modifications may be made without departing from the principle of the present invention, these improvements and modifications are also answered It is considered as protection scope of the present invention.

Claims (10)

1. a kind of palm grain identification method, which is characterized in that include the following steps:
Obtain palm picture to be identified;
The palm picture is input to depth match neural network model trained in advance, obtains the depth match nerve net Network model responds the palm picture and the palm key point and characteristic pattern that export, wherein the palm key point be it is N number of, N is Positive integer greater than 1;
The characteristic pattern is reduced according to the palm key point, the characteristic pattern after being reduced;
The matching degree between each archive feature figure in characteristic pattern and preset plate library after calculating the reduction obtains M Characteristic pattern matching value, wherein be labelled with the archive feature figure of identity number in the preset plate library comprising M, M is big In 1 positive integer;
According to the M characteristic pattern matching value, the corresponding identity number of the palm picture is determined.
2. palm grain identification method according to claim 1, which is characterized in that the depth match neural network model according to Following step training:
Training sample is obtained, the training sample is the palmmprint pictures for being labelled with palm key point, wherein each sample mark Palm key point be it is N number of;
The training sample is input to depth match neural network model and obtains the depth match neural network model response The training sample and the palm key point predicted;
Calculate the quadratic sum of the distance between the palm key point of the mark and the palm key point of the prediction;
The parameter of each node of depth match neural network model is adjusted, until the palm key point and the prediction of the mark The distance between palm key point quadratic sum minimum when, training terminates.
3. palm grain identification method according to claim 2, which is characterized in that the depth match neural network model includes The training sample, is input to by convolutional neural networks layer, transformation estimation layer, temperature figure generation layer and feature extraction layer described Depth match neural network model obtains the palm that the depth match neural network model responds the training sample and predicts In the step of key point, including K stage, K are the positive integer greater than 1, wherein k-th of stage includes the following steps:
As k=1, the training sample is input to the convolutional neural networks layer, obtains the key point correction value Δ in stage 1 S1, with preset key point initial value S0It is added, obtains the palm key point predicted value S of stage k1
As k=2~K, palm key forecast value S that stage k-1 is exportedk-1It is input to the transformation estimation layer, obtains the stage The transformation matrix T of kkAnd inversion matrix Tk -1
By the transformation matrix T of the stage kkIt is multiplied to obtain the palm transformation picture T of stage k with the training samplek(I);
The palm of the stage k is converted into picture Tk(I) it is input to the feature extraction layer, obtains the characteristic pattern F of stage kk
By the transformation matrix T of the stage kkWith the palm key point predicted value S of the stage k-1k-1Multiplication obtains Tk(Sk-1);
By Tk(Sk-1) it is input to the temperature figure generation layer, obtain the temperature figure H of stage kk, the wherein transformation for mula of temperature figure Are as follows:
Wherein SiIt is Tk(Sk-1) i-th label, (x, y) be training sample picture midpoint coordinate;
The palm of the stage k is converted into picture Tk(I), the characteristic pattern F of the stage kkWith the temperature figure H of the stage kkIt is defeated Enter to obtain the key point correction value Δ S of stage k to the convolutional neural networks layerk
The key point correction value Δ Sk of the stage k is substituted into following formula, the palm key point predicted value S of calculation stages kk:
Wherein, Sk-1For the key point predicted value of the stage k-1, TkAnd Tk -1For the transformation matrix T of the stage kkAnd inversion square Battle array Tk -1
4. palm grain identification method according to claim 1, which is characterized in that described according to M characteristic pattern matching value, really In the step of fixed palm picture corresponding identity, include the following steps:
The M characteristic pattern matching value is compared with preset threshold value;
When thering is L numerical value to be greater than in the M characteristic pattern matching value and/or being equal to preset threshold value, wherein L is greater than 0 Positive integer determines that the identity number of the corresponding archive feature icon note of maximum value in the L numerical value is the palm picture Identity number;
When the M characteristic pattern matching value is both less than preset threshold value, recognition failures information is returned.
5. palm grain identification method according to claim 1, which is characterized in that described according to M characteristic pattern matching value, really In the step of fixed palm picture corresponding identity, include the following steps:
The size for comparing the M characteristic pattern matching value determines that the corresponding archive of maximum value is special in the M characteristic pattern matching value The identity number for levying icon note is the identity number of the palm picture.
6. palm grain identification method according to claim 1, which is characterized in that each archive feature in the preset plate library Figure is associated with corresponding key point auxiliary information and determines the corresponding body of the palm picture described according to M characteristic pattern matching value In the step of part identification number, include the following steps:
The size for comparing the M characteristic pattern matching value, R numerical value maximum in the M characteristic pattern matching value is corresponding The identity number of archive feature icon note is compared, and wherein R is the positive integer greater than 0, less than M;
When the identity number of the corresponding archive feature icon note of the R numerical value is identical, determine that the identity number is The identity number of the palm picture;
When the identity difference of the corresponding archive feature icon note of the R numerical value, the pass of the palm picture is calculated Matching degree between the key point auxiliary information of key point archive feature figure corresponding with the R numerical value obtains R key point With value;
According to the R numerical value and corresponding R key point matching value, it is corresponding with the R numerical value to calculate the palm picture Archive feature figure between comprehensive matching degree, wherein comprehensive matching degree calculates according to the following equation:
P (v)=aPf(v)+bPs(v)
Wherein, P (v) is comprehensive matching degree, Pf(v) figure matching value, P are characterizedsIt (v) is key point matching value, a and b are preset Weight, v are the positive integer of 1~R;
The identity number for determining the palm picture is the corresponding archive feature icon of greatest measure in the comprehensive matching degree The identity number of note.
7. palm grain identification method according to claim 6, which is characterized in that the key point of the archive feature figure assists letter Coordinate of the breath for the key point including at least 3 archive feature figures, key point and the R in the calculating palm picture Matching degree between the key point auxiliary information of the corresponding archive feature figure of a numerical value, the step of obtaining R key point matching value In, include the following steps:
According to the coordinate of the key point of the archive feature figure calculate the key point distance of the archive feature figure than;
According to the key point of the palm picture calculate the key point distance of the palm picture than;
According to the key point distance of the archive feature figure than the key point distance with the palm picture than calculating distance than phase Like degree, the key point matching value of the palm picture Yu the archive feature figure is obtained.
8. a kind of personal recognition device characterized by comprising
Module is obtained, for obtaining palm picture to be identified;
Processing module, for the palm picture to be input to in advance trained depth match neural network model, described in acquisition Depth match neural network model responds the palm picture and the palm key point and characteristic pattern that export, wherein the palm Key point be it is N number of, N is positive integer greater than 1;
Reduces module, for reducing the characteristic pattern according to the palm key point, the characteristic pattern after being reduced;
Computing module, for calculating between each archive feature figure in characteristic pattern and preset plate library after the reduction With degree, wherein be labelled with the archive feature figure of identity number in the preset plate library comprising M, M is just greater than 1 Integer obtains M characteristic pattern matching value;
Execution module, for determining the corresponding identity number of the palm picture according to the M characteristic pattern matching value.
9. a kind of computer equipment, including memory and processor are stored with computer-readable instruction in the memory, described When computer-readable instruction is executed by the processor, so that the processor executes such as any one of claims 1 to 7 right It is required that the step of palm grain identification method.
10. a kind of computer readable storage medium, it is stored with computer-readable instruction on the computer readable storage medium, institute It states and realizes the personal recognition as described in any one of claims 1 to 7 claim when computer-readable instruction is executed by processor The step of method.
CN201910401000.5A 2019-05-15 2019-05-15 Palmprint recognition method, palmprint recognition device, computer equipment and storage medium Active CN110298233B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910401000.5A CN110298233B (en) 2019-05-15 2019-05-15 Palmprint recognition method, palmprint recognition device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910401000.5A CN110298233B (en) 2019-05-15 2019-05-15 Palmprint recognition method, palmprint recognition device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110298233A true CN110298233A (en) 2019-10-01
CN110298233B CN110298233B (en) 2024-04-09

Family

ID=68026836

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910401000.5A Active CN110298233B (en) 2019-05-15 2019-05-15 Palmprint recognition method, palmprint recognition device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110298233B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111507279A (en) * 2020-04-21 2020-08-07 北京智能工场科技有限公司 Palm print recognition method based on UNet + + network
CN111523402A (en) * 2020-04-01 2020-08-11 车智互联(北京)科技有限公司 Video processing method, mobile terminal and readable storage medium
CN112132099A (en) * 2020-09-30 2020-12-25 腾讯科技(深圳)有限公司 Identity recognition method, palm print key point detection model training method and device

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150253863A1 (en) * 2014-03-06 2015-09-10 Avago Technologies General Ip (Singapore) Pte. Ltd. Image Processor Comprising Gesture Recognition System with Static Hand Pose Recognition Based on First and Second Sets of Features
CN105654035A (en) * 2015-12-21 2016-06-08 湖南拓视觉信息技术有限公司 Three-dimensional face recognition method and data processing device applying three-dimensional face recognition method
US20170091595A1 (en) * 2015-09-29 2017-03-30 Huami Inc. Method, apparatus and system for biometric identification
CN106603563A (en) * 2016-12-30 2017-04-26 厦门市美亚柏科信息股份有限公司 Information safety realization method and system based on biometric features identification
CN107341473A (en) * 2017-07-04 2017-11-10 深圳市利众信息科技有限公司 Palm characteristic recognition method, palm characteristic identificating equipment and storage medium
CN108960125A (en) * 2018-06-29 2018-12-07 河北工业大学 A kind of three-dimensional palm print recognition methods
CN109259748A (en) * 2018-08-17 2019-01-25 西安电子科技大学 The system and method for handset processes face video extraction heart rate signal
CN109345553A (en) * 2018-08-31 2019-02-15 厦门中控智慧信息技术有限公司 A kind of palm and its critical point detection method, apparatus and terminal device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150253863A1 (en) * 2014-03-06 2015-09-10 Avago Technologies General Ip (Singapore) Pte. Ltd. Image Processor Comprising Gesture Recognition System with Static Hand Pose Recognition Based on First and Second Sets of Features
US20170091595A1 (en) * 2015-09-29 2017-03-30 Huami Inc. Method, apparatus and system for biometric identification
CN105654035A (en) * 2015-12-21 2016-06-08 湖南拓视觉信息技术有限公司 Three-dimensional face recognition method and data processing device applying three-dimensional face recognition method
CN106603563A (en) * 2016-12-30 2017-04-26 厦门市美亚柏科信息股份有限公司 Information safety realization method and system based on biometric features identification
CN107341473A (en) * 2017-07-04 2017-11-10 深圳市利众信息科技有限公司 Palm characteristic recognition method, palm characteristic identificating equipment and storage medium
CN108960125A (en) * 2018-06-29 2018-12-07 河北工业大学 A kind of three-dimensional palm print recognition methods
CN109259748A (en) * 2018-08-17 2019-01-25 西安电子科技大学 The system and method for handset processes face video extraction heart rate signal
CN109345553A (en) * 2018-08-31 2019-02-15 厦门中控智慧信息技术有限公司 A kind of palm and its critical point detection method, apparatus and terminal device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111523402A (en) * 2020-04-01 2020-08-11 车智互联(北京)科技有限公司 Video processing method, mobile terminal and readable storage medium
CN111523402B (en) * 2020-04-01 2023-12-12 车智互联(北京)科技有限公司 Video processing method, mobile terminal and readable storage medium
CN111507279A (en) * 2020-04-21 2020-08-07 北京智能工场科技有限公司 Palm print recognition method based on UNet + + network
CN112132099A (en) * 2020-09-30 2020-12-25 腾讯科技(深圳)有限公司 Identity recognition method, palm print key point detection model training method and device

Also Published As

Publication number Publication date
CN110298233B (en) 2024-04-09

Similar Documents

Publication Publication Date Title
CN109949255B (en) Image reconstruction method and device
CN108986140B (en) Target scale self-adaptive tracking method based on correlation filtering and color detection
WO2021022521A1 (en) Method for processing data, and method and device for training neural network model
CN111191622A (en) Posture recognition method and system based on thermodynamic diagram and offset vector and storage medium
CN110276780A (en) A kind of multi-object tracking method, device, electronic equipment and storage medium
CN110298233A (en) Palm grain identification method, device, computer equipment and storage medium
CN108898168A (en) The compression method and system of convolutional neural networks model for target detection
CN110287775B (en) Palm image clipping method, palm image clipping device, computer equipment and storage medium
CN110084221A (en) A kind of serializing face critical point detection method of the tape relay supervision based on deep learning
CN110765882B (en) Video tag determination method, device, server and storage medium
CN111507184B (en) Human body posture detection method based on parallel cavity convolution and body structure constraint
CN110751039A (en) Multi-view 3D human body posture estimation method and related device
CN111126249A (en) Pedestrian re-identification method and device combining big data and Bayes
CN108197669A (en) The feature training method and device of convolutional neural networks
CN110414593A (en) Image processing method and device, processor, electronic equipment and storage medium
CN112800882A (en) Mask face posture classification method based on weighted double-flow residual error network
CN116704615A (en) Information processing method and device, computer equipment and computer readable storage medium
CN112862095B (en) Self-distillation learning method and device based on feature analysis and readable storage medium
CN114049491A (en) Fingerprint segmentation model training method, fingerprint segmentation device, fingerprint segmentation equipment and fingerprint segmentation medium
CN117373064A (en) Human body posture estimation method based on self-adaptive cross-dimension weighting, computer equipment and storage medium
CN116563588A (en) Image clustering method and device, electronic equipment and storage medium
CN110069647A (en) Image tag denoising method, device, equipment and computer readable storage medium
US20230196095A1 (en) Pure integer quantization method for lightweight neural network (lnn)
CN115830705A (en) Human body action recognition method, system and equipment based on WiFi channel state information imaging and readable storage medium
CN113408539B (en) Data identification method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant