CN104484803A - Mobile phone payment method employing three-dimensional human face recognition based on neural network - Google Patents

Mobile phone payment method employing three-dimensional human face recognition based on neural network Download PDF

Info

Publication number
CN104484803A
CN104484803A CN201410677322.XA CN201410677322A CN104484803A CN 104484803 A CN104484803 A CN 104484803A CN 201410677322 A CN201410677322 A CN 201410677322A CN 104484803 A CN104484803 A CN 104484803A
Authority
CN
China
Prior art keywords
neural network
face
dimensional
human face
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410677322.XA
Other languages
Chinese (zh)
Inventor
张会林
孙利华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Fufeng Technology Co Ltd
Original Assignee
Suzhou Fufeng Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Fufeng Technology Co Ltd filed Critical Suzhou Fufeng Technology Co Ltd
Priority to CN201410677322.XA priority Critical patent/CN104484803A/en
Publication of CN104484803A publication Critical patent/CN104484803A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/38Payment protocols; Details thereof
    • G06Q20/40Authorisation, e.g. identification of payer or payee, verification of customer or shop credentials; Review and approval of payers, e.g. check credit lines or negative lists
    • G06Q20/401Transaction verification
    • G06Q20/4014Identity check for transactions
    • G06Q20/40145Biometric identity checks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/30Payment architectures, schemes or protocols characterised by the use of specific devices or networks
    • G06Q20/32Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices
    • G06Q20/322Aspects of commerce using mobile devices [M-devices]
    • G06Q20/3227Aspects of commerce using mobile devices [M-devices] using secure elements embedded in M-devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Accounting & Taxation (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Security & Cryptography (AREA)
  • Finance (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a mobile phone payment method employing three-dimensional human face recognition based on a neural network. The mobile phone payment method comprises the following steps: S01 obtaining a human face recognition standard sample, and forming a training sample; S02 determining BP (back propagation) neural network model parameters; S03 obtaining a human face image by virtue of a camera of a mobile phone; S04 projecting a three-dimensional image to a two-dimensional space; S05 preprocessing the human face image obtained from the step S03, filtering out noise, and carrying out binarization image processing; S06 extracting human face image features; S07 obtaining sample parameters of the human face image based on a fuzzy logic compensation method; S08 comparing and matching the sample parameters obtained from the step S07 with the standard sample parameters. According to the mobile phone payment method, three-dimensional human face matching is carried out based on the neural network, so that the safety performance is high; rapid matching of the human face image and the standard sample parameters is facilitated; the method is high in response efficiency, simple in algorithm, and high in calculation matching efficiency; complicated calculation through the neural network is not required.

Description

Based on the hand set paying method of the three-dimensional face identification of neural network
Technical field
The invention belongs to intelligent payment technical field, particularly relate to a kind of hand set paying method of the three-dimensional face identification based on neural network.
Background technology
At present, in the user identity identification technology of mobile-phone payment and mobile phone wallet, recognition of face progressively enters safety-security area, because mobile-phone payment is high to security requirement, two-dimension human face identification can not meet its high security, need to use three-dimensional face identification to carry out mobile-phone payment, the hand set paying method of the three-dimensional face identification of prior art, three-dimensional face recognition technology is carried out based on mathematics prior probability, calculation of complex, takies content comparatively large, identifies slowly corresponding, and require harsh to the angle of taking pictures and intensity of illumination, can not apply.
Summary of the invention
In order to solve prior art problem, the invention provides a kind of hand set paying method of the three-dimensional face identification based on neural network, sample training is carried out based on neural network, calculate simple, configuration parameter is quick, can meet the requirement of mobile-phone payment efficiency, carry out three-dimensional face coupling based on neural network, security performance is high.
The technical solution used in the present invention is:
Based on a hand set paying method for the three-dimensional face identification of neural network, comprise the following steps:
S01, obtain recognition of face master sample, recognition of face master sample comprises positive face, left side face and has side face, and stores recognition of face master sample parameter composing training sample;
S02, determine BP neural network model parameter, BP neural network model comprises input layer, hidden layer and output layer; BP neural network model parameter comprises input layer, hidden layer neuron and output layer neuron; Input layer is master sample parameter; Hidden layer neuron is the node of hidden layer, and the node of hidden layer carries out nodes adjustment according to training precision in the training process; Output layer neuron is the facial image of Real-time Collection and the matching degree of master sample parameter;
S03, mobile phone camera obtains facial image;
S04, uses nonlinear color transformation, 3-D view is projected two-dimensional space;
S05, carries out pre-service, filter out noise to the facial image that step S03 obtains, carries out binary image process;
S06, extracts facial image feature;
S07, by the method for fuzzy logic compensation, based on the sample parameter of the facial image feature acquisition facial image that step S06 extracts;
S08, the sample parameter obtained according to step S07 carries out contrast with the master sample parameter that step S01 stores and mates, and when matching degree exceedes matching threshold, enters hand call payment system, otherwise, pay unsuccessfully.
Step S01 recognition of face master sample parameter comprises the Aspect Ratio of face figure, left and right eyes, nose, face position coordinates and corresponding length, width parameter.
Step S04 specifically comprises the following steps, and the color value of each pixel is projected two-dimensional sub-spaces from the color space of three-dimensional, and in two-dimensional sub-spaces, the pixel representing the colour of skin flocks together.
Step S05 specifically comprises the following steps:
(501) brightness of pixels all in entire image is arranged from high to low,
(502) set maximum brightness threshold value as, preset reference brightness is, wherein represents the number percent of maximum brightness threshold value, span be 0 ~ 100%;
(503) if the ratio that in image, the brightness value of pixel is greater than the pixel number of preset reference brightness and the total pixel number of image arrives preset ratio limit value, be then reference white by preset reference brightness adjustment, the gray-scale value of reference white is 255, is converted by the yardstick of the gray-scale value of other pixels of entire image by the brightness adjustment of preset reference brightness and reference white.
Step S06 specifically comprises the following steps, and carries out gridding piecemeal, extract 4 category features to the binary image that step S05 obtains, and 4 category features comprise normalized gray-scale value, piecemeal average, striping criterion difference and the poor absolute value of neighboring gradation; The Aspect Ratio that the normalized gray-scale value that foundation obtains, piecemeal average, striping criterion are poor, the poor absolute value of neighboring gradation obtains face figure, left and right eyes, nose, face position coordinates and corresponding length, width parameter.
Step S06 is specifically further comprising the steps of, and facial image self study identification, carries out standardization based on sample training method by the image of different countenance.
The sample parameter of step S07 facial image comprises the Aspect Ratio of face figure, left and right eyes, nose, face position coordinates and corresponding length, width parameter.
Compared with prior art, beneficial effect of the present invention comprises:
The present invention is based on neural network and carry out three-dimensional face coupling, security performance is high, illumination pretreatment and binary conversion treatment are carried out to the facial image obtained simultaneously, facilitate the Rapid matching of facial image and master sample parameter, response efficiency is high, algorithm is simple, does not need through neural network complex calculation, calculates matching efficiency high.
Accompanying drawing explanation
Fig. 1 is that the present invention is generally based on the structural representation of the hand set paying method of the three-dimensional face identification of neural network.
Embodiment
Below in conjunction with accompanying drawing, the present invention is further described.
As shown in Figure 1, a kind of hand set paying method of the three-dimensional face identification based on neural network, comprises the following steps:
S01, obtain recognition of face master sample, recognition of face master sample comprises positive face, left side face and has side face, and stores recognition of face master sample parameter composing training sample; Recognition of face master sample parameter comprises the Aspect Ratio of face figure, left and right eyes, nose, face position coordinates and corresponding length, width parameter.
S02, determine BP neural network model parameter, BP neural network model comprises input layer, hidden layer and output layer; BP neural network model parameter comprises input layer, hidden layer neuron and output layer neuron; Input layer is master sample parameter; Hidden layer neuron is the node of hidden layer, and the node of hidden layer carries out nodes adjustment according to training precision in the training process; Output layer neuron is the facial image of Real-time Collection and the matching degree of master sample parameter;
S03, mobile phone camera obtains facial image;
S04, uses nonlinear color transformation, 3-D view is projected two-dimensional space; Step S04 specifically comprises the following steps, and the color value of each pixel is projected two-dimensional sub-spaces from the color space of three-dimensional, and in two-dimensional sub-spaces, the pixel representing the colour of skin flocks together.
S05, carries out pre-service, filter out noise to the facial image that step S03 obtains, carries out binary image process;
Step S05 specifically comprises the following steps:
(501) brightness of pixels all in entire image is arranged from high to low,
(502) set maximum brightness threshold value as, preset reference brightness is, wherein represents the number percent of maximum brightness threshold value, span be 0 ~ 100%;
(503) if the ratio that in image, the brightness value of pixel is greater than the pixel number of preset reference brightness and the total pixel number of image arrives preset ratio limit value, be then reference white by preset reference brightness adjustment, the gray-scale value of reference white is 255, is converted by the yardstick of the gray-scale value of other pixels of entire image by the brightness adjustment of preset reference brightness and reference white.
S06, extracts facial image feature;
Step S06 specifically comprises the following steps, and carries out gridding piecemeal, extract 4 category features to the binary image that step S05 obtains, and 4 category features comprise normalized gray-scale value, piecemeal average, striping criterion difference and the poor absolute value of neighboring gradation; The Aspect Ratio that the normalized gray-scale value that foundation obtains, piecemeal average, striping criterion are poor, the poor absolute value of neighboring gradation obtains face figure, left and right eyes, nose, face position coordinates and corresponding length, width parameter.
Step S06 is specifically further comprising the steps of, and facial image self study identification, carries out standardization based on sample training method by the image of different countenance.
S07, by the method for fuzzy logic compensation, based on the sample parameter of the facial image feature acquisition facial image that step S06 extracts; Identical with step S01, the sample parameter of step S07 facial image comprises the Aspect Ratio of face figure, left and right eyes, nose, face position coordinates and corresponding length, width parameter.
S08, the sample parameter obtained according to step S07 carries out contrast with the master sample parameter that step S01 stores and mates, and when matching degree exceedes matching threshold, enters hand call payment system, otherwise, pay unsuccessfully.
Below be only the preferred embodiment of the present invention; be noted that for those skilled in the art; under the premise without departing from the principles of the invention, can also make some improvements and modifications, these improvements and modifications also should be considered as protection scope of the present invention.

Claims (7)

1., based on a hand set paying method for the three-dimensional face identification of neural network, it is characterized in that, comprise the following steps:
S01, obtain recognition of face master sample, recognition of face master sample comprises positive face, left side face and has side face, and stores recognition of face master sample parameter composing training sample;
S02, determines BP neural network model parameter: described BP neural network model comprises input layer, hidden layer and output layer; Described BP neural network model parameter comprises input layer, hidden layer neuron and output layer neuron; Described input layer is master sample parameter; Described hidden layer neuron is the node of hidden layer, and the node of described hidden layer carries out nodes adjustment according to training precision in the training process; Described output layer neuron is the facial image of Real-time Collection and the matching degree of master sample parameter;
S03, mobile phone camera obtains facial image;
S04, uses nonlinear color transformation, 3-D view is projected two-dimensional space;
S05, carries out pre-service, filter out noise to the facial image that step S03 obtains, carries out binary image process;
S06, extracts facial image feature;
S07, by the method for fuzzy logic compensation, the facial image feature based on step S06 extraction obtains the sample parameter of described facial image;
S08, the sample parameter obtained according to step S07 carries out contrast with the master sample parameter that step S01 stores and mates, and when matching degree exceedes matching threshold, enters hand call payment system, otherwise, pay unsuccessfully.
2. the hand set paying method of the three-dimensional face identification based on neural network according to claim 1, it is characterized in that, described in step S01, recognition of face master sample parameter comprises the Aspect Ratio of face figure, left and right eyes, nose, face position coordinates and corresponding length, width parameter.
3. the hand set paying method of the three-dimensional face identification based on neural network according to claim 1, it is characterized in that, step S04 specifically comprises the following steps, the color value of each pixel is projected two-dimensional sub-spaces from the color space of three-dimensional, in two-dimensional sub-spaces, the pixel representing the colour of skin flocks together.
4. the hand set paying method of the three-dimensional face identification based on neural network according to claim 1, it is characterized in that, step S05 specifically comprises the following steps:
The brightness of pixels all in entire image is arranged from high to low,
If maximum brightness threshold value is, preset reference brightness is, wherein represents the number percent of maximum brightness threshold value, span be 0 ~ 100%;
If the ratio that in image, the brightness value of pixel is greater than the pixel number of preset reference brightness and the total pixel number of image arrives preset ratio limit value, be then reference white by described preset reference brightness adjustment, the gray-scale value of described reference white is 255, is converted by the yardstick of the gray-scale value of other pixels of entire image by the brightness adjustment of preset reference brightness and described reference white.
5. the hand set paying method of the three-dimensional face identification based on neural network according to claim 1, it is characterized in that, step S06 specifically comprises the following steps,
Carry out gridding piecemeal to the binary image that step S05 obtains, extract 4 category features, described 4 category features comprise that normalized gray-scale value, piecemeal average, striping criterion are poor, the poor absolute value of neighboring gradation;
The Aspect Ratio that the normalized gray-scale value that foundation obtains, piecemeal average, striping criterion are poor, the poor absolute value of neighboring gradation obtains face figure, left and right eyes, nose, face position coordinates and corresponding length, width parameter.
6. the hand set paying method of the three-dimensional face identification based on neural network according to claim 1, it is characterized in that, step S06 is specifically further comprising the steps of, and facial image self study identification, carries out standardization based on sample training method by the image of different countenance.
7. the hand set paying method of the three-dimensional face identification based on neural network according to claim 1, it is characterized in that, described in described step S07, the sample parameter of facial image comprises the Aspect Ratio of face figure, left and right eyes, nose, face position coordinates and corresponding length, width parameter.
CN201410677322.XA 2014-11-24 2014-11-24 Mobile phone payment method employing three-dimensional human face recognition based on neural network Pending CN104484803A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410677322.XA CN104484803A (en) 2014-11-24 2014-11-24 Mobile phone payment method employing three-dimensional human face recognition based on neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410677322.XA CN104484803A (en) 2014-11-24 2014-11-24 Mobile phone payment method employing three-dimensional human face recognition based on neural network

Publications (1)

Publication Number Publication Date
CN104484803A true CN104484803A (en) 2015-04-01

Family

ID=52759343

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410677322.XA Pending CN104484803A (en) 2014-11-24 2014-11-24 Mobile phone payment method employing three-dimensional human face recognition based on neural network

Country Status (1)

Country Link
CN (1) CN104484803A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105225328A (en) * 2015-08-25 2016-01-06 浙江工业大学 Based on mobile terminal electronic voting method and the system of face characteristic identification
CN107346423A (en) * 2017-06-30 2017-11-14 重庆科技学院 The face identification method of autoassociative memories based on cell neural network
CN107423678A (en) * 2017-05-27 2017-12-01 电子科技大学 A kind of training method and face identification method of the convolutional neural networks for extracting feature
CN108257079A (en) * 2016-12-29 2018-07-06 北京国双科技有限公司 Graphic change method and device
CN108921926A (en) * 2018-07-02 2018-11-30 广州云从信息科技有限公司 A kind of end-to-end three-dimensional facial reconstruction method based on single image
WO2019041660A1 (en) * 2017-08-31 2019-03-07 苏州科达科技股份有限公司 Face deblurring method and device
CN110706420A (en) * 2019-08-28 2020-01-17 北京联合大学 Intelligent express cabinet implementation method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1971630A (en) * 2006-12-01 2007-05-30 浙江工业大学 Access control device and check on work attendance tool based on human face identification technique
US20140214666A1 (en) * 2008-03-13 2014-07-31 Giftya Llc System and method for managing gifts
CN104013414A (en) * 2014-04-30 2014-09-03 南京车锐信息科技有限公司 Driver fatigue detecting system based on smart mobile phone

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1971630A (en) * 2006-12-01 2007-05-30 浙江工业大学 Access control device and check on work attendance tool based on human face identification technique
US20140214666A1 (en) * 2008-03-13 2014-07-31 Giftya Llc System and method for managing gifts
CN104013414A (en) * 2014-04-30 2014-09-03 南京车锐信息科技有限公司 Driver fatigue detecting system based on smart mobile phone

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
吴育文 等: "基于BP网络的人脸朝向识别模型", 《影像技术》 *
沈凌云 等: "一种基于人工神经网络的人脸识别方法", 《液晶与显示》 *
王万良 等: "模糊逻辑补偿的网络PI控制系统及稳定性分析", 《控制理论与应用》 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105225328A (en) * 2015-08-25 2016-01-06 浙江工业大学 Based on mobile terminal electronic voting method and the system of face characteristic identification
CN108257079A (en) * 2016-12-29 2018-07-06 北京国双科技有限公司 Graphic change method and device
CN107423678A (en) * 2017-05-27 2017-12-01 电子科技大学 A kind of training method and face identification method of the convolutional neural networks for extracting feature
CN107346423A (en) * 2017-06-30 2017-11-14 重庆科技学院 The face identification method of autoassociative memories based on cell neural network
WO2019041660A1 (en) * 2017-08-31 2019-03-07 苏州科达科技股份有限公司 Face deblurring method and device
CN108921926A (en) * 2018-07-02 2018-11-30 广州云从信息科技有限公司 A kind of end-to-end three-dimensional facial reconstruction method based on single image
CN108921926B (en) * 2018-07-02 2020-10-09 云从科技集团股份有限公司 End-to-end three-dimensional face reconstruction method based on single image
CN110706420A (en) * 2019-08-28 2020-01-17 北京联合大学 Intelligent express cabinet implementation method

Similar Documents

Publication Publication Date Title
CN104484803A (en) Mobile phone payment method employing three-dimensional human face recognition based on neural network
CN106874871B (en) Living body face double-camera identification method and identification device
CN104484669A (en) Mobile phone payment method based on three-dimensional human face recognition
CN101359365B (en) Iris positioning method based on maximum between-class variance and gray scale information
CN104268583B (en) Pedestrian re-recognition method and system based on color area features
CN112487922B (en) Multi-mode human face living body detection method and system
CN108268850B (en) Big data processing method based on image
CN103985172A (en) An access control system based on three-dimensional face identification
CN104143091B (en) Based on the single sample face recognition method for improving mLBP
CN108369644B (en) Method for quantitatively detecting human face raised line, intelligent terminal and storage medium
CN104392220A (en) Three-dimensional face recognition airport security inspection method based on cloud server
CN107644191A (en) A kind of face identification method and system, terminal and server
CN103870808A (en) Finger vein identification method
CN104408780A (en) Face recognition attendance system
CN108323203A (en) A kind of method, apparatus and intelligent terminal quantitatively detecting face skin quality parameter
CN104484652A (en) Method for fingerprint recognition
CN104063686A (en) System and method for performing interactive diagnosis on crop leaf segment disease images
CN104599367A (en) Multi-user parallel access control recognition method based on three-dimensional face image recognition
CN112101260B (en) Method, device, equipment and storage medium for identifying safety belt of operator
CN111178130A (en) Face recognition method, system and readable storage medium based on deep learning
CN103984922A (en) Face identification method based on sparse representation and shape restriction
CN107862298B (en) Winking living body detection method based on infrared camera device
CN109509299A (en) A kind of automatic vending machine with recognition of face
CN110222647B (en) Face in-vivo detection method based on convolutional neural network
CN110084587A (en) A kind of service plate automatic settlement method based on edge context

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20150401