CN108053218A - A kind of safe method of mobile payment - Google Patents
A kind of safe method of mobile payment Download PDFInfo
- Publication number
- CN108053218A CN108053218A CN201711468123.8A CN201711468123A CN108053218A CN 108053218 A CN108053218 A CN 108053218A CN 201711468123 A CN201711468123 A CN 201711468123A CN 108053218 A CN108053218 A CN 108053218A
- Authority
- CN
- China
- Prior art keywords
- mrow
- msub
- msubsup
- munderover
- prime
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 17
- 230000001815 facial effect Effects 0.000 claims abstract description 64
- 238000012795 verification Methods 0.000 claims abstract description 14
- 210000003462 vein Anatomy 0.000 claims description 40
- 230000010355 oscillation Effects 0.000 claims description 11
- 239000000284 extract Substances 0.000 claims description 7
- 210000005069 ears Anatomy 0.000 claims description 6
- 210000004709 eyebrow Anatomy 0.000 claims description 5
- 210000000162 simple eye Anatomy 0.000 claims description 5
- 210000001367 artery Anatomy 0.000 claims description 2
- 210000004209 hair Anatomy 0.000 abstract description 2
- 241000208340 Araliaceae Species 0.000 description 2
- 235000005035 Panax pseudoginseng ssp. pseudoginseng Nutrition 0.000 description 2
- 235000003140 Panax quinquefolius Nutrition 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 235000008434 ginseng Nutrition 0.000 description 2
- 230000008901 benefit Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 210000001061 forehead Anatomy 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q20/00—Payment architectures, schemes or protocols
- G06Q20/38—Payment protocols; Details thereof
- G06Q20/40—Authorisation, e.g. identification of payer or payee, verification of customer or shop credentials; Review and approval of payers, e.g. check credit lines or negative lists
- G06Q20/401—Transaction verification
- G06Q20/4014—Identity check for transactions
- G06Q20/40145—Biometric identity checks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/316—User authentication by observing the pattern of computer usage, e.g. typical user behaviour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L17/00—Speaker identification or verification techniques
- G10L17/06—Decision making techniques; Pattern matching strategies
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L17/00—Speaker identification or verification techniques
- G10L17/22—Interactive procedures; Man-machine interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/14—Vascular patterns
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Security & Cryptography (AREA)
- Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Computer Hardware Design (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Acoustics & Sound (AREA)
- Software Systems (AREA)
- Accounting & Taxation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Business, Economics & Management (AREA)
- Strategic Management (AREA)
- Finance (AREA)
- General Health & Medical Sciences (AREA)
- Social Psychology (AREA)
- Game Theory and Decision Science (AREA)
- Pharmaceuticals Containing Other Organic And Inorganic Compounds (AREA)
Abstract
The present invention relates to a kind of safe method of mobile payment, are used to implement the secure payment of mobile terminal, establish the facial characteristics parameter database of validated user, obtain the face recognition parameter for current Lawful user;Voice data average amplitude value and voice data power consumption value tag when going out sound using user's hair obtain the audio identification match parameter for current Lawful user;Gradient parameter when pre-setting payment, the legitimate digital payment cipher using the sine value at the pre-set angle of inclination as mobile terminal;By, to the face-image of operator and the audio identification of collection in worksite, unauthorized person being avoided to place hope on the authentication attempted only by the face-image by validated user through mobile terminal in section of default verification time;Finger touch payment pressure data storehouse corresponding during the validated user input payment cipher of mobile terminal is built, obtains the finger touch payment fluctuation factor of validated user, it is ensured that mobile payment security.
Description
Technical field
The present invention relates to mobile payment field more particularly to a kind of safe method of mobile payment.
Background technology
With the gradual popularization of the mobile terminals such as smart mobile phone, tablet computer, the shifting realized based on these mobile terminals
Dynamic payment is slowly liked be subject to people.People by carry mobile terminal can be not required cash payment in the case of it is complete
Into the delivery operations such as shopping and booking, very convenient, this is but also people enter the new era of mobile payment.
However, security risk is also faced with to facility, existing method of mobile payment caused by people in face of mobile payment:
Since existing mobile payment only needs operator to accurately input payment secret using mobile terminal, as to whether to be somebody's turn to do
The real validated user input of mobile terminal can not but be identified;Once user loses the mobile terminal of oneself, inevitably illegal
Person carries out delivery operation using the payment cipher that back door obtains to the mobile terminal being lost, so as to cause the movement whole
Hold the property loss of real validated user.
The content of the invention
The technical problems to be solved by the invention are provided for the above-mentioned prior art at a kind of mobile terminal traffic monitoring
Reason method.
Technical solution is used by the present invention solves above-mentioned technical problem:A kind of safe method of mobile payment, is used for
Realize the secure payment of mobile terminal, which is characterized in that include the following steps 1 to step 13:
Step 1, default acquisition time section and default frequency acquisition are set, by mobile terminal in the default acquisition time section
The interior face-image that its validated user is gathered according to the default frequency acquisition obtains several face figures of current Lawful user
Picture, and by mobile terminal the infrared of station acquisition its validated user face is gathered according to default in the default acquisition time section
Temperature data obtains the facial infrared temperature database of validated user;
Wherein, the default acquisition time segment mark is denoted as T, and the default frequency acquisition is labeled as f, the face-image number
According to storehouse labeled as JPG, the face-image total number in the face image data storehouse JPG is M, the face image data storehouse
I-th of face-image in JPG is labeled as JPGi;The facial infrared temperature database flags of the validated user are
Facial infrared temperature data total number in Infrared, facial infrared temperature database Infrared is similarly M, and face is infrared
I-th of facial infrared temperature data markers in the Infrared of temperature data storehouse are infraredi;1≤i≤M;
Step 2, the mobile terminal extracts according to the face image data storehouse of foundation, obtains institute in each width face-image
The facial characteristics parameter sets of current Lawful user are stated, build the facial characteristics parameter database of current Lawful user, and are calculated
Obtain the face recognition match parameter of current Lawful user;
Wherein, in the M width face-image for active user, the facial characteristics parameter sets include currently
Facial gross area value, eyebrow distance values, ears distance values, mouth area value and simple eye area value in the breadth portion figure;
Mark face-image JPGiIn facial gross area value beEyebrow distance values areEars distance values areMouth
Portion's area value isSimple eye area value is
The face recognition match parameter for marking current Lawful user is σFace:
Step 3, the mobile terminal gathers its validated user in default acquisition time section according to default frequency acquisition
Voice data, obtains the multistage voice data of current Lawful user and validated user corresponds to the power consumption number of each section audio, and
The audio database for current Lawful user is established in mobile terminal;
Wherein, the audio database is labeled as Voice, and total hop count of audio is N in the audio database Voice,
Jth section audio in the audio database Voice is labeled as Voicej, the voice data VoicejAverage amplitude value mark
It is denoted asThe voice data VoicejPower consumption number be1≤j≤N;
Step 4, the audio identification of current Lawful user is calculated according to the audio database of foundation in the mobile terminal
Match parameter;Wherein, the audio identification match parameter for marking the current Lawful user is σVoice:
Step 5, gradient parameter during payment is pre-set in mobile terminal, and with the pre-set gradient
Legitimate digital payment cipher of the corresponding sine value as the mobile terminal;Wherein, the pre-set inclination is marked
Angle is θ0, the tilt angle theta0Corresponding sine value is sin θ0;
Step 6, the mobile terminal stores the face recognition match parameter of the validated user, the audio of validated user is known
Other match parameter and the legitimate digital payment cipher, mobile terminal are detecting operating personnel's startup payment program progress
During payment, the mobile terminal sends authentication and requires interface to current operator, and only in the current operator
Legal identity be verified after, payment interface is presented to the current operator in mobile terminal, and otherwise, the mobile payment is whole
Hold the positive closing payment program;Wherein, having on the authentication requirement interface requires current operator to shoot face figure
The section of default verification time of picture and acquisition voice data;
Step 7, in the default verification time section, the mobile terminal judgement receives the face of current operator's shooting
Portion's image and after collecting the voice data of current operator, mobile terminal goes to the authentication rank for current operator
Section, is transferred to step 8;Otherwise, in the default verification time section, the mobile terminal does not receive current operator's shooting
Face-image or the voice data for not collecting current operator, the mobile terminal positive closing payment program;
Wherein, set the mobile terminal and receive the face-image total number of current operator's shooting as M', mobile end
R-th of face-image for terminating the current operator's shooting received is labeled as JPGr';1≤r≤M';
Set the mobile terminal collect current operator audio block book as N', in the N' section audios
K section audios are labeled as Voice'k, the audio Voice'kAverage amplitude value be labeled asThe audio Voice'k's
Power consumption number is1≤j≤N;
Step 8, mobile terminal is obtained its validated user in default payment custom acquisition time section and is touched using finger
Mobile terminal screen builds its validated user and inputs payment cipher by touch-control come payment touch-control pressure when inputting payment cipher
When corresponding finger touch payment pressure data storehouse, and obtain the finger touch payment fluctuation factor of the validated user;
Wherein, the finger touch corresponding to the validated user is marked to pay pressure data storehouse and is labeled as Suser_Finger, hand
Refer to touch-control payment pressure data storehouse Suser_FingerInterior payment touch-control pressure data number is G, and finger touch pays pressure data
Storehouse Suser_FingerG-th interior of payment touch-control pressure data is labeled as1≤g≤G;
Wherein, the finger touch of the validated user is marked to pay the fluctuation factor as χuser_Finger:
Step 9, for mobile terminal according to the facial infrared temperature database of its constructed validated user, it is described legal to obtain
The facial infrared temperature index of oscillation of user;
Mobile terminal obtains the eye frame vein image of its validated user, establishes the legal use according to default collection period
The eye frame vein image database at family, and obtain the eye frame vein pattern parameter of the validated user;Wherein, it is described legal to mark
The facial infrared temperature index of oscillation of user is ξInfrared:
Wherein, infrarediFor i-th of facial infrared temperature number of degrees in the facial infrared temperature database Infrared
According to;
The eye frame vein image database flags of the validated user are JPGUser_EyeBox, the eye frame of the validated user is quiet
Arteries and veins characteristic parameter is labeled as ψUser_EyeBox;The eye frame vein image database JPGUser_EyeBoxInterior corresponding image number is
Y;The eye frame vein image database JPGUser_EyeBoxY-th interior frame vein image is labeled as jpgUser_EyeBox,y, eye
Frame vein image jpgUser_EyeBox,yLength value be labeled as LjpgUser_EyeBox,y, eye frame vein image jpgUser_EyeBox,yWidth
Angle value is labeled as WjpgUser_EyeBox,y;
Step 10, mobile terminal structure current operator touches mobile terminal screen come when inputting payment cipher using finger
Payment touch-control pressure data storehouse, obtain the facial infrared temperature database of current operator's face and eye frame vein image number
According to storehouse, it is corresponding be calculated the finger touch payment fluctuation factor for current operator, the facial infrared temperature index of oscillation with
And eye frame vein pattern parameter;
Mobile terminal judges the finger touch payment fluctuation factor of current operator and the validated user, facial infrared temperature
When spending the equal correspondent equal of the index of oscillation, eye frame vein pattern parameter, step 11 is transferred to;Otherwise, mobile terminal positive closing branch
Pay program;
Step 11, the mobile terminal extracts to obtain the face recognition match parameter of current operator respectively and audio is known
Other match parameter;Wherein, the face recognition match parameter of the current operator is labeled as σ 'Face, the current operator's
Audio identification match parameter is labeled as σ 'Voice:
Wherein, the face-image JPG of current operator is markedr' in facial gross area value beEyebrow distance values areEars distance values areMouth area value isSimple eye area value is
Wherein, the kth section audio of the current operator is labeled as Voice'k, the audio Voice'kMean amplitude of tide
Value is labeled asThe audio Voice'kPower consumption number be1≤k≤N';
Step 12, the mobile terminal is by the face recognition match parameter of the validated user of storage and audio identification
Respectively correspondingly matching is done with parameter with the face recognition match parameter of the current operator and audio identification match parameter to sentence
It is disconnected, to judge the legal identity of current operator:
As the face recognition match parameter σ of the validated userFaceGinseng is matched with the face recognition of the current operator
Number σ 'FaceIt is equal, and the audio identification match parameter σ of the validated userVoiceWith the audio identification of the current operator
With parameter σ 'VoiceWhen equal, the mobile terminal judges current operator for its validated user, and is transferred to step 13;Otherwise, institute
State the current payment program of mobile terminal positive closing;
Step 13, the mobile terminal detects its current angle of inclination, and judges current tilt angle with movement eventually
When holding pre-set angle of inclination equal, the mobile terminal automatically by pre-set angle of inclination corresponding to sine value
As payment cipher, to be safely completed delivery operation.
Compared with prior art, the advantage of the invention is that:
First, the facial characteristics parameter database of the invention by establishing validated user, obtains for current Lawful user
Face recognition parameter, by extract the facial gross area value in user's face image, eyebrow distance values, ears distance values,
The biological characteristic parameters such as mouth area value and simple eye area value are realized and user's legal identity are tested based on multi-biological feature
Card identification, improves the recognition effect for user's legal identity;
Secondly, the mobile terminal in the present invention is sent out by establishing for current Lawful audio user database using user
Voice data average amplitude value and voice data power consumption value tag during gone out sound obtain the audio for current Lawful user
It identifies match parameter, so as to the identification feature parameter as identification user identity, it is legal to realize that guarantee mobile terminal only receives its
User carries out the safety of delivery operation, avoids the payment risk of illegal user's operation mobile terminal;In addition, by pre-setting
Gradient parameter during payment is paid close using the sine value at the pre-set angle of inclination as the legitimate digital of mobile terminal
Code more can guarantee that the payment action for operating the mobile terminal is sent by the real validated user of mobile terminal, so as to pay
Crucial moment ensure the safety of payment of user;
Again, by knowing in section of default verification time to the face-image of operator and the audio of collection in worksite
Not, placed hope on to avoid unauthorized person and go to attempt the body by mobile terminal only by the face-image by validated user
Part verification so as to better assure that the delivery operation of mobile terminal only receives the manipulation of its validated user, improves mobile branch
The security paid;
Finally, corresponding finger touch pays pressure when inputting payment cipher by building the validated user of mobile terminal
Database obtains the finger touch payment fluctuation factor of validated user, so as to the certain amount gathered by means of mobile terminal
Finger touch payment pressure distinguish the identity situation of current input payment cipher, it is ensured that mobile payment security.
Description of the drawings
Fig. 1 is the flow diagram of method of mobile payment safe in inventive embodiments.
Specific embodiment
The present invention is described in further detail below in conjunction with attached drawing embodiment.
As shown in Figure 1, method of mobile payment safe in the present embodiment, is used to implement the secure payment of mobile terminal, it should
Method of mobile payment includes the following steps 1 to step 13:
Step 1, default acquisition time section and default frequency acquisition are set, by mobile terminal in the default acquisition of setting
Between the face-image of its validated user is gathered in section according to default frequency acquisition, so as to obtain several faces of current Lawful user
Image, and preset by mobile terminal at this in acquisition time section according to the infrared of default its validated user of acquisition station acquisition face
Temperature data obtains the facial infrared temperature database of validated user;For example, can use for default acquisition position herein
The main positions such as mouth, cheek or the forehead of family face;
Wherein, preset acquisition time segment mark and be denoted as T, preset frequency acquisition labeled as f, the face-image of current Lawful user
Database flags are JPG, and the face-image total number in the JPG of face image data storehouse is M, in the JPG of face image data storehouse
I-th of face-image is labeled as JPGi;
For the default acquisition position set in advance, the facial infrared temperature database flags of validated user are
Facial infrared temperature data total number in Infrared, facial infrared temperature database Infrared is similarly M, and face is infrared
I-th of facial infrared temperature data markers in the Infrared of temperature data storehouse are infraredi;1≤i≤M;
Step 2, mobile terminal extracts according to the face image data storehouse of foundation, obtains currently closing in each width face-image
The facial characteristics parameter sets of method user, build the facial characteristics parameter database of current Lawful user, and are calculated current
The face recognition match parameter of validated user;
Specifically, in the M width face-images for active user, facial characteristics parameter sets include the current breadth
Facial gross area value, eyebrow distance values, ears distance values, mouth area value and simple eye area value in portion's figure;Index face
Portion image JPGiIn facial gross area value beEyebrow distance values areEars distance values areMouth area
It is worth and isSimple eye area value is
The face recognition match parameter of current Lawful user is labeled as σFace, face recognition match parameter σFaceCalculating it is public
Formula is as follows:
The present invention carries out the characteristic parameter of identity differentiation using face as different user, by extracting in user's face image
The biological characteristics such as facial gross area value, eyebrow distance values, ears distance values, mouth area value and simple eye area value ginseng
Number is realized and the verification of user's legal identity is identified based on multi-biological feature, improves the identification for user's legal identity
Effect only can carry out delivery operation, so as to ensure that use with the operator by multiple-authentication using the mobile terminal
Safety when family operation mobile terminal is paid;
Step 3, mobile terminal gathers the audio of its validated user in default acquisition time section according to default frequency acquisition
Data, obtain the multistage voice data of current Lawful user and validated user corresponds to the power consumption number of each section audio, and in movement
The audio database for current Lawful user is established in terminal;
Wherein, audio database is labeled as Voice, and total hop count of audio is N in audio database Voice, voice data
Jth section audio in the Voice of storehouse is labeled as Voicej, voice data VoicejAverage amplitude value be labeled asAudio number
According to VoicejPower consumption number be1≤j≤N;
Step 4, the audio identification matching of current Lawful user is calculated according to the audio database of foundation in mobile terminal
Parameter;Wherein, the audio identification match parameter for marking current Lawful user is σVoice:
Mobile terminal in the present invention is by establishing the voice data of gathered user for current Lawful audio user
Database, with using user hair go out sound when voice data average amplitude value and voice data power consumption value tag be directed to
The audio identification match parameter of current Lawful user so as to the identification feature parameter as identification user identity, is further realized
Ensure that mobile terminal only receives the safety that its validated user carries out delivery operation, avoid illegal user's operation mobile terminal
Payment risk;
Step 5, gradient parameter during payment is pre-set in mobile terminal, and it is right with pre-set gradient institute
Legitimate digital payment cipher of the sine value answered as mobile terminal;Wherein, in this embodiment, pre-set angle of inclination
Labeled as θ0, the tilt angle theta0Corresponding sine value is sin θ0;
Since each operator using the mobile terminal of oneself when carrying out delivery operation, based on operator oneself
People is accustomed to, and operator usually can be accustomed to mobile terminal showing certain posture to pay, and it is special to be also based on this
Point, the gradient parameter when present invention is by pre-setting payment, with the pre-set tilt angle theta0Sine value sin θ0
As the legitimate digital payment cipher of mobile terminal, it more can guarantee that the payment action for operating the mobile terminal is by mobile terminal
Real validated user is sent, so as to ensure the safety of payment of user at the crucial moment of payment;
Step 6, the face recognition match parameter of mobile terminal storage validated user, the audio identification of validated user matching ginseng
Number and legitimate digital payment cipher, mobile terminal are mobile when having detected that operating personnel's startup payment program is paid
Terminal sends authentication and requires interface to current operator, and is only verified in the legal identity of current operator
Afterwards, payment interface is presented to the current operator in mobile terminal;Otherwise, Mobile payment terminal positive closing payment program;
Wherein, having on authentication requirement interface requires current operator to shoot face-image and acquisition audio number
According to the default verification time section;That is, mobile terminal requirement current operator presets confession in verification time section at this at this time
The mobile terminal is taken the face-image of operator and is made a sound by the current operator, to verify that mobile terminal is gathered
The user of face-image is the personnel with vital sign rather than photo;This way it is possible to avoid unauthorized person is placed hope on only
It only goes to attempt the authentication by mobile terminal by the face-image by validated user, so as to better assure that movement
The delivery operation of terminal only receives the manipulation of its validated user, improves the security of mobile payment;
Step 7, the default verification time section in, mobile terminal judgement receive current operator shooting face-image and
After the voice data for collecting current operator, show that mobile terminal takes the user corresponding to the face-image of operator as tool
There are the personnel of vital sign, mobile terminal goes to the authenticating phase for current operator at this time, is transferred to step 8;Otherwise,
In section of default verification time, mobile terminal does not receive the face-image of current operator's shooting or does not collect current operation
The voice data of person shows that mobile terminal takes the user corresponding to the face-image of operator not have vital sign, mobile
The terminal positive closing payment program;
Wherein, set mobile terminal receive current operator shooting face-image total number as M', mobile terminal connects
R-th of face-image of the current operator's shooting received is labeled as JPGr';1≤r≤M';
Set mobile terminal collect current operator audio block book as N', in this of gathered current operator
Kth section audio in N' section audios is labeled as Voice'k, audio Voice'kAverage amplitude value be labeled asAudio
Voice'kPower consumption number be1≤j≤N;
Step 8, mobile terminal is obtained its validated user in default payment custom acquisition time section and is touched using finger
Mobile terminal screen builds its validated user and inputs payment cipher by touch-control come payment touch-control pressure when inputting payment cipher
When corresponding finger touch payment pressure data storehouse, and obtain the finger touch payment fluctuation factor of validated user;
Wherein, the finger touch payment pressure data storehouse corresponding to validated user is labeled as Suser_Finger, finger touch branch
Pay pressure data storehouse Suser_FingerInterior payment touch-control pressure data number is G, and finger touch pays pressure data storehouse
Suser_FingerG-th interior of payment touch-control pressure data is labeled as1≤g≤G;
Wherein, the finger touch payment fluctuation factor marker of validated user is χuser_Finger:
Since when using mobile terminal screen input payment cipher, different user is applied to mobile terminal screen using finger
Payment touch-control pressure on curtain is different, then, just by building the legal of mobile terminal in the step 8 of the present embodiment
User's finger touch corresponding when inputting payment cipher pays pressure data storehouse and obtains the finger touch of validated user
The payment fluctuation factor, a certain number of finger touch payment pressure so as to be gathered by means of mobile terminal are worked as to distinguish
The identity situation of preceding input payment cipher;
In addition, even the validated user of the mobile terminal is only because the payment cipher for being not intended to touch mobile terminal is defeated
When entering interface, the finger touch payment for the validated user that the present invention can also be as obtained by fluctuates the factor to judge to belong at this time
In validated user maloperation, so as to be unlikely to cause the economic loss of validated user;
Step 9, mobile terminal obtains validated user according to the facial infrared temperature database of its constructed validated user
The facial infrared temperature index of oscillation;Mobile terminal obtains the eye frame vein image of its validated user according to default collection period,
The eye frame vein image database of validated user is established, and obtains the eye frame vein pattern parameter of validated user;Wherein, legal use
The facial infrared temperature index of oscillation at family is labeled as ξInfrared:
Wherein, infrarediFor i-th of facial infrared temperature data in facial infrared temperature database Infrared;
The eye frame vein image database flags of validated user are JPGUser_EyeBox, the eye frame vein pattern parameter tags of validated user are
ψUser_EyeBox;Eye frame vein image database JPGUser_EyeBoxInterior corresponding image number is Y;
Eye frame vein image database JPGUser_EyeBoxY-th interior frame vein image is labeled as jpgUser_EyeBox,y,
Eye frame vein image jpgUser_EyeBox,yLength value be labeled as LjpgUser_EyeBox,y, eye frame vein image jpgUser_EyeBox,y's
Width value is labeled as WjpgUser_EyeBox,y;1≤y≤Y;
Step 10, mobile terminal structure current operator touches mobile terminal screen come when inputting payment cipher using finger
Payment touch-control pressure data storehouse, obtain the facial infrared temperature database of current operator's face and eye frame vein image number
According to storehouse, it is corresponding be calculated the finger touch payment fluctuation factor for current operator, the facial infrared temperature index of oscillation with
And eye frame vein pattern parameter;
Wherein, for the finger touch payment fluctuation factor, the facial infrared temperature index of oscillation and the eye of current operator
The calculating process of frame vein pattern parameter is identical with the correspondence parameter calculation procedure for validated user, it is thus only necessary to will accordingly close
The parameter of method user replaces with the correspondence parameter of current operator;
Mobile terminal judges the finger touch payment fluctuation factor of current operator and validated user, facial infrared temperature ripple
During the equal correspondent equal of dynamic index, eye frame vein pattern parameter, step 11 is transferred to;Otherwise, showing the identity of current operation, there are non-
Method suspicion, it is possible to which there are payment risk, the payment programs of mobile terminal positive closing at this time;
Step 11, mobile terminal extracts to obtain the face recognition match parameter of current operator and audio identification respectively
With parameter;Wherein, the face recognition match parameter of current operator is labeled as σ 'Face, the audio identification matching of current operator
Parameter tags are σ 'Voice:
Wherein, the face-image JPG of current operatorr' in facial gross area value be labeled asEyebrow distance values areEars distance values areMouth area value isSimple eye area value is
Wherein, the kth section audio of current operator is labeled as Voice'k, current operator's audio Voice'kBe averaged and shake
Amplitude is labeled asAudio Voice'kPower consumption number be1≤k≤N';
Step 12, mobile terminal divides the face recognition match parameter of the validated user of storage and audio identification match parameter
Matching judgment accordingly is not done with the face recognition match parameter of current operator and audio identification match parameter, it is current to judge
The legal identity of operator:
As the face recognition match parameter σ of validated userFaceWith the face recognition match parameter σ ' of current operatorFacePhase
Deng, and the audio identification match parameter σ of validated userVoiceWith the audio identification match parameter σ ' of current operatorVoiceIt is equal
When, mobile terminal judges current operator for its validated user, and the identity security of operator is credible at this time, and is transferred to step 13;
Otherwise, mobile terminal judges that current operator is not its validated user, and there are payment risk, mobile terminal positive closing is current
Payment program, to ensure the fund security of its validated user;
Step 13, mobile terminal detects its current angle of inclination, and judges that current tilt angle is pre- with mobile terminal
When the angle of inclination first set is equal, illustrate that its validated user of mobile terminal has input correct payment cipher, mobile terminal
Automatically using the sine value corresponding to pre-set angle of inclination as payment cipher, to be safely completed delivery operation.
Although the preferred embodiment of the present invention described in detail above, it is to be clearly understood that for this field
Technical staff for, the invention may be variously modified and varied.That is made within the spirit and principles of the invention appoints
What modification, equivalent substitution, improvement etc., should all be included in the protection scope of the present invention.
Claims (1)
1. a kind of safe method of mobile payment is used to implement the secure payment of mobile terminal, which is characterized in that including walking as follows
Rapid 1 to step 13:
Step 1, default acquisition time section and default frequency acquisition are set, pressed by mobile terminal in the default acquisition time section
The face-image of its validated user is gathered according to the default frequency acquisition, obtains several face-images of current Lawful user, and
By mobile terminal in the default acquisition time section according to it is default acquisition station acquisition its validated user face infrared temperature
Data obtain the facial infrared temperature database of validated user;
Wherein, the default acquisition time segment mark is denoted as T, and the default frequency acquisition is labeled as f, the face image data storehouse
Labeled as JPG, face-image total number in the face image data storehouse JPG is M, in the face image data storehouse JPG
I-th of face-image be labeled as JPGi;The facial infrared temperature database flags of the validated user are Infrared, facial
Facial infrared temperature data total number in infrared temperature database Infrared is similarly M, facial infrared temperature database
I-th of facial infrared temperature data markers in Infrared are infraredi;1≤i≤M;
Step 2, the mobile terminal is according to the face image data storehouse of foundation, extract, obtain described in each width face-image when
The facial characteristics parameter sets of preceding validated user, build the facial characteristics parameter database of current Lawful user, and are calculated
The face recognition match parameter of current Lawful user;
Wherein, in the M width face-image for active user, the facial characteristics parameter sets include the current width
Facial gross area value, eyebrow distance values, ears distance values, mouth area value and simple eye area value in facial figure;Mark
Face-image JPGiIn facial gross area value beEyebrow distance values areEars distance values areMouth face
Product value isSimple eye area value is
The face recognition match parameter for marking current Lawful user is σFace:
<mrow>
<msub>
<mi>&sigma;</mi>
<mrow>
<mi>F</mi>
<mi>a</mi>
<mi>c</mi>
<mi>e</mi>
</mrow>
</msub>
<mo>=</mo>
<mfrac>
<mrow>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>M</mi>
</munderover>
<mo>|</mo>
<msubsup>
<mi>D</mi>
<mrow>
<msub>
<mi>JPG</mi>
<mi>i</mi>
</msub>
</mrow>
<mn>2</mn>
</msubsup>
<mo>-</mo>
<msubsup>
<mi>W</mi>
<mrow>
<msub>
<mi>JPG</mi>
<mi>i</mi>
</msub>
</mrow>
<mn>2</mn>
</msubsup>
<mo>|</mo>
</mrow>
<mrow>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>M</mi>
</munderover>
<mo>|</mo>
<mfrac>
<msubsup>
<mi>S</mi>
<mrow>
<msub>
<mi>JPG</mi>
<mi>i</mi>
</msub>
</mrow>
<mo>&prime;</mo>
</msubsup>
<msub>
<mi>S</mi>
<mrow>
<msub>
<mi>JPG</mi>
<mi>i</mi>
</msub>
</mrow>
</msub>
</mfrac>
<mo>|</mo>
<mo>+</mo>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>M</mi>
</munderover>
<mo>|</mo>
<mfrac>
<msubsup>
<mi>S</mi>
<mrow>
<msub>
<mi>JPG</mi>
<mi>i</mi>
</msub>
</mrow>
<mrow>
<mo>&prime;</mo>
<mo>&prime;</mo>
</mrow>
</msubsup>
<msub>
<mi>S</mi>
<mrow>
<msub>
<mi>JPG</mi>
<mi>i</mi>
</msub>
</mrow>
</msub>
</mfrac>
<mo>|</mo>
</mrow>
</mfrac>
<mo>&CenterDot;</mo>
<mfrac>
<mrow>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>M</mi>
</munderover>
<msqrt>
<mrow>
<mo>(</mo>
<mo>(</mo>
<msub>
<mi>D</mi>
<mrow>
<msub>
<mi>JPG</mi>
<mi>i</mi>
</msub>
</mrow>
</msub>
<mo>&CenterDot;</mo>
<msub>
<mi>W</mi>
<mrow>
<msub>
<mi>JPG</mi>
<mi>i</mi>
</msub>
</mrow>
</msub>
<mo>)</mo>
<mo>-</mo>
<mfrac>
<mn>1</mn>
<mi>M</mi>
</mfrac>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>M</mi>
</munderover>
<mo>(</mo>
<msub>
<mi>D</mi>
<mrow>
<msub>
<mi>JPG</mi>
<mi>i</mi>
</msub>
</mrow>
</msub>
<mo>&CenterDot;</mo>
<msub>
<mi>W</mi>
<mrow>
<msub>
<mi>JPG</mi>
<mi>i</mi>
</msub>
</mrow>
</msub>
<mo>)</mo>
<mo>)</mo>
</mrow>
</msqrt>
</mrow>
<mrow>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>M</mi>
</munderover>
<msqrt>
<mrow>
<mo>(</mo>
<mo>(</mo>
<msubsup>
<mi>S</mi>
<mrow>
<msub>
<mi>JPG</mi>
<mi>i</mi>
</msub>
</mrow>
<mo>&prime;</mo>
</msubsup>
<mo>&CenterDot;</mo>
<msubsup>
<mi>S</mi>
<mrow>
<msub>
<mi>JPG</mi>
<mi>i</mi>
</msub>
</mrow>
<mrow>
<mo>&prime;</mo>
<mo>&prime;</mo>
</mrow>
</msubsup>
<mo>)</mo>
<mo>-</mo>
<mfrac>
<mn>1</mn>
<mi>M</mi>
</mfrac>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>M</mi>
</munderover>
<mo>(</mo>
<msubsup>
<mi>S</mi>
<mrow>
<msub>
<mi>JPG</mi>
<mi>i</mi>
</msub>
</mrow>
<mo>&prime;</mo>
</msubsup>
<mo>&CenterDot;</mo>
<msubsup>
<mi>S</mi>
<mrow>
<msub>
<mi>JPG</mi>
<mi>i</mi>
</msub>
</mrow>
<mrow>
<mo>&prime;</mo>
<mo>&prime;</mo>
</mrow>
</msubsup>
<mo>)</mo>
<mo>)</mo>
</mrow>
</msqrt>
</mrow>
</mfrac>
<mo>;</mo>
</mrow>
Step 3, the mobile terminal gathers the audio of its validated user in default acquisition time section according to default frequency acquisition
Data, obtain the multistage voice data of current Lawful user and validated user corresponds to the power consumption number of each section audio, and in movement
The audio database for current Lawful user is established in terminal;
Wherein, the audio database is labeled as Voice, and total hop count of audio is N in the audio database Voice, described
Jth section audio in audio database Voice is labeled as Voicej, the voice data VoicejAverage amplitude value be labeled asThe voice data VoicejPower consumption number be1≤j≤N;
Step 4, the audio identification matching of current Lawful user is calculated according to the audio database of foundation in the mobile terminal
Parameter;Wherein, the audio identification match parameter for marking the current Lawful user is σVoice:
<mrow>
<msub>
<mi>&sigma;</mi>
<mrow>
<mi>V</mi>
<mi>o</mi>
<mi>i</mi>
<mi>c</mi>
<mi>e</mi>
</mrow>
</msub>
<mo>=</mo>
<mfrac>
<mrow>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>N</mi>
</munderover>
<mo>|</mo>
<msub>
<mi>A</mi>
<mrow>
<msub>
<mi>Voice</mi>
<mi>j</mi>
</msub>
</mrow>
</msub>
<mo>+</mo>
<msub>
<mi>E</mi>
<mrow>
<msub>
<mi>Voice</mi>
<mi>j</mi>
</msub>
</mrow>
</msub>
<mo>|</mo>
</mrow>
<mrow>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>N</mi>
</munderover>
<mo>|</mo>
<mfrac>
<msub>
<mi>A</mi>
<mrow>
<msub>
<mi>Voice</mi>
<mi>j</mi>
</msub>
</mrow>
</msub>
<msub>
<mi>E</mi>
<mrow>
<msub>
<mi>Voice</mi>
<mi>j</mi>
</msub>
</mrow>
</msub>
</mfrac>
<mo>|</mo>
<mo>+</mo>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>N</mi>
</munderover>
<mo>|</mo>
<mfrac>
<msub>
<mi>E</mi>
<mrow>
<msub>
<mi>Voice</mi>
<mi>j</mi>
</msub>
</mrow>
</msub>
<msub>
<mi>A</mi>
<mrow>
<msub>
<mi>Voice</mi>
<mi>j</mi>
</msub>
</mrow>
</msub>
</mfrac>
<mo>|</mo>
</mrow>
</mfrac>
<mo>&CenterDot;</mo>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>N</mi>
</munderover>
<msqrt>
<mrow>
<mo>(</mo>
<mo>(</mo>
<msub>
<mi>A</mi>
<mrow>
<msub>
<mi>Voice</mi>
<mi>j</mi>
</msub>
</mrow>
</msub>
<mo>&CenterDot;</mo>
<msub>
<mi>E</mi>
<mrow>
<msub>
<mi>Voice</mi>
<mi>j</mi>
</msub>
</mrow>
</msub>
<mo>)</mo>
<mo>-</mo>
<mfrac>
<mn>1</mn>
<mi>N</mi>
</mfrac>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>N</mi>
</munderover>
<mo>(</mo>
<msub>
<mi>A</mi>
<mrow>
<msub>
<mi>Voice</mi>
<mi>j</mi>
</msub>
</mrow>
</msub>
<mo>&CenterDot;</mo>
<msub>
<mi>E</mi>
<mrow>
<msub>
<mi>Voice</mi>
<mi>j</mi>
</msub>
</mrow>
</msub>
<mo>)</mo>
<mo>)</mo>
</mrow>
</msqrt>
<mo>;</mo>
</mrow>
Step 5, gradient parameter during payment is pre-set in mobile terminal, and it is right with the pre-set gradient institute
Legitimate digital payment cipher of the sine value answered as the mobile terminal;Wherein, the pre-set angle of inclination is marked
For θ0, the tilt angle theta0Corresponding sine value is sin θ0;
Step 6, the mobile terminal stores face recognition match parameter, the audio identification of validated user of the validated user
With parameter and the legitimate digital payment cipher, mobile terminal is detecting that operating personnel's startup payment program is paid
When, the mobile terminal sends authentication and requires interface to current operator, and only in the conjunction of the current operator
Method authentication is by rear, and payment interface is presented to the current operator in mobile terminal, and otherwise, the Mobile payment terminal is strong
System closes the payment program;Wherein, have on authentication requirement interface require current operator shoot face-image with
And the section of default verification time of acquisition voice data;
Step 7, in the default verification time section, the mobile terminal judgement receives the face figure of current operator's shooting
As and after collecting the voice data of current operator, mobile terminal goes to the authenticating phase for current operator, turns
Enter step 8;Otherwise, in the default verification time section, the mobile terminal does not receive the face of current operator's shooting
Image or the voice data for not collecting current operator, the mobile terminal positive closing payment program;
Wherein, set the mobile terminal receive current operator shooting face-image total number as M', mobile terminal connects
R-th of face-image of the current operator's shooting received is labeled as JPG 'r;1≤r≤M';
Set the mobile terminal collect current operator audio block book as N', the kth section in the N' section audios
Audio indicia is Voice'k, the audio Voice'kAverage amplitude value be labeled asThe audio Voice'kPower consumption
It is worth and is1≤j≤N;
Step 8, mobile terminal obtains its validated user in default payment custom acquisition time section and touches movement using finger
Terminal screen builds its validated user and passes through touch-control and input payment cipher when institute come payment touch-control pressure when inputting payment cipher
Corresponding finger touch payment pressure data storehouse, and obtain the finger touch payment fluctuation factor of the validated user;
Wherein, the finger touch corresponding to the validated user is marked to pay pressure data storehouse and is labeled as Suser_Finger, finger touches
Control payment pressure data storehouse Suser_FingerInterior payment touch-control pressure data number is G, and finger touch pays pressure data storehouse
Suser_FingerG-th interior of payment touch-control pressure data is labeled as1≤g≤G;
Wherein, the finger touch of the validated user is marked to pay the fluctuation factor as χuser_Finger:
<mrow>
<msub>
<mi>&chi;</mi>
<mrow>
<mi>u</mi>
<mi>s</mi>
<mi>e</mi>
<mi>r</mi>
<mo>_</mo>
<mi>F</mi>
<mi>i</mi>
<mi>n</mi>
<mi>g</mi>
<mi>e</mi>
<mi>r</mi>
</mrow>
</msub>
<mo>=</mo>
<msqrt>
<mrow>
<mfrac>
<mn>1</mn>
<mi>G</mi>
</mfrac>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>g</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>G</mi>
</munderover>
<msup>
<mrow>
<mo>(</mo>
<msubsup>
<mi>Pre</mi>
<mrow>
<mi>u</mi>
<mi>s</mi>
<mi>e</mi>
<mi>r</mi>
<mo>_</mo>
<mi>F</mi>
<mi>i</mi>
<mi>n</mi>
<mi>g</mi>
<mi>e</mi>
<mi>r</mi>
</mrow>
<mi>g</mi>
</msubsup>
<mo>-</mo>
<mover>
<mrow>
<msub>
<mi>Pre</mi>
<mrow>
<mi>u</mi>
<mi>s</mi>
<mi>e</mi>
<mi>r</mi>
<mo>_</mo>
<mi>F</mi>
<mi>i</mi>
<mi>n</mi>
<mi>g</mi>
<mi>e</mi>
<mi>r</mi>
</mrow>
</msub>
</mrow>
<mo>&OverBar;</mo>
</mover>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
</mrow>
</msqrt>
<mo>;</mo>
</mrow>
<mrow>
<mover>
<mrow>
<msub>
<mi>Pre</mi>
<mrow>
<mi>u</mi>
<mi>s</mi>
<mi>e</mi>
<mi>r</mi>
<mo>_</mo>
<mi>F</mi>
<mi>i</mi>
<mi>n</mi>
<mi>g</mi>
<mi>e</mi>
<mi>r</mi>
</mrow>
</msub>
</mrow>
<mo>&OverBar;</mo>
</mover>
<mo>=</mo>
<mfrac>
<mn>1</mn>
<mi>G</mi>
</mfrac>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>g</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>G</mi>
</munderover>
<msubsup>
<mi>Pre</mi>
<mrow>
<mi>u</mi>
<mi>s</mi>
<mi>e</mi>
<mi>r</mi>
<mo>_</mo>
<mi>F</mi>
<mi>i</mi>
<mi>n</mi>
<mi>g</mi>
<mi>e</mi>
<mi>r</mi>
</mrow>
<mi>g</mi>
</msubsup>
<mo>;</mo>
</mrow>
Step 9, mobile terminal obtains the validated user according to the facial infrared temperature database of its constructed validated user
The facial infrared temperature index of oscillation;
Mobile terminal obtains the eye frame vein image of its validated user, establishes the validated user according to default collection period
Eye frame vein image database, and obtain the eye frame vein pattern parameter of the validated user;Wherein, the validated user is marked
The facial infrared temperature index of oscillation be ξInfrared:
<mrow>
<msub>
<mi>&xi;</mi>
<mrow>
<mi>I</mi>
<mi>n</mi>
<mi>f</mi>
<mi>r</mi>
<mi>a</mi>
<mi>r</mi>
<mi>e</mi>
<mi>d</mi>
</mrow>
</msub>
<mo>=</mo>
<mfrac>
<mn>1</mn>
<mrow>
<mi>M</mi>
<mo>-</mo>
<mn>1</mn>
</mrow>
</mfrac>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mrow>
<mi>M</mi>
<mo>-</mo>
<mn>1</mn>
</mrow>
</munderover>
<msup>
<mrow>
<mo>(</mo>
<mfrac>
<mrow>
<msub>
<mi>infrared</mi>
<mi>i</mi>
</msub>
<mo>-</mo>
<mover>
<mrow>
<mi>inf</mi>
<mi>r</mi>
<mi>a</mi>
<mi>r</mi>
<mi>e</mi>
<mi>d</mi>
</mrow>
<mo>&OverBar;</mo>
</mover>
</mrow>
<mrow>
<msub>
<mi>infrared</mi>
<mrow>
<mi>i</mi>
<mo>+</mo>
<mn>1</mn>
</mrow>
</msub>
<mo>-</mo>
<msub>
<mi>infrared</mi>
<mi>i</mi>
</msub>
</mrow>
</mfrac>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo>;</mo>
<mover>
<mrow>
<mi>inf</mi>
<mi>r</mi>
<mi>a</mi>
<mi>r</mi>
<mi>e</mi>
<mi>d</mi>
</mrow>
<mo>&OverBar;</mo>
</mover>
<mo>=</mo>
<mfrac>
<mn>1</mn>
<mrow>
<mi>M</mi>
<mo>-</mo>
<mn>1</mn>
</mrow>
</mfrac>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mrow>
<mi>M</mi>
<mo>-</mo>
<mn>1</mn>
</mrow>
</munderover>
<msub>
<mi>infrared</mi>
<mi>i</mi>
</msub>
<mo>;</mo>
</mrow>
Wherein, infrarediFor i-th of facial infrared temperature data in the facial infrared temperature database Infrared;
The eye frame vein image database flags of the validated user are JPGUser_EyeBox, the eye frame vein spy of the validated user
Sign parameter tags are ψUser_EyeBox;The eye frame vein image database JPGUser_EyeBoxInterior corresponding image number is Y;Institute
State a frame vein image database JPGUser_EyeBoxY-th interior frame vein image is labeled as jpgUser_EyeBox,y, eye frame is quiet
Arteries and veins image jpgUser_EyeBox,yLength value be labeled as LjpgUser_EyeBox,y, eye frame vein image jpgUser_EyeBox,yWidth value
Labeled as WjpgUser_EyeBox,y;
<mrow>
<msub>
<mi>&psi;</mi>
<mrow>
<mi>U</mi>
<mi>s</mi>
<mi>e</mi>
<mi>r</mi>
<mo>_</mo>
<mi>E</mi>
<mi>y</mi>
<mi>e</mi>
<mi>B</mi>
<mi>o</mi>
<mi>x</mi>
</mrow>
</msub>
<mo>=</mo>
<mfrac>
<mn>1</mn>
<mn>2</mn>
</mfrac>
<mo>&lsqb;</mo>
<mfrac>
<mrow>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>y</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>Y</mi>
</munderover>
<mrow>
<mo>(</mo>
<msub>
<mi>L</mi>
<mrow>
<msub>
<mi>jpg</mi>
<mrow>
<mi>U</mi>
<mi>s</mi>
<mi>e</mi>
<mi>r</mi>
<mo>_</mo>
<mi>E</mi>
<mi>y</mi>
<mi>e</mi>
<mi>B</mi>
<mi>o</mi>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
</mrow>
</msub>
</mrow>
</msub>
<mo>&CenterDot;</mo>
<msub>
<mi>W</mi>
<mrow>
<msub>
<mi>jpg</mi>
<mrow>
<mi>U</mi>
<mi>s</mi>
<mi>e</mi>
<mi>r</mi>
<mo>_</mo>
<mi>E</mi>
<mi>y</mi>
<mi>e</mi>
<mi>B</mi>
<mi>o</mi>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
</mrow>
</msub>
</mrow>
</msub>
<mo>)</mo>
</mrow>
</mrow>
<mrow>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>y</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>Y</mi>
</munderover>
<msub>
<mi>L</mi>
<mrow>
<msub>
<mi>jpg</mi>
<mrow>
<mi>U</mi>
<mi>s</mi>
<mi>e</mi>
<mi>r</mi>
<mo>_</mo>
<mi>E</mi>
<mi>y</mi>
<mi>e</mi>
<mi>B</mi>
<mi>o</mi>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
</mrow>
</msub>
</mrow>
</msub>
</mrow>
</mfrac>
<mo>+</mo>
<mfrac>
<mrow>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>y</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>Y</mi>
</munderover>
<mrow>
<mo>(</mo>
<msub>
<mi>L</mi>
<mrow>
<msub>
<mi>jpg</mi>
<mrow>
<mi>U</mi>
<mi>s</mi>
<mi>e</mi>
<mi>r</mi>
<mo>_</mo>
<mi>E</mi>
<mi>y</mi>
<mi>e</mi>
<mi>B</mi>
<mi>o</mi>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
</mrow>
</msub>
</mrow>
</msub>
<mo>&CenterDot;</mo>
<msub>
<mi>W</mi>
<mrow>
<msub>
<mi>jpg</mi>
<mrow>
<mi>U</mi>
<mi>s</mi>
<mi>e</mi>
<mi>r</mi>
<mo>_</mo>
<mi>E</mi>
<mi>y</mi>
<mi>e</mi>
<mi>B</mi>
<mi>o</mi>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
</mrow>
</msub>
</mrow>
</msub>
<mo>)</mo>
</mrow>
</mrow>
<mrow>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>y</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>Y</mi>
</munderover>
<msub>
<mi>W</mi>
<mrow>
<msub>
<mi>jpg</mi>
<mrow>
<mi>U</mi>
<mi>s</mi>
<mi>e</mi>
<mi>r</mi>
<mo>_</mo>
<mi>E</mi>
<mi>y</mi>
<mi>e</mi>
<mi>B</mi>
<mi>o</mi>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
</mrow>
</msub>
</mrow>
</msub>
</mrow>
</mfrac>
<mo>&rsqb;</mo>
<mo>;</mo>
</mrow>
Step 10, mobile terminal structure current operator touches mobile terminal screen come branch when inputting payment cipher using finger
It pays touch-control pressure data storehouse, obtain the facial infrared temperature database of current operator's face and eye frame vein image data
Storehouse, it is corresponding be calculated the finger touch payment fluctuation factor for current operator, the facial infrared temperature index of oscillation and
Eye frame vein pattern parameter;
Mobile terminal judges the finger touch payment fluctuation factor of current operator and the validated user, facial infrared temperature ripple
During the equal correspondent equal of dynamic index, eye frame vein pattern parameter, step 11 is transferred to;Otherwise, mobile terminal positive closing payment journey
Sequence;
Step 11, the mobile terminal extracts to obtain the face recognition match parameter of current operator and audio identification respectively
With parameter;Wherein, the face recognition match parameter of the current operator is labeled as σ 'Face, the audio of the current operator
Identify that match parameter is labeled as σ 'Voice:
<mrow>
<msubsup>
<mi>&sigma;</mi>
<mrow>
<mi>F</mi>
<mi>a</mi>
<mi>c</mi>
<mi>e</mi>
</mrow>
<mo>&prime;</mo>
</msubsup>
<mo>=</mo>
<mfrac>
<mrow>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>r</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<msup>
<mi>M</mi>
<mo>&prime;</mo>
</msup>
</munderover>
<mo>|</mo>
<msubsup>
<mi>D</mi>
<mrow>
<msubsup>
<mi>JPG</mi>
<mi>r</mi>
<mo>&prime;</mo>
</msubsup>
</mrow>
<mn>2</mn>
</msubsup>
<mo>-</mo>
<msubsup>
<mi>W</mi>
<mrow>
<msubsup>
<mi>JPG</mi>
<mi>r</mi>
<mo>&prime;</mo>
</msubsup>
</mrow>
<mn>2</mn>
</msubsup>
<mo>|</mo>
</mrow>
<mrow>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>r</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<msup>
<mi>M</mi>
<mo>&prime;</mo>
</msup>
</munderover>
<mo>|</mo>
<mfrac>
<msubsup>
<mi>s</mi>
<mrow>
<msubsup>
<mi>JPG</mi>
<mi>r</mi>
<mo>&prime;</mo>
</msubsup>
</mrow>
<mo>&prime;</mo>
</msubsup>
<msub>
<mi>s</mi>
<mrow>
<msubsup>
<mi>JPG</mi>
<mi>r</mi>
<mo>&prime;</mo>
</msubsup>
</mrow>
</msub>
</mfrac>
<mo>|</mo>
<mo>+</mo>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>r</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<msup>
<mi>M</mi>
<mo>&prime;</mo>
</msup>
</munderover>
<mo>|</mo>
<mfrac>
<msubsup>
<mi>s</mi>
<mrow>
<msubsup>
<mi>JPG</mi>
<mi>r</mi>
<mo>&prime;</mo>
</msubsup>
</mrow>
<mrow>
<mo>&prime;</mo>
<mo>&prime;</mo>
</mrow>
</msubsup>
<msub>
<mi>s</mi>
<mrow>
<msubsup>
<mi>JPG</mi>
<mi>r</mi>
<mo>&prime;</mo>
</msubsup>
</mrow>
</msub>
</mfrac>
<mo>|</mo>
</mrow>
</mfrac>
<mo>&CenterDot;</mo>
<mfrac>
<mrow>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>r</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<msup>
<mi>M</mi>
<mo>&prime;</mo>
</msup>
</munderover>
<msqrt>
<mrow>
<mo>(</mo>
<mo>(</mo>
<msub>
<mi>D</mi>
<mrow>
<msubsup>
<mi>JPG</mi>
<mi>r</mi>
<mo>&prime;</mo>
</msubsup>
</mrow>
</msub>
<mo>&CenterDot;</mo>
<msub>
<mi>W</mi>
<mrow>
<msubsup>
<mi>JPG</mi>
<mi>r</mi>
<mo>&prime;</mo>
</msubsup>
</mrow>
</msub>
<mo>)</mo>
<mo>-</mo>
<mfrac>
<mn>1</mn>
<msup>
<mi>M</mi>
<mo>&prime;</mo>
</msup>
</mfrac>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>r</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<msup>
<mi>M</mi>
<mo>&prime;</mo>
</msup>
</munderover>
<mo>(</mo>
<msub>
<mi>D</mi>
<mrow>
<msubsup>
<mi>JPG</mi>
<mi>r</mi>
<mo>&prime;</mo>
</msubsup>
</mrow>
</msub>
<mo>&CenterDot;</mo>
<msub>
<mi>W</mi>
<mrow>
<msubsup>
<mi>JPG</mi>
<mi>r</mi>
<mo>&prime;</mo>
</msubsup>
</mrow>
</msub>
<mo>)</mo>
<mo>)</mo>
</mrow>
</msqrt>
</mrow>
<mrow>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>r</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<msup>
<mi>M</mi>
<mo>&prime;</mo>
</msup>
</munderover>
<msqrt>
<mrow>
<mo>(</mo>
<mo>(</mo>
<msubsup>
<mi>s</mi>
<mrow>
<msubsup>
<mi>JPG</mi>
<mi>r</mi>
<mo>&prime;</mo>
</msubsup>
</mrow>
<mo>&prime;</mo>
</msubsup>
<mo>&CenterDot;</mo>
<msubsup>
<mi>s</mi>
<mrow>
<msubsup>
<mi>JPG</mi>
<mi>r</mi>
<mo>&prime;</mo>
</msubsup>
</mrow>
<mrow>
<mo>&prime;</mo>
<mo>&prime;</mo>
</mrow>
</msubsup>
<mo>)</mo>
<mo>-</mo>
<mfrac>
<mn>1</mn>
<msup>
<mi>M</mi>
<mo>&prime;</mo>
</msup>
</mfrac>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>r</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<msup>
<mi>M</mi>
<mo>&prime;</mo>
</msup>
</munderover>
<mo>(</mo>
<msubsup>
<mi>s</mi>
<mrow>
<msubsup>
<mi>JPG</mi>
<mi>r</mi>
<mo>&prime;</mo>
</msubsup>
</mrow>
<mo>&prime;</mo>
</msubsup>
<mo>&CenterDot;</mo>
<msubsup>
<mi>s</mi>
<mrow>
<msubsup>
<mi>JPG</mi>
<mi>r</mi>
<mo>&prime;</mo>
</msubsup>
</mrow>
<mrow>
<mo>&prime;</mo>
<mo>&prime;</mo>
</mrow>
</msubsup>
<mo>)</mo>
<mo>)</mo>
</mrow>
</msqrt>
</mrow>
</mfrac>
<mo>;</mo>
</mrow>
Wherein, the face-image JPG ' of current operator is markedrIn facial gross area value beEyebrow distance values are
Ears distance values areMouth area value isSimple eye area value is
<mrow>
<msubsup>
<mi>&sigma;</mi>
<mrow>
<mi>V</mi>
<mi>o</mi>
<mi>i</mi>
<mi>c</mi>
<mi>e</mi>
</mrow>
<mo>&prime;</mo>
</msubsup>
<mo>=</mo>
<mfrac>
<mrow>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>k</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<msup>
<mi>N</mi>
<mo>&prime;</mo>
</msup>
</munderover>
<mo>|</mo>
<msub>
<mi>A</mi>
<mrow>
<msubsup>
<mi>Voice</mi>
<mi>k</mi>
<mo>&prime;</mo>
</msubsup>
</mrow>
</msub>
<mo>+</mo>
<msub>
<mi>E</mi>
<mrow>
<msubsup>
<mi>Voice</mi>
<mi>k</mi>
<mo>&prime;</mo>
</msubsup>
</mrow>
</msub>
<mo>|</mo>
</mrow>
<mrow>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>k</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<msup>
<mi>N</mi>
<mo>&prime;</mo>
</msup>
</munderover>
<mo>|</mo>
<mfrac>
<msub>
<mi>A</mi>
<mrow>
<msubsup>
<mi>Voice</mi>
<mi>k</mi>
<mo>&prime;</mo>
</msubsup>
</mrow>
</msub>
<msub>
<mi>E</mi>
<mrow>
<msubsup>
<mi>Voice</mi>
<mi>k</mi>
<mo>&prime;</mo>
</msubsup>
</mrow>
</msub>
</mfrac>
<mo>|</mo>
<mo>+</mo>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>k</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<msup>
<mi>N</mi>
<mo>&prime;</mo>
</msup>
</munderover>
<mo>|</mo>
<mfrac>
<msub>
<mi>E</mi>
<mrow>
<msubsup>
<mi>Voice</mi>
<mi>k</mi>
<mo>&prime;</mo>
</msubsup>
</mrow>
</msub>
<msub>
<mi>A</mi>
<mrow>
<msubsup>
<mi>Voice</mi>
<mi>k</mi>
<mo>&prime;</mo>
</msubsup>
</mrow>
</msub>
</mfrac>
<mo>|</mo>
</mrow>
</mfrac>
<mo>&CenterDot;</mo>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>k</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<msup>
<mi>N</mi>
<mo>&prime;</mo>
</msup>
</munderover>
<msqrt>
<mrow>
<mo>(</mo>
<mo>(</mo>
<msub>
<mi>A</mi>
<mrow>
<msubsup>
<mi>Voice</mi>
<mi>k</mi>
<mo>&prime;</mo>
</msubsup>
</mrow>
</msub>
<mo>&CenterDot;</mo>
<msub>
<mi>E</mi>
<mrow>
<msubsup>
<mi>Voice</mi>
<mi>k</mi>
<mo>&prime;</mo>
</msubsup>
</mrow>
</msub>
<mo>)</mo>
<mo>-</mo>
<mfrac>
<mn>1</mn>
<msup>
<mi>N</mi>
<mo>&prime;</mo>
</msup>
</mfrac>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>k</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<msup>
<mi>N</mi>
<mo>&prime;</mo>
</msup>
</munderover>
<mo>(</mo>
<msub>
<mi>A</mi>
<mrow>
<msubsup>
<mi>Voice</mi>
<mi>k</mi>
<mo>&prime;</mo>
</msubsup>
</mrow>
</msub>
<mo>&CenterDot;</mo>
<msub>
<mi>E</mi>
<mrow>
<msubsup>
<mi>Voice</mi>
<mi>k</mi>
<mo>&prime;</mo>
</msubsup>
</mrow>
</msub>
<mo>)</mo>
<mo>)</mo>
</mrow>
</msqrt>
<mo>;</mo>
</mrow>
Wherein, the kth section audio of the current operator is labeled as Voice'k, the audio Voice'kAverage amplitude value mark
It is denoted asThe audio Voice'kPower consumption number be1≤k≤N';
Step 12, the mobile terminal joins the face recognition match parameter of the validated user of storage and audio identification matching
Number respectively correspondingly does matching judgment with the face recognition match parameter of the current operator and audio identification match parameter, with
Judge the legal identity of current operator:
As the face recognition match parameter σ of the validated userFaceWith the face recognition match parameter of the current operator
σ'FaceIt is equal, and the audio identification match parameter σ of the validated userVoiceIt is matched with the audio identification of the current operator
Parameter σ 'VoiceWhen equal, the mobile terminal judges current operator for its validated user, and is transferred to step 13;Otherwise, it is described
The current payment program of mobile terminal positive closing;
Step 13, the mobile terminal detects its current angle of inclination, and judges that current tilt angle is pre- with mobile terminal
When the angle of inclination first set is equal, the mobile terminal automatically using the sine value corresponding to pre-set angle of inclination as
Payment cipher, to be safely completed delivery operation.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711468123.8A CN108053218A (en) | 2017-12-29 | 2017-12-29 | A kind of safe method of mobile payment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711468123.8A CN108053218A (en) | 2017-12-29 | 2017-12-29 | A kind of safe method of mobile payment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108053218A true CN108053218A (en) | 2018-05-18 |
Family
ID=62129145
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711468123.8A Pending CN108053218A (en) | 2017-12-29 | 2017-12-29 | A kind of safe method of mobile payment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108053218A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109286831A (en) * | 2018-10-08 | 2019-01-29 | 宁波大学 | A kind of intelligent video wireless security connection control method |
CN109408087A (en) * | 2018-10-08 | 2019-03-01 | 宁波大学 | A kind of mobile intelligent terminal registration center upgrade method |
CN112258193A (en) * | 2019-08-16 | 2021-01-22 | 创新先进技术有限公司 | Payment method and device |
WO2021128847A1 (en) * | 2019-12-25 | 2021-07-01 | 深圳壹账通智能科技有限公司 | Terminal interaction method and apparatus, computer device, and storage medium |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103164645A (en) * | 2011-12-09 | 2013-06-19 | 康佳集团股份有限公司 | Information security management method and mobile terminal |
CN103679451A (en) * | 2012-09-18 | 2014-03-26 | 上海语联信息技术有限公司 | Specific person voice-based real-time online payment technology and application thereof |
CN104134029A (en) * | 2014-07-31 | 2014-11-05 | 中山市品汇创新专利技术开发有限公司 | Multichannel E-bank safety certification method based on voice recognition verification |
CN104200366A (en) * | 2014-09-15 | 2014-12-10 | 长沙市梦马软件有限公司 | Voice payment authentication method and system |
CN105023155A (en) * | 2015-07-29 | 2015-11-04 | 广东欧珀移动通信有限公司 | Payment method, network equipment and terminal equipment |
CN105160530A (en) * | 2015-07-31 | 2015-12-16 | 努比亚技术有限公司 | Mobile terminal and information processing method |
CN105761074A (en) * | 2016-02-03 | 2016-07-13 | 浙江万里学院 | Self pick-up management method for business logistics commodity based on NFC payment |
CN105989497A (en) * | 2016-03-07 | 2016-10-05 | 李明 | Payment method and system |
CN107516071A (en) * | 2017-07-31 | 2017-12-26 | 芜湖市振华戎科智能科技有限公司 | Face identification system for safety check |
-
2017
- 2017-12-29 CN CN201711468123.8A patent/CN108053218A/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103164645A (en) * | 2011-12-09 | 2013-06-19 | 康佳集团股份有限公司 | Information security management method and mobile terminal |
CN103679451A (en) * | 2012-09-18 | 2014-03-26 | 上海语联信息技术有限公司 | Specific person voice-based real-time online payment technology and application thereof |
CN104134029A (en) * | 2014-07-31 | 2014-11-05 | 中山市品汇创新专利技术开发有限公司 | Multichannel E-bank safety certification method based on voice recognition verification |
CN104200366A (en) * | 2014-09-15 | 2014-12-10 | 长沙市梦马软件有限公司 | Voice payment authentication method and system |
CN105023155A (en) * | 2015-07-29 | 2015-11-04 | 广东欧珀移动通信有限公司 | Payment method, network equipment and terminal equipment |
CN105160530A (en) * | 2015-07-31 | 2015-12-16 | 努比亚技术有限公司 | Mobile terminal and information processing method |
CN105761074A (en) * | 2016-02-03 | 2016-07-13 | 浙江万里学院 | Self pick-up management method for business logistics commodity based on NFC payment |
CN105989497A (en) * | 2016-03-07 | 2016-10-05 | 李明 | Payment method and system |
CN107516071A (en) * | 2017-07-31 | 2017-12-26 | 芜湖市振华戎科智能科技有限公司 | Face identification system for safety check |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109286831A (en) * | 2018-10-08 | 2019-01-29 | 宁波大学 | A kind of intelligent video wireless security connection control method |
CN109408087A (en) * | 2018-10-08 | 2019-03-01 | 宁波大学 | A kind of mobile intelligent terminal registration center upgrade method |
CN109286831B (en) * | 2018-10-08 | 2021-04-16 | 宁波大学 | Intelligent video wireless safety access control method |
CN109408087B (en) * | 2018-10-08 | 2021-10-26 | 宁波大学 | Mobile intelligent terminal registration center upgrading method |
CN112258193A (en) * | 2019-08-16 | 2021-01-22 | 创新先进技术有限公司 | Payment method and device |
CN112258193B (en) * | 2019-08-16 | 2024-01-30 | 创新先进技术有限公司 | Payment method and device |
WO2021128847A1 (en) * | 2019-12-25 | 2021-07-01 | 深圳壹账通智能科技有限公司 | Terminal interaction method and apparatus, computer device, and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
TWI676947B (en) | Payment authentication method and authentication system based on face recognition and HCE | |
CN108875491B (en) | Data updating method, authentication equipment and system for face unlocking authentication and nonvolatile storage medium | |
JP4862447B2 (en) | Face recognition system | |
CN108053218A (en) | A kind of safe method of mobile payment | |
CN102722696B (en) | Identity authentication method of identity card and holder based on multi-biological characteristics | |
US20130275309A1 (en) | Electronic-payment authentication process with an eye-positioning method for unlocking a pattern lock | |
CN110313008A (en) | Method and smart card suitable for the registration of gradually fingerprint | |
CN103310339A (en) | Identity recognition device and method as well as payment system and method | |
CN104462922A (en) | Method for verifying authorization on basis of biological recognition | |
CN105975837B (en) | Calculate equipment, biological feather recognition method and template register method | |
CN106056054A (en) | Fingerprint identification method and terminal | |
CN111095246B (en) | Method and electronic device for authenticating user | |
CN109858220A (en) | Electronic contract signs method, apparatus, terminal device and storage medium | |
CN109478290A (en) | The method that user is authenticated or identified based on finger scan | |
CN111815833A (en) | Hotel access control authentication system based on intelligent identification and encryption technology | |
CN106022754B (en) | Mobile wallet near-field communication cCredit card payment method | |
TWI754964B (en) | Authentication system, authentication device, authentication method, and program product | |
KR101853270B1 (en) | Authentication method for portable secure authentication apparatus using fingerprint | |
CN109960907A (en) | A kind of method for identifying ID and equipment | |
Prasanthi et al. | Palm vein biometric technology: An approach to upgrade security in ATM transactions | |
KR101960801B1 (en) | smart device with biometrics registration function and methods for registering biometric information | |
CN205680188U (en) | ATM device | |
CN115481381A (en) | Testimony comparison method, device and medium for self-service terminal | |
Li et al. | The evolution of biometrics | |
JP6891355B1 (en) | Authentication system, authentication device, authentication method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180518 |
|
RJ01 | Rejection of invention patent application after publication |