CN112487885A - Payment method, payment device, electronic equipment and readable storage medium - Google Patents

Payment method, payment device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN112487885A
CN112487885A CN202011277942.6A CN202011277942A CN112487885A CN 112487885 A CN112487885 A CN 112487885A CN 202011277942 A CN202011277942 A CN 202011277942A CN 112487885 A CN112487885 A CN 112487885A
Authority
CN
China
Prior art keywords
target user
image
payment
information
face
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011277942.6A
Other languages
Chinese (zh)
Inventor
李佳程
白冉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Citic Bank Corp Ltd
Original Assignee
China Citic Bank Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Citic Bank Corp Ltd filed Critical China Citic Bank Corp Ltd
Priority to CN202011277942.6A priority Critical patent/CN112487885A/en
Publication of CN112487885A publication Critical patent/CN112487885A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/38Payment protocols; Details thereof
    • G06Q20/40Authorisation, e.g. identification of payer or payee, verification of customer or shop credentials; Review and approval of payers, e.g. check credit lines or negative lists
    • G06Q20/401Transaction verification
    • G06Q20/4014Identity check for transactions
    • G06Q20/40145Biometric identity checks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Abstract

The application provides a payment method, a payment device, electronic equipment and a readable storage medium, which are applied to the field of image processing, wherein the method comprises the steps of collecting an image of a target user; determining expression information and gesture information of the target user based on the image of the target user; comparing the determined expression information and gesture information of the target user with prestored expression information and gesture information for payment of the target user; and determining whether to pay or not based on the comparison result of the expression information and the gesture information. Only when the expression and the gesture of the user accord with the preset payment expression and gesture, payment is carried out, so that the situation that the user is not aware of is avoided, and the payment safety is improved.

Description

Payment method, payment device, electronic equipment and readable storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a payment method, an apparatus, an electronic device, and a readable storage medium.
Background
With the rapid development and the gradual maturity of the internet technology and the artificial intelligence technology, the life style of people becomes more and more efficient, convenient and intelligent, and the people can realize payment through face recognition after the shopping of off-line merchant stores, taking daily consumption as an example.
However, the face recognition mode has certain drawback, and the user is brushed face payment under the condition of being unwilling or compelled very easily, and after other users obtain their payment terminal, only need with payment terminal alignment user face, do not need user's initiative cooperation, can accomplish the payment. The existing face brushing payment has the problem of poor safety.
Disclosure of Invention
The application provides a payment method, a payment device, electronic equipment and a readable storage medium, which are used for improving the safety of face brushing payment, and the technical scheme adopted by the application is as follows:
in a first aspect, there is provided a payment method, the method comprising,
collecting an image of a target user;
determining expression information and gesture information of the target user based on the image of the target user;
comparing the determined expression information and gesture information of the target user with prestored expression information and gesture information paid by the target user;
and determining whether to pay or not based on the comparison result of the expression information and the gesture information.
Optionally, the method further comprises:
determining the position relation between the hand and the face of the target user based on the image of the target user;
comparing the determined position relation between the hand and the face of the target user with a pre-stored position relation between the hand and the face paid by the target user, and determining a comparison result of the position relation between the hand and the face;
determining whether to pay or not based on the comparison result of the expression information and the gesture information, wherein the determining step comprises the following steps:
and confirming whether to pay or not based on the comparison result of the expression information and the gesture information and the comparison result of the position relation between the hand and the face.
Optionally, determining the position relationship between the hand and the face of the target user based on the image of the target user includes:
carrying out image segmentation on the image of the target user, and extracting a hand image and a face image;
and determining the position relation between the hand and the face of the target user based on the extracted hand image and the extracted face image.
Optionally, the method further comprises:
and acquiring the position relation between the hand and the face for payment input by the user.
Optionally, determining expression information and gesture information of the target user based on the image of the target user, previously including:
identifying the user based on the image of the target user;
and when the user identity is identified to be legal, determining expression information and gesture information of the target user based on the image of the target user.
Optionally, the method further comprises:
and pre-collecting payment expression information and payment gesture information of a target user.
In a second aspect, there is provided a payment device, the device comprising,
the acquisition module is used for acquiring an image of a target user;
the first determination module is used for determining expression information and gesture information of the target user based on the image of the target user;
the first comparison module is used for comparing the determined expression information and gesture information of the target user with prestored expression information and gesture information for payment of the target user;
and the second determination module is used for determining whether to pay or not based on the comparison result of the expression information and the gesture information.
Optionally, the apparatus further comprises:
the second determination module is used for determining the position relation between the hand and the face of the target user based on the image of the target user;
the first comparison module is used for comparing the determined position relationship between the hand and the face of the target user with a pre-stored position relationship between the hand and the face paid by the target user and determining a comparison result of the position relationship between the hand and the face;
and the second determining module is specifically used for determining whether to pay or not based on the comparison result of the expression information and the gesture information and the comparison result of the position relation between the hand and the face.
Optionally, the second determining module includes:
the image segmentation unit is used for carrying out image segmentation on the image of the target user and extracting a hand image and a face image;
and the determining unit is used for determining the position relation between the hand and the face of the target user based on the extracted hand image and face image.
Optionally, the apparatus further comprises:
and the acquisition module is used for acquiring the position relation between the hand and the face for payment input by the user.
Optionally, the apparatus further comprises:
the identity recognition module is used for carrying out identity recognition on the user based on the image of the target user;
the first determining module is specifically used for determining expression information and gesture information of the target user based on the image of the target user after the user identity is identified to be legal.
Optionally, the apparatus further comprises:
and the pre-acquisition module is used for pre-acquiring payment expression information and payment gesture information of the target user. In a third aspect, an electronic device is provided, which includes:
one or more processors;
a memory;
one or more application programs, wherein the one or more application programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to: the payment method shown in the first aspect is performed.
In a fourth aspect, there is provided a computer-readable storage medium for storing computer instructions which, when run on a computer, cause the computer to perform the payment method of the first aspect.
Compared with the prior art that only face features are compared to determine whether payment is carried out, the payment method and device, the electronic equipment and the readable storage medium have the advantages that images of target users are collected; determining expression information and gesture information of the target user based on the image of the target user; comparing the determined expression information and gesture information of the target user with prestored expression information and gesture information for payment of the target user; and determining whether to pay or not based on the comparison result of the expression information and the gesture information. Only when the expression and the gesture of the user accord with the preset payment expression and gesture, payment is carried out, so that the situation that the user is not aware of is avoided, and the payment safety is improved.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The foregoing and additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a schematic flow chart of a payment method according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of a payment apparatus according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 4 is a diagram illustrating an example of a payment process based on expressions and gestures according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to the embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary only for the purpose of explaining the present application and are not to be construed as limiting the present application.
As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and groups thereof. As used herein, the term "and" includes all or any element and all combinations of one or more of the associated listed items.
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
The following describes the technical solutions of the present application and how to solve the above technical problems with specific embodiments. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
Implement one
An embodiment of the present application provides a payment method, as shown in fig. 1, the method may include the following steps:
step S101, collecting an image of a target user;
specifically, the image of the user may be acquired through a terminal device such as a mobile phone, a PAD, and a wearable device of the target user, or a self-service terminal device of the merchant. Specifically, before the image of the target user is acquired, a prompt for the user to pay for face brushing may be issued.
Step S102, determining expression information and gesture information of a target user based on an image of the target user;
specifically, the expression information and the gesture information of the target user can be determined based on the image of the target user through a corresponding image processing method; specifically, the face or hand position of the target user may be determined by a target recognition algorithm, and then the user's expression (e.g., smile, grin, left-handed person) or gesture (V, thumbs, orchid, etc., as well as the particular number represented by the finger) may be determined.
Step S103, comparing the determined expression information and gesture information of the target user with prestored expression information and gesture information paid by the target user;
specifically, the determined expression information and gesture information of the target user are compared with prestored expression information and gesture information paid by the target user; specifically, it may be determined whether the target user's expression is consistent with the expression for making payment by calculating the similarity between the image features, that is, if the similarity is within a predetermined pre-determined range, the expression is consistent, and if the similarity is beyond the predetermined pre-determined range, the expression is not consistent. Specifically, the comparison of the gesture may be performed in a manner of calculating image feature similarity (that is, pre-storing image features corresponding to the gesture), or may be performed by first recognizing a gesture motion, and then comparing the gesture motion with a text corresponding to the pre-stored gesture motion (that is, the pre-stored gesture is represented by a text form), that is, performing gesture feature recognition, if the gesture motion is recognized as V, and then comparing the gesture motion with the text corresponding to the pre-stored gesture, where if the corresponding text is V, the matching results are consistent.
And step S104, determining whether to pay or not based on the comparison result of the expression information and the gesture information.
Specifically, when the comparison result of the expression information and the gesture information passes, the payment is performed.
Compared with the prior art that whether payment is carried out is determined by comparing only face features, the payment method has the advantages that images of target users are collected; determining expression information and gesture information of the target user based on the image of the target user; comparing the determined expression information and gesture information of the target user with prestored expression information and gesture information for payment of the target user; and determining whether to pay or not based on the comparison result of the expression information and the gesture information. Only when the expression and the gesture of the user accord with the preset payment expression and gesture, payment is carried out, so that the situation that the user is not aware of is avoided, and the payment safety is improved.
The embodiment of the present application further provides a possible implementation manner, and optionally, the method further includes:
determining the position relation between the hand and the face of the target user based on the image of the target user; comparing the determined position relation between the hand and the face of the target user with a pre-stored position relation between the hand and the face paid by the target user, and determining a comparison result of the position relation between the hand and the face;
determining whether to pay or not based on the comparison result of the expression information and the gesture information, wherein the determining step comprises the following steps:
and confirming whether to pay or not based on the comparison result of the expression information and the gesture information and the comparison result of the position relation between the hand and the face.
For example, it may be set that the hand gesturing at the time of payment must be on the left face or the right face, and payment may be made only when the user's gesture of payment is recognized as being on the left face or the right face.
For the embodiment of the application, payment can be carried out only when the expression and the gesture of the user accord with the preset payment expression and the payment gesture, and the hand doing the payment gesture is at the specific position of the face, so that the payment safety is further improved.
Optionally, determining the position relationship between the hand and the face of the target user based on the image of the target user includes:
carrying out image segmentation on the image of the target user, and extracting a hand image and a face image; specifically, the hand image and the face image can be extracted and obtained through a corresponding target positioning and recognition algorithm.
And determining the position relation between the hand and the face of the target user based on the extracted hand image and the extracted face image.
Optionally, the method further comprises:
and acquiring the position relation between the hand and the face for payment input by the user.
Specifically, the position relationship between the hand and the face for payment input by the user may be represented in a text form, so as to avoid the problem of large data processing amount caused by calculating the image similarity.
Optionally, determining expression information and gesture information of the target user based on the image of the target user, previously including:
identifying the user based on the image of the target user;
and when the user identity is identified to be legal, determining expression information and gesture information of the target user based on the image of the target user.
Specifically, payment is carried out only when the identity authentication of the target user passes and the expression and the gesture accord with the preset payment expression or gesture, so that the payment safety is improved.
Optionally, the method further comprises:
and pre-collecting payment expression information and payment gesture information of a target user.
Exemplarily, fig. 4 shows an example of payment based on expressions and gestures, which mainly includes the following steps:
step S01, collecting specific payment expression information selected by a user through a camera, such as smiling, grin-through, left-handed people and the like;
step S02, collecting specific payment gesture actions selected by a user through a camera, such as V-shaped, vertical thumb, orchid finger and the like;
step S03, storing the payment expression characteristic information and the gesture action characteristic information set by the user in a server, and associating the payment expression characteristic information and the gesture action characteristic information with account information;
step S11, the user performs payment activities;
step S12, the payment terminal camera is opened;
step S13, the user faces the camera and makes corresponding payment expression and payment gesture actions at the same time;
step S14, after capturing the expression and the gesture, analyzing the expression characteristics and the gesture characteristics of the user;
step S15, comparing the expression characteristics and the gesture characteristics with the server, if the expression characteristics are consistent with the pre-stored expression characteristic information and the gesture characteristics are consistent with the pre-stored gesture characteristics, executing step S16, otherwise executing step S17;
step S16, the payment is completed after the verification is passed;
and step S17, prompting that the verification fails, popping up a password box and prompting that the password is verified.
Example two
Fig. 2 is a payment apparatus provided in an embodiment of the present application, where the apparatus 20 includes:
an acquisition module 201, configured to acquire an image of a target user;
a first determining module 202, configured to determine expression information and gesture information of a target user based on an image of the target user;
the first comparison module 203 is configured to compare the determined expression information and gesture information of the target user with prestored expression information and gesture information for payment of the target user;
and a second determining module 204, configured to determine whether to pay based on a comparison result of the expression information and the gesture information.
Compared with the prior art that whether payment is carried out is determined by comparing facial features, the payment device provided by the embodiment of the application acquires images of target users; determining expression information and gesture information of the target user based on the image of the target user; comparing the determined expression information and gesture information of the target user with prestored expression information and gesture information for payment of the target user; and determining whether to pay or not based on the comparison result of the expression information and the gesture information. Only when the expression gesture of the user accords with the preset payment expression and gesture, payment is carried out, so that the situation that the user is not aware of is avoided, and the payment safety is improved.
The payment device of this embodiment can execute a payment method provided in the above embodiments of the present application, and the implementation principles thereof are similar and will not be described herein again.
The embodiment of the present application provides a possible implementation manner, and optionally, the apparatus further includes:
the second determination module is used for determining the position relation between the hand and the face of the target user based on the image of the target user;
the first comparison module is used for comparing the determined position relationship between the hand and the face of the target user with a pre-stored position relationship between the hand and the face paid by the target user and determining a comparison result of the position relationship between the hand and the face;
and the second determining module is specifically used for determining whether to pay or not based on the comparison result of the expression information and the gesture information and the comparison result of the position relation between the hand and the face.
Optionally, the second determining module includes:
the image segmentation unit is used for carrying out image segmentation on the image of the target user and extracting a hand image and a face image;
and the determining unit is used for determining the position relation between the hand and the face of the target user based on the extracted hand image and face image.
Optionally, the apparatus further comprises:
and the acquisition module is used for acquiring the position relation between the hand and the face for payment input by the user.
Optionally, the apparatus further comprises:
the identity recognition module is used for carrying out identity recognition on the user based on the image of the target user;
the first determining module is specifically used for determining expression information and gesture information of the target user based on the image of the target user after the user identity is identified to be legal.
Optionally, the apparatus further comprises:
and the pre-acquisition module is used for pre-acquiring payment expression information and payment gesture information of the target user.
The embodiment of the present application provides a payment apparatus, which is suitable for the method shown in the above embodiment, and details are not repeated here.
EXAMPLE III
An embodiment of the present application provides an electronic device, as shown in fig. 3, an electronic device 30 shown in fig. 3 includes: a processor 3001 and a memory 3003. The processor 3001 is coupled to the memory 3003, such as via a bus 3002. Further, the electronic device 30 may also include a transceiver 3003. It should be noted that the transceiver 3004 is not limited to one in practical applications, and the structure of the electronic device 30 is not limited to the embodiment of the present application. The processor 3001 is applied in the embodiment of the present application to implement the functions of the modules shown in fig. 2. The transceiver 3003 includes a receiver and a transmitter.
The processor 3001 may be a CPU, general purpose processor, DSP, ASIC, FPGA or other programmable logic device, transistor logic device, hardware component, or any combination thereof. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. The processor 3001 may also be a combination of computing functions, e.g., comprising one or more microprocessors, a combination of a DSP and a microprocessor, or the like.
Bus 3002 may include a path that conveys information between the aforementioned components. The bus 3002 may be a PCI bus or an EISA bus, etc. The bus 3002 may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in FIG. 3, but this does not mean only one bus or one type of bus.
Memory 3003 may be, but is not limited to, a ROM or other type of static storage device that can store static information and instructions, a RAM or other type of dynamic storage device that can store information and instructions, an EEPROM, a CD-ROM or other optical disk storage, optical disk storage (including compact disk, laser disk, optical disk, digital versatile disk, blu-ray disk, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer.
The memory 3003 is used for storing application program codes for performing the present scheme, and is controlled to be executed by the processor 3001. The processor 3001 is configured to execute application program code stored in the memory 3003 to implement the functions of the payment apparatus provided by the embodiment shown in fig. 2.
Compared with the prior art that whether payment is carried out or not is determined only by comparing human face features, the electronic equipment acquires images of target users; determining expression information and gesture information of the target user based on the image of the target user; comparing the determined expression information and gesture information of the target user with prestored expression information and gesture information for payment of the target user; and determining whether to pay or not based on the comparison result of the expression information and the gesture information. Only when the expression and the gesture of the user accord with the preset payment expression and gesture, payment is carried out, so that the situation that the user is not aware of is avoided, and the payment safety is improved.
The embodiment of the application provides an electronic device suitable for the method embodiment. And will not be described in detail herein.
The present application provides a computer-readable storage medium, on which a computer program is stored, and when the program is executed by a processor, the method shown in the above embodiments is implemented.
Compared with the prior art that only face features are compared to determine whether payment is carried out, the method and the device for determining the payment are characterized by acquiring images of target users; determining expression information and gesture information of the target user based on the image of the target user; comparing the determined expression information and gesture information of the target user with prestored expression information and gesture information for payment of the target user; and determining whether to pay or not based on the comparison result of the expression information and the gesture information. Only when the expression and the gesture of the user accord with the preset payment expression and gesture, payment is carried out, so that the situation that the user is not aware of is avoided, and the payment safety is improved.
The embodiment of the application provides a computer-readable storage medium which is suitable for the method embodiment. And will not be described in detail herein.
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least a portion of the steps in the flow chart of the figure may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed alternately or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
The foregoing is only a partial embodiment of the present application, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present application, and these modifications and decorations should also be regarded as the protection scope of the present application.

Claims (10)

1. A payment method, comprising:
collecting an image of a target user;
determining expression information and gesture information of the target user based on the image of the target user;
comparing the determined expression information and gesture information of the target user with prestored expression information and gesture information paid by the target user;
and determining whether to pay or not based on the comparison result of the expression information and the gesture information.
2. The method of claim 1, further comprising:
determining a position relationship of a hand and a face of the target user based on the image of the target user;
comparing the determined position relation between the hand and the face of the target user with a pre-stored position relation between the hand and the face paid by the target user, and determining a comparison result of the position relation between the hand and the face;
the determining whether to pay based on the comparison result of the expression information and the gesture information includes:
and confirming whether to pay or not based on the comparison result based on the expression information and the gesture information and the comparison result based on the position relation between the hand and the face.
3. The method of claim 2, wherein determining a positional relationship of a hand and a face of a target user based on an image of the target user comprises:
carrying out image segmentation on the image of the target user, and extracting a hand image and a face image;
and determining the position relation between the hand and the face of the target user based on the extracted hand image and the extracted face image.
4. The method of claim 3, further comprising:
and acquiring the position relation between the hand and the face for payment input by the user.
5. The method of any of claims 1-4, wherein determining the target user's expression information and gesture information based on the target user's image previously comprises:
identifying the user based on the image of the target user;
and when the user identity is identified to be legal, determining expression information and gesture information of the target user based on the image of the target user.
6. The method according to any one of claims 1-4, characterized in that the method further comprises:
and pre-collecting payment expression information and payment gesture information of a target user.
7. A payment device, comprising:
the acquisition module is used for acquiring an image of a target user;
the first determination module is used for determining expression information and gesture information of the target user based on the image of the target user;
the first comparison module is used for comparing the determined expression information and gesture information of the target user with prestored expression information and gesture information for payment of the target user;
and the second determination module is used for determining whether to pay or not based on the comparison result of the expression information and the gesture information.
8. The apparatus of claim 7, further comprising:
a second determination module, configured to determine a position relationship between a hand and a face of the target user based on the image of the target user;
the first comparison module is used for comparing the determined position relationship between the hand and the face of the target user with a pre-stored position relationship between the hand and the face paid by the target user and determining a comparison result of the position relationship between the hand and the face;
the second determining module is specifically configured to request whether to pay based on the comparison result based on the expression information and the gesture information and the comparison result based on the position relationship between the hand and the face.
9. An electronic device, comprising:
one or more processors;
a memory;
one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to: -executing a payment method according to any one of claims 1 to 6.
10. A computer-readable storage medium for storing computer instructions which, when run on a computer, cause the computer to perform the payment method of any one of claims 1 to 6.
CN202011277942.6A 2020-11-16 2020-11-16 Payment method, payment device, electronic equipment and readable storage medium Pending CN112487885A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011277942.6A CN112487885A (en) 2020-11-16 2020-11-16 Payment method, payment device, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011277942.6A CN112487885A (en) 2020-11-16 2020-11-16 Payment method, payment device, electronic equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN112487885A true CN112487885A (en) 2021-03-12

Family

ID=74930500

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011277942.6A Pending CN112487885A (en) 2020-11-16 2020-11-16 Payment method, payment device, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN112487885A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113657903A (en) * 2021-08-16 2021-11-16 支付宝(杭州)信息技术有限公司 Face-brushing payment method and device, electronic equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113657903A (en) * 2021-08-16 2021-11-16 支付宝(杭州)信息技术有限公司 Face-brushing payment method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN107093066B (en) Service implementation method and device
CN112084946B (en) Face recognition method and device and electronic equipment
CN107491965B (en) Method and device for establishing biological feature library
CN110163096B (en) Person identification method, person identification device, electronic equipment and computer readable medium
US11367310B2 (en) Method and apparatus for identity verification, electronic device, computer program, and storage medium
Kalas Real time face detection and tracking using OpenCV
CN109416734B (en) Adaptive quantization method for iris image coding
CN110046806B (en) Method and device for customer service order and computing equipment
CN107886330A (en) Settlement method, apparatus and system
CN110795714A (en) Identity authentication method and device, computer equipment and storage medium
CN111191207A (en) Electronic file control method and device, computer equipment and storage medium
Belkhede et al. Biometric mechanism for enhanced security of online transaction on Android system: A design approach
Syazana-Itqan et al. A MATLAB-based convolutional neural network approach for face recognition system
CN110991231B (en) Living body detection method and device, server and face recognition equipment
WO2017079166A1 (en) High speed reference point independent database filtering for fingerprint identification
Okpara et al. Cam-Wallet: Fingerprint-based authentication in M-wallets using embedded cameras
CN111368814A (en) Identity recognition method and system
El-Abed et al. Quality assessment of image-based biometric information
Hossain et al. Incorporating deep learning into capacitive images for smartphone user authentication
CN112487885A (en) Payment method, payment device, electronic equipment and readable storage medium
Ríos-Sánchez et al. gb2sμMOD: A MUltiMODal biometric video database using visible and IR light
Eid et al. Development of Iris Security System Using Adaptive Quality-Based Template Fusion
US20170293410A1 (en) Biometric state switching
Ren et al. Face and facial expressions recognition and analysis
WO2022222735A1 (en) Information processing method and apparatus, computer device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination