WO2021164511A1 - 基于眼球追踪的信息处理方法及系统、支付处理方法 - Google Patents

基于眼球追踪的信息处理方法及系统、支付处理方法 Download PDF

Info

Publication number
WO2021164511A1
WO2021164511A1 PCT/CN2021/073910 CN2021073910W WO2021164511A1 WO 2021164511 A1 WO2021164511 A1 WO 2021164511A1 CN 2021073910 W CN2021073910 W CN 2021073910W WO 2021164511 A1 WO2021164511 A1 WO 2021164511A1
Authority
WO
WIPO (PCT)
Prior art keywords
movement trajectory
eye movement
eye
payment
user
Prior art date
Application number
PCT/CN2021/073910
Other languages
English (en)
French (fr)
Inventor
苏涵
宋汉石
黄晓艳
曹宇
汪毅
Original Assignee
中国银联股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中国银联股份有限公司 filed Critical 中国银联股份有限公司
Publication of WO2021164511A1 publication Critical patent/WO2021164511A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/38Payment protocols; Details thereof
    • G06Q20/40Authorisation, e.g. identification of payer or payee, verification of customer or shop credentials; Review and approval of payers, e.g. check credit lines or negative lists
    • G06Q20/401Transaction verification
    • G06Q20/4014Identity check for transactions
    • G06Q20/40145Biometric identity checks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/40Spoof detection, e.g. liveness detection
    • G06V40/45Detection of the body part being alive

Definitions

  • the present invention relates to computer technology, in particular to an information processing method based on eye tracking, an information processing system based on eye tracking, a payment processing method based on eye tracking, and a payment processing device based on eye tracking.
  • Face-swiping payment has gradually become a new payment hotspot, bringing great convenience to mobile payment. Users only need to swipe their face to complete the payment without having to carry any equipment or cards. While feeling the convenience and speed of face payment, the security of face payment cannot be ignored. In face payment, determining the user's willingness to pay is a crucial step. With traditional card swiping and QR code payment, the user needs to enter the payment password or open the mobile phone, display the QR code or scan the QR code. These operations can reflect the user's willingness to pay to a certain extent.
  • the existing face-scanning payment is mainly divided into two categories according to the payment information provided by the user: (1) facial information payment; (2) facial information + payment password payment.
  • the normal process of facial information payment is that the facial scanning device starts the facial scanning payment, collects the user's facial information, verifies the facial information, and completes the payment. During this process, the user may be required to enter the completed mobile phone number or the last four digits of the mobile phone number, but as the number of face scans increases, this operation will be gradually cancelled.
  • the normal process of facial information + payment password payment is that the facial swiping device starts the swipe payment, collects the user's facial information, the user enters the payment password, verifies the facial information and the payment password, and completes the payment.
  • the problem in these existing technologies is that the main problem of the face information payment solution is that the user's willingness to pay cannot be confirmed. Especially in the mode that does not need to enter a mobile phone number, it is easy for users to be inadvertently collected by malicious merchants to complete facial information payment. Even if it is necessary to enter a mobile phone number, the mobile phone number can easily be leaked, causing the user to be stolen.
  • the problem is that the user needs to remember the payment password and enter the payment password on the keyboard, and the operation procedures will be cumbersome.
  • the present invention aims to propose an information processing method based on eye tracking, an information processing system based on eye tracking, a payment processing method based on eye tracking, and a payment processing method based on Payment processing device for eye tracking.
  • the setting step is to set the eye movement trajectory and the control instruction corresponding to the eye movement trajectory
  • the capturing step is to capture the eye movement trajectory of the user and obtain the eye movement trajectory
  • the processing step is to determine whether the eye movement trajectory captured in the capturing step is consistent with the eye movement trajectory set in the setting step, and when it is determined that the degree of agreement between the two is above a predetermined ratio, the control corresponding to the eye movement trajectory is executed instruction.
  • the setting step and before the capturing step it further includes:
  • the prompt step prompts the user to perform the eye movement trajectory.
  • a schematic diagram related to the eye movement trajectory is displayed on the display screen to prompt the user.
  • the captured current eye movement trajectory of the user is displayed on the display screen.
  • the facial image is captured while capturing the eye movement trajectory of the user.
  • the image capturing of the living body detection is performed while capturing the eye movement trajectory of the user.
  • the setting step is to set the eye movement trajectory and the payment-related control instructions corresponding to the eye movement trajectory;
  • the capturing step is to capture the eye movement trajectory of the user and obtain the eye movement trajectory
  • the processing step is to judge whether the eye movement trajectory captured in the capturing step is consistent with the eye movement trajectory set in the setting step, and when it is judged that the degree of agreement between the two is above a predetermined ratio, execute the corresponding eye movement trajectory Payment related control instructions.
  • the setting step and before the capturing step it further includes:
  • the display step is to display payment-related information and display the eye movement trajectory that needs to be performed by the user.
  • payment-related information is displayed on the display screen, and a schematic diagram related to the eye movement trajectory is displayed.
  • the captured current eye movement trajectory of the user is dynamically displayed on the display screen.
  • the facial image is captured while capturing the eye movement trajectory of the user.
  • the image capturing of the living body detection is performed while capturing the eye movement trajectory of the user.
  • the payment-related control instruction includes any one of the following:
  • the setting module is used to set the correspondence between the eye movement trajectory and the control command
  • the capture module is used to capture the eye movement trajectory of the user and obtain the eye movement trajectory
  • the processing module determines whether the eye movement trajectory captured by the capture module is consistent with the eye movement trajectory set by the setting module, and executes the control corresponding to the eye movement trajectory when it is judged that the degree of agreement between the two is above a specified ratio instruction.
  • a prompting module for prompting the user to follow the eye movement trajectory.
  • the prompt module prompts the user by displaying a schematic diagram related to the eye movement trajectory on the display screen.
  • the capturing module displays the captured current eye movement trajectory of the user on the display screen while capturing the eye movement trajectory of the user.
  • the setting module is used to set and store the eye movement trajectory and payment related control instructions corresponding to the eye movement trajectory;
  • the capture module is used to capture the eye movement trajectory of the user and obtain the eye movement trajectory
  • the processing module is used to determine whether the eye movement trajectory captured by the capture module is consistent with the eye movement trajectory set and stored in the setting module. When it is judged that the degree of agreement between the two is above a specified ratio, execute the same Payment related control instructions corresponding to the eye movement trajectory.
  • a display module for displaying payment-related information and displaying the eye movement trajectory required by the user.
  • the display module includes:
  • the first display sub-module is used to display payment related information
  • the second display sub-module is used to display schematic diagrams related to eye movement trajectories.
  • the capture module includes:
  • the first capturing sub-module is used to capture the eye movement trajectory of the user.
  • the second capture sub-module is used to capture the face image.
  • the capture module further includes:
  • the third sub-module is used to capture images of living body detection.
  • the second display submodule is further configured to dynamically display the user's current eye movement track captured by the first capture submodule on the display screen.
  • the payment-related control instruction includes any one of the following:
  • the computer readable medium of the present invention has a computer program stored thereon, and is characterized in that the computer program, when executed by a processor, implements the above-mentioned eye tracking-based payment processing method.
  • the computer device of the present invention includes a storage module, a processor, and a computer program that is stored on the storage module and can run on the processor, and is characterized in that, when the processor executes the computer program, the aforementioned eye tracking-based Payment processing method.
  • the computer-readable medium of the present invention has a computer program stored thereon, and is characterized in that the computer program implements the above-mentioned information processing method based on eye tracking when the computer program is executed by a processor.
  • the computer device of the present invention includes a storage module, a processor, and a computer program that is stored on the storage module and can run on the processor, and is characterized in that, when the processor executes the computer program, the aforementioned eye tracking-based Information processing methods.
  • the eye tracking-based payment processing method and payment processing device (such as a face swiping device) of the present invention can display guidelines for eye operations to the user through a display device, and complete an operation process with the eyes, such as main
  • the payment processing device captures and monitors the user's eye movements through the camera. If the user's eyeballs complete the operation correctly, it is considered as the user's autonomous payment request at this time, and if it is not completed correctly, it is considered not the user at this time.
  • the independent payment request can simplify user operations and improve convenience.
  • FIG. 1 is a schematic flowchart showing the information processing method based on eye tracking of the present invention.
  • FIG. 2 is a block diagram showing the structure of the information processing system based on eye tracking of the present invention.
  • Fig. 3 shows a framework diagram of a payment processing method based on eye tracking according to an embodiment of the present invention.
  • FIG. 4 is a schematic flowchart showing a payment processing method based on eye tracking according to an embodiment of the present invention.
  • 5 to 7 are specific schematic diagrams showing instructions related to eye operation.
  • FIG. 8 is a block diagram showing the structure of a payment processing device based on eye tracking according to an embodiment of the present invention.
  • FIG. 9 is a schematic flowchart showing an on-board information processing method based on eye tracking according to an embodiment of the present invention.
  • FIG. 10 is a block diagram showing the structure of an in-vehicle information processing device based on eye tracking according to an embodiment of the present invention.
  • the eye tracking technology Before describing the information processing method based on eye tracking, the information processing system based on eye tracking, the payment processing method based on eye tracking, and the payment processing device based on eye tracking of the present invention, the eye tracking technology will be briefly described.
  • Eye tracking technology is a scientific application technology. One is to track the eyeball and surrounding features of the eyeball, the other is to track the iris angle, and the third is to actively project infrared beams and other light beams to the iris to extract features. Eye tracking is a scientific application technology, for example, users can turn pages without touching the screen. From the principle point of view, eye tracking is mainly to study the acquisition, modeling and simulation of eye movement information, and it has a wide range of uses. In addition to infrared devices, the device for obtaining eye movement information can also be an image acquisition device, or even a camera on a general computer or mobile phone, which can also achieve eye tracking with the support of software.
  • the principle of eye tracking technology is that when a person’s eyes look in different directions, there will be subtle changes in the eyes, and these changes will produce features that can be extracted.
  • the computer can extract these features through image capture or scanning to track the eye in real time. Changes, predict the user’s status and needs, and respond to them, so as to achieve the purpose of controlling the device with the eyes.
  • the supporting equipment of eye tracking technology mainly includes: infrared equipment and image acquisition equipment.
  • infrared projection method has a relatively large advantage. It can be accurate to within 1 cm on a 30-inch screen. With the help of blink recognition and gaze recognition technologies, it can already replace the mouse and touchpad to a certain extent. Some limited operations.
  • other image capture devices such as cameras on computers or mobile phones, can also achieve eye tracking with the support of software, but they differ in accuracy, speed, and stability.
  • FIG. 1 is a schematic flowchart showing the information processing method based on eye tracking of the present invention.
  • the information processing method based on eye tracking of the present invention includes:
  • Setting step S10 setting one or more kinds of eye movement trajectories and control instructions corresponding to this type of eye movement trajectories;
  • Prompt step S20 prompt the user to perform eye movement trajectory
  • Capture step S30 capture the user's eye movement trajectory and obtain the eye movement trajectory
  • Processing step S40 Determine whether the obtained eye movement trajectory is consistent with a preset eye movement trajectory, and execute a control command corresponding to the eye movement trajectory when it is judged that the degree of agreement between the two is greater than a predetermined ratio.
  • an eye movement trajectory and a control instruction corresponding to the eye movement trajectory may be set, for example, the control instruction A is executed when the eye movement trajectory is set from left to right.
  • the prompt step S20 is an optional step, and the user may not be prompted to perform the eye movement trajectory.
  • set multiple eye movement trajectories and correspond to multiple control instructions for example, set the first eye movement trajectory (for example, the eye movement trajectory from left to right) corresponds to the control instruction A, and the second eye movement trajectory (For example, the trajectory of the eyeball movement from right to left) corresponds to the control command B.
  • a schematic diagram (animated or still image) related to the eye movement trajectory can be displayed on the display screen to prompt the user.
  • the user can also be prompted by voice or text. Eye movement trajectory.
  • the captured current eye movement trajectory of the user is displayed on the display screen.
  • the face image is captured while capturing the eye movement trajectory of the user.
  • the image capturing of face image and living body detection is performed while capturing the eye movement trajectory of the user.
  • the eye movement trajectory is an eye movement trajectory based on a prescribed direction, for example, including: eye movement trajectory from left to right; eye movement trajectory from right to left; eye movement trajectory from top to bottom; and from bottom to top Eye movement trajectory.
  • the eye movement trajectory is an eye movement trajectory based on a prescribed pattern, for example, it includes: an eye movement trajectory based on a circle; an eye movement trajectory based on a polygon; and an eye movement trajectory based on an irregular pattern.
  • the eye movement trajectory is an eye movement trajectory based on random points, for example, it includes: an eye movement trajectory at a regular change point; and an eye movement trajectory at an irregular change point.
  • FIG. 2 is a block diagram showing the structure of the information processing system based on eye tracking of the present invention.
  • the information processing system 10 based on eye tracking of the present invention includes:
  • the setting module 11 is used to set the eye movement trajectory and the corresponding relationship between the eye movement trajectory and the control instruction;
  • the prompt module 12 is used for prompting the eye movement trajectory that needs to be performed by the user;
  • the capturing module 13 is used to capture the eye movement trajectory of the user and obtain the eye movement trajectory;
  • the processing module 14 judges whether the eye movement trajectory captured by the capture module 13 is consistent with the eye movement trajectory set in the setting module 11, and executes the control command corresponding to the eye movement trajectory when it is judged that the degree of agreement between the two is above a prescribed ratio.
  • the prompt module 12 is an optional module.
  • the prompting module 12 may be configured to prompt the user by displaying a schematic diagram related to the eye movement trajectory on the display screen.
  • the capturing module 13 displays the captured current eye movement trajectory of the user on the display screen while capturing the eye movement trajectory of the user.
  • the eye movement trajectory is an eye movement trajectory based on a prescribed direction, for example, including: eye movement trajectory from left to right; eye movement trajectory from right to left; eye movement trajectory from top to bottom; and from bottom to top Eye movement trajectory.
  • the eye movement trajectory is an eye movement trajectory based on a prescribed pattern, for example, it includes: an eye movement trajectory based on a circle; an eye movement trajectory based on a polygon; and an eye movement trajectory based on an irregular pattern.
  • the eye movement trajectory is an eye movement trajectory based on random points, for example, it includes: an eye movement trajectory at a regular change point; and an eye movement trajectory at an irregular change point.
  • Fig. 3 shows a framework diagram of a payment processing method based on eye tracking according to an embodiment of the present invention.
  • the payment processing method is implemented by a payment processing device.
  • a payment processing device As an example, as shown in FIG. 3, an example in which a face brushing device is used as the payment processing device is exemplified here.
  • the payment process is completed through the interaction between the face-swiping device and the payment server as the backend.
  • FIG. 4 is a schematic flowchart showing a payment processing method based on eye tracking according to an embodiment of the present invention.
  • a payment processing method based on eye tracking includes:
  • Setting step S1 Pre-setting one or more kinds of eye movement trajectories and control instructions corresponding to the eye movement trajectories;
  • the face swiping device initiates a face collection request, and at the same time displays the eyeball operation instructions (that is, the eyeball trajectory) that needs to be completed by the user's eyes on the display screen;
  • Operation step S3 the user points the face at the camera, and at the same time completes the eye operation instructions on the display screen;
  • Capture step S4 the face-brushing device simultaneously collects the user's face information and captures the eye movement trajectory
  • Judging step S5 the face-brushing device judges whether the captured eye movement trajectory is in compliance with the required eye operation instruction
  • step S6 If the judgment is consistent (here the judgment rule can be set to be completely consistent, or it can be set to be consistent with a certain degree or more), then collect facial information and interact with the payment server to complete the facial payment payment, if not, end the facial recognition Paid.
  • the judgment rule can be set to be completely consistent, or it can be set to be consistent with a certain degree or more
  • control instruction ie, eye operation instruction
  • eye movement trajectory includes any one of the following: confirmation of payment of bills; confirmation of payment of fees; and confirmation of willingness to pay.
  • eye operation instructions can also be divided into two options, sliding confirmation and bill confirmation.
  • 5 to 7 are specific schematic diagrams showing eye operation commands, which will be described with reference to FIGS. 5 to 7.
  • the user's eyes replace the fingers to complete the sliding operation.
  • the face-swiping device displays a special effect of swiping confirmation and showing that this is the payment confirmation process, telling the user that they need to slide their eyes from left to right, during which the camera captures the user’s eye movements and confirms the eyeballs
  • the left-to-right action is completed, and the payment confirmation operation is completed.
  • the user's face recognition and living body detection are also completed in this process.
  • the sliding confirmation button on the screen will also slide synchronously with the eye movements. It should be clarified that from left to right here is just an example with a better experience, and there are other possible confirmation methods, such as from top to bottom, random point confirmation, and so on.
  • the face swiping device displays the product name, product quantity, payment time and payment amount in order from top to bottom on the display screen. As shown in Figure 6, each time one item is zoomed in and highlighted, the other items are displayed blurred and zoomed out.
  • the display sequence is from top to bottom, gradually guiding the user's eye focus to slide from top to bottom.
  • the camera monitors the user's eye movement in real time, and after recognizing the user's eye movement from top to bottom, it is judged that the current user is willing to pay.
  • FIG. 8 is a block diagram showing the structure of a payment processing device based on eye tracking according to an embodiment of the present invention.
  • the payment processing device here refers to the face swiping device described above.
  • the payment processing device 100 based on eye tracking of the present invention includes:
  • the setting module 110 is used to set and store the eye movement trajectory and payment related control instructions corresponding to the eye movement trajectory;
  • the display module 120 is used to display payment related information and display the eye movement trajectory required by the user;
  • the capturing module 130 is used to capture the eye movement trajectory of the user and obtain the eye movement trajectory;
  • the processing module 140 is used to determine whether the eye movement trajectory captured by the capture module is consistent with the eye movement trajectory set and stored in the setting module.
  • the payment-related control instructions corresponding to the eye movement trajectory are described.
  • the display module 120 is an optional module, and the payment processing device 100 of this embodiment can also be implemented without the display module 120.
  • the display module 120 includes:
  • the first display sub-module 121 is used to display payment related information
  • the second display sub-module 122 is used to display schematic diagrams related to the eye movement trajectory.
  • the capturing module 130 includes:
  • the first capturing sub-module 131 is used to capture the eye movement trajectory of the user
  • the second capturing sub-module 132 is used to capture a face image
  • the third capturing sub-module 133 is used for capturing images of the living body detection.
  • the second display sub-module 122 is further configured to dynamically display the user's current eye movement track captured by the first capture sub-module 131 on the display screen.
  • payment-related control instructions include any of the following: confirmation of payment of bills; confirmation of payment of fees; and confirmation of willingness to pay.
  • the eye movement trajectory is an eye movement trajectory based on a prescribed direction, for example, including: eye movement trajectory from left to right; eye movement trajectory from right to left; eye movement trajectory from top to bottom; and from bottom to top Eye movement trajectory.
  • the eye movement trajectory is an eye movement trajectory based on a prescribed pattern, for example, it includes: an eye movement trajectory based on a circle; an eye movement trajectory based on a polygon; and an eye movement trajectory based on an irregular pattern.
  • the eye movement trajectory is an eye movement trajectory based on random points, for example, it includes: an eye movement trajectory at a regular change point; and an eye movement trajectory at an irregular change point.
  • the payment processing method and the payment processing device based on eye tracking of this embodiment it is possible to provide a user with a method for confirming the willingness to pay based on eye tracking, which can ensure that the current face-swiping payment is the user's true intention and avoid the user's face inadvertently It is collected by the face-swiping device and passively completes the face-swiping payment, and does not require the user to input a password or mobile phone number and other cumbersome operations. It can be achieved only by moving the eyes of the user, which has the advantage of high convenience.
  • FIG. 9 is a schematic flowchart showing an on-board information processing method based on eye tracking according to an embodiment of the present invention.
  • a vehicle-mounted information processing method includes:
  • the eye movement trajectory is the instruction to turn on the air conditioner from left to right, and the eye movement trajectory is From right to left, it corresponds to the instruction to turn off the air conditioner.
  • the eye movement trajectory corresponds to the instruction to open the sunroof from top to bottom, and the eye movement track is the instruction to close the sunroof from bottom to bottom;
  • the on-board information processing device displays the eyeball operation instructions (that is, the eyeball trajectory) that needs to be completed by the user's eyes on the display screen;
  • Operation step S13 the user aims at the camera to complete the eye operation instruction on the display screen;
  • step S14 the vehicle-mounted information processing device captures the eye movement trajectory
  • Judging step S15 the vehicle-mounted information processing device judges whether the captured eye movement trajectory is in compliance with the required eye operation instruction.
  • Step S16 is executed: if it is judged to be consistent (here, the judgment rule can be set to be completely consistent, or it can be set to be consistent with a certain degree or more), then the in-vehicle information processing device completes the corresponding control instruction.
  • FIG. 10 is a block diagram showing the structure of an in-vehicle information processing device based on eye tracking according to an embodiment of the present invention.
  • the in-vehicle information processing device 200 based on eye tracking of the present invention includes:
  • the setting module 210 is used to set and store the eye movement trajectory and the on-board information processing instructions corresponding to the eye movement trajectory;
  • the capturing module 230 is used to capture the eye movement trajectory of the user and obtain the eye movement trajectory;
  • the processing module 240 is used to determine whether the eye movement trajectory captured by the capture module 230 is consistent with the eye movement trajectory set and stored in the setting module. When it is determined that the degree of agreement between the two is above a prescribed ratio, the The vehicle-mounted information processing instruction corresponding to the eye movement trajectory.
  • the display module 220 is an optional module, and the display module 220 may also be omitted.
  • the display module 220 includes:
  • the first display sub-module 221 is used to display content related to vehicle information processing.
  • the second display sub-module 222 is used to display schematic diagrams related to the eye movement trajectory.
  • the capturing module 230 is used to capture the eye movement trajectory of the user.
  • the second display sub-module 122 is further configured to dynamically display the user's current eye movement trajectory captured by the capturing sub-module 230 on the display screen.
  • the in-vehicle information processing instructions include, for example, any of the following: control instructions related to vehicle display; control instructions related to vehicle operation; and related control instructions to confirm driving intentions.
  • the eye movement trajectory is an eye movement trajectory based on a prescribed direction, for example, including: eye movement trajectory from left to right; eye movement trajectory from right to left; eye movement trajectory from top to bottom; and from bottom to top Eye movement trajectory.
  • the eye movement trajectory is an eye movement trajectory based on a prescribed pattern, for example, it includes: an eye movement trajectory based on a circle; an eye movement trajectory based on a polygon; and an eye movement trajectory based on an irregular pattern.
  • the eye movement trajectory is an eye movement trajectory based on random points, for example, it includes: an eye movement trajectory at a regular change point; and an eye movement trajectory at an irregular change point.
  • the on-board information processing device and on-board information box processing method based on eye tracking of this embodiment it is possible to provide users with a method for confirming the willingness to pay based on eye tracking, which can ensure that the current instruction is the user’s true will, and the user does not need to use hands.
  • the operation can be realized only by moving the eyes, which has the advantage of high convenience.
  • the present invention also provides a computer-readable medium on which a computer program is stored, characterized in that, when the computer program is executed by a processor, the above-mentioned eye tracking-based payment processing method is implemented.
  • the present invention also provides a computer device, including a storage module, a processor, and a computer program stored on the storage module and capable of running on the processor, wherein the processor executes the computer program to implement the above-mentioned Payment processing method for eye tracking.
  • the present invention also provides a computer-readable medium on which a computer program is stored, wherein the computer program is characterized in that, when the computer program is executed by a processor, the above-mentioned information processing method based on eye tracking is implemented.
  • the present invention also provides a computer device, including a storage module, a processor, and a computer program stored on the storage module and running on the processor, wherein the processor executes the computer program to implement the above-mentioned Information processing methods for eye tracking.

Abstract

一种基于眼球追踪的信息处理方法及其信息处理系统、基于眼球追踪的支付处理方法及其支付处理装置。该信息处理方法包括:设置步骤,设置眼球运动轨迹以及与该眼球运动轨迹对应的控制指令;捕捉步骤,捕捉用户的眼球运动轨迹并得到眼球运动轨迹;以及处理步骤,判断所述捕捉步骤捕捉到的眼球运动轨迹是否与所述设置步骤中设置的眼球运动轨迹一致,当判断两者一致程度在规定比例以上的情况下执行与该眼球运动轨迹对应的控制指令。能够采集到用户的真实意愿并且能够简化用户操作。

Description

基于眼球追踪的信息处理方法及系统、支付处理方法 技术领域
本发明涉及计算机技术,具体地涉及一种基于眼球追踪的信息处理方法、基于眼球追踪的信息处理系统、基于眼球追踪的支付处理方法以及基于眼球追踪的支付处理装置。
背景技术
刷脸支付逐渐成为新的支付热点,为移动支付带来极大的便捷。用户在不用携带任何设备和卡片的前提下,只需刷一下脸即可完成支付。在感受刷脸支付的方便快捷的同时,刷脸支付的安全性也不容忽视。刷脸支付中,确定用户的支付意愿是至关重要的一步。传统的刷卡和二维码支付,用户需要输入支付密码或者自主打开手机,展示二维码或者扫描二维码,这些操作都可在一定程度上体现用户的支付意愿。
具体地,现有的刷脸支付根据用户需要提供的支付信息主要分为两大类:(1)人脸信息支付;(2)人脸信息+支付密码支付。
(1)人脸信息
人脸信息支付的正常流程为刷脸设备启动刷脸支付,采集用户人脸信息,验证人脸信息,完成支付。在此过程中,可能会需要用户输入完成的手机号码或者手机号码后四位,但随着刷脸次数的增加,会逐步取消该操作。
(2)人脸信息+支付密码支付
人脸信息+支付密码支付的正常流程为,刷脸设备启动刷脸支付,采集用户人脸信息,用户输入支付密码,验证人脸信息和支付密码,完成支付。
在这些现有技术中存在的问题是,人脸信息支付方案的主要问题是无法确认用户支付意愿。特别是在无需输入手机号码的模式中,用户很容易在不经意间被恶意商户采集人脸信息,完成刷脸支付。即使需要输入手机号码,手机号码也很容易泄露,导致用户被盗刷。
而且,人脸信息+支付密码支付方案中,存在的问题是用户需要记住支付密码并在键盘上输入支付密码,操作手续会比较繁琐。
另外,还有一种应用场景,例如在车载终端等的不方便用户通过手动进行操作的情况下,也希望提供一种能够利用眼睛或者眼球的运动轨迹进行信息操作的信息处理方法以及信息处理系统。
发明内容
鉴于上述问题,本发明旨在提出一种能够基于眼球追踪技术实现用户意愿表示或者用户意愿确认的基于眼球追踪的信息处理方法、基于眼球追踪的信息处理系统、基于眼球追踪的支付处理方法以及基于眼球追踪的支付处理装置。
本发明的基于眼球追踪的信息处理方法,其特征在于,包括:
设置步骤,设置眼球运动轨迹以及与该眼球运动轨迹对应的控制指令;
捕捉步骤,捕捉用户的眼球运动轨迹并得到眼球运动轨迹;以及
处理步骤,判断所述捕捉步骤捕捉到的眼球运动轨迹是否与所述设置步骤中设置的眼球运动轨迹一致,当判断两者一致程度在规定比例以上的情况下执行与该眼球运动轨迹对应的控制指令。
可选地,在所述设置步骤之后、所述捕捉步骤之前进一步包括:
提示步骤,提示用户需要进行的眼球运动轨迹。
可选地,在所述提示步骤中,在显示屏上显示眼球运动轨迹相关的示意图来提示用户。
可选地,在所述捕捉步骤中,在捕捉用户的眼球运动轨迹的同时将捕捉到的用户的当前眼球运动轨迹显示在显示屏上。
可选地,在所述捕捉步骤中,在捕捉用户的眼球运动轨迹的同时进行人脸图像捕捉。
可选地,在所述捕捉步骤中,在捕捉用户的眼球运动轨迹的同时进行活体检测的图像捕捉。
本发明的基于眼球追踪的支付处理方法,其特征在于,包括:
设置步骤,设置眼球运动轨迹以及与眼球运动轨迹对应的支付相关控制指令;
捕捉步骤,捕捉用户的眼球运动轨迹并得到眼球运动轨迹;以及
处理步骤,判断所述捕捉步骤捕捉到的眼球运动轨迹是否与所述设置步骤中设置的眼球运动轨迹一致,当判断两者一致程度在规定比例以上的情况下执行与所述眼球运动轨迹对应的支付相关控制指令。
可选地,在所述设置步骤之后、所述捕捉步骤之前进一步包括:
显示步骤,显示支付相关信息以及显示需要用户进行的眼球运动轨迹。
可选地,在所述显示步骤中,在显示屏上显示支付相关信息,并且显示眼球运动轨迹相关的示意图。
可选地,在所述捕捉步骤中,在捕捉用户的眼球运动轨迹的同时将捕捉到的用户的当前眼球运动轨迹动态地显示在显示屏上。
可选地,在所述捕捉步骤中,在捕捉用户的眼球运动轨迹的同时进行人脸图像捕捉。
可选地,在所述捕捉步骤中,在捕捉用户的眼球运动轨迹的同时进行活体检测的图像捕捉。
可选地,所述支付相关控制指令包括以下任意一项:
支付账单确认;
支付费用确认;以及
支付意愿确认。
本发明的基于眼球追踪的信息处理系统,其特征在于,包括:
设置模块,用于设置眼球运动轨迹与控制指令之间的对应关系;
捕捉模块,用于捕捉用户的眼球运动轨迹并得到眼球运动轨迹;以及
处理模块,判断所述捕捉模块捕捉到的眼球运动轨迹是否与所述设置模块设置的眼球运动轨迹一致,当判断两者一致程度在规定比例以上的情况下执行与所述眼球运动轨迹对应的控制指令。
可选地,进一步包括:提示模块,用于提示用户进行眼球运动轨迹。
可选地,所述提示模块通过在显示屏上显示眼球运动轨迹相关的示意图来提示用户。
可选地,所述捕捉模块在捕捉用户的眼球运动轨迹的同时将捕捉到的用户的当前眼球运动轨迹显示在显示屏上。
本发明的基于眼球追踪的支付处理装置,其特征在于,包括:
设置模块,用于设置并存储眼球运动轨迹以及与眼球运动轨迹对应的支付相关控制指令;
捕捉模块,用于捕捉用户的眼球运动轨迹并得到眼球运动轨迹;以及
处理模块,用于判断所述捕捉模块捕捉到的眼球运动轨迹是否与所述设置模块中设置并存储的眼球运动轨迹一致,当判断两者的一致程度在规定比例以上的情况下执行与所述眼球运动轨迹对应的支付相关控制指令。
可选地,进一步包括:显示模块,用于显示支付相关信息以及示显示需要用户进行的眼球运动轨迹。
可选地,所述显示模块包括:
第一显示子模块,用于显示支付相关信息;
第二显示子模块,用于显示眼球运动轨迹相关的示意图。
可选地,所述捕捉模块包括:
第一捕捉子模块,用于捕捉用户的眼球运动轨迹;以及
第二捕捉子模块,用于进行人脸图像的捕捉。
可选地,所述捕捉模块进一步包括:
第三子模块,用于进行活体检测的图像的捕捉。
可选地,所述第二显示子模块进一步用于将由第一捕捉子模块捕捉到的用户的当前眼球运动轨迹动态地显示在显示屏上。
可选地,所述支付相关控制指令包括以下任意一项:
支付账单确认;
支付费用确认;以及
支付意愿确认。
本发明的计算机可读介质,其上存储有计算机程序,其特征在于,该计算机程序被处理器执行时实现上述的基于眼球追踪的支付处理方法。
本发明的计算机设备,包括存储模块、处理器以及存储在存储模块上并可在处理器上运行的计算机程序,其特征在于,所述处理器执行所述计算机程序时实现上述的基于眼球追踪的支付处理方法。
本发明的计算机可读介质,其上存储有计算机程序,其特征在于,该计算机程序被处理器执行时实现上述的基于眼球追踪的信息处理方法。
本发明的计算机设备,包括存储模块、处理器以及存储在存储模块上并可在处理器上运行的计算机程序,其特征在于,所述处理器执行所述计算机程序时实现上述的基于眼球追踪的信息处理方法。
如上所述,本发明的基于眼球追踪的支付处理方法以及支付处理装置(例如刷脸装置),能够在通过显示设备给用户显示有关眼部操作的指引,通过用眼睛完成一段操作过程,例如主要为左右滑动和确认账单信息等,支付处理装置通过摄像头捕捉监测用户的眼球动作,如果用户眼球正确的完成操作,则认为此时是用户的自主支付请求,如果没有正确完成则认为此时不是用户的自主支付请求,能够简化用户操作,提高便捷性。
附图说明
图1是表示本发明的基于眼球追踪的信息处理方法的流程示意图。
图2是表示本发明的基于眼球追踪的信息处理系统的构造框图。
图3表示本发明一实施方式的基于眼球追踪的支付处理方法的构架图。
图4是表示本发明一实施方式的基于眼球追踪的支付处理方法的流程示意图。
图5~图7是表示与眼部操作指令相关的具体示意图。
图8是表示本发明一实施方式的基于眼球追踪的支付处理装置的构造框图。
图9是表示本发明一实施方式的基于眼球追踪的车载信息处理方法的流程示意图。
图10是表示本发明一实施方式的基于眼球追踪的车载信息处理装置的构造框图。
具体实施方式
下面介绍的是本发明的多个实施例中的一些,旨在提供对本发明的基本了解。并不旨在确认本发明的关键或决定性的要素或限定所要保护的范围。
出于简洁和说明性目的,本文主要参考其示范实施例来描述本发明的原理。但是,本领域技术人员将容易地认识到,相同的原理可等效地应用于所有类型的基于眼球追踪的信息处理方法、基于眼球追踪的信息处理系统、基于眼球追踪的支付处理方法以及基于眼球追踪的支付处理装置,并且可以在其中实施这些相同的原理,以及任何此类变化不背离本专利申请的真实精神和范围。
而且,在下文描述中,参考了附图,这些附图图示特定的示范实施例。在不背离本发明的精神和范围的前提下可以对这些实施例进行电、机械、逻辑和结 构上的更改。此外,虽然本发明的特征是结合若干实施/实施例的仅其中之一来公开的,但是如针对任何给定或可识别的功能可能是期望和/或有利的,可以将此特征与其他实施/实施例的一个或多个其他特征进行组合。因此,下文描述不应视为在限制意义上的,并且本发明的范围由所附权利要求及其等效物来定义。
诸如“具备”和“包括”之类的用语表示除了具有在说明书和权利要求书中有直接和明确表述的单元(模块)和步骤以外,本发明的技术方案也不排除具有未被直接或明确表述的其它单元(模块)和步骤的情形。
在说明本发明的基于眼球追踪的信息处理方法、基于眼球追踪的信息处理系统、基于眼球追踪的支付处理方法以及基于眼球追踪的支付处理装置之前,先简单说明眼球追踪技术。
眼球追踪技术是一项科学应用技术,一是根据眼球和眼球周边的特征变化进行跟踪,二是根据虹膜角度变化进行跟踪,三是主动投射红外线等光束到虹膜来提取特征。眼球追踪是一项科学应用技术,例如,用户无需触摸屏幕即可翻动页面。从原理上看,眼球追踪主要是研究眼球运动信息的获取、建模和模拟,用途颇广。而获取眼球运动信息的设备除了红外设备之外,还可以是图像采集设备,甚至一般电脑或手机上的摄像头,其在软件的支持下也可以实现眼球跟踪。
眼球追踪技术的原理在于,当人的眼睛看向不同方向时,眼部会有细微的变化,这些变化会产生可以提取的特征,计算机可以通过图像捕捉或扫描提取这些特征,从而实时追踪眼睛的变化,预测用户的状态和需求,并进行响应,达到用眼睛控制设备的目的。
眼球追踪技术的支持设备主要包括:红外设备和图像采集设备。在精度方面,红外线投射方式有比较大的优势,大概能在30英寸的屏幕上精确到1厘米以内,辅以眨眼识别、注视识别等技术,已经可以在一定程度上替代鼠标、触摸板,进行一些有限的操作。此外,其他图像采集设备,如电脑或手机上的摄像头,在软件的支持下也可以实现眼球跟踪,但是在准确性、速度和稳定性上各有差异。
图1是表示本发明的基于眼球追踪的信息处理方法的流程示意图。
如图1所示,本发明的基于眼球追踪的信息处理方法包括:
设置步骤S10:设置一种或者多种的眼球运动轨迹以及与该种类型的眼球运动轨迹对应的控制指令;
提示步骤S20:提示用户进行眼球运动轨迹;
捕捉步骤S30:捕捉用户的眼球运动轨迹并得到眼球运动轨迹;以及
处理步骤S40:判断获得的眼球运动轨迹是否与预设一种的眼球运动轨迹一致,当判断两者一致程度在规定比例以上的情况下执行与该眼球运动轨迹对应的控制指令。
在设置步骤S100,作为一个示例,可以是设置一种眼球运动轨迹以及与该种眼球运动轨迹对应的控制指令,例如设置眼球运动轨迹从左到右的情况下执行控制指令A。
另外,提示步骤S20是可选步骤,也可以不提示用户进行眼球运动轨迹。
作为另一个示例,也可以设置多种眼球运动轨迹以及分别对应多种控制指令,例如设置第一种眼球运动轨迹(例如眼球运动轨迹从左到右)对应控制指令A,第二种眼球运动轨迹(例如眼球运动轨迹从右到左)对应控制指令B。
在提示步骤中S20,作为可选的方式,可以是在显示屏上显示眼球运动轨迹相关的示意图(动画图或者静止图)来提示用户,当然,也可以通过声音或者文字方式提示需要用户进行的眼球运动轨迹。
在捕捉步骤S30中,在捕捉用户的眼球运动轨迹的同时将捕捉到的用户的当前眼球运动轨迹显示在显示屏上。在捕捉步骤S30中,在同时需要进行人脸图像检测的情况下,在捕捉用户的眼球运动轨迹的同时进行人脸图像捕捉。在捕捉步骤A30中,同时需要进行人脸图像检测并需要进行活体检测的情况下,在捕捉用户的眼球运动轨迹的同时进行人脸图像捕捉和活体检测的图像捕捉。
作为一个示例,眼球运动轨迹是基于规定方向的眼球运动轨迹,例如包括:从左向右的眼球运动轨迹;从右向左的眼球运动轨迹;从上向下的眼球运动轨迹;以及从下向上的眼球运动轨迹。
作为一个示例,眼球运动轨迹是基于规定图形的眼球运动轨迹,例如包括:基于圆形的眼球运动轨迹;基于多边形的眼球运动轨迹;以及基于不规则图形的眼球运动轨迹。
作为一个示例,眼球运动轨迹是基于随机点的眼球运动轨迹,例如包括:规则变化点的眼球运动轨迹;以及不规则变化点的眼球运动轨迹。
图2是表示本发明的基于眼球追踪的信息处理系统的构造框图。
如图2所示,本发明的基于眼球追踪的信息处理系统10包括:
设置模块11,用于设置眼球运动轨迹以及眼球运动轨迹与控制指令之间的对应关系;
提示模块12,用于提示需要用户进行的眼球运动轨迹;
捕捉模块13,用于捕捉用户的眼球运动轨迹并得到眼球运动轨迹;以及
处理模块14,判断捕捉模块13捕捉到的眼球运动轨迹与设置模块11中设置的眼球运动轨迹是否一致,当判断两者一致程度在规定比例以上的情况下执行与眼球运动轨迹对应的控制指令。
其中,提示模块12是可选模块。提示模块12可以设置为通过在显示屏上显示眼球运动轨迹相关的示意图来提示用户。
捕捉模块13在捕捉用户的眼球运动轨迹的同时将捕捉到的用户的当前眼球运动轨迹显示在显示屏上。
作为一个示例,眼球运动轨迹是基于规定方向的眼球运动轨迹,例如包括:从左向右的眼球运动轨迹;从右向左的眼球运动轨迹;从上向下的眼球运动轨迹;以及从下向上的眼球运动轨迹。
作为一个示例,眼球运动轨迹是基于规定图形的眼球运动轨迹,例如包括:基于圆形的眼球运动轨迹;基于多边形的眼球运动轨迹;以及基于不规则图形的眼球运动轨迹。
作为一个示例,眼球运动轨迹是基于随机点的眼球运动轨迹,例如包括:规则变化点的眼球运动轨迹;以及不规则变化点的眼球运动轨迹。
接下来对于本发明的基于眼球追踪的信息处理方法、基于眼球追踪的信息处理系统的具体应用场景的实施方式进行说明。首先说明人脸支付场景下的实施方式,其次说明车载信息操作场景下的实施方式。
图3表示本发明一实施方式的基于眼球追踪的支付处理方法的构架图。该支付处理方法由支付处理装置来实现,作为一个示例,如图3所示,这里例举采用刷脸设备作为支付处理装置的示例。通过刷脸设备与作为后台的支付服务器之间的交互完成支付流程。
图4是表示本发明一实施方式的基于眼球追踪的支付处理方法的流程示意图。
如图4所示,本发明一实施方式的基于眼球追踪的支付处理方法,包括:
设置步骤S1:预先设置一种或者多种的眼球运动轨迹以及与该眼球运动轨迹对应的控制指令;
请求步骤S2:刷脸设备发起人脸采集请求,同时在显示屏上展示需要用户眼部完成的眼球操作指令(即眼球运动轨迹);
操作步骤S3:用户将人脸对准摄像头,同时完成显示屏上的眼部操作指令;
捕捉步骤S4:刷脸设备同时采集用户人脸信息并且捕捉眼球运动轨迹;
判断步骤S5:刷脸设备判定捕捉到的眼球运动轨迹是否动作是否符合要求进行的眼球操作指令;以及
执行步骤S6:如果判断符合(这里判断规则可以设置为完全一致,也可以设置为一定程度比例以上一致),则采集人脸信息并与支付服务器交互完成刷脸支付,如果不符合则结束刷脸支付。
作为示例,该眼球运动轨迹对应的控制指令(即眼部操作指令)包括以下任意一项:支付账单确认;支付费用确认;以及支付意愿确认。
另外,也可以将眼部操作指令分为两种方案,滑动确认和账单确认。图5~图7是表示与眼部操作指令相关的具体示意图,参照图5~图7进行说明。
首先描述滑动确认。
如图5所示,类似于智能手机上的滑动解锁,这里用户的眼睛代替了手指完成滑动操作。在商户发起一次刷脸支付时,刷脸设备显示一个滑动确认的特效同时显示这是支付确认的过程,告知用户需要用眼睛从左到右滑动,在此期间摄像头捕捉用户眼球运动情况,确认眼球完成了从左到右的动作,完成支付确认操作,与此同时,用户的人脸识别和活体检测也在此过程完成。
为了提高用户体验,屏幕的滑动确认按钮也会和眼睛动作同步滑动。需要明确的是此处从左到右,只是一种体验较好的实施例,还存在其他可能的确认方式,如从上到下,随机点确认等。
其次,描述确认账单信息。
刷脸设备在显示屏中由上到下依次显示商品名称,商品数量,支付时间和支付金额。如图6所示,每次放大突出显示一项内容,其他项内容模糊缩小显示。显示顺序由上到下,逐步引导用户眼睛焦点由上到下滑动。同时摄像头实时监控 用户眼球运动情况,当识别出用户眼球从上到下的运动轨迹后,判断当前用户有支付意愿。
作为一个优选方式,人脸识别和活体检测也在以上过程中间完成。在逐项信息显示结束后,刷脸设备会将所有信息突出放大展示,如图7所示。需要明确的是此处由上到下的确认账单,只是一种体验较好的实施例,还存在其他可能的确认方式,如从左到右等。账单内容也只是一种示例,账单具体内容可根据不同场景变化。
图8是表示本发明一实施方式的基于眼球追踪的支付处理装置的构造框图。这里的支付处理装置是指上文中说明的刷脸装置。
如图8所示,本发明的基于眼球追踪的支付处理装置100包括:
设置模块110,用于设置并存储眼球运动轨迹以及与眼球运动轨迹对应的支付相关控制指令;
显示模块120,用于显示支付相关信息以及示显示需要用户进行的眼球运动轨迹;
捕捉模块130,用于捕捉用户的眼球运动轨迹并得到眼球运动轨迹;以及
处理模块140,用于判断所述捕捉模块捕捉到的眼球运动轨迹是否与所述设置模块中设置并存储的眼球运动轨迹一致,当判断两者的一致程度在规定比例以上的情况下执行与所述眼球运动轨迹对应的支付相关控制指令。
其中,显示模块120是可选模块,也可以不设置显示模块120也同样能够实现本实施方式的支付处理装置100。
其中,显示模块120包括:
第一显示子模块121,用于显示支付相关信息;以及
第二显示子模块122,用于显示眼球运动轨迹相关的示意图。
其中,捕捉模块130包括:
第一捕捉子模块131,用于捕捉用户的眼球运动轨迹;
第二捕捉子模块132,用于进行人脸图像的捕捉;以及
第三捕捉子模块133,用于进行活体检测的图像的捕捉。
这样,第二显示子模块122进一步用于将由第一捕捉子模块131捕捉到的用户的当前眼球运动轨迹动态地显示在显示屏上。
其中,支付相关控制指令包括以下任意一项:支付账单确认;支付费用确 认;以及支付意愿确认。
作为一个示例,眼球运动轨迹是基于规定方向的眼球运动轨迹,例如包括:从左向右的眼球运动轨迹;从右向左的眼球运动轨迹;从上向下的眼球运动轨迹;以及从下向上的眼球运动轨迹。
作为一个示例,眼球运动轨迹是基于规定图形的眼球运动轨迹,例如包括:基于圆形的眼球运动轨迹;基于多边形的眼球运动轨迹;以及基于不规则图形的眼球运动轨迹。
作为一个示例,眼球运动轨迹是基于随机点的眼球运动轨迹,例如包括:规则变化点的眼球运动轨迹;以及不规则变化点的眼球运动轨迹。
根据本实施方式的基于眼球追踪的支付处理方法以及支付处理装置,能够为用户提供一种基于眼球追踪的支付意愿确认方法,能够确保当前刷脸支付为用户真实意愿,避免用户人脸在不经意间被刷脸设备采集,被动的完成刷脸支付,而且不需要用户输入密码或者手机号码等的繁琐操作,只需用户移动眼睛目光就能够实现,具有便利性高的优点。
接着,说明将本发明应用于车载终端的场景。
图9是表示本发明一实施方式的基于眼球追踪的车载信息处理方法的流程示意图。
如图9所示,本发明一实施方式的车载信息处理方法,包括:
设置步骤S11:预先在车载信息处理装置中设置一种或者多种的眼球运动轨迹以及与该眼球运动轨迹对应的控制指令,例如眼球运动轨迹为从左到右对应打开空调指令,眼球运动轨迹为从右到左对应关闭空调指令,又例如眼球运动轨迹为从上到下对应打开天窗指令,眼球运动轨迹为从下到下为关闭天窗指令等;
请求步骤S12:车载信息处理装置在显示屏上展示需要用户眼部完成的眼球操作指令(即眼球运动轨迹);
操作步骤S13:用户对准摄像头完成显示屏上的眼部操作指令;
捕捉步骤S14:车载信息处理装置捕捉眼球运动轨迹;
判断步骤S15:车载信息处理装置判定捕捉到的眼球运动轨迹是否动作是否符合要求进行的眼球操作指令;以及
执行步骤S16:如果判断符合(这里判断规则可以设置为完全一致,也可以设置 为一定程度比例以上一致),则车载信息处理装置完成对应的控制指令。
图10是表示本发明一实施方式的基于眼球追踪的车载信息处理装置的构造框图。
如图10所示,本发明的基于眼球追踪的车载信息处理装置200包括:
设置模块210,用于设置并存储眼球运动轨迹以及与眼球运动轨迹对应的车载信息处理指令;
捕捉模块230,用于捕捉用户的眼球运动轨迹并得到眼球运动轨迹;以及
处理模块240,用于判断所述捕捉模块230捕捉到的眼球运动轨迹是否与所述设置模块中设置并存储的眼球运动轨迹一致,当判断两者的一致程度在规定比例以上的情况下执行与所述眼球运动轨迹对应的车载信息处理指令。
其中,显示模块220是可选模块,也可以省略设置显示模块220。
其中,显示模块220包括:
第一显示子模块221,用于显示车载信息处理相关内容;以及
第二显示子模块222,用于显示眼球运动轨迹相关的示意图。
其中,捕捉模块230用于捕捉用户的眼球运动轨迹。
这样,第二显示子模块122进一步用于将由捕捉子模块230捕捉到的用户的当前眼球运动轨迹动态地显示在显示屏上。
其中,车载信息处理指令例如包括以下任意一项:与车载显示相关的控制指令;与车辆操作相关的控制指令;以及确认驾驶意愿的相关控制指令。
作为一个示例,眼球运动轨迹是基于规定方向的眼球运动轨迹,例如包括:从左向右的眼球运动轨迹;从右向左的眼球运动轨迹;从上向下的眼球运动轨迹;以及从下向上的眼球运动轨迹。
作为一个示例,眼球运动轨迹是基于规定图形的眼球运动轨迹,例如包括:基于圆形的眼球运动轨迹;基于多边形的眼球运动轨迹;以及基于不规则图形的眼球运动轨迹。
作为一个示例,眼球运动轨迹是基于随机点的眼球运动轨迹,例如包括:规则变化点的眼球运动轨迹;以及不规则变化点的眼球运动轨迹。
根据本实施方式的基于眼球追踪的车载信息处理装置以及车载信息箱处理方法,能够为用户提供一种基于眼球追踪的支付意愿确认方法,能够确保当前 指令为用户真实意愿,而且用户不需要用手进行操作,只需移动眼睛目光就能够实现,具有便利性高的优点。
如上所述,本发明还提供一种计算机可读介质,其上存储有计算机程序,其特征在于,该计算机程序被处理器执行时实现上述的基于眼球追踪的支付处理方法。
本发明还提供一种计算机设备,包括存储模块、处理器以及存储在存储模块上并可在处理器上运行的计算机程序,其特征在于,所述处理器执行所述计算机程序时实现上述的基于眼球追踪的支付处理方法。
本发明还提供一种计算机可读介质,其上存储有计算机程序,其特征在于,该计算机程序被处理器执行时实现上述的基于眼球追踪的信息处理方法。
本发明还提供一种计算机设备,包括存储模块、处理器以及存储在存储模块上并可在处理器上运行的计算机程序,其特征在于,所述处理器执行所述计算机程序时实现上述的基于眼球追踪的信息处理方法。
以上例子主要说明了本发明的基于眼球追踪的信息处理方法、基于眼球追踪的信息处理系统、基于眼球追踪的支付处理方法以及基于眼球追踪的支付处理装置。尽管只对其中一些本发明的具体实施方式进行了描述,但是本领域普通技术人员应当了解,本发明可以在不偏离其主旨与范围内以许多其他的形式实施。因此,所展示的例子与实施方式被视为示意性的而非限制性的,在不脱离如所附各权利要求所定义的本发明精神及范围的情况下,本发明可能涵盖各种的修改与替换。

Claims (22)

  1. 一种基于眼球追踪的信息处理方法,其特征在于,包括:
    设置步骤,设置眼球运动轨迹以及与该眼球运动轨迹对应的控制指令;
    捕捉步骤,捕捉用户的眼球运动轨迹并得到眼球运动轨迹;以及
    处理步骤,判断所述捕捉步骤捕捉到的眼球运动轨迹是否与所述设置步骤中设置的眼球运动轨迹一致,当判断两者一致程度在规定比例以上的情况下执行与该眼球运动轨迹对应的控制指令。
  2. 如权利要求1所述的基于眼球追踪的信息处理方法,其特征在于,
    在所述设置步骤之后、所述捕捉步骤之前进一步具备:
    提示步骤,提示用户需要进行的眼球运动轨迹。
  3. 如权利要求2所述的基于眼球追踪的信息处理方法,其特征在于,
    在所述提示步骤中,在显示屏上显示眼球运动轨迹相关的示意图来提示用户。
  4. 如权利要求3所述的基于眼球追踪的信息处理方法,其特征在于,
    在所述捕捉步骤中,在捕捉用户的眼球运动轨迹的同时将捕捉到的用户的当前眼球运动轨迹显示在显示屏上。
  5. 如权利要求1所述的基于眼球追踪的信息处理方法,其特征在于,
    所述眼球运动轨迹是以下任意一种:
    基于规定方向的眼球运动轨迹;
    所述眼球运动轨迹是基于规定图形的眼球运动轨迹;以及
    所述眼球运动轨迹是基于随机点的眼球运动轨迹。
  6. 如权利要求1所述的基于眼球追踪的信息处理方法,其特征在于,
    在所述捕捉步骤中,在捕捉用户的眼球运动轨迹的同时进行人脸图像捕捉,或者,在捕捉用户的眼球运动轨迹的同时进行活体检测的图像捕捉。
  7. 一种基于眼球追踪的支付处理方法,其特征在于,包括:
    设置步骤,设置眼球运动轨迹以及与眼球运动轨迹对应的支付相关控制指令;
    捕捉步骤,捕捉用户的眼球运动轨迹并得到眼球运动轨迹;以及
    处理步骤,判断所述捕捉步骤捕捉到的眼球运动轨迹是否与所述设置步骤中设置 的眼球运动轨迹一致,当判断两者一致程度在规定比例以上的情况下执行与所述眼球运动轨迹对应的支付相关控制指令。
  8. 如权利要求7所述的基于眼球追踪的支付处理方法,其特征在于,
    在所述设置步骤之后、所述捕捉步骤之前进一步具备:
    显示步骤,显示支付相关信息以及显示需要用户进行的眼球运动轨迹。
  9. 如权利要求7所述的基于眼球追踪的支付处理方法,其特征在于,
    在所述显示步骤中,在显示屏上显示支付相关信息,并且显示眼球运动轨迹相关的示意图。
  10. 如权利要求8所述的基于眼球追踪的支付处理方法,其特征在于,
    在所述捕捉步骤中,在捕捉用户的眼球运动轨迹的同时将捕捉到的用户的当前眼球运动轨迹动态地显示在显示屏上。
  11. 如权利要求7所述的基于眼球追踪的支付处理方法,其特征在于,
    所述支付相关控制指令包括以下任意一项:
    支付账单确认;
    支付费用确认;以及
    支付意愿确认。
  12. 一种基于眼球追踪的信息处理系统,其特征在于,包括:
    设置模块,用于设置眼球运动轨迹与控制指令之间的对应关系;
    捕捉模块,用于捕捉用户的眼球运动轨迹并得到眼球运动轨迹;以及
    处理模块,判断所述捕捉模块捕捉到的眼球运动轨迹是否与所述设置模块设置的眼球运动轨迹一致,当判断两者一致程度在规定比例以上的情况下执行与所述眼球运动轨迹对应的控制指令。
  13. 一种基于眼球追踪的支付处理装置,其特征在于,包括:
    设置模块,用于设置并存储眼球运动轨迹以及与眼球运动轨迹对应的支付相关控制指令;
    捕捉模块,用于捕捉用户的眼球运动轨迹并得到眼球运动轨迹;以及
    处理模块,用于判断所述捕捉模块捕捉到的眼球运动轨迹是否与所述设置模块中设置并存储的眼球运动轨迹一致,当判断两者的一致程度在规定比例以上的情况下执行与所述眼球运动轨迹对应的支付相关控制指令。
  14. 如权利要求13所述的基于眼球追踪的支付处理装置,其特征在于,进一步具备:
    显示模块,用于显示支付相关信息以及示显示需要用户进行的眼球运动轨迹。
  15. 如权利要求13所述的基于眼球追踪的支付处理装置,其特征在于,
    所述显示模块包括:
    第一显示子模块,用于显示支付相关信息;
    第二显示子模块,用于显示眼球运动轨迹相关的示意图。
  16. 如权利要求15所述的基于眼球追踪的支付处理装置,其特征在于,
    所述捕捉模块包括:
    第一捕捉子模块,用于捕捉用户的眼球运动轨迹;以及
    第二捕捉子模块,用于进行人脸图像的捕捉。
  17. 如权利要求16所述的基于眼球追踪的支付处理装置,其特征在于,所述捕捉模块进一步包括:
    第三子模块,用于进行活体检测的图像的捕捉。
  18. 如权利要求17所述的基于眼球追踪的支付处理装置,其特征在于,
    所述第二显示子模块进一步用于将由第一捕捉子模块捕捉到的用户的当前眼球运动轨迹动态地显示在显示屏上。
  19. 一种计算机可读介质,其上存储有计算机程序,其特征在于,
    该计算机程序被处理器执行时实现权利要求1~6任意一项所述的基于眼球追踪的支付处理方法。
  20. 一种计算机设备,包括存储模块、处理器以及存储在存储模块上并可在处理器上运行的计算机程序,其特征在于,所述处理器执行所述计算机程序时实现权利要求1~6任意一项所述的基于眼球追踪的支付处理方法。
  21. 一种计算机可读介质,其上存储有计算机程序,其特征在于,
    该计算机程序被处理器执行时实现权利要求7~11任意一项所述的基于眼球追踪的信息处理方法。
  22. 一种计算机设备,包括存储模块、处理器以及存储在存储模块上并可在处理器上运行的计算机程序,其特征在于,所述处理器执行所述计算机程序时实现权利要求7~11任意一项所述的基于眼球追踪的信息处理方法。
PCT/CN2021/073910 2020-02-20 2021-01-27 基于眼球追踪的信息处理方法及系统、支付处理方法 WO2021164511A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010104148.5 2020-02-20
CN202010104148.5A CN111667265A (zh) 2020-02-20 2020-02-20 基于眼球追踪的信息处理方法及系统、支付处理方法

Publications (1)

Publication Number Publication Date
WO2021164511A1 true WO2021164511A1 (zh) 2021-08-26

Family

ID=72382538

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/073910 WO2021164511A1 (zh) 2020-02-20 2021-01-27 基于眼球追踪的信息处理方法及系统、支付处理方法

Country Status (2)

Country Link
CN (1) CN111667265A (zh)
WO (1) WO2021164511A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111667265A (zh) * 2020-02-20 2020-09-15 中国银联股份有限公司 基于眼球追踪的信息处理方法及系统、支付处理方法
CN112270210B (zh) * 2020-10-09 2024-03-01 珠海格力电器股份有限公司 数据处理、操作指令识别方法、装置、设备和介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105205379A (zh) * 2015-10-28 2015-12-30 广东欧珀移动通信有限公司 一种终端应用的控制方法、装置和终端
CN105847538A (zh) * 2016-03-16 2016-08-10 惠州Tcl移动通信有限公司 基于眼球追踪控制vr眼镜操作的手机及方法
CN110209269A (zh) * 2019-05-07 2019-09-06 谷东科技有限公司 基于ar技术的线下购物方法、装置、存储介质及终端设备
CN111667265A (zh) * 2020-02-20 2020-09-15 中国银联股份有限公司 基于眼球追踪的信息处理方法及系统、支付处理方法

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103616953A (zh) * 2013-11-27 2014-03-05 福州瑞芯微电子有限公司 一种屏幕解锁和应用启动的方法及装置
CN104834446B (zh) * 2015-05-04 2018-10-26 惠州Tcl移动通信有限公司 一种基于眼球追踪技术的显示屏多屏控制方法及系统
CN106686295A (zh) * 2015-11-11 2017-05-17 中兴通讯股份有限公司 一种控制摄像设备的方法和装置
CN110633978A (zh) * 2019-08-09 2019-12-31 苏州若依玫信息技术有限公司 一种基于多重认证的智能移动支付系统

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105205379A (zh) * 2015-10-28 2015-12-30 广东欧珀移动通信有限公司 一种终端应用的控制方法、装置和终端
CN105847538A (zh) * 2016-03-16 2016-08-10 惠州Tcl移动通信有限公司 基于眼球追踪控制vr眼镜操作的手机及方法
CN110209269A (zh) * 2019-05-07 2019-09-06 谷东科技有限公司 基于ar技术的线下购物方法、装置、存储介质及终端设备
CN111667265A (zh) * 2020-02-20 2020-09-15 中国银联股份有限公司 基于眼球追踪的信息处理方法及系统、支付处理方法

Also Published As

Publication number Publication date
CN111667265A (zh) 2020-09-15

Similar Documents

Publication Publication Date Title
WO2021031522A1 (zh) 一种支付方法及装置
US11765163B2 (en) Implementation of biometric authentication
CN112243510A (zh) 生物识别认证的实现
US20190050866A1 (en) Image analysis for user authentication
CN112749377A (zh) 基于生物识别特征的一个或多个部分的注册状态提示移动
EP3567535A1 (en) Virtual reality scene-based business verification method and device
EP3518130A1 (en) Method and system for 3d graphical authentication on electronic devices
CN105427107B (zh) 基于智能眼镜的支付方法及装置
WO2021164511A1 (zh) 基于眼球追踪的信息处理方法及系统、支付处理方法
TW201814440A (zh) 基於虛擬實境場景的業務實現方法及裝置
CN106600855A (zh) 基于面部识别的支付装置和方法
WO2020051016A1 (en) Method, apparatus, and system for resource transfer
KR20150049550A (ko) 복합 생체 정보를 이용한 보안을 제공하는 전자 장치 및 방법
JP7006584B2 (ja) 生体データ処理装置、生体データ処理システム、生体データ処理方法、生体データ処理プログラム、生体データ処理プログラムを記憶する記憶媒体
CN111292092A (zh) 刷脸支付方法、装置及电子设备
JP7180869B2 (ja) 自動販売機決済システム、自動販売機、顔認証サーバ、自動販売機決済方法及びプログラム
CN106651340B (zh) 结算方法及装置
CN109493079A (zh) 支付认证方法和设备
CN206271123U (zh) 基于面部识别的支付装置
KR20190128536A (ko) 전자 장치 및 그 제어 방법
JP2018173891A (ja) 認証装置、認証方法、認証プログラム、およびデータベース
CN110032849B (zh) 生物识别认证的实现
US10997446B2 (en) Enrollment scheme for an electronic device
WO2022222735A1 (zh) 信息处理方法、装置、计算机设备和存储介质
CN107340962B (zh) 基于虚拟现实设备的输入方法、装置及虚拟现实设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21757919

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21757919

Country of ref document: EP

Kind code of ref document: A1