CN115705567A - Payment method and related device - Google Patents

Payment method and related device Download PDF

Info

Publication number
CN115705567A
CN115705567A CN202110905685.4A CN202110905685A CN115705567A CN 115705567 A CN115705567 A CN 115705567A CN 202110905685 A CN202110905685 A CN 202110905685A CN 115705567 A CN115705567 A CN 115705567A
Authority
CN
China
Prior art keywords
electronic device
interface
application
payment
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110905685.4A
Other languages
Chinese (zh)
Other versions
CN115705567B (en
Inventor
毛璐
刘兴宇
黄龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202110905685.4A priority Critical patent/CN115705567B/en
Priority to PCT/CN2022/092134 priority patent/WO2023010935A1/en
Publication of CN115705567A publication Critical patent/CN115705567A/en
Application granted granted Critical
Publication of CN115705567B publication Critical patent/CN115705567B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/08Payment architectures
    • G06Q20/10Payment architectures specially adapted for electronic funds transfer [EFT] systems; specially adapted for home banking systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/30Payment architectures, schemes or protocols characterised by the use of specific devices or networks
    • G06Q20/32Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/38Payment protocols; Details thereof
    • G06Q20/40Authorisation, e.g. identification of payer or payee, verification of customer or shop credentials; Review and approval of payers, e.g. check credit lines or negative lists

Abstract

The application discloses a payment method and a related device, which relate to the technical field of terminals, and the method comprises the following steps: when the screen of the electronic device 100 is in the bright screen and unlocked state, the electronic device 100 may detect whether the gesture of the electronic device 100 satisfies a preset condition. If yes, the electronic device 100 may capture one or more frames of images through the designated camera. When the electronic device 100 may determine the code scanning device based on the one or more frames of images and the distance between the electronic device 100 and the code scanning device is within a preset distance threshold 1, the electronic device 100 may display the two-dimensional code. In this way, the code scanning device can read the information in the two-dimensional code more quickly, so that the code scanning device and the electronic device 100 can perform payment operation.

Description

Payment method and related device
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a payment method and a related device.
Background
With the development of the terminal technology field, the user can pay the money of goods by cash when purchasing the goods, and can also pay the goods by using the electronic equipmentAnd (4) money. For example, when a user needs to pay via the electronic device while purchasing an item, the electronic device may receive and forward to the user a specific application (e.g., a payment request for a particular item or a particular application)
Figure BDA0003200702480000011
Application of,
Figure BDA0003200702480000012
Application, etc.) by a touch operation (e.g., a click). In response to the touch operation, the electronic device may display a two-dimensional code for payment on a screen display interface based on the payment application. When the code scanning device exists in the preset range away from the electronic device, the code scanning device can scan the two-dimensional code displayed on the screen display interface of the electronic device, so that the amount of money in the account of the device can be transferred to the account of the seller, and the operation of paying the goods money is completed.
However, it can be seen from the above flow that when the user performs the payment operation in this manner, the two-dimensional code paid by the user needs to be displayed on the screen display interface of the electronic device through multiple touch operations, so the operation is very complicated, and the payment efficiency is very low.
Disclosure of Invention
According to the technical scheme provided by the application, the probability of occurrence of the phenomenon of false recognition when the electronic equipment 100 recognizes the first target equipment is reduced, the operation of user payment is simplified, and the efficiency of user payment is improved.
In a first aspect, the present application provides a payment method, which may include: the electronic device displays a first interface, which is not a lock screen interface. The electronic device detects a first operation. In response to the first operation, the electronic device detects a state of the electronic device. The electronic equipment determines that the state of the electronic equipment meets a first preset condition, and a first camera of the electronic equipment acquires a first image. The electronic device determines that the first image includes a first target object. The electronic equipment displays a second interface, wherein the second interface comprises a two-dimensional code. Thus, the probability of occurrence of the misrecognition phenomenon when the electronic device 100 recognizes the first target device can be reduced, and meanwhile, the operation of user payment is simplified, and the efficiency of user payment is improved.
In one possible implementation manner, the first preset condition may include: an included angle between a display screen of the electronic device and the ground plane is smaller than a first threshold value, and the time of the electronic device in the hovering state is larger than a second threshold value. Therefore, the probability of displaying the two-dimensional code by false triggering can be reduced through the state of the electronic equipment, and the operation of payment of a user is simplified.
In one possible implementation, before the electronic device displays the second interface, the method further includes: a second camera of the electronic device acquires a second image and a third image, wherein the second image is different from the first image, and the third image is different from the first image. The electronic device determines that the second image and the third image both include the first target object. Therefore, the first target object is identified through a plurality of images, the first target object can be determined more efficiently and accurately, and the payment efficiency of the user is improved.
In one possible implementation, before the electronic device displays the second interface, the method further includes: determining that a distance between the electronic device and the first target object is less than a third threshold. In this way, the intention of the user can be more accurately judged by identifying the distance between the electronic equipment and the first target object, and the payment efficiency of the user is improved.
In one possible implementation, the method further includes: the first interface does not include a two-dimensional code.
In one possible implementation, the second camera is the same as the first camera. Therefore, the first target object can be simply, conveniently and quickly identified, the identification time is reduced, and the payment efficiency of the user is improved.
In one possible implementation, the first camera is a low power consumption camera and the second camera is a non-low power consumption camera. In this way, the first target object can be identified more accurately, reducing the probability of misidentification.
In a possible implementation manner, the method specifically includes: the electronic device transmits a pulse wave to the code scanning device through the TOF sensor. The electronic device acquires a time of flight from the emission of the pulse wave to the reflection of the pulse wave by the first target object back to the TOF sensor. The electronic device determines that the distance between the electronic device and the first target object is less than a third threshold based on the time of flight and the velocity of the pulse wave. Therefore, the sensor is used for judging the distance between the electronic equipment and the first target object, so that the intention of the user can be judged more accurately, and the payment efficiency of the user is improved.
In one possible implementation manner, before the electronic device displays the second interface, the method further includes: the electronic device detects the first interface. And when the electronic equipment determines that the first interface does not display the two-dimensional code, the electronic equipment displays the second interface.
In a possible implementation manner, the electronic device sends the information of the two-dimensional code to the first server based on the first target object. The information of the two-dimensional code comprises an identifier of a first application and information of a first account, and the first server is a server corresponding to the first application. And when the information of the first account triggers the first server to determine that the amount of money in the first account is equal to or larger than a first amount of money and transfer the first amount of money in the first account to a second account, the electronic equipment receives payment success prompt information sent by the first server.
In one possible implementation, the method further includes: the first interface includes an icon of a second application. When the information of the first account triggers the first server to determine that the amount of money in the first account is smaller than the first amount of money, the electronic device receives payment failure prompt information sent by the first server. The electronic device receives a first input by a user on an icon of the second application. Responding to the first input, the electronic equipment displays a third interface of the second application, wherein the third interface comprises a two-dimensional code corresponding to the second application. Therefore, the user can quickly switch the payment mode, the payment operation of the user is simplified, and the payment efficiency of the user is improved.
In one possible implementation manner, before the electronic device is in the unlocked state and displays the first interface, the method further includes: the electronic device displays a fourth interface, wherein the fourth interface includes a first control. The electronic device receives a second input directed to the first control by the user. In response to the second input, the electronic device selects the first application as a default payment application. Therefore, the operation of the user in payment can be simplified, and the payment efficiency of the user can be improved.
In a second aspect, the present application provides an electronic device, which may include: one or more processors, one or more sensors, one or more memories, a display screen, and a transceiver. The one or more memories are coupled to the one or more processors and the one or more memories are configured to store computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the method of any of the possible implementations of the first aspect. Thus, the probability of occurrence of the misrecognition phenomenon when the electronic device 100 recognizes the first target device is reduced, and meanwhile, the operation of user payment is simplified, and the efficiency of user payment is improved.
In a third aspect, the present application provides a computer storage medium, which may include computer instructions that, when executed on an electronic device, cause the electronic device to perform the method in any one of the possible implementations of the first aspect. Therefore, the probability of occurrence of the phenomenon of false recognition when the electronic device 100 recognizes the first target device is reduced, the operation of user payment is simplified, and the efficiency of user payment is improved.
In a fourth aspect, the present application provides a computer program product, which when run on an electronic device, causes the electronic device to perform the method of any one of the possible implementations of the first aspect. Thus, the probability of occurrence of the misrecognition phenomenon when the electronic device 100 recognizes the first target device is reduced, and meanwhile, the operation of user payment is simplified, and the efficiency of user payment is improved.
Drawings
Fig. 1 is a schematic hardware structure diagram of an electronic device 100 according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of a misrecognition scene provided in an embodiment of the present application;
FIG. 3 is a flow chart of a payment method provided by an embodiment of the present application;
FIG. 4A is a schematic diagram of a user interface provided by an embodiment of the present application;
FIG. 4B is a schematic diagram of a user gesture provided by an embodiment of the present application;
FIG. 4C is a schematic diagram of coordinate axes provided by an embodiment of the present application;
fig. 4D is a schematic diagram of an identification scenario provided in an embodiment of the present application;
fig. 4E is a schematic diagram of another recognition scenario provided in the embodiment of the present application;
4F-4G are a set of schematic user interfaces provided by embodiments of the present application;
FIGS. 5A-5D are a set of schematic user interfaces provided by embodiments of the present application;
6A-6K are a set of schematic user interfaces provided by embodiments of the present application;
7A-7G are a set of schematic user interfaces provided by embodiments of the present application;
FIG. 8 is a diagram of a software architecture provided by an embodiment of the present application;
fig. 9 is another software architecture diagram provided by an embodiment of the present application.
Detailed Description
The terminology used in the following embodiments of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the present application. As used in this application in the specification and the appended claims, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein is meant to encompass any and all possible combinations of one or more of the listed features. In the embodiments of the present application, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature, and in the description of embodiments of the application, unless stated otherwise, "plurality" means two or more.
First, an exemplary electronic device 100 provided in the embodiment of the present application is introduced.
Referring to fig. 1, fig. 1 schematically illustrates a hardware structure of an electronic device 100 according to an embodiment of the present disclosure.
The electronic device 100 may be an electronic device such as a mobile phone, a tablet computer, a PC, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), and the like, and the embodiment of the present application is not particularly limited to a specific type of the electronic device.
The electronic device 100 may include a processor 101, a memory 102, a wireless communication module 103, a display 104, a sensor module 105, an audio module 106, a speaker 107, a mobile communication module 108, and the like. The modules may be connected by a bus or in other manners, and the embodiment of the present application takes the bus connection as an example.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may also include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 101 may include one or more processor units, for example, the processor 101 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. Wherein, the different processing units may be independent devices or may be integrated in one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 101 for storing instructions and data. In some embodiments, the memory in the processor 101 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 101. If the processor 101 needs to use the instruction or data again, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 101 and thus increases the efficiency of the system.
In some embodiments, processor 101 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a USB interface, etc.
Memory 102 is coupled to processor 101 for storing various software programs and/or sets of instructions. In particular implementations, memory 102 may include a volatile memory (volatile memory), such as a Random Access Memory (RAM); non-volatile memory (non-volatile memory) such as ROM, flash memory (FLASH memory), hard Disk Drive (HDD) or Solid State Drive (SSD) may also be included; the memory 102 may also comprise a combination of the above-mentioned kinds of memories. The memory 102 may also store some program codes, so that the processor 101 may call the program codes stored in the memory 102 to implement the method implemented in the electronic device 100 according to the embodiment of the present application. The memory 102 may store an operating system, such as an embedded operating system like uCOS, vxWorks, RTLinux, etc.
The wireless communication module 103 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 103 may be one or more devices integrating at least one communication processing module. The wireless communication module 103 receives electromagnetic waves via an antenna, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 101. The wireless communication module 103 may also receive a signal to be transmitted from the processor 101, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through an antenna to radiate the electromagnetic waves. In some embodiments, the electronic device 100 may further transmit a signal to detect or scan a device near the electronic device 100 through a bluetooth module (not shown in fig. 1) or a WLAN module (not shown in fig. 1) in the wireless communication module 103, and establish a wireless communication connection with the nearby device and transmit data. Wherein, the bluetooth module may provide a solution including one or more of classic bluetooth (BR/EDR) or Bluetooth Low Energy (BLE) bluetooth communication, and the WLAN module may provide a solution including one or more of Wi-Fi direct, wi-Fi LAN, or Wi-Fi softAP WLAN communication.
The display screen 104 may be used to display images, video, and the like. The display screen 104 may include a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 104, N being a positive integer greater than 1.
The sensor module 105 may include a touch sensor 105A or the like. The touch sensor 105A may also be referred to as a "touch device". The touch sensor 105A may be disposed on the display screen 104, and the touch sensor 105A and the display screen 104 form a touch screen, which is also called a "touch screen". The touch sensor 105A may be used to detect a touch operation acting thereon or nearby. Optionally, the sensor module 105 may further include a gyroscope sensor (not shown in fig. 1), an acceleration sensor (not shown in fig. 1), and the like. Where a gyroscope sensor may be used to determine the motion pose of the electronic device 100, in some embodiments, the electronic device 100 may determine the angular velocity of the electronic device 100 about three axes (i.e., x, y, and z axes) via the gyroscope sensor. Acceleration sensors may be used to detect the magnitude of acceleration of electronic device 100 in various directions (typically the x, y, and z axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary.
The audio module 106 may be used to convert digital audio information into an analog audio signal output and may also be used to convert an analog audio input into a digital audio signal. The audio module 106 may also be used to encode and decode audio signals. In some embodiments, the audio module 106 may also be disposed in the processor 101, or some functional modules of the audio module 106 may be disposed in the processor 101.
The speaker 107, which may also be referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music through the speaker 107 or to a hands free phone.
The mobile communication module 108 may provide a solution including 2G/3G/4G/5G wireless communication applied on the electronic device 100.
Next, a payment method provided by the present application is described.
The electronic apparatus 100 may capture one or more frames of images by a designated camera (e.g., a front-facing low power consumption camera). The electronic device 100 may identify the code scanning device based on the one or more frames of images according to shape features (e.g., squares and/or rectangles, etc.). Then, the electronic device 100 may run the application 1 and show the application 1 on the screen display interface (e.g., the application 1 is displayed on the screen display interface)
Figure BDA0003200702480000053
Figure BDA0003200702480000051
Application of,
Figure BDA0003200702480000052
Application, etc.) corresponding payment two-dimensional code 1. The payment two-dimensional code 1 may include an identification of the application 1 and information specifying the account 1 (e.g., an identification number (ID) of the specified account 1). The code scanning apparatus and the electronic apparatus 100 can perform a payment operation based on the payment two-dimensional code 1.
As can be seen from the above flow, when the electronic device 100 identifies code scanning devices within a preset range through a designated camera (e.g., a front-facing low-power camera), a false identification condition is very likely to occur.
Illustratively, as shown in fig. 2, when the electronic device 100 recognizes the square ceiling light 201, and/or the long tubular ceiling light 202, and/or the central air conditioner 203, since the shape characteristics of the square ceiling light 201, the long tubular ceiling light 202, and the central air conditioner 203 are similar to those of the code scanning device 200, the electronic device 100 determines the square ceiling light 201, the long tubular ceiling light 202, and the central air conditioner 203 as the code scanning device 200, so that the electronic device displays the payment two-dimensional code 1 of the specified application on the screen display interface, causing a false recognition phenomenon.
Accordingly, the present application provides a method of payment.
In the embodiment of the present application, the electronic device 100 may provide the payment method based on the intelligent assistant "YOYO" of the electronic device 100, that is, the "YOYO" quick payment method, which is taken as an example in the following description. The "YOYO" may be the name of the intelligent assistant of the electronic device 100.
When the screen of the electronic device 100 is in the bright-screen and unlocked state, the electronic device 100 may detect whether the gesture of the electronic device 100 satisfies a preset condition (e.g., whether the electronic device 100 is turned over due to the gesture 1 of the user, an angle between the screen of the electronic device 100 and the ground is within a specified angle range, and whether the hover time of the electronic device 100 is greater than a preset time value, etc.). If yes, the electronic device 100 may capture one or more frames of images through a designated camera (e.g., a front-facing low power consumption camera). When the electronic device 100 can determine the code scanning device based on the one or more frames of images, and the distance between the electronic device 100 and the code scanning device is within a preset distance threshold of 1 (e.g., 8 cm-20 cm), the electronic device 100 can display the two-dimensional code. In this way, the code scanning device can read the information in the payment two-dimensional code 1 more quickly, so that the code scanning device and the electronic device 100 can perform payment operation. For the description of the payment two-dimensional code 1, reference may be made to the related description above, and details are not repeated here.
Therefore, the method for implementing the quick YOYO payment can reduce the probability of the occurrence of the false recognition phenomenon, simplify the operation of the user and improve the payment efficiency of the user.
A method for quick payment of YOYO provided in the embodiment of the present application is described below.
Referring to fig. 3, fig. 3 is a flowchart illustrating an example of a method for quick payment of YOYO according to an embodiment of the present application.
S301, the electronic device 100 detects the screen display state.
Specifically, the electronic device 100 may obtain the current screen display state through a preset mechanism (e.g., a broadcast mechanism). The screen display state may include a screen-off state, a screen-locking interface display state, a screen-on and screen-off state, and the like.
The screen off state may refer to a state in which, when the electronic device 100 is in the screen lock state, no interface element is displayed on the screen or a partial area on the screen remains constantly lit for displaying time, notification, and other information. The lock screen interface display state refers to the user interface of the electronic device 100 being in the lock screen state. The bright screen and unlocked state refers to an unlocked state in which the electronic device 100 may receive user input based on an interface displayed in the state and execute one or more applications in response to the input, i.e., the electronic device referred to in the art.
When the screen state of the electronic device 100 is in the screen-off state, the electronic device 100 may receive and display a screen lock interface in response to an input of a user lighting the screen. The screen locking interface may refer to an interface in which some functions are locked in the electronic device 100, that is, the electronic device 100 does not provide some functions. The function provided by the electronic device 100 under the lock screen interface has low requirement on data security. For example, the functions provided by the electronic device 100 under the lock screen interface may include: answering a call, hanging up a call, adjusting the volume of music, starting a camera application, turning on/off a flight mode, etc.
When the screen state of the electronic device 100 is in the screen-locking interface display state, the electronic device 100 may verify the identity of the user in a biometric authentication manner to unlock the electronic device, so that the screen display state is in a bright screen and unlocked state. The biometric authentication method may include one or more of password authentication, face authentication, fingerprint authentication, iris authentication, voiceprint authentication, and the like. The bright screen may refer to a state in which all areas of the display screen on the electronic device 100 are lighted.
S302, when the electronic device 100 displays the interface 1 (which may be referred to as a first interface), and it is determined that the current screen display state is a bright screen and unlocked state, the electronic device 100 detects a gesture 1 (which may also be referred to as a first operation) of a user.
Specifically, the interface 1 displayed by the electronic device 100 in the bright-screen and unlocked state may be a desktop, a user interface displayed when an application 3 (e.g., a music application, a news application, a video application, etc.) is running, and the like. Note that the application 3 is different from the application 1, and is not used for payment of money.
Illustratively, the interface 1 displayed by the electronic device 100 may be a desktop. The electronic device 100 may detect that the current screen display state is a bright screen and unlocked state. As shown in fig. 4A, the electronic device 100 may display a desktop 400. One or more application icons may be displayed in the desktop 400. The one or more application icons may include a weather application icon, a stock application icon, a calculator application icon, a settings application icon, a mail application icon, a theme application icon, a calendar application icon, a video application icon, and the like, among others.
Optionally, status bar, page indicator, and tray icon areas may also be displayed in desktop 400. The status bar may include, among other things, one or more signal strength indicators for mobile communication signals (also may be referred to as cellular signals), signal strength indicators for wireless fidelity (Wi-Fi) signals, battery status indicators, time indicators, and the like. The page indicator may be used to indicate the positional relationship of the currently displayed page with other pages. The tray icon area includes a plurality of tray icons (e.g., a dialing application icon, an information application icon, a contacts application icon, a camera application icon, etc.) that remain displayed during page switching. The page may also include a plurality of application icons and a page indicator, the page indicator may not be a part of the page but may exist alone, the tray icon is also optional, and the embodiment of the present application does not limit this.
In addition, the electronic device 100 may further include a front-facing low power consumption camera 401. The front-facing low power camera 401 may be used to capture images. The electronic device 100 may recognize and determine the code scanning device based on the captured image. The process of identifying and determining the code scanning device will be described in detail in the following embodiments, and will not be described herein again.
When the electronic device 100 displays the interface 1, and it is determined that the current screen display state is the bright screen and unlocked state, the electronic device 100 may detect the gesture 1 of the user based on the motion sensor. The motion sensor may include an accelerometer (which may also be referred to as an acceleration sensor), and/or a gyroscope, and/or a gravity sensor, among others. The gesture 1 may be a wrist-flipping gesture (may also be referred to as a wrist-flipping motion) of the user, or the like. When the electronic device 100 detects, through the motion sensor, that the motion data satisfies a preset condition 1, for example, an acceleration 1 in a specified axial direction (e.g., Z axis) changes from a specified direction 1 (e.g., opposite to the gravity acceleration direction) to a specified direction 2 (e.g., the same as or perpendicular to the gravity acceleration direction), and the value of the acceleration 1 is within a preset acceleration threshold 1 (e.g., 0.75g-1.1 g), the electronic device 100 may detect a gesture 1 of the user.
For example, the electronic device 100 may detect a wrist-flipping gesture of the user through an acceleration sensor. When the electronic device 100 detects that the acceleration in the Z-axis direction changes from the direction opposite to the gravity acceleration direction to the same direction as the gravity acceleration direction based on the acceleration sensor, and the value of the acceleration in the Z-axis direction is 0.75g-1.1g, the electronic device 100 may determine the wrist-flipping gesture of the user.
S303, after the electronic device 100 determines the gesture 1 of the user, the electronic device 100 may detect an orientation of a screen on the electronic device 100 and a hovering time of the electronic device 100.
Specifically, the electronic apparatus 100 may detect an acceleration value 2 in the X axis direction, an acceleration value 3 in the Y axis direction, and an acceleration value 4 in the Z axis direction based on the motion sensors, and detect the orientation of the screen based on the above data.
For example, as shown in fig. 4B, the electronic device 100 may enable the screen on the electronic device 100 to be oriented within a specified angle range (e.g., 90 degrees to 180 degrees) from the ground based on the wrist-flipping motion of the user, for example, when the screen on the electronic device 100 is oriented at 90 degrees from the ground, the screen on the electronic device 100 is perpendicular to the ground. When the screen on the electronic device 100 faces 180 degrees from the ground, the screen on the electronic device 100 faces the ground. The electronic device 100 may detect the orientation of the screen on the electronic device 100 based on a gravity sensor.
As shown in fig. 4C, an axis perpendicular to the upper screen of the electronic device 100 is a Z-axis, an axis perpendicular to the upper side and the lower side of the electronic device 100 is a Y-axis, and an axis perpendicular to the left side and the right side of the electronic device 100 is an X-axis. When the electronic apparatus 100 acquires, through the gravity sensor, that the value of the acceleration 4 in the Z-axis direction is-g, the electronic apparatus 100 may determine that the orientation of the screen on the electronic apparatus 100 is facing the ground. When the electronic apparatus 100 acquires that the value of the acceleration 2 in the X-axis direction is g through the gravity sensor, the electronic apparatus 100 may determine that the screen on the electronic apparatus 100 is oriented perpendicular to the ground and the right side of the electronic apparatus 100 is facing upward. When the electronic apparatus 100 acquires that the value of the acceleration 3 in the Y-axis direction is g through the gravity sensor, the electronic apparatus 100 may determine that the screen on the electronic apparatus 100 is perpendicular to the ground and the upper side of the electronic apparatus 100 is oriented upward.
S304, when the electronic device 100 determines that the state of the electronic device 100 is that the included angle between the screen orientation on the electronic device 100 and the ground is within a specified angle range (e.g., 90 degrees to 180 degrees, which may also be referred to as being smaller than a first threshold, for example, the included angle is smaller than 90 degrees), and the hover time of the electronic device 100 is greater than a preset time value 1 (which may also be referred to as a second threshold, for example, 2 seconds), the electronic device 100 may capture the image 1 (which may also be referred to as a first image) by specifying the camera 1 (which may also be referred to as a first camera, for example, a front-facing low-power camera). The electronic device 100 may identify a code scanning device (which may also be referred to as a first target object) based on the image 1. The code scanning device can be a code scanning box or a large code scanning machine, and the application is not limited to this.
Wherein, hovering may refer to holding the electronic device 100 by a user to keep the electronic device within a certain area for more than a certain predetermined time, and keeping the angle between the screen orientation and the ground within a specified angle range by the electronic device 100. Alternatively, the state of the electronic device is hovering over the code scanning device, which may be interpreted as staying on the code scanning device, stationary, or within a certain area (e.g., a change in the position of the electronic device due to a user shaking his hand or changing posture) for a time exceeding a preset time value of 1.
Specifically, when the electronic device 100 detects, through the motion sensor, that the acceleration value 2 in the X-axis direction is at an acceleration threshold 2 (e.g., -g to g), the acceleration value 3 in the Y-axis direction is at an acceleration threshold 3 (e.g., 0-0.2), and the acceleration value 4 in the Z-axis direction is at an acceleration threshold 4 (e.g., -g to 0), the electronic device 100 may determine that the screen orientation on the electronic device 100 is within a specified angle range (e.g., 90-180 degrees) from the ground.
The electronic device 100 may be preset with an image 2 including a screen of a code scanning device. The electronic apparatus 100 can capture a preset number (e.g., one sheet) of images 1 by specifying the camera 1 (e.g., the front low power consumption camera). When the electronic device 100 identifies the code scanning device based on the image 1, the electronic device 100 may compare the image 1 with the image 2 to calculate the similarity. When the calculation result 1 is within the specified similarity threshold range 1, the electronic apparatus 100 may determine that the code scanning apparatus is included in the image 1.
For example, as shown in fig. 4D, when the electronic device 100 determines that the screen on the electronic device 100 is facing the ground, the electronic device 100 may capture an image 410 (which may also be referred to as image 1) through the front low-power-consumption camera 401. For a description of the front-facing low power consumption camera 401, reference may be made to the related description in fig. 4A, and details are not repeated here. The electronic device 100 may perform a calculated comparison of the similarity based on the image 410 and a preset image 2 including a code scanning device. When the calculation result 1 is within the specified similarity threshold range 1, the electronic device 100 may determine that the code scanning device 410A is included in the image 410.
S305, when the electronic device 100 identifies the code scanning device based on the image 1, the electronic device 100 may capture a plurality of images by specifying the camera 1 (e.g., a front-facing low-power camera). The electronic device 100 may identify the code scanning device based on the plurality of captured images. The plurality of captured images include an image 1, an image 3 (which may also be referred to as a second image), and an image 4 (which may also be referred to as a third image).
Specifically, the electronic apparatus 100 may perform the comparison calculation of the similarity between the plurality of images captured based on the designated camera 1 (e.g., the front-facing low-power camera) and the image 2 preset in the foregoing step S304. When the calculation result of each image is within the specified similarity threshold range 2, for example, the calculation result 1 of the image 1, the calculation result 2 of the image 3, and the calculation result 3 of the image 4 are all within the similarity threshold range 2, the electronic device 100 may further determine the code scanning device. The interval of the similarity threshold range 2 is smaller than the interval of the similarity threshold range 1 in S304.
For example, as shown in fig. 4E, after the electronic device 100 identifies the code scanning device based on the aforementioned image 410 in fig. 4D, the electronic device 100 may capture a plurality of images through the front low power consumption camera 401. For a description of the front-end low power consumption camera 401, reference may be made to the related description in fig. 4A, and details are not repeated here. The plurality of images may include an image 410 (may also be referred to as image 1), an image 411 (may also be referred to as image 3), an image 412 (may also be referred to as image 4), and the like. The electronic device 100 may compare the images 410, 411, 412, etc. with the image 2 for similarity calculation, respectively. When the calculation result of each image, for example, the calculation result 1 of the image 410, the calculation result 2 of the image 411, and the calculation result 3 of the image 412, are all within the specified similarity threshold range 2, the electronic device 100 may determine that the code scanning device 410A is included in all of the image 410, the image 411, and the image 412, and thus, the electronic device 100 may further determine the code scanning device 410A.
Optionally, when the code scanning device is a shopping deduction code scanning device applied to a shopping mall, the code scanning device is a shopping deduction code scanning device of the shopping mall; when the code scanning device is a traffic fee deduction code scanning device applied to traffic, the code scanning device is a traffic fee deduction code scanning device.
In one possible implementation, the electronic device 100 may capture multiple images including code scanning devices at the same angle. In another possible implementation, the electronic device 100 may also capture multiple images including code scanning devices at different angles. This is not limited by the present application.
In one possible implementation, the electronic device 100 may capture image 1 with the designated camera 1 in step S304, and capture images 3 and 4 with the designated camera 2 (which may also be referred to as a second camera) in step S305. Wherein, the designated camera 1 and the designated camera 2 may be different. For example, in some embodiments, designated camera 1 may be a low power camera and designated camera 2 may be a non-low power camera. In some embodiments, designated camera 1 may be a close-range camera and designated camera 2 may be a distant-range camera. This is not limited by the present application.
In one possible implementation, the electronic device 100 may also capture only the image 3 in step 305. The electronic device 100 may perform the contrast calculation of the similarity between the image 1 and the image 3 and the preset image 2. When the electronic device 100 determines that the calculation result 1 of the image 1 and the calculation result 2 of the image 3 are both in the similarity threshold range 2, the electronic device 100 may further determine a code scanning device. Alternatively, the interval of the similarity threshold range 2 is smaller than the interval of the similarity threshold range 1 in the aforementioned S304.
S306, the electronic device 100 may detect whether the distance between the code scanning device and the electronic device 100 is within a distance threshold of 1 (which may also be referred to as a third threshold, for example, 8 cm to 20 cm).
Specifically, the electronic device 100 may detect the distance of the code scanning device from the electronic device 100 based on a specified sensor (e.g., a time of flight (TOF) sensor). When the electronic device 100 determines that the electronic device 100 is no longer moved in a stable state according to the acceleration sensor, the designated sensor can calculate the distance between the code scanning device and the electronic device 100.
For example, the electronic device 100 may detect the distance of the code scanning device from the electronic device 100 based on a TOF sensor. When the electronic device 100 determines that the electronic device 100 is no longer moving according to the acceleration sensor and is in a stable state, the TOF sensor may obtain the distance between the code scanning device and the electronic device 100 by calculating the flight time from the time when the pulse wave is emitted to the time when the pulse wave is reflected back to the TOF sensor by the code scanning device.
In one possible implementation, the TOF sensor may acquire multiple sets of distance values between the code scanning device and the electronic device 100 based on multiple emitted pulse waves, and determine an average value of the multiple sets of distance values as the distance between the code scanning device and the electronic device 100.
S307, the electronic device 100 may detect whether the two-dimensional code is displayed on the interface 1.
Specifically, the electronic device 100 may capture the image 4 by capturing the image on the interface 1. The electronic apparatus 100 can detect whether the two-dimensional code is displayed on the interface 1 based on the image 4.
S308, when the electronic device 100 determines that the distance between the code scanning device and the electronic device 100 is within a distance threshold 1 (for example, 8 cm to 20 cm), and the two-dimensional code is not displayed on the interface 1, the electronic device 100 may display an interface 2 (which may also be referred to as a second interface) corresponding to the application 1. The interface 2 may include a two-dimensional code and a floating window, among other things.
Optionally, when the code scanning device is a shopping deduction code scanning device applied to a shopping mall, the two-dimensional code can be a payment two-dimensional code applied to shopping; when the code scanning device is applied to the traffic cost deduction code scanning device of traffic, the two-dimensional code can also be a subway two-dimensional code or a bus two-dimensional code.
The present embodiment takes the two-dimensional code as an example of the payment two-dimensional code 1 applied to shopping. Wherein the application 1 may be a default application preset by the electronic device 100 for payment of money, for example
Figure BDA0003200702480000101
Application of,
Figure BDA0003200702480000102
Applications, and the like. The payment two-dimensional code 1 may include an identification of the application 1 and information specifying the account 1 (e.g., an ID number specifying the account 1). The floating window may include icons for one or more payment applications. The one or more icons of the payment applications may include an icon of application 1. The icons of the one or more payment applications may receive input from a user, such that the electronic device 100 executes a designated payment application to perform an amount payment operation in response to the input.
Illustratively, as shown in FIG. 4F, electronic device 100 may display a user interface 420 (which may also be referred to as a "user interface")Interface 2). The user interface 420 can display a page title, a return control 421A, a payment two-dimensional code 421B (which can also be referred to as payment two-dimensional code 1), and a floating window 422. Where the page title may be a text prompt such as "pay merchant". The return control 421A may be configured to receive a touch operation (e.g., a click) acted on by the user, so that the electronic device 100 returns to the upper page in response to the touch operation. The floating window 422 may include
Figure BDA0003200702480000103
Application icons 422A,
Figure BDA0003200702480000106
Application icon 422B, checkbox 422C, close control 422D. Wherein checkbox 422C may prompt the user for the currently selected application for payment of the amount. Closing control 422D may be used to receive a touch operation (e.g., a click) on which the user acts such that device 100 no longer displays flyover window 422 in response to the touch operation. Wherein, in the embodiment of the application,
Figure BDA0003200702480000105
Figure BDA0003200702480000104
the application (which may also be referred to as application 1) may be an application preset by the electronic device 100 for payment of an amount of money by default.
In one possible implementation, electronic device 100 may not display a floating window in interface 2 for the user to switch payment applications. This is not limited by the present application.
In one possible implementation, when electronic device 100 displays interface 2, electronic device 100 may output a voice message and/or a vibration prompt to inform the user that electronic device 100 has displayed interface 2 at this time. This is not limited by the present application.
It should be noted that the above steps S305 to S307 may be executed in parallel. When the electronic device 100 determines that the code scanning device is not included in the image, and/or the distance between the code scanning device and the electronic device 100 is not within a preset distance range 1 (for example, 8 cm-20 cm), and/or a two-dimensional code is displayed on the interface 1, the electronic device 100 may terminate execution of the flow of the YOYO quick payment method provided by the present application.
S309, the electronic device 100 can execute payment operation based on the two-dimensional code and the code scanning device.
Specifically, the present embodiment takes the two-dimensional code as the payment two-dimensional code 1 applied to shopping as an example. When the code scanning device reads the information in the payment two-dimensional code 1, the information of the application 1 identification and the designated account 1 in the payment two-dimensional code 1 can be sent to the server 1 (also referred to as a first server) based on the code scanning device. The server 1 may be a server corresponding to the application 1 (may also be referred to as a first application). The server 1 may inquire the amount of money in the designated account 1 based on the information of the designated account 1 (which may also be referred to as a first account). When the server 1 determines that the amount of money in the designated account 1 is equal to or greater than the designated amount of money to be paid (which may also be referred to as a first amount of money), the server 1 transfers the designated amount of money to be paid in the designated account 1 to the designated account 2 (which may also be referred to as a second account) to perform a payment operation. After performing the payment operation, the server 1 may transmit a payment success prompt message to the electronic device 100. The electronic device 100 may display an interface 3 for prompting the user that the payment operation is completed based on the payment success prompting message. The payment two-dimensional Code 1 may be an Aztec two-dimensional Code, a QR Code two-dimensional Code, a PDF417 two-dimensional Code, or the like, which is not limited in this application.
Illustratively, as shown in fig. 4G, when the code scanning device reads the information in the payment two-dimensional code 421B,
Figure BDA0003200702480000111
the application corresponding server 1 can be used for
Figure BDA0003200702480000112
And transferring the specified amount needing to be paid in the corresponding specified account 1 into the specified account 2 by the application so as to execute the payment operation. The electronic device 100 may displayUser interface 430 (which may also be referred to as interface 3). The user interface 430 may include payment amount information, payment method information, information specifying account 2, and a completion control 431. The payment amount information may be the text prompt information "10.8", the payment method information may be the text prompt information "account balance", and the information specifying the account 2 may be the name "convenience store chain" of the specified account 2. Completion control 431 may be used to receive a touch operation (e.g., a click) by the user thereon, such that electronic device 100 no longer displays user interface 430 in response to the touch operation.
It should be noted that the above steps are only used for exemplary explanation of the payment method provided in the present application, and do not constitute a specific limitation to the present application. In particular implementations, embodiments of the present application may include fewer steps. This is not limited by the present application.
For example, in some embodiments, when it is detected that the interface 1 is not a lock screen interface, and it is detected that the gesture 1 of the user and the state of the electronic device 100 are that an included angle between a screen on the electronic device 100 and the ground is within a specified angle range, and the hovering time is greater than a preset time value 1, the electronic device 100 may recognize a code scanning device based on an image 1 captured by a specified camera 1 (e.g., a front-facing low-power camera), and then display an interface 2 including a two-dimensional code.
In other embodiments, after detecting that the interface 1 is not a lock screen interface and detecting the gesture 1 of the user, the electronic device 100 may recognize the code scanning device based on the image 1 captured by the designated camera 1 (e.g., a front-facing low-power camera), and then display the interface 2 including the two-dimensional code.
In other embodiments, after the electronic device 100 detects gesture 1 of the user, the electronic device 100 may perform step S304 and step S305, and then display the interface 2 including the two-dimensional code.
In other implementations, after detecting gesture 1 of the user, the electronic device 100 may perform steps S303 and S304, and then display the interface 2 including the two-dimensional code.
In other implementations, after detecting gesture 1 of the user, the electronic device 100 may perform step S303, step S304, and step S305, and then display an interface 2 including a two-dimensional code.
In other embodiments, after detecting gesture 1 of the user, the electronic device 100 may perform step S303, step S304, step S305, and step S306, and then display the interface 2 including the two-dimensional code. This is not limited by the present application.
In a possible implementation manner, when the server 1 cannot transfer the designated amount from the designated account 1 to the designated account 2 due to the fact that the amount of money in the designated account 1 is less than the designated amount of money to be deducted (which may also be referred to as a first amount of money), the electronic device 100 may receive the payment failure prompt message sent by the server 1. The electronic device 100 may display the interface 2 based on the payment failure prompt information and display the payment failure prompt information on the interface 2. The payment failure prompt message can be used for prompting the user that the electronic device 100 fails to run the application 1 for payment. Electronic device 100 may receive a user input on the hover window for the app 2 icon (which may also be referred to as a first input), run app 2 (which may also be referred to as a second application, e.g.,
Figure BDA0003200702480000126
application,
Figure BDA0003200702480000125
Application) and displays interface 4 (which may also be referred to as a third interface). The interface 4 may include a payment two-dimensional code 2 corresponding to the application 2. Where application 2 and application 1 are different, but both application 2 and application 1 may be used for payment of money. The information in the payment two-dimensional code 2 may include information specifying the account 3 (e.g., an ID number specifying the account 3) and an identification of the application 2. When the code scanning device reads the information in the payment two-dimensional code 2, the information of the application 2 identifier and the designated account 3 in the payment two-dimensional code 2 can be sent to the server 2 based on the code scanning device. The server 2 may be a server corresponding to the application 2. The server 2 may query the amount of money in the designated account 3 based on the information of the designated account 3.When the server 2 determines that the amount of money in the designated account 3 is equal to or greater than the designated amount of money to be paid, the server 2 transfers the designated amount of money to be paid in the designated account 3 to the designated account 2 to perform a payment operation.
For example, as shown in fig. 5A, when the amount of money in the designated account 1 corresponding to application 1 is less than the designated amount of money to be deducted, thereby causing a payment failure, the electronic device 100 may display a prompt box 501 on the user interface 420. The prompt box 501 may include a payment failure prompt message, which may be, for example, a text prompt message "sorry, insufficient balance, retry". The prompt box 501 may also include a control 501A and a control 501B. The control 501A may be used to receive a touch operation (e.g., a click) acted on by the user, such that the electronic device 100 no longer displays the user interface 420 in response to the touch operation. The control 501B may be configured to receive a touch operation (e.g., a click) acted on by the user, so that the electronic device 100 may switch a payment mode (e.g., payment using a bank card account, etc.) in response to the touch operation.
When the electronic device 100 receives a user action in the suspension frame 422
Figure BDA0003200702480000124
Upon a touch operation (e.g., click) on the application icon 422B, the electronic device 100 may run
Figure BDA0003200702480000127
An application (also may be referred to as application 2) and displays a user interface 510 (also may be referred to as interface 4).
As shown in fig. 5B, user interface 510 may include a page title, a return control 511, and a payment two-dimensional code 512 (which may also be referred to as payment two-dimensional code 2). The page title may be a text prompt, such as "collect and pay", among others. The return control 511 may be configured to receive a touch operation (e.g., a click) acted on by the user, so that the electronic apparatus 100 returns to the upper page in response to the touch operation.
When reading by code scanning equipmentWhen the information in the two-dimensional code 512 is paid for,
Figure BDA0003200702480000121
the server 2 corresponding to the application can move the specified amount of money needing to be paid in the specified account 3 to the specified account 2 to execute the payment operation. The electronic device 100 may display a user interface 520 for prompting the user that the payment operation has been successfully performed.
As shown in FIG. 5C, user interface 520 may include payment amount information, information specifying account 2, and a completion control 521. The payment amount information may be the text prompt information "10.8", and the information specifying the account 2 may be the name "convenience store chain" specifying the account 2. Completion control 521 may be used to receive a touch operation (e.g., a click) on which the user acts such that electronic device 100 no longer displays user interface 520 in response to the touch operation.
In one possible implementation, when the electronic device 100 runs the application 1 (e.g., application 1)
Figure BDA0003200702480000122
Application of,
Figure BDA0003200702480000123
Application) and displays the interface 2 including the payment two-dimensional code 1, the electronic device 100 may output a voice message and/or a vibration prompt to inform the user that the interface 2 is displayed by the electronic device 100 at this time. Electronic device 100 may receive and respond to a touch operation (e.g., a click) by a user on the application 2 icon within the floating window, execute application 2 (e.g.,
Figure BDA00032007024800001310
application of,
Figure BDA00032007024800001311
Application) and displays interface 4. Where application 2 and application 1 are different, but both application 2 and application 1 may be used for payment of money. For the description of the interface 4 and the payment two-dimensional code 2, reference may be made to the corresponding description in the previous embodimentsThe description is not repeated here.
Illustratively, as shown in FIG. 5D, the application 1 is
Figure BDA00032007024800001316
Application as an example, when the electronic device 100 is running
Figure BDA0003200702480000134
When user interface 420 is applied and displayed, electronic device 100 may output a voice message or a vibration prompt to inform the user that user interface 420 has been displayed by electronic device 100 at this time. Electronic device 100 may receive and respond to user action in hovering window 422
Figure BDA00032007024800001313
Upon a touch operation (e.g., click) on an application icon, the electronic device 100 may be operated
Figure BDA00032007024800001312
An application (which may also be referred to as application 2) and displays a user interface 510 (which may also be referred to as interface 4). For the description of the user interface 510, reference may be made to the foregoing description of the embodiment shown in fig. 5B, and details are not repeated here.
When the code scanning device reads the information in the payment two-dimensional code 512,
Figure BDA0003200702480000133
the server 2 corresponding to the application can transfer the specified amount in the specified account 3 to the specified account 2 to execute the payment operation. The electronic device 100 may display a user interface 520 for prompting the user that the payment operation has been successfully performed. For the description of the user interface 520, reference may be made to the foregoing description of the embodiment shown in fig. 5C, and details are not repeated here.
In one possible implementation, when the electronic device 100 is not installed with the application 1 (e.g., the application 1 is installed in the electronic device 100)
Figure BDA0003200702480000131
Application of,
Figure BDA0003200702480000132
Application, etc.), the electronic device 100 may display a floating window on the interface 1. Wherein the flyout window may include a prompt for a user failure to run application 1, one or more payment application icons, and the like. The electronic device 100 may receive and respond to a touch operation (e.g., a click) by the user on the floating window for the application 2 icon, execute the application 2 (e.g.,
Figure BDA0003200702480000136
application of,
Figure BDA0003200702480000137
Application) and displays interface 4. Where application 2 and application 1 are different, but both application 2 and application 1 may be used for payment of money. For the description of the interface 4 and the payment two-dimensional code 2, reference may be made to the corresponding description in the foregoing embodiments, and details are not repeated here.
Illustratively, as shown in FIG. 6A, the application 1 is
Figure BDA00032007024800001319
Application is for example when the electronic device 100 is not installed
Figure BDA0003200702480000139
Figure BDA0003200702480000138
In use, the electronic device 100 may display a floating window 601 on the desktop 400. Wherein, the floating window 601 may include a prompt message of failure of running the application 1,
Figure BDA00032007024800001315
Application icons 422A,
Figure BDA00032007024800001314
Application icon 422B, checkbox 422C, close control 422D, and so on. The prompt message may be a text message "the default payment method fails, please reselect".For the descriptions of the check box 422C and the close control 422D, reference may be made to the description in the embodiment shown in fig. 4F, and details are not repeated here.
As shown in FIG. 6B, when the electronic device 100 receives a user action in the floating frame 601
Figure BDA0003200702480000135
Upon a touch operation (e.g., click) on the application icon 422B, the electronic device 100 may run
Figure BDA00032007024800001317
An application (also may be referred to as application 2) and displays a user interface 510 (also may be referred to as interface 4). For the description of the user interface 510, reference may be made to the foregoing description of the embodiment shown in fig. 5B, and details are not repeated here.
When the code scanning device reads the information in the payment two-dimensional code 512,
Figure BDA00032007024800001318
the server 2 corresponding to the application can transfer the specified amount in the specified account 3 to the specified account 2 to execute the payment operation. The electronic device 100 may display a user interface 520 for prompting the user that the payment operation has been successfully performed. For the description of the user interface 520, reference may be made to the foregoing description of the embodiment shown in fig. 5C, and details are not repeated here.
In a possible implementation manner, when the electronic device 100 does not store data of the designated account 1 therein, the electronic device 100 may display the interface 5 before the application 1 is run to display the payment two-dimensional code 1. Wherein the interface 5 may include a floating window that may display icons for one or more payment applications. For the description of the one or more payment application icons, reference may be made to the description in the foregoing step S308, which is not described herein again. The electronic device 100 may acquire account information (e.g., an account ID and an account password) of the designated account 1 based on the input of the user to the interface 5, and then send the account information to the server 1 corresponding to the application 1. The server can verify whether the account password in the account information matches the account ID. When the account password matches the account ID, the server may transmit instruction information that the authentication is successful to the electronic device 100. After receiving the instruction of successful verification, the electronic device 100 may obtain data of the designated account 1 from the server, and display the interface 2 including the payment two-dimensional code 1.
Illustratively, as shown in FIG. 6C, the application 1 is
Figure BDA0003200702480000147
Application as an example, when there is no memory in the electronic device 100
Figure BDA0003200702480000142
Figure BDA0003200702480000141
Electronic device 100 may display user interface 610 when applying the data for the corresponding designated account 1. User interface 610 may include exit controls 611, input prompt boxes 612, and hover windows 613, among others. Therein, the exit control 611 may receive a touch operation (e.g., a click) acted upon by the user such that the user interface 610 may no longer be displayed by the electronic device 100 in response to the touch operation. The input prompt box 612 may display a text prompt message "please input a mobile phone number" for prompting the user to input the specified data information. The input prompt box 612 may display the specified data information based on the user's input. The floating window 613 may include
Figure BDA0003200702480000143
Application icon 613A. The
Figure BDA0003200702480000146
The application icon 613A may receive a touch operation (e.g., a click) acted on by the user, so that the electronic device 100 may be executed in response to the touch operation
Figure BDA0003200702480000144
Apply and complete the payment.
As shown in the figure6C-6D, when the electronic device 100 receives a user touch operation on any blank area of the user interface 610, the floating window 613 may be hidden. The electronic device 100 may acquire account information (e.g., account ID, account password) of the designated account 1 based on the designated data information input by the user on the input box 612, and then send the account information to the electronic device
Figure BDA0003200702480000145
The application is applied to the corresponding server 1. The server may verify whether the account password in the account information matches the account ID. When the account password matches the account ID, the server may transmit instruction information that the authentication is successful to the electronic device 100. After receiving the instruction of successful verification, the electronic device 100 may obtain the data of the designated account 1 from the server, and display the user interface 420. For the description of the user interface 420, reference may be made to the foregoing description of the embodiment shown in fig. 4F, and details are not repeated here.
When the code scanning device reads the information in the payment two-dimensional code 421B,
Figure BDA00032007024800001410
the application corresponding server 1 can be used for
Figure BDA0003200702480000149
Figure BDA0003200702480000148
The amount of the appointed amount in the corresponding appointed account 1 is transferred into the appointed account 2 so as to execute the payment operation. The electronic device 100 may display a user interface 430 for prompting the user that the payment operation has been successfully performed. For the description of the user interface 430, reference may be made to the description of the embodiment shown in fig. 4G, and details are not repeated here.
In one possible implementation, the interface 1 may be a user interface displayed after the electronic device 100 unlocks based on the lock screen interface. The electronic device 100 may execute the flow of the YOYO quick payment method provided by the present application based on the interface 1. The screen locking interface refers to an interface of the electronic device 100 in a screen locking state.
As shown in fig. 6E, electronic device 100 may display a lock screen interface 620. The lock screen interface 620 may display a status bar, a calendar indicator, and a lock screen icon 621. The status bar may include, among other things, one or more signal strength indicators for mobile communication signals (which may also be referred to as cellular signals), one or more signal strength indicators for wireless fidelity (Wi-Fi) signals, a battery status indicator, etc. The calendar indicator may be used to indicate the current time, such as date, day of week, time division information, and the like. Below the lock screen icon 621, a text prompt "face is being recognized" may be displayed.
When the electronic device 100 displays the lock screen interface, the electronic device 100 may verify the identity of the user through biometric authentication. The biometric authentication method may include one or more of password authentication, face authentication, fingerprint authentication, iris authentication, voiceprint authentication, and the like.
As shown in fig. 6F, when the electronic device 100 displays the lock screen interface 620, the electronic device 100 can be unlocked through face verification. Electronic device 100 may display an unlock icon 622 on lock screen interface 620 for prompting the user that the lock has been unlocked. In which, a text prompt message "the human face is unlocked" may be displayed below the unlocking icon 622. The electronic device 100 may execute the flow of the YOYO quick payment method provided by the present application based on the user interface 620. For the description of the flow, reference may be made to the description of the foregoing steps S301 to S309, which are not described herein again.
In one possible implementation, the interface 1 may be a user interface displayed when the electronic device 100 runs the application 3. The electronic device 100 may execute the flow of the YOYO quick payment method provided by the present application based on the interface 1. Among them, the application 3 may be an immersive application that provides video playback and game execution and does not display a status bar, or may be a non-immersive application that cannot be used for money payment.
Illustratively, taking application 3 as a video application in an immersive application as an example, as shown in fig. 6G, electronic device 100 may display video interface 630 while running application 3. The video interface 630 may include a video screen 631 and a progress bar 632. The electronic device 100 may execute the flow of the YOYO quick payment method provided by the present application based on the video interface 630. For the description of the flow, reference may be made to the description of the foregoing steps S301 to S309, which are not described herein again.
Illustratively, taking the application 3 as a music application in a non-immersive application as an example, as shown in fig. 6H, the electronic device 100 may display a music playing interface 640 when the application 3 is executed. The music playing interface 640 may include a music playing screen 641, a progress bar 642, one or more controls (e.g., a music pause/play control, a play next music control, a play previous music control, etc.), and so on. The electronic device 100 may execute the flow of the YOYO quick payment method provided by the present application based on the music playing interface 640. For the description of the flow, reference may be made to the description of the foregoing steps S301 to S309, and details are not repeated herein.
In one possible implementation, interface 1 may be a negative one-screen interface. The negative one-screen interface refers to the leftmost page of the mobile phone. The negative screen can display application programs commonly used by the user or shortcut functions provided by the application programs. The electronic device 100 may execute the flow of the YOYO quick payment method provided by the present application based on the interface 1.
Illustratively, as shown in FIG. 6I, electronic device 100 may display a negative one-screen interface 650 when operated by a user's right-swipe gesture on desktop 400. The negative one-screen interface 650 can include a search box 651, one or more shortcut functions (e.g., a sweep shortcut function, a pay shortcut function, a load shortcut function, etc.), a message prompt box 652. The message prompt box 652 may include, among other things, the text prompt messages "2598 steps today" and "356 card consumed". The electronic device 100 may execute the flow of the YOYO quick payment method provided by the present application based on the negative one-screen interface 650. For the description of the flow, reference may be made to the description of the foregoing steps S301 to S309, which are not described herein again.
In one possible implementation, interface 1 may be a drop-down interface. Wherein the drop-down interface may include one or more pieces of notification information. The electronic device 100 may execute the flow of the YOYO quick payment method provided by the present application based on the interface 1.
For example, as shown in fig. 6J, when the electronic device 100 detects a slide-down gesture operation by the user on the desktop 400, the electronic device 100 may display a drop-down interface 660 in response to the slide-down gesture. Drop-down interface 660 can include a close control 661 and a notification window 662. The notification window 662 may display notification information, such as "beijing city, today to tomorrow, cloudy, 18-29 degrees … …". Control 661 can receive a touch operation (e.g., a click) by a user thereon such that electronic device 100 can clear all notifications for the drop-down interface in response to the touch operation. The electronic device 100 may execute the flow of the YOYO quick payment method provided by the present application based on the pull-down interface 660. For the description of the flow, reference may be made to the description of the foregoing steps S301 to S309, which are not described herein again.
In one possible implementation, interface 1 may also be a geared interface. The backstage interface refers to a user interface on which one or more cards (e.g., flight card, schedule card, weather card, etc.) can be displayed on the electronic device 100. The electronic device 100 may execute the flow of the YOYO quick payment method provided by the present application based on the interface 1. This is not limited by the present application.
In one possible implementation, after electronic device 100 displays interface 2, electronic device 100 may receive and respond to an input by a user acting on interface 2, no longer displaying interface 2, but rather displaying a desktop.
For example, as shown in fig. 6K, the electronic device 100 may receive a touch operation of the user acting on the user interface 420 to slide upward from the lower edge. In response to the touch operation, the electronic apparatus 100 may display the desktop 400 instead of the user interface 420. For the description of the desktop 400, reference may be made to the foregoing description of the embodiment shown in fig. 4A, and further description is omitted here.
In one possible implementation, the electronic device 100 may be configured with a primary screen and a secondary screen. Wherein, the orientation of the main screen can be the same as that of the front camera, and the orientation of the auxiliary screen can be the same as that of the rear camera. When the electronic device 100 determines that the current screen state is a bright screen and unlocked state, and determines the gesture 1 of the user, the orientation of the screen, and the hovering time, the electronic device 100 may capture one or more frames of images through the rear camera. The electronic device 100 may identify the code scanning device through the one or more frames of images. When the electronic device 100 recognizes the code scanning device and determines that the code scanning device is at a distance threshold of 1 from the electronic device 100, the electronic device 100 may run the application 1 and display the payment two-dimensional code 1 on the main screen, and/or the sub-screen. The code scanning device can perform a payment operation based on the payment two-dimensional code 1 on the main screen or the sub-screen.
When the payment two-dimensional code 1 of the application 1 is simultaneously displayed on the main screen and the sub screen, the electronic apparatus 100 may receive the user's input based on the main screen. In response to the input, the electronic device 100 may run the application 2 such that the main screen and the sub-screen display the payment two-dimensional code 2, and no longer display the payment two-dimensional code 1. The code scanning device can perform a payment operation based on the payment two-dimensional code 2. Therefore, the user can more conveniently switch the payment application, and the payment efficiency is improved.
In some embodiments, before the electronic device 100 performs the foregoing steps S301 to S309, the electronic device 100 may display a fourth interface, where the fourth interface includes the first control. The electronic device 100 may receive and respond to a second input by the user acting on the first control to set up application 1 as a default for payment of an amount.
Illustratively, in application 1 are
Figure BDA0003200702480000161
For example, as shown in FIG. 7A, the electronic device 100 may display a desktop 400. For the description of the desktop 400, reference may be made to the description of the embodiment shown in fig. 4A, and details are not repeated here.
The electronic apparatus 100 may receive a touch operation (e.g., a click) by the user on the setting application icon 701. In response to the touch operation, the electronic apparatus 100 may display a setting interface.
As shown in fig. 7B, the electronic device 100 may display a settings interface 710. The settings interface 710 may include one or more settings entries (e.g., flight mode settings entry, wi-Fi settings entry, bluetooth settings entry, mobile network settings entry, do-not-disturb mode settings entry, display and brightness settings entry, account settings entry, and YOYO quick payment settings entry 711). The electronic device 100 may receive a touch operation (e.g., a click) by the user on the YOYO quick payment setting entry 711. In response to the touch operation, the electronic device 100 may display a YOYO quick payment setting interface.
As shown in fig. 7C, the electronic device 100 may display a YOYO quick payment settings interface 720 (which may also be referred to as a fourth interface). The YOYO quick payment interface can include a page title, a return control 721, a YOYO quick payment settings control 722, a quick payment mode information column 723, and a disclaimer entry. Wherein the page title may be the text information "YOYO quick pay". The return control 721 may receive a touch operation (e.g., a click) acted on by the user, so that the electronic apparatus 100 returns to the upper page in response to the touch operation. The text message on the YOYO quick payment setting control 722 may be a prompt message "OFF" to close the YOYO quick payment. The YOYO quick payment setting control 722 may be configured to receive a touch operation (e.g., a click) applied thereto by the user, such that the electronic device 100 opens or closes the YOYO quick payment function in response to the touch operation.
As shown in fig. 7D-7E, when electronic device 100 receives and responds to a touch operation (e.g., a click) on YOYO quick payment settings control 722, electronic device 100 may display a prompt box 731 on YOYO quick payment settings interface 720. Wherein, the prompt box 731 can comprise
Figure BDA0003200702480000173
Application icons and corresponding controls 731A (which may also be referred to as first controls),
Figure BDA0003200702480000174
Icons and corresponding controls 731B, cancel control 731C, and determine control 731D, among others. When electronic device 100 receives a touch operation (which may also be referred to as a second input, e.g., a click) by the user on control 731A and determines a touch operation (e.g., a click) on control 731D, electronic device 100 may send a signal to control 731A
Figure BDA0003200702480000177
The application is set as the default application for payment of the amount. As shown in fig. 7F, the electronic device 100 may display a quick payment information column 723 in the YOYO quick payment setup interface 720
Figure BDA0003200702480000176
Information of the application, for example the name "pay Bao". The text message ON the YOYO quick payment setting control 722 may be a prompt message "ON" to open the YOYO quick payment.
In one possible implementation, when the electronic device 100 receives and responds to the input of the user to set the application 1 as the default application for amount payment, and the application 1 is not bound to the bank card account, the electronic device 100 may display a prompt box on the YOYO quick payment setting interface. Wherein, the prompt box can comprise a control used for binding the bank card account.
Illustratively, in application 1 are
Figure BDA0003200702480000175
For example, as shown in fig. 7G, electronic device 100 may display a prompt box 741 on YOYO quick payment setting interface 720. The prompt box 741 may include a text prompt message "your payment treasure is not bound to the card and is available behind the bound card", a control 741A, and a control 741B, among others. Controls 741A and 741B may be configured to receive an input (e.g., a click) acted upon by a user, such that electronic device 100 performs a corresponding operation in response to the input. When electronic device 100 receives and responds to an input (e.g., a click) by the user on control 741B, electronic device 100 can bind the designated bank card account.
Next, a software architecture that can be applied to the electronic device 100 according to the embodiment of the present application is described.
Referring to fig. 8, fig. 8 is a diagram illustrating an exemplary software architecture provided by an embodiment of the present application.
As shown in fig. 8, the software architecture provided in this embodiment of the present application may include an application layer, an application framework layer, a hardware abstraction layer, a hardware layer, a low power consumption awareness module, and a Trusted Execution Environment (TEE).
The application layer may include a series of application packages. As shown in fig. 8, the application package may include a calculator, a camera, a calendar, weather, a memo, an application 1 for money payment (e.g.,
Figure BDA0003200702480000171
application of,
Figure BDA0003200702480000172
Applications, etc.) and YOYO quick payment applications.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions. As shown in fig. 8, in the embodiment of the present application, the application framework layer may include a fusion awareness module. The fusion perception module can comprise a northbound interface, a screen display state detection module, a wrist-turning action detection module, a screen orientation/hovering detection module, a screen two-dimensional code detection module, a distance detection module, a visual angle perception module and a data bus. The functions of the modules will be described in detail in the following embodiments, which are not repeated herein.
The hardware abstraction layer may provide an interface between other modules and the hardware layer. As shown in fig. 8, the hardware abstraction layer may include a sensor interface module, an audio interface module, and an Always On (AO) module. Regarding the functions of the AO module and the sensor interface module, detailed descriptions will be given in the following embodiments, and will not be repeated herein.
The hardware layer may comprise a series of hardware modules. As shown in fig. 8, the hardware layer may include a measurement device module and a camera module. The measuring and calculating device module can comprise a gyroscope sensor, an acceleration sensor, a gravity sensor, a TOF sensor and the like. Regarding the functions of each sensor, reference may be made to the descriptions in the foregoing steps S301 to S309, which are not described herein again.
The low power consumption sensing module may be configured to acquire data information detected by each sensor in the hardware layer. The low power consumption sensing module may include a screen orientation/hover module, a distance detection module, and a wrist rollover recognition module. The functions of the modules will be described in detail in the following embodiments, and are not described herein again.
The TEE may provide a secure environment for the operation of data and storage of information that is isolated from other modules. In an embodiment of the present application, the TEE may include a code scanning device identification module and a suspected code scanning device identification module. The functions of the modules will be described in detail in the following embodiments, and are not described herein again.
Next, taking gesture 1 as a wrist flipping gesture as an example, a module interaction flow provided by the embodiment of the present application is described with reference to the software architecture diagram shown in fig. 8.
S1, YOYO quick payment can send a data instruction of screen display state detection to a screen display state detection module in the fusion perception module, and the screen display state detection module is triggered to detect the screen display state.
Specifically, for the description of the screen display state, reference may be made to the description in step S301, and details are not repeated here.
S2, the screen display state detection module can acquire the current screen display state through a preset mechanism (such as a broadcasting mechanism) and send the screen display state information to YOYO quick payment.
And S3, when the YOYO quick payment determines that the current screen display state is a bright screen and unlocked state, the YOYO quick payment can send a data instruction for triggering wrist flipping action detection to a wrist flipping action detection module in the fusion sensing module. After receiving the data instruction, the wrist-flipping action detection module may trigger the wrist-flipping recognition module in the low-power-consumption sensing module to communicate with the sensor interface module in the hardware abstraction layer, so that the sensor interface module drives the measurement and calculation device module in the hardware layer to detect data information (e.g., acceleration data) for determining whether the user has flipped the wrist.
And S4, when the measuring and calculating device module detects the data information (such as acceleration data) for judging whether the user turns over the wrist, the data information can be sent to a wrist turning identification module in the low-power-consumption sensing module through a hardware abstraction layer. After receiving the data information, the wrist-flipping recognition module can send the data information to a wrist-flipping action detection module in the fusion sensing module, so that the wrist-flipping action detection module judges whether the user flips the wrist or not based on the data information, and sends the result to YOYO quick payment.
Specifically, for the related description of the wrist-flipping motion detection process, reference may be made to the description in step S302, and details are not repeated here.
In a possible implementation manner, the wrist-flipping recognition module in the low-power-consumption sensing module may also determine whether the user has flipped the wrist based on the data information, and then send the determination result to the YOYO fast payment through the wrist-flipping motion detection module in the fusion sensing module. This is not limited by the present application.
S5, after the YOYO quick payment receives a result of determining that the user has turned over the wrist, which is sent by the wrist turning action detection module, the YOYO quick payment can send a data instruction for triggering screen orientation/hovering detection to the screen orientation/hovering detection module in the fusion sensing module. After receiving the data instruction, the screen orientation/hover detection module may trigger the screen orientation/hover detection module in the low power consumption sensing module to communicate with the sensor interface module in the hardware abstraction layer, so that the sensor interface module drives the measurement and calculation device module in the hardware layer to detect data information (e.g., acceleration data) for determining the orientation and hover time of the screen.
And S6, when the measuring and calculating device module detects the data information (for example, acceleration data) for judging the screen orientation and the hovering, the data information can be sent to the screen orientation/hovering detection module in the low-power-consumption sensing module through the hardware abstraction layer. After receiving the data information, the screen orientation/hover detection module in the low power consumption sensing module may send the data information to the screen orientation/hover detection module in the fusion sensing module, so that the screen orientation/hover detection module in the fusion sensing module determines the orientation and hover time of the screen based on the data information, and sends the result to the YOYO fast payment.
Specifically, for the description about the screen orientation and the hover detection process of the electronic device 100, reference may be made to the description in step S303, and details are not repeated here.
In a possible implementation manner, the screen orientation/hover detection module in the low power consumption sensing module may also determine the orientation and hover time of the screen based on the data information, and then send the determination result to the YOYO fast payment through the screen orientation/hover detection module in the fusion sensing module. This is not limited by the present application.
S7, when the YOYO quick payment receives a judgment result sent by the screen orientation/hovering detection module in the fusion sensing module, the judgment result determines that the included angle between the screen orientation and the ground is within a specified angle range (for example, 90-180 degrees), and the hovering time is greater than a preset time value 1 (for example, 2 seconds), the YOYO quick payment can send a data instruction identified by the code scanning device to the visual angle sensing module in the fusion sensing module. After receiving the data instruction, the view sensing module may trigger an AO service module in the hardware abstraction layer to capture an image 1 through a camera module in the hardware layer. Then, the camera module may send the captured image 1 to a suspected code scanning device identification module in the TEE module for code scanning device identification.
S8, after a suspected code scanning device recognition module in the TEE module executes code scanning device recognition operation based on the image 1, the recognition result can be sent to a visual perception module in the fusion perception module through the AO service module, and therefore YOYO can pay quickly to obtain the recognition result.
Specifically, for the description of the code scanning device identification process, reference may be made to the description in step S304, and details are not repeated here.
And S9, when the YOYO quick payment receives a judgment result which is sent by the visual perception module and identifies that the image 1 comprises the code scanning device, sending a data instruction for further determining the code scanning device to the visual perception module in the fusion perception module. After receiving the data command, the view sensing module may trigger an AO service module in the hardware abstraction layer to capture a plurality of images through a camera module in the hardware layer. Then, the camera module may send the plurality of captured images to the code scanning device identification module in the TEE module for further determination of the code scanning device. The plurality of captured images include image 1 and image 3.
S10, after the code scanning device identification module in the TEE module executes the code scanning device identification and judgment operation based on the multiple images, the judgment result can be sent to the visual perception module in the fusion perception module through the AO service module, and therefore the YOYO can pay quickly to obtain the identification result.
Specifically, for the description of the code scanning device identification process, reference may be made to the description in step S304, and details are not repeated here.
S11, after the YOYO quick payment obtains the judgment result that the shot images comprise the code scanning device, the YOYO quick payment can send a data instruction for triggering the detection of the distance between the electronic device 100 and the code scanning device to a distance detection module in the fusion perception. After receiving the data instruction, the distance detection module may send a distance detection instruction to a distance detection module in the low power consumption sensing module, so that the low power consumption sensing module may drive a measurement and calculation device module in a hardware layer to detect and acquire distance data between the electronic device 100 and a code scanning device based on a sensor interface module in the hardware abstraction layer.
And S12, after the measuring and calculating device module acquires the distance data between the electronic equipment 100 and the code scanning equipment, the measuring and calculating device module can send the distance data to a distance detection module in the low-power-consumption fusion sensing module through a hardware abstraction layer. After receiving the distance data, the distance detection module in the fusion sensing module can determine whether the distance data is within 1 (for example, 8 cm-20 cm) of the distance threshold, and send the determination result to the YOYO fast payment.
Specifically, the obtaining of the distance data may refer to the related description in the step S306, and is not repeated herein.
And S13, after the YOYO quick payment receives a result of determining that the distance between the electronic device 100 and the code scanning device is within 1 distance threshold (for example, 8-20 cm) sent by the distance detection module in the fusion sensing module, the YOYO quick payment can send a data instruction for detecting the screen two-dimensional code to the screen two-dimensional code detection module in the fusion sensing module. After receiving the data instruction, the screen two-dimensional code detection module may capture an image of the interface 1 currently displayed by the electronic device 100, and detect whether the interface 1 displays a two-dimensional code.
S14, the screen two-dimension code detection module can send data information comprising a screen two-dimension code detection result to YOYO quick payment.
In some embodiments, the screen two-dimensional code detection process of S13-S14, the code scanning device confirmation process of S11-S12, and the distance detection process of S9-10 may be performed in parallel. When the electronic device 100 determines that the code scanning device is not included in the image, and/or the distance between the code scanning device and the electronic device 100 is not within a preset distance range 1 (for example, 8 cm-20 cm), and/or a two-dimensional code is displayed on the interface 1, the electronic device 100 may terminate the flow of the YOYO quick payment method provided by the present application.
S15, when the YOYO quick payment determines that the distance between the code scanning device and the electronic device 100 is within a preset distance threshold 1 (e.g., 8 cm-20 cm) and the two-dimensional code is not displayed on the interface 1, the YOYO quick payment may send a data instruction to the application 1 (e.g.,
Figure BDA0003200702480000201
application,
Figure BDA0003200702480000202
Application, etc.) so that the application should beThe interface 2 corresponding to the application 1 can be run and displayed with 1. The interface 2 may include, among other things, a payment two-dimensional code 1 and a floating window. The electronic device 100 may perform a payment operation based on the payment-based two-dimensional code 1 and the code scanning device.
Specifically, for the description of the application 1, the interface 2, the payment two-dimensional code 1, the floating window, and the payment operation, reference may be made to the related description in the foregoing steps S308 to S309, and details are not repeated herein.
Next, another software architecture that can be applied to the electronic device 100 provided by the embodiment of the present application is described.
Referring to fig. 9, fig. 9 is a diagram illustrating an exemplary software architecture provided by an embodiment of the present application.
As shown in fig. 9, the software architecture provided in this embodiment of the present application may include an application layer, an application framework layer, a hardware abstraction layer, a hardware layer, a low power consumption awareness module, and a Trusted Execution Environment (TEE). In the software architecture, the application layer may include a YOYO quick payment application and a fusion-aware application. The fusion awareness application may include a northbound interface, a screen display state detection module, a wrist-flipping action detection module, a screen orientation/hover detection module, a screen two-dimensional code detection module, a distance detection module, a perspective awareness module, and a data bus. The functions of the modules will be described in detail in the following embodiments, which are not repeated herein.
For the descriptions of the application layer, the application framework layer, the hardware abstraction layer, the hardware layer, the low power consumption sensing module and the trusted execution environment, reference may be made to the related description in the foregoing embodiment of fig. 8, which is not repeated herein.
Next, a module interaction flow provided by the embodiment of the present application is described in conjunction with the software architecture diagram shown in fig. 9.
Step S1 to step S12 can refer to the description in step S1 to step S12 shown in the embodiment of fig. 8, and are not repeated herein.
And S13, after receiving a result of determining that the distance between the electronic device 100 and the code scanning device is within 1 (for example, 8-20 cm) of the distance threshold value, which is sent by the distance detection module in the fusion sensing module, the YOYO quick payment can send a data instruction for detecting the screen two-dimensional code to the screen two-dimensional code detection module in the fusion sensing module. After receiving the data instruction, the screen two-dimensional code detection module may call a specified interface of the application framework layer to capture an image of the interface 1 currently displayed by the electronic device 100.
And S14, after the screen two-dimensional code detection module receives the image intercepted by the specified interface in the application program frame layer, detecting whether the interface 1 displays the two-dimensional code. The screen two-dimensional code detection module can send data information including a screen two-dimensional code detection result to YOYO quick payment.
Step S15 may refer to the description of step S15 in the embodiment shown in fig. 8, and is not described herein again.
In one possible implementation, a fusion awareness module may also be included in the YOYO quick payment. This is not limited by the present application.
In one possible implementation, data instructions of various modules in the low power consumption sensing module (e.g., the screen orientation/hover detection module, the distance detection module, and the wrist flipping detection module) may trigger the measurement device module in the hardware layer to detect data via the application framework layer, the hardware abstraction layer. The data detected by the gauging device detection module may then be sent to various modules in the low power consumption sensing module (e.g., a screen orientation/hover detection module, a distance detection module, and a wrist rollover detection module) via the hardware abstraction layer and the application framework layer.
In one possible implementation manner, the sensor driving module may be disposed on the same side of the low power consumption sensing module. Data instructions of various modules in the low power consumption sensing module (e.g., the screen orientation/hover detection module, the distance detection module, and the wrist-flipping detection module) may trigger the reckoning device module in the hardware layer to detect data via the sensor driving module. The data detected by the gauging device detection module may then be sent to various ones of the low power consumption sensing modules (e.g., a screen orientation/hover detection module, a distance detection module, and a wrist rollover detection module) via the sensor driver module.
As used in the above embodiments, the term "when …" may be interpreted to mean "if …" or "after …" or "in response to determination …" or "in response to detection of …", depending on the context. Similarly, the phrase "in determining …" or "if (a stated condition or event) is detected" may be interpreted to mean "if … is determined" or "in response to … is determined" or "in response to (a stated condition or event) is detected", depending on the context.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, digital subscriber line) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk), among others.
One of ordinary skill in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the above method embodiments. And the aforementioned storage medium includes: various media capable of storing program codes, such as ROM or RAM, magnetic or optical disks, etc.

Claims (15)

1. A payment method, comprising:
the method comprises the steps that the electronic equipment displays a first interface, wherein the first interface is not a screen locking interface;
the electronic device detects a first operation;
in response to the first operation, the electronic device detecting a state of the electronic device;
the electronic equipment determines that the state of the electronic equipment meets a first preset condition, and a first camera of the electronic equipment acquires a first image;
the electronic device determines that the first image includes a first target object;
the electronic equipment displays a second interface, wherein the second interface comprises a two-dimensional code.
2. The method according to claim 1, wherein the first preset condition comprises:
an included angle between a display screen of the electronic device and the ground plane is smaller than a first threshold value, and the time of the electronic device in the hovering state is larger than a second threshold value.
3. The method according to claim 1 or 2, wherein before the electronic device displays the second interface, the method further comprises:
a second camera of the electronic equipment acquires a second image and a third image, wherein the second image is different from the first image, and the third image is different from the first image;
the electronic device determines that the second image and the third image both include a first target object.
4. The method of any of claims 1-3, wherein before the electronic device displays the second interface, the method further comprises:
determining that a distance between the electronic device and the first target object is less than a third threshold.
5. The method according to any one of claims 1-4, further comprising: the first interface does not include a two-dimensional code.
6. The method of claim 3, wherein the second camera is the same as the first camera.
7. The method of claim 3, wherein the first camera is a low power camera and the second camera is a non-low power camera.
8. The method according to claim 4, characterized in that it comprises in particular:
the electronic equipment transmits pulse waves to the code scanning equipment through a TOF sensor;
the electronic equipment acquires the flight time from the emission of the pulse wave to the reflection of the pulse wave by the first target object back to the TOF sensor;
the electronic device determines, based on the time of flight and the velocity of the pulse wave, that a distance between the electronic device and the first target object is less than a third threshold.
9. The method of claim 6, wherein before the electronic device displays the second interface, the method further comprises:
the electronic equipment detects the first interface;
and when the electronic equipment determines that the first interface does not display the two-dimensional code, the electronic equipment displays the second interface.
10. The method of claim 1, further comprising:
the electronic equipment sends the information of the two-dimensional code to a first server based on the first target object; the information of the two-dimensional code comprises an identifier of a first application and information of a first account, and the first server is a server corresponding to the first application;
when the information of the first account triggers the first server to determine that the amount of money in the first account is equal to or larger than a first amount of money and transfer the first amount of money in the first account to a second account, the electronic device receives payment success prompt information sent by the first server.
11. The method of claim 10, further comprising:
the first interface comprises an icon of a second application;
when the information of the first account triggers the first server to determine that the amount of money in the first account is smaller than the first amount of money, the electronic equipment receives payment failure prompt information sent by the first server;
the electronic equipment receives a first input acted on the icon of the second application by a user;
responding to the first input, the electronic equipment displays a third interface of the second application, wherein the third interface comprises a two-dimensional code corresponding to the second application.
12. The method of claim 1, wherein before the electronic device is in the unlocked state and the first interface is displayed, the method further comprises:
the electronic equipment displays a fourth interface, wherein the fourth interface comprises a first control;
the electronic equipment receives a second input aiming at the first control from a user;
in response to the second input, the electronic device selects the first application as a default payment application.
13. An electronic device, comprising: one or more processors, one or more sensors, one or more memories, a display screen, and a transceiver; the one or more memories coupled with the one or more processors for storing computer program code, the computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the method of any of claims 1-12.
14. A computer storage medium comprising computer instructions that, when executed on an electronic device, cause the electronic device to perform the method of any of claims 1-12.
15. A computer program product, characterized in that it, when run on an electronic device, causes the electronic device to perform the method of any of claims 1-12.
CN202110905685.4A 2021-08-06 2021-08-06 Payment method and related device Active CN115705567B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110905685.4A CN115705567B (en) 2021-08-06 2021-08-06 Payment method and related device
PCT/CN2022/092134 WO2023010935A1 (en) 2021-08-06 2022-05-11 Payment method and related apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110905685.4A CN115705567B (en) 2021-08-06 2021-08-06 Payment method and related device

Publications (2)

Publication Number Publication Date
CN115705567A true CN115705567A (en) 2023-02-17
CN115705567B CN115705567B (en) 2024-04-19

Family

ID=85154232

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110905685.4A Active CN115705567B (en) 2021-08-06 2021-08-06 Payment method and related device

Country Status (2)

Country Link
CN (1) CN115705567B (en)
WO (1) WO2023010935A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116341586A (en) * 2023-02-27 2023-06-27 荣耀终端有限公司 Code scanning method, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017124899A1 (en) * 2016-01-20 2017-07-27 努比亚技术有限公司 Information processing method, apparatus and electronic device
CN108596584A (en) * 2018-04-25 2018-09-28 合肥上城信息技术有限公司 A kind of method and system of electric business platform and Third-party payment platform information trading
US20190019177A1 (en) * 2017-07-13 2019-01-17 Samsung Electronics Co., Ltd. Electronic device for displaying information and method thereof
CN113177471A (en) * 2021-04-28 2021-07-27 Oppo广东移动通信有限公司 Motion detection method, motion detection device, electronic device, and storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2003236887A1 (en) * 2002-05-22 2003-12-02 Yates Web Marketing Limited Internet payment
US20160019512A1 (en) * 2013-04-21 2016-01-21 SSI America INC. Transaction facilitation methods and apparatuses
CN106096946A (en) * 2016-06-06 2016-11-09 珠海市魅族科技有限公司 Electric paying method and payment devices

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017124899A1 (en) * 2016-01-20 2017-07-27 努比亚技术有限公司 Information processing method, apparatus and electronic device
US20190019177A1 (en) * 2017-07-13 2019-01-17 Samsung Electronics Co., Ltd. Electronic device for displaying information and method thereof
CN108596584A (en) * 2018-04-25 2018-09-28 合肥上城信息技术有限公司 A kind of method and system of electric business platform and Third-party payment platform information trading
CN113177471A (en) * 2021-04-28 2021-07-27 Oppo广东移动通信有限公司 Motion detection method, motion detection device, electronic device, and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116341586A (en) * 2023-02-27 2023-06-27 荣耀终端有限公司 Code scanning method, electronic equipment and storage medium
CN116341586B (en) * 2023-02-27 2023-12-01 荣耀终端有限公司 Code scanning method, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN115705567B (en) 2024-04-19
WO2023010935A1 (en) 2023-02-09
WO2023010935A9 (en) 2023-04-27

Similar Documents

Publication Publication Date Title
US20200177714A1 (en) Electronic device for performing operation based on status information thereof and operating method thereof
EP3996358B1 (en) Display method for foldable electronic device, and electronic device
EP3245106B1 (en) Device and method of controlling the device
WO2019134591A1 (en) Electronic transaction method and terminal
AU2016224175B2 (en) Electronic device and control method thereof
EP2765807A1 (en) Undesired NFC pairing
KR20100124591A (en) Mobile terminal system and control method thereof
US20230098616A1 (en) Method for Invoking NFC Application, Electronic Device, and NFC Apparatus
WO2021135618A1 (en) Interface display method and related apparatus
US11521165B2 (en) Information processing system and information processing method
WO2023010935A9 (en) Payment method and related apparatus
KR20100037489A (en) Navigation apparatus and method thereof
US20190195641A1 (en) Electronic device for providing operation information of vehicle and method for the same
US20210200189A1 (en) Method for determining movement of electronic device and electronic device using same
KR20140089975A (en) Apparatus and method for saving power battery of mobile telecommunication terminal
CN105103079B (en) The low-power management of multiple sensor integrated chip framework
KR101832370B1 (en) Mobile Terminal
KR20170020039A (en) Method and program for providing real-time traffic informaion
KR20100050322A (en) Navigation apparatus and method thereof
US20150163340A1 (en) Portable terminal and a method for operating the same
WO2021244118A1 (en) Smart card sharing method, electronic device, and computer-readable storage medium
CN115708372A (en) Vehicle position sharing method, system, device, equipment and readable storage medium
CN111047328A (en) Mobile payment method, device, system and storage medium
US11568386B2 (en) Method and system for active NFC payment device management
US20220398563A1 (en) Method and system for active nfc payment device management

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant